Libraries are for Use

Demonstrating the value of librarianship

New ways of viewing data


There has been a series of articles in Nature and The Scholarly Kitchen regarding (yet) another way of comparing the impact of journals via citations.  Among the problems with the more traditional journal impact factor (JIF) that the authors of the original paper were attempting to address are these:

  1. The JIF is an arithmetic mean, but the underlying distribution is heavily skewed to the right (meaning, there are a few papers that have a lot of citations).
  2. The JIF does not represent the distribution of central tendency.
  3. The data is not openly available.
  4. The measure is not reproducible.

They then provide an alternative way of addressing at least three of these concerns by using Web of Science to gather data, compile it and generate a histogram of the distribution of citations.  Here is an example:

Figure 1 of Lariviere, et al. 2016

Distributions of citations from 11 different journals.

(Side note: The appendix to this article provides wonderfully-detailed steps, including screenshots of Excel workbooks, exact formulas, and graphs.  I am beginning to believe that all social science research should include such detailed instructions, just as biological and physical scientists include or reference protocols for their studies.)

Distribution of citations of articles from Biological Letters, as documented in Lariviere, et al. (http://biorxiv.org/content/biorxiv/early/2016/07/05/062109.full.pdf)

Distribution of citations of articles from Biological Letters, as documented in Lariviere, et al. (http://biorxiv.org/content/biorxiv/early/2016/07/05/062109.full.pdf)

The authors recommended, among other things, that journals provide not only their JIF, but also a graph or histogram showing the distribution of citations.

Now, the whole issue of measuring scientific outputs is controversial, and there has been a lively discussion of this particular recommendation, most of it critical; indeed, to me, the critics (Phil Davis and Kent Anderson) seem to dismiss the idea a little too quickly.  True, graphs & charts can be developed to misrepresent the truth, but they can also clarify concepts.  One of the original paper’s authors commented on Davis’ blog post being an obvious “straw man” argument, noting his loaded question about the misconceptions of JIF and the Davis’ own re-plotting of the data without axis labels.  However, he did concede that when comparing journals’ distributions, they should be plotted on the same scales.  Here is a blog post that supports this idea: https://quantixed.wordpress.com/2015/05/05/wrong-number-a-closer-look-at-impact-factors/

I’ve been fascinated by distributions for quite some time and the severe skewedness of data used in LIS research has always bothered and intrigued me.  It is clear to me now that we should no longer be reporting averages or means, which are sensitive to outliers, but rather medians of all such data, which are less so.  It is also clear that no one measure or metric is useful by itself for summarizing the value of journals (or articles or researchers or institutions or countries), but there is evidence that what measures we have are often correlated.

So I am intrigued by this recommendation, and will investigate integrating such distributions and images into our reports of data related to collections.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

The Scholarly Kitchen

What’s Hot and Cooking In Scholarly Publishing

Libraries are for Use

Demonstrating the value of librarianship

Scholarly Communication | Scoop.it

Demonstrating the value of librarianship

Library & Information Science Research | Scoop.it

Demonstrating the value of librarianship

Library Collections | Scoop.it

Demonstrating the value of librarianship

Lib(rary) Performance

About library statistics & measurement - by Ray Lyons

Walt at Random

Demonstrating the value of librarianship

ULS Universe

Demonstrating the value of librarianship

The Scholarly Kitchen

Demonstrating the value of librarianship

The Quarterly Journal of Economics Current Issue

Demonstrating the value of librarianship

Texas Library Association blogs

Demonstrating the value of librarianship

Demonstrating the value of librarianship

Stephen's Lighthouse

Demonstrating the value of librarianship

ResourceShelf

Demonstrating the value of librarianship

Reference Notes

Demonstrating the value of librarianship

Politifact.com Truth-O-Meter rulings from National

Demonstrating the value of librarianship

Open and Shut?

Demonstrating the value of librarianship

N S R

Demonstrating the value of librarianship

Musings about librarianship

Demonstrating the value of librarianship

LISNews:

Demonstrating the value of librarianship

%d bloggers like this: