Libraries are for Use

Demonstrating the value of librarianship

Assessment in C&RL


Just ahead of ALA, the ACRL releases the latest issue of C&RL with several articles on library assessment.  Here are two that I wanted to comment on:

The first is actually a method I hope to put into practice soon – local citation analysis to assess utility of library collections.  Their primary objective was “to find out the extent to which the collections of the Hesburgh Libraries of Notre Dame met the needs of the graduate doctoral student population.”  They looked at all references in all electronic dissertations (which was “most” of all dissertations owned by the library).  They manually entered over 27,000 unique citations from over 39,000 references (whew! – even with various tricks to help reduce the labor, that is a lot of work!).  Then they checked their catalog and other resources to a) validate the citation, and b) determine library ownership or access.

The results included several tables demonstrating differences in citation between major disciplines, which was pretty much expected – more books than journals were cited in the arts & humanities, while for sciences it was nearly all journal articles, and for social sciences it was about 60% articles.  One notable difference – dissertations in computer sciences referenced conference papers most (32%), followed closely by articles.
The authors analyzed the data by title and year (and author, for books), which had the effect of listing in the top sources cited the same journal titles for different years.  While this may be useful for books (although it would have a similar effect on different editions), I think it would be better to condense it one more level to analyze by title.  This would enable other resources to appear in the “top 25” lists.
The most interesting part was Table 10, which details the ownership of the resources.  Overall, the authors could boast that their library provided access to 63-76% of all citations referenced, and 75-93% of the top 1,000 items referenced.  Differences across broad disciplines showed that the sciences had the greatest rate of ownership, with the social sciences and engineering having lower rates.  Arts and humanities was mixed – low for all references cited (63%) but high for top 1,000 references (90%).
Of course, this does beg the question – does this actually reflect “need” or “use”?  Were there unmet needs by the candidates?  How was the material not owned by the library acquired (ILL, loans from faculty)?  How were the students made aware of these references (faculty referrals and personal libraries, references in other works)?
The other article is both a literature review of methods of assessment of online library instruction (or is it, online methods of assessment of library instruction?), and the results of a survey of such assessment in academic libraries.  While the literature review section starts out discussing online instruction and metrics of usage (e.g. “hits”), it evolves into a broad overview of instruction assessment in general.  Mentioned here is the lack of good measures for evaluating non-traditional delivery of instruction – those that are not given in a single face-to-face “session”.  I really couldn’t tell if the primary concern of the authors, though, was the technical aspects of measuring online instruction or the broader issue of more appropriate measures of impact and outcome in assessment.
This was followed by a description of their survey, but the authors did not effectively state the purpose of the survey, except “to go beyond the existing literature, opinion, and experience in the realm of statistical reporting of online LI activities”.  Invitations to complete the survey were sent out on various Listservs, requesting librarians who provide instruction or manage such programs to complete it.  Because it was anonymous, no attempt was made to combine results from the same library, but that may or may not have affected results.
The results revealed that accounting for the delivery of online instruction is hardly standardized across institutions.  In addition to there being no clear method of counting and recording these “sessions” (for lack of a better word), the comments showed quite a bit of confusion over if and how to do that.  One thing that was clear was that online instruction methods take much more time to prepare than face-to-face instruction.  This does not surprise me, because I have done both and I have noticed the difference.
Overall, I think this issue is very important, but I think the survey revealed that it (the issue) is still too immature to measure.  This seems like a good opportunity to do a qualitative study, focus groups of assessment and instruction librarians, which can help identify key terms and ideas, which could be clarified and condensed with a Delphi study.
Overall, good reading for a cold, January night.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Information

This entry was posted on January 12, 2012 by in Assessment, Bibliometrics, Collections, LIS Data, LIS Research and tagged .
The Scholarly Kitchen

What’s Hot and Cooking In Scholarly Publishing

Libraries are for Use

Demonstrating the value of librarianship

Scholarly Communication | Scoop.it

Demonstrating the value of librarianship

Library & Information Science Research | Scoop.it

Demonstrating the value of librarianship

Library Collections | Scoop.it

Demonstrating the value of librarianship

Lib(rary) Performance

About library statistics & measurement - by Ray Lyons

Walt at Random

Demonstrating the value of librarianship

The Scholarly Kitchen

Demonstrating the value of librarianship

The Quarterly Journal of Economics Current Issue

Demonstrating the value of librarianship

Texas Library Association blogs

Demonstrating the value of librarianship

Demonstrating the value of librarianship

Stephen's Lighthouse

Demonstrating the value of librarianship

ResourceShelf

Demonstrating the value of librarianship

Reference Notes

Demonstrating the value of librarianship

Politifact.com Truth-O-Meter rulings from National

Demonstrating the value of librarianship

Open and Shut?

Demonstrating the value of librarianship

N S R

Demonstrating the value of librarianship

Musings about librarianship

Demonstrating the value of librarianship

LISNews:

Demonstrating the value of librarianship

%d bloggers like this: