Grant Writing for Information Agencies

Download Report

Transcript Grant Writing for Information Agencies

Usage Data for Electronic
Resources
WRAPS/FRIP Presentation
April 24, 2007
Gayle Baker, Maribeth Manoff, Eleanor Read
MaxData
http://web.utk.edu/~tenopir/imls/index.htm
“Maximizing Library
Investments in
Digital Collections
Through Better
Data Gathering
and Analysis”
Funded by Institute
of Museum and
Library Services
(IMLS) 2005-2007
MaxData Project Purpose


Evaluate and compare methods of usage
data collection and analysis
Develop cost/benefit model to help
librarians select appropriate method(s)
for electronic resource usage
assessments
MaxData Project Teams



UT Libraries: COUNTER data from
vendors, link resolver, database usage
logs, federated search engine
David Nicholas et al. (Ciber): deep log
analysis on OhioLINK journal usage data
Carol Tenopir and Donald King:
readership surveys at UT and four Ohio
universities
FRIP Equipment Award
(Fall 2005)

Requested







PC with extra capacity for handling data
HP LaserJet Printer
Microsoft Office 2003 Professional
Archival DVDs
$2477
Consulted with David Ratledge
Housed in faculty study in Hodges
Project File Sharing


Account (Usestat) on library server for
project files for UT Libraries team
BlackBoard group site for MaxData team
Presentations








Charleston 2005 (GB, ER/project intro)
ER&L 2006 (GB/vendor data issues)
Lib Assessment 2006 (ER, MM/combining data)
Charleston 2006 (GB/vendor data results)
ER&L 2007 (GB/vendor data survey)
ELUNA 2007 (MM/SFX data)
ALA/ACRL/EBSS 2007 (MM/data presentation)
Charleston 2007 (all 3/comparing data types)
Publications



“MaxData: A Project to Help Librarians
Maximize E-Journal Usage Data.” In Usage
Statistics of E-Serials (summer 2007)
“All That Data: Finding Useful and Practical
Ways to Combine Electronic Resource Usage
Data from Multiple Sources.” Library
Assessment Conference Proceedings (May
2007)
Article on vendor data survey results in Learned
Publishing (due June 1, 2007)
The Usage Data Challenge


Vendor-supplied data
Other data
Vendor Reports: Background



Vendor-supplied data primary source of
e-journal usage information
Project COUNTER helpful, but…
Manipulation may be required to
compare use among vendors
Vendor Reports: Consolidating



COUNTER Journal Report 1 (JR-1)
Data from each vendor combined in
Excel spreadsheet
Facilitates additional analyses



Sorting by selected fields
Subject analysis
Cost per use calculations
COUNTER: JR1 Format
Vendor Reports: Challenges

Inconsistencies in data fields



Journal title (articles, upper/lower case, extra
information)
ISSN (with and without hyphen)
Time consuming to fix

ScholarlyStats, SUSHI, ERMS may help
Survey: Purpose



How much effort is involved in working
with vendor-supplied use data?
How are the data used?
What data are most useful in managing
electronic resources?
Survey: Subjects



Sent to Library Directors at Carnegie I
and II research institutions (360+)
April 2006
92 respondents
Number of Vendors Providing
Usage Reports
Reports for Different Types of
Resources
Purpose for Reviewing and/or
Analyzing Vendor Data
Number of Hours Processing
Usage Reports in 2005
Percentage of Time
Processing Vendor Data
Biggest Challenges

Lack of consistency / standards (61)
Takes too much time (27)

COUNTER standards help but… (14)

Most Useful Statistic(s)






Number of full-text downloads (67)
Number of searches (41)
Number of sessions (27)
COUNTER statistics (26)
Number of turnaways (17)
Other (17)
Other (Local) Data




UT – database “hits” recorded from
database menu pages
Federated search system (MetaLib)
statistics
Some libraries using proxy server logs
Link resolver (SFX) data
Link Resolver Data


SFX includes a statistical module with
a number of “canned” reports
For journal level data, one report in
particular (“Requests and
clickthroughs by journal and target”) is
analogous to COUNTER JR1
SFX “Request” and
“Clickthrough” Data

UT student searching in an SFX “source”
discovers an article of interest
 Clicks on FindText button
 Article is available electronically in Journal
A, Package Y and Z – “Request” statistic
recorded for each
 Student chooses link to Journal A in
Package Y – “Clickthrough” statistic
recorded
SFX “Clickthroughs” vs.
JR1 “Full-Text Article Requests”




Clickthrough is less specific, does not
measure actual download
But, clickthrough is a “known quantity,” not
dependent on package interface
SFX report as a useful supplement to JR1,
comparing trends and patterns
SFX contains data not in JR1 reports, e.g.,
non-COUNTER packages, open access
journals, backfiles
Formatting the SFX Report

Report from SFX is not formatted like
JR1, does contain data elements



Request to software vendor: Include in
statistical module
Incorporate into ERMS
Manual or programming approach,
depending on time and expertise available
Other Useful Link Resolver
Reports and Data

Unmet user needs





Journals “requested” with no electronic
full-text available
Interlibrary loan requests
Unused full-text report
Overlap reports
Subject categories
Conclusions So Far




Collecting, consolidating and analyzing
vendor data is time-consuming and difficult
Survey of electronic resource librarians
indicates many do not have enough time
Acquiring data from local systems provides
consistency, also requires time and effort
Libraries face difficult decisions about what
methods are most practical and useful
Into the Future




Present selected data sets to subject
librarians to see what they find useful
Investigate usefulness of new COUNTER
standards
Will SUSHI solve our problems? ERMS?
Compare our findings with those of the
other MaxData teams