Empowering Data: Persuasion Through Presentation Empowering Data: Persuasion Through Presentation ALA 2007 Annual Conference Washington, DC Saturday June 23, 2007 from 1:30-3:30

Download Report

Transcript Empowering Data: Persuasion Through Presentation Empowering Data: Persuasion Through Presentation ALA 2007 Annual Conference Washington, DC Saturday June 23, 2007 from 1:30-3:30

Empowering Data: Persuasion Through Presentation

Empowering Data: Persuasion Through Presentation

ALA 2007 Annual Conference Washington, DC Saturday June 23, 2007 from 1:30-3:30

Empowering Data: Persuasion Through Presentation

Sponsored by:

ACRL and EBSS

Empowering Data: Persuasion Through Presentation

Presentation One

Robert Molyneux

Academic Library Data: An overview of their use and condition

[email protected]

Empowering Data: Persuasion Through Presentation

What are data?

Numerical evidence

Empowering Data: Persuasion Through Presentation

What do librarians, traditionally, use data for?

Funding justification Decision support - An index to like-libraries - Planning, budgeting, etc Episodic, expensive mega-studies

Empowering Data: Persuasion Through Presentation

What don't we do?

- Research - Search for cause

How do academic libraries work?

- What works well? - If I get more X, what will happen to Y? - What trends?

Empowering Data: Persuasion Through Presentation

Empowering Data: Persuasion Through Presentation

Empowering Data: Persuasion Through Presentation

Structure of academic library data

We have lots Primarily annual - That is, compilers do not create longitudinal files - Ergo, you cannot study trends except by the crudest methods Data collected by others is well documented but often not organized to answer our questions Data collected by librarians, when available electronically is usually poorly documented - Recompiling them into longitudinal files is dauntingly complex

Empowering Data: Persuasion Through Presentation

What do we have?

- ARL —1907/08-current. The premier library data series. - ACRL1 —about 1930-1960, census - ACRL2 —1977/78-1996/97. "second 100" - ACRL3 —1998/99-current, census - Academic Library Survey. NCES, census. Broken but reviving.

PUBLIC DOMAIN.

- LSU, 1926/78-1986/87, select Southern colleges and Universities.

And others that are foreseeable

- Oberlin Group. Prestigious undergraduate institutions. Unpublished. - Association of Southeastern Research Libraries - Canadian Association of Research Libraries

Empowering Data: Persuasion Through Presentation

Where are we now?

- Mostly annual data - Mostly badly documented - For a few large libraries - What is documented still will usually require programming skill - Little is in the public domain

What is coming for certain —thank you, National Commission on Libraries and Information Science

- Academic Library Statistics Longitudinal File And derivative data products - What I hope is coming WBOW

Empowering Data: Persuasion Through Presentation

Jump in. There are many questions that need answers. Meanwhile, some of my rules:

- Not all questions are data questions - Data will not always answer the question but they might get you closer to an answer and constrain mistakes - It's always apples and oranges - The true lesson from the Garden of Eden: bad data

But ALWAYS remember:

- Never take counsel of your fears

Empowering Data: Persuasion Through Presentation

Presentation Two

Steve Hiller

Director of Assessment and Planning University of Washington

Make Data Meaningful

Select, Summarize, Compare, Analyze, Present

• Select the appropriate method or data sets – Know the limitations of the data • Compile/review results, summarize and analyze – Descriptive statistics – Apparent themes or patterns – Variations and differences • Comparisons add understanding and context – Chronological – Within and between groups – Institutions and norms

Presenting Data and Results

• Make sure data/results are: – Timely – Understandable – Usable • Identify

important

findings/

key

results – What’s important to know – What’s actionable • Present

key/important

results to: – Library administration/institutional administration – Library staff – Other libraries/interested parties/stakeholders

Be Graphic!!!

“ Often the most effective way to describe, explore and summarize a set of numbers – even a very large set – is to look at pictures of those numbers. Furthermore, of all methods for analyzing and communicating statistical information, well designed data graphics are usually the simplest and at the same time the most powerful.”

Edward Tufte The Visual Display of Quantitative Information

"The Leonardo da Vinci of data." THE NEW YORK TIMES

• • •

Traditional Library Core Business Physical Collections

– Print (primarily) – Microforms – Other

Facilities

– House collections – Customer service & work space – Staff work space

Services

– Reference – Instruction – Access

UW Lib Input/Output Use Measures Down In-Library Use

2.12 million in 1995-96 0.49 million in 2005-06

Weekly Visits 1998 2007

Faculty Grad student 47% 21% 78% 46% Undergrad 67% 66%

Reference

142,000 in 2002-03 106,000 in 2005-06

Time for a New Business Model?

Not just better libraries . . . Not better customers . . . but

Demonstrate the Value the Library Provides the University Community

“ Documenting the libraries contributions to quality teaching, student outcomes, and research productivity will become critical.” (Yvonna Lincoln 2006)

What We Need to Know to Support Our Communities

Who are our customers (and potential customers)?

• What are their teaching, learning, clinical and research interests? How do they work? What’s important to them?

• What are their library and information needs?

• How do they currently use library/information services? • How would they prefer to do so? • How do they differ from each other in library use/needs?

How does the library add value to their work?

UW Libraries Biosciences Review Task Force:

Reasons for Review (2005-06)

• Better understand how bioscientists work • Understand significance and value of bioscience and research enterprise to University • Growing interdisciplinarity in research and teaching • Implications of significant change in library use patterns • Viability of Libraries organizational structure/footprint • Strengthen library connection to research enterprise

Some Potential Existing Data Sources

• • •

National/International

– Program rankings – Research awards – Scholarly productivity

Institutional

– Student enrollments; faculty/researcher appointments – Degrees granted – Research awards by area

Library

– Use and usage patterns – Information needs – Value and impact on teaching, learning and research

University of Washington Libraries

Assessment Methods Used

• Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004, 2007 – All faculty – Samples of undergraduate and graduate students – Research scientists, Health Sciences fellow/residents 2004 • In-library use surveys every 3 years beginning 1993 • Focus groups/Interviews (annually since 1998) • Observation (guided and non-obtrusive) • Usability • Use statistics/data mining • Information about assessment program available at:

http://www.lib.washington.edu/assessment/

Biosciences Review Process

(2006)

• Define scope (e.g. what is “bioscience”?) • Identify and mine existing data sources • Acquire new information

(primarily qualitative)

– Environmental scan – Interviews (biosciences faculty) – Focus groups (biosci faculty & students) – Peer library surveys – NO USER SURVEYS • Review, analyze and understand information • Final report and recommendations • Actions

Institutional Data

: UW Students, Faculty and Doctorates Awarded by Academic Area Undergraduate Majors Other 52% Bioscience 21% Health Sci 5% Phy Sci - Eng 22% Grad/Professional Students Bioscience, 12% Other 38% Health Sci 31% Phy Sci - Eng 18% Faculty Bioscience 10% Other 20% Phy Sci - Eng 18% Health Sci 47% Doctorates Awarded Other 31% Bioscience 19% Phy Sci - Eng 30% Health Sci, 20%

Institutional Data FY 2005 External Funding By Source and UW Faculty Area Awards by Source (in millions) Awards by Area (in millions) Other Non-Fed 105 State of Wash 21 Industry-Found 79 Other Programs 45 Hum/SocSci/Arts39 Engineering 67 Other Federal 81 Sciences & Natural Resources 173 US Dept of Educ 46 US Dept of Def 40 National Science Foundation 88 Other Health Sciences 198 Health and Human Services 536 School of Medicine 470

100% Faculty External Funding by Academic Area (2004 survey) Health S ciences (728) S ciences-Engineering (389) Humanities/S oc S ci/Arts (365) 90% None 17% None 16% 80% Other Only 8% Other Only 11% 70% None 62% 60% Fed and Other 42% 50% Fed and Other 43% 40% 30% Other Only 21% 20% 10% 0% Fed Only 28% Fed Only 34% Fed and Other 9% Fed Only 8%

UW Faculty Mode of Use by Academic Area 1998/2007

(w eekly)

Non- Weekly 17% Non- Weekly 5% Non- Weekly 25% Non- Weekly 9% Non- Weekly 15% Non- Weekly, 9% Rem ote Only 23% Rem ote Only 45% Rem ote Only 87% Rem ote Only 26% Rem ote Only 71% Rem ote Only 47% Rem ote & Visit 32% Rem ote & Visit 39% Rem ote & Visit 51% Rem ote & Visit 42% Visit Only 6% Health Sci 1998 Rem ote & Visit 7% 0% Health Sci 2007 Visit Only10% Science-Engin 1998 Rem ote & Visit 17% 1% Science-Eng 2007 Visit Only 10% Hum-Soc Sci 1998 1% Hum-Soc Sci 2007

Journal Article Downloads 2005

Packages with more than 100,000 downloads

Highwire 1,320,817 Wiley 143,641 Science Direct 858,865 ProQuest 781,042 Nature 314,079 Expanded AI OVID Blackwell 232,131 203,314 167,029 ACS Science Mag 134,055 122,188 PsycArticles 116,871 Oxford UP 114,476 Springer/Kluwer 104,213

Biology Undergrads Library Use

(2005 In-Library Use Survey) Libraries Used Biology Undergrads (n=136) Other Health Other Sci Chem Sci 9% Main 33% OUGL 44% What They Did 70% 60% 50% 40% 30% 20% 10% 0% Worked alone Group work Used computer Looked for material Asked for help

The Qualitative Provides the Key

• Increased use and importance of such qualitative methods as, comments, interviews, focus groups, usability, observation • Statistical data often can’t tell us – Who, how, why – Value, impact, outcomes • Qualitative provides information directly from users – Their language – Their issues – Their work • Qualitative provides understanding

Faculty Interview Themes

• Library seen primarily as E-Journal provider • Physical library used only for items not available online • Start information search with Google and PubMed • Too busy to attend training, instruction, workshops • Faculty who teach undergrads use libraries differently • Could not come up with “new library services” unprompted

Focus Group Themes:

Print Is Dead, Really Dead Our Virtual Space Not Yours

• Google, PubMed, Web of Science starting points for all • Faculty identify library with E-journals • Want all content online; if not online deliver digitally • Faculty/many grads go to physical library as last resort • Too many physical libraries • Lack understanding of many library services, resources • Increasing overlap between “bio” & other science research

Sources Consulted for Information on Research Topics 2007 Survey (Scale of 1 “Not at All” to 5 “Usually”) 3.75

3.5

3.25

3 5 4.75

4.5

4.25

4 Health Science Grad Health Science Faculty Sci-Engineer Grad Sci-Engineer Faculty Hum-Soc Sci Grad Bibliographic DB Open Internet Hum Soc-Sci Fac

BioScience Task Force Recommendations

Make physical libraries more inviting/easier to use – Consolidate collections and service points – Reduce print holdings; focus on service • Integrate search/discovery tools into users workflow • Expand/improve information/service delivery options • Use an integrated approach to collection allocations • Increase integration of librarians with user workflow • Lead/Partner on scholarly communication and e science issues • Provide more targeted communication and marketing

Using the Data: 2007 Actions

• Appointed Director, Cyberinfrastructure Initiatives and Special Assistant to the Dean of University Libraries for Biosciences and e-Science • Work underway to standardize interlibrary loan and article delivery regardless of academic program request comes from • Informed development of Libraries 2007 Triennial Survey • Libraries Strategic Plan priority initiatives for 2007 include: – Improve discovery to delivery (WorldCat Local etc.) – Reshape our physical facilities as discovery and learning centers for the University community – Strengthen existing delivery services, both physical and digital, while developing new, more rapid delivery services – Enhance and strengthen the Libraries support for UW’s scientific research infrastructure

Empowering Data: Persuasion Through Presentation

Presentation Three

Maribeth Manoff

Coordinator for Networked Services Integration University of Tennessee E-Resource Usage Data for Decision Making and More

Outline

 MaxData Project  UT Libraries and E-Resource Usage Data    What data do we have available?

What questions do we want to answer?

Which data do we use to answer our questions?

 Next Steps   Survey of librarians Gathering a variety of data sets and views

MaxData Project

“Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis” Funded by Institute of Museum and Library Services (IMLS) 2005-2007 Carol Tenopir, PI

MaxData Project Purpose

 Evaluate and compare methods of usage data collection and analysis  Develop cost/benefit model to help librarians select appropriate method(s) for electronic resource usage assessments

MaxData Project Teams

   Carol Tenopir and Donald King: readership surveys at UT and four Ohio Universities David Nicholas et al. (Ciber): deep log analysis on OhioLINK journal usage data UT Libraries: COUNTER data from vendors, data from several local systems  Gayle Baker, Electronic Services Coordinator   Eleanor Read, Data Services Librarian Maribeth Manoff, Systems Librarian

UT Libraries – What Data Do We Have Available?

 Database or Package Level  “Hits” recorded from database menu pages  COUNTER reports from vendors (e.g. Database Report 1: Total Searches and Sessions by Month and Database)   Federated search system (MetaLib) statistics Totals from journal-level data

UT Libraries – What Data Do We Have Available?

 Journal Level  COUNTER reports from vendors (e.g. Journal Report 1: Number of Successful Full-Text Article Requests by Month and Journal)   Link resolver (SFX) statistics (e.g. Report 10: Requests and Clickthroughs by Journal and Target) Some libraries using proxy server logs

UT Libraries – What Questions Do We Want to Answer?

 We want insights into user behavior    Which databases and e-journals are they using?

Which are they not using?

How are they navigating to full-text?

UT Libraries – What Questions Do We Want to Answer?

 We want to make collection management decisions and improve access      Does e-resource usage justify money spent?

Which of our e-subscriptions do we need to keep?

Are there new e-titles we need to obtain?

Are there low use titles that would benefit from PR efforts?

Are there access points that could be added or improved?

UT Libraries – Which Data Do We Use?

 The good news and the bad news - all the data we’ve looked at thus far has potential.

 Next steps in determining costs and benefits for the model:  Survey of librarians who collect and work with usage data from vendors  Gathering a variety of data sets and views for analysis and presentation

Next Steps – Survey Purpose

 How much effort is involved in working with vendor-supplied use data?

 How are the data used?

 What data are most useful in managing electronic resources?

Next Steps – Survey Subjects

 Sent to Library Directors at 284 Carnegie/ARL research institutions  Forwarded to the librarian who collects and works with usage data from vendors  April 2006  92 respondents

Next Steps – Survey Results

 Purpose for reviewing and/or analyzing vendor data     Subscription decisions (94%) Justify expenditures (86%) Reporting (61%) Other (28%)

Next Steps – Survey Results

 “Other” purposes for reviewing and/or analyzing vendor data     Collection management    Cost/use Cancellation decisions Change to electronic-only Promotion / marketing / training for lower use e-resources Administrative  Strategic planning / budget Curiosity

Next Steps – Survey Results

 Most Useful Statistics      Number of full-text downloads (67) Number of searches (41) Number of sessions (27) COUNTER statistics (26) Number of turnaways (17)  Other (17)

Next Steps – Survey Results

 Combining data for added value      Combine vendor stats (36) Combine / compare with other use data gathered electronically (SFX, web logs, consortia reports) (17) Cost per use (12) Fund code/subject (5) Other (12)

Next Steps – Data Gathering and Analysis

 Since 1999, homegrown system to record “hits” to web page database links  Subscription management    Number of simultaneous users Pattern of use of a database over time  Continuation decisions Cost per request  Services management  Use patterns by day, week or semester  Location of users (campus, off-campus, wireless)

Next Steps – Data Gathering and Analysis

 Totals from COUNTER vendor reports for reporting to ARL (Supplementary Statistics)  Number of sessions (logins) to databases or services   Number of searches (queries) in databases or services Number of successful full-text article requests

Next Steps – Data Gathering and Analysis

 Journal level analysis for MaxData project     Spreadsheet combining JR1 Reports for 30+ vendors Using Sep-Nov 2005 time period to correspond with MaxData readership surveys Interest in high use titles, also low or zero use Calculate % of titles accounting for 80% of full text downloads

Next Steps – Data Gathering and Analysis

 In order for journal level data to be useful to collection managers, we need a way to present data by subject area  Subject categories from link resolver (SFX), mapped to local subjects and added to combined JR1 spreadsheet

Next Steps – Data Gathering and Analysis

  Data from SFX statistical module to complement and supplement vendor data Elements from SFX report correspond to COUNTER JR1 report:   ISSN, Title, SFX Target – JR1 Platform SFX Clickthrough – JR1 Full Text Download  Look at these two numbers alongside each other to see a fuller picture of use   Compare trends and patterns See data not in JR1 reports, e.g., non COUNTER packages, open access journals, backfiles

Empowering Data An Iterative Process

 Ask questions  Envision the data possibilities  Gather data sets and views  Have we answered our questions?

 Ask more questions, envision more possibilities, gather more data…

Empowering Data: Persuasion Through Presentation

Thank you for attending this EBSS Program!

Please join us for the EBSS Research Forum

http://www.ala.org/ebss/empoweringdata