Value and Impact Measurement: a UK perspective and

Download Report

Transcript Value and Impact Measurement: a UK perspective and

Building a resource for practical
assessment: adding value to value
and impact
Stephen Town
University of York, UK
Library Assessment Conference, Seattle
Wednesday 6th August, 2008
Summary
• The SCONUL Value & Impact Measurement
Program (“VAMP”) recap
• The Performance Portal
• ‘Value’ options
– UK drivers (TRAC)
– Institutional Case: The Open University’s Best Value project
– Benchmarking & national statistics
Introduction & recap
The University Context
(from the 2006 Library Assessment Conference,
after Lombardi)
Universities have two “bottom lines”
1. Financial (as in business)
2. Academic, largely through reputation in
•
•
Research (the priority in “leading” Universities)
Teaching (& maybe Learning)
Library Pressures for Accountability
The need is therefore to demonstrate the Library
contribution in these two dimensions:
1. Financial, through “value for money” or related
measures
2. Impact on research, teaching and learning
This also implies that “competitive” data will be highly
valued
The Aim & Role of Universities & their
Libraries: cautions for measurement
• Research, Teaching & Reductionism
–
–
–
–
–
–
‘Mode 1’ Research & impact ‘transcendental’
‘Mode 2’ Research & impact ‘instrumental’
Value, Price & ‘Mandarinisation’ of research and its support
Interdisciplinary research
Collaborative research across institutions
Learning as a set of discreet assessed modules
• All of this may damage the idea of Libraries
as ‘transcendent’, collective and connective
services
SCONUL Member Survey Findings
• 70% undertaken value or impact measurement
• Main rationales are advocacy, service improvement,
comparison
• Half used in-house methodologies; half used
standard techniques
• Main barrier is lack of tools,
– Creating issues of time and buy-in
Member Survey Conclusions
• There is a need to demonstrate value and that
libraries make a difference
• Measurement needs to show ‘real’ value
• Need to link to University mission
• Libraries are, and intend to be, ahead of the game
• Impact may be difficult or impossible to measure
• All respondents welcomed the programme, and the
prospect of an available toolkit with robust and
simple tools
VAMP Objectives
• New missing measurement instruments &
frameworks
• A full coherent framework for performance,
improvement and innovation
• Persuasive data for University Senior
Managers, to prove value, impact,
comparability, and worth
Missing methods
• An impact tool or tools, for both teaching &
learning and research (from the
LIRG/SCONUL initiative?)
• A robust Value for Money/Economic Impact
tool
• Staff measures
• Process & operational costing tools
Program Benefits
1. Attainment & retention of Library institutional
income
2. Proof of value and impact on education and
research
3. Evidence of comparability with peer institutions
4. Justification of a continuing role for libraries and
their staff
5. Meeting national costing requirements for
separating spend on teaching and research
Communities of Practice
“groups of people who share a passion
for something that they know how to do,
and who interact regularly to learn how to
do it better”
“coherence through mutual engagement”
Etienne Wenger, 1998 & 2002
VAMP Project Structure
•
•
•
•
•
•
Analysis
Tools I (Impact)
Site Development
Tools II (Value)
CoP development
Maintenance
March-June 2006
- June 2007
- June 2007
-?
The Performance Portal
Community of
Practice
Techniques
Member’s Forum
(Blog?Chat?)
VAMP
Home Page
Techniques in Use
(Wiki?)
Simple
Introductions
Detailed
Techniques
The ‘Performance Portal’
• A Wiki of library performance measurement
containing a number of ‘approaches’, each
(hopefully) with:
–
–
–
–
A definition
A method or methods
Some experience of their use in libraries (or links to this)
The opportunity to discuss use
Content submission
User guide
Discussion Tools
• An experiment in social networking & Web
2.0 technologies
The Ontology of Performance
•
•
•
•
•
‘Frameworks’
‘Impact’
‘Quality’
‘Statistics’
‘Value’
• A visual Mind map?
Frameworks
Mounted
Desired
• European Framework
for Quality
Management (EFQM)
• Key Performance
Indicators
• The Balanced Scorecard
• Critical Success Factors
• The Effective Academic
Library
Impact
Mounted
Desired
• Impact tools
• Detailed UK experience
from LIRG/SCONUL
Initiatives
• Outcome based
evaluation
• Information Literacy
measurement
• More on research
impact
Quality
Mounted
Desired
• Charter Mark
• Customer Surveys
•
•
•
•
–
–
–
LibQUAL+
SCONUl Survey
Priority Research
• Investors in People
Benchmarking
Quality Assurance
ISO 9000s
‘Investors in People’
experience
• Opinion meters
• Quality Maturity Model
Statistics
Mounted
Desired
• SCONUL Statistics &
interactive service
• HELMS statistics
• Institutional experience
of using SCONUL
statistics for local
advocacy
• COUNTER
• E-resource tools
Value
Mounted
Desired
• Contingent valuation
• ‘Transparency’ costing
• Staff & process costing,
value & contribution
• E-resource value tools
Value
What is value?
•
•
•
•
•
•
•
•
•
Cost efficiency
Cost effectiveness
Cost comparison (Case 3)
Financial management process standards & audit
Financial allocation (Case 1)
Valuation
Value added
Return on investment
Best value (Case 2)
Case 1: TRAC
UK Higher Education Transparency
initiative 2000-09
Transparent approach to costing
•
•
•
•
The standard method for costing in UK HEIs
Government requirement
Ending of cross-subsidy (T vs R)
Research funding based on full economic
costing (fEC)
• Positive effects on funding
• Positive effect on pricing
Implications
• All activity to be identified as ‘research’,
‘teaching’ or ‘other’
• Library as other? or
• All library activities either research or
teaching, or a simplistic apportioning to
each
• Libraries omitted as a component of
research costs, and therefore as a share
recipient
Case 2: the UK Open University
Library’s Best Value Program
OU Best Value Program Objectives
• To increase the business skills of library
managers & staff
• To develop skills to support customerfocused, cost-efficient management decision
making
• To develop benchmarking evaluation skills
that balance quality, value and cost
efficiency
Strands
•
•
•
•
Business reporting
Process costing and continuous improvement
Service planning
Benchmarking
‘to generate real accountability’
Business reporting elements
• Library business areas
• Five PIs per area, including cost, quality &
customer impact
• Forecast, variance & remedial action
Has improved use of management information,
efficiency, prioritisation and expenditure
control
Process Costing
• Complete process and stage costing
• Average times and skill levels
• Included enquiries, cataloguing, e-resources, IT
support, document delivery, counter services
Has delivered justification for staffing levels against
activity, staffing formulae, redeployment to
priority areas, and process improvements
Service plans
• Costed service plans to achieve medium
term improvement and development through
a rolling program
Included document delivery, enquiries,
information literacy, and e-resources
Program benefits and outcomes
• Staff development
– cost-conscious decision-making
– business skills
•
•
•
•
Management information improvement
Clarity about customers and use
Improved quality
Ability to ‘sell benefits’
Case 3: Financial benchmarking
International Benchmarking initiatives
• OU able to engage and lead an exercise
against distance education Universities
worldwide
• In one 2008 international benchmarking
study
– Only one institution (out of eight) had a comprehensive
costing model
Financial Statistical Convergence
• York Meeting, 2008
–
–
–
–
OCLC/RLG
ARL
SCONUL
CAUL
Conclusion & Questions
• What do mean by value?
• Why do we not yet have a collective view on
costing approaches?
– Skills deficiency?
– Lack of real need or real financial performance
accountability?
– We would rather not know?
– Are we more intent on increasing budgets than seeking
efficiency improvement?
Acknowledgments
• The VAMP Subgroup of SCONUL WGPI
Maxine Melling, Philip Payne, Rupert Wood
• The Cranfield VAMP Team, Darien Rossiter,
Michael Davis, Selena Lock, Heather Regan
• The Open University, Ann Davies
• ‘Value’ Consultants, Sue Boorman, Larraine
Cooper
• Attendees at the York Statistics meeting