Measuring the quality of our service to the CSIR

Download Report

Transcript Measuring the quality of our service to the CSIR

Measuring library leadership
and competence at the CSIR
Martie van Deventer
Portfolio Manager: CSIR
Information Services
CICD Winter Seminar, Pretoria: NLSA,
29 June 2009
© CSIR 2009
Roadmap
•
•
•
•
What do we regard as competence?
How do we measure @ CSIR
Indicators of technical competence
Leadership competence
© CSIR 2009
Competence
•
•
•
•
> knowledge about a subject
> skill & experience in the field
> behaviour & attitude
Combination ~ phases of
competence (beginner –
seasoned professional)
• Brain surgeon fainting @ scene
of accident … competent?
© CSIR 2009
Library schools … do they provide
knowledge or competence?
•
•
•
•
•
•
•
•
•
•
Information and knowledge management
Information seeking, searching and retrieval
Knowledge organization and representation
Management
User studies
Information literacy
ICTs
Ethical, legal and economic aspects of information
The information / knowledge society and globalisation
… and other academic courses
Source: Bothma & Britz, 2007
© CSIR 2009
Library schools … cont
• Within (L)IS programmes
–
–
–
–
–
–
–
Focus on (Information and) Knowledge Management
Computer trouble shooting skills
Multimedia and web development
Media and Publishing Studies
ICTs
Records management
Archival studies
• As new programmes
– Multimedia
– Publishing studies
– Archival studies
Source: Bothma & Britz, 2007
© CSIR 2009
CICD-type progammes … do they
provide knowledge or competence?
• Strive for continuing education &
professional development
• Promote lifelong learning
• Intend to act as facilitator between
employer and employee
• All these are essential … but … where &
when does experience & skill enter the
equation?
© CSIR 2009
Where can we go to gain the
necessary competence to …
• Make our clients happy?
• Make us happy?
–
–
–
–
–
–
–
–
Turn us into true leaders?
Comfortable in a virtual world?
Embed our services?
Support open access: Install institutional repositories, publish
online, using Web 2.0 or other appropriate tools - not as toys but
with the explicit goal to bring about fame?
Simultaneously be IP astute?
Smart at negotiating contracts?
Curate digital content?
Collaborate in international teams?
© CSIR 2009
Measuring competence is difficult
• Often need to use indicators of competence
• CSIR measures portfolio/unit performance
against indicators
– Life isn’t fair – the CSIR cannot compensate for past
wrongs!
– Perceptions rule – the CSIR cannot fix perceptions
about the library profession!
• We, in CSIRIS, appoint against competence
– Interview for knowledge & experience
– Practical evaluation to test skill & attitude
© CSIR 2009
CSIR measuring performance
• Weighted scorecard system (Key
Performance Indicators (KPIs))
– Set targets approved by Executive
• Elements measured
–
–
–
–
–
Operational service delivery
Financial sustainability
Customer satisfaction
HR & staff transformation
Corporate governance & citizenship
• Progress measured quarterly (KPIs)
• Annual evaluation (KPIs)
• Performance measured six monthly (Key
Result Areas (KRAs)) … competence?
– Measured against negotiated set targets
© CSIR 2009
Evaluation dashboard 2009
Executives
happy!
Customer
satisfaction (20%)
> 80%
HR & staff
transformation (10%)
> 50% black
professionals
Staff development
Financial
sustainability (10%)
+ Balance + R100k external
Corporate governance
& citizenship (10%
BBEEE purchasing
Saving on electricity
Social responsibility
© CSIR 2009
Operational service
delivery (50%)
TOdB reliable publication
equivalent statistics
> Repository content
Records Management
> Moving to eLibrary
eJournal usage increasing
Curation strategy
Training excellence
Executives are happy when …
• We save them money
• We make the organization famous!
Researchers are happy when …
• We save their time, and
• We make them famous!
© CSIR 2009
Measuring impact
Make me
famous!
Source: Botha, Erasmus & Van Deventer, 2009
Save me
time!
Save me
more time!
Indicators of financial competency
© CSIR 2009
Ensuring that information products &
resources provide value for money
Product
List Price
Offer to CSIR
Final negotiated price Savings
Rand
Blackwell (Refusal to pay
extra because of a
publisher change to US$
invoicing)
Sabinet SACat
RefWorks
R 24,255
£6886 (R110705)
$13,496.56 (R134960)
R 19,173
R 19,163.00
R 174,211.00
R 193,374.00
R 26,000
$2600
$10000
$12600
Free in addition to
current subs if we
continue (cannot
Technology Section:
R 42,000
Free. Value approx R42000
give discount)
approx £2800 ?
Business Insights
R194778.50 9
(R10.54/$). Asked
for $ invoice when
R 1,886
R 192,892.00
R wasapprox R9.60
$18480
EbscoHost
$11907.72. Picked
up incorrect
R 11,617
$1161.72
$10746
calculations
Wiley
Cancelled –
WorldCat available
free on Internet.
Enough article
R 14,434.00
databases
WorldCat and Articles First.
Not cost effective.
R 20,000.00
SAJS now free
SA ePublications
R 72,524
$10260
$5322 (R40028)
2008/9: $15582 (R112552)
SPIE Digital Lbrary (Aug)
R 229,820.00
€20000
2008/9 €12000
€32000
Beilstein (Aug)
Total
© CSIR 2009
R 461,709
Value for money
2006-2009
Researchers
Oct-06
Mar-07
Mar-08
Mar-09
815
849
788
722
Downloads
Downloads per
researcher
60 562
92 598
141 428
140 752
Equals: 195 x $25 = $4 860 (R46k) value per
researcher or R11.30 vs $25/ R35 per download
© CSIR 2009
74
109
179
195
Technical competencies required
• Negotiation – we deal with sharp businessmen who are
selling their product
• Statistical analysis - to check if investments where
correctly made
• ROI investigations
• Report writing
• Marketing and promotional activities to encourage usage
• Technical installations of products
• Managing contracts that cost the organisation in excess
of R1m each
• Better searches on these products than what the
average information literate person can do?
Indicators of client satisfaction
© CSIR 2009
User education & training to enable
our clients
Overall evaluation
Access to Peer Reviewed Scientific Publications
01/04/08 - 31/03/2009
0
Questions
B1 : Dealing with Enquiries
B2 : Booking process
B3 : Pre-event communication (sufficient / clear / timely)
C1 : Comprehensiveness
C2 : Challenging
C3 : Usefulness to your job
C4 : Sufficient learning by doing exercises
C5 : The event content was presented at a level which
C6 : Difficult concepts were explained
C7 : The ideas fitted into a meaningful picture by the
D1 : Workbook/handouts quality
D2 : Exercises and case studies
D3 : Videos and transparencies
D4 : Software and systems
D5 : The materials contributed to my understanding
D6 : The materials were well organised
E1 : Knowledge of subject
E2 : Preparedness
E3 : Presentation skills
E4 : Clarity of instructions / explanations
E5 : Feedback on exercises / questions
E6 : Creations of group rapport ("vibe")
E7 : Encouragement of group participation
E8 : The facilitator asked questions that made me think
E9 : The facilitator stimulated my interest in the content
F1 : Comfort
F2 : Lighting
F3 : Refreshments and meals
F4 : Audio visual equipment
G1 : Overall rating of the learning event
1
2
3
4
5
Ave ra ge
Technical
expertise
© CSIR 2009
Tracking customer satisfaction
Information specialists client satisfaction, Apr 08-Mar 09
95
95%
94
93
92
%
91
Series1
90
89
88
87
86
86%
Apr
M ay
Jun
Jul
Aug
Sep
Oct
Nov
Dec/Jan
Feb
M ar
Technical competencies behind the scene
• Ability to do significantly higher quality literature
searches, faster than they can - knowing both the
product and the search strategies, having both skill and
experience
• Develop training that incorporates knowledge about
learning styles and end user behaviour
• Ability to utilise a variety of technologies to assist with
skills transfer
• Provide guidance and develop tips and tricks to assist
with information overload
• Provide guidance and lead others to work faster and
smarter
© CSIR 2009
Indicators of operational excellence
© CSIR 2009
Bringing fame to the institution & our
researchers
Total
Records
.pdf down- Records .pdf downadded
loads
added
loads
2007/8
2007/8
2008/9 2008/9
1,349
113,327
1,075 340,616
Items in the repository:
Total number of .pdf files downloaded:
© CSIR 2009
2,424
453,943
Showing usage
Research Space Financial Year comparison
40,000
35,000
25,000
2007/8 Total no of PDFs viewed
2008/9 Total no of PDFs viewed
20,000
15,000
10,000
5,000
ar
M
ua
br
Fe
ch
ry
y
Ja
nu
ar
r
ce
m
be
r
Month/Year
De
ve
No
O
ct
m
ob
be
er
r
em
pt
Se
Au
gu
be
st
ly
Ju
ne
Ju
ay
M
ri l
0
Ap
Number of items
30,000
Technical competencies required …
•
•
•
•
•
•
•
Indexing of internationally acceptable standard
Open source coding to address unique needs
Statistical analysis of usage
Curation of repository content
Repackaging of content
Due diligence in terms of legal requirements
Training so that others could effectively use and
utilise the facility
• Enhancements to the open source products we
are using
© CSIR 2009
Leadership KRAs
• Collaboration (internal & external)
• Guiding the organization
• Partnerships (internal & external)
• New developments in eResearch
enablement (curation, collaboration,
information infrastructure)
• Managing service enablers
• Staff development & satisfaction
• Records management
• Open access
© CSIR 2009
NeDICC: attempting to collaborate with the
USGS on establishing a WDCBHH in SA
CSIR/UP partnership: looking at the
components of VREs
Source: Van Deventer & Pienaar, 2009
Finally
• I am not convinced that we, as a profession, really do
understand the new competencies for high quality,
service … relevant to our stakeholders … or that we
have a place to go for help when we need practical
assistance!
• I do not know how to measure if new recruits are rapidly
acquiring essential competencies.
• I need an independent qualification authority to assist
when it comes to identifying leadership competence.
• I know it is time to increase the pace … and we are
doing the best we can!
• If you can help … speak to me!!
© CSIR 2009
References
•
•
•
•
•
•
Botha, E., Erasmus, R. & Van Deventer, M. 2009. Evaluating the impact of a Special Library and Information
Service. Journal of Librarianship and Information Science, Vol. 41, No. 2, 108-123 (2009). Available:
http://lis.sagepub.com/cgi/reprint/41/2/108 (Accessed 6 June 2009)
Bothma, T & Britz, J. 2007. Collaboration amongst (L)IS Schools in South Africa. Presented at the World Library
and Information Congress (WLIC): 73rd IFLA General Conference and Council, 22 August 2007. Education and
Training Section, University of Kwazulu-Natal Library. URL: http://www.ifla.org/IV/ifla73/index.htm (Accessed 6
June 2009)
Frame, M. 2009. World Data Center for Biodiversity & Human Health: Infrastructure Requirements of a National
Informatics System (edited version)
Satgoor, U. 2005. CICD & Library schools: partnership for LIS CEPD. RETIG Indaba. 28 February. Available:
http://www.infs.ukzn.ac.za/cicd.doc (Accessed 6 June 2009)
Van Deventer, M. 2009. Knowledge Retention. Presentation to: Pretoria Chapter of the Knowledge Practitioner’s
Group. Pretoria.
Van Deventer, M & Pienaar, H. 2009. Virtual research environments: learning gained from a situation and needs
analysis for malaria researchers. 2nd African eScholarship and digital curation conference. Pretoria: 12-13 May.
Available: http://www.library.up.ac.za/digi/docs/mvdeventer_paper.pdf (Accessed 6 June 2009)
© CSIR 2009