Assessment Blitz!

Download Report

Transcript Assessment Blitz!

Assessment Blitz!
Getting Started with Techniques and Tips You Can Use
Session Overview
 How to use a simple, practical approach to tackle userfocused assessment
 Who is this session for?
 New to assessment
 Wanting a refresher
 Supervisors/leaders wanting to support
Why an assessment plan?
 Problem: “Assessment is hard”
 Problem: People often get stuck in the planning stages
 Problem: Assessment fizzles out because of poor planning
Warning!
 Thinking required!
 This is a blitz…
 Result – mental framework & “first draft” assessment plan
So, I need to know…
 Who are you?
 Library type
 Assessment experience
Let’s Get Started…
Types of User Assessment
 Program/Process/Service Evaluation
- Assessment of what we are already doing
 (Bonus Round) Discovery or Foundational Assessments
- Assessment for starting-up programs and innovating
Introducing: The Worksheet
 Completed, it will be 1st draft
 Break it down!
 Breaks the process into pieces
 Multiples needed for full assessment
 Housekeeping…
 Formal assessment vocabulary italicized
 Be careful about writing ahead
 Questions on slides short/informal versions of form
General Information
 Like an introduction outline for a paper
 Can be fleshed out to give others context
1. What “Program” do you want to
Assess?
2. Who are you doing this for?
 What do they value?
 How do they like to be communicated to?
 What so they need to know?
Section: Formulating the Objective
 What is the objective of the assessment? How will it show
how the program impacts users’ lives.
 Example Objective: Demonstrate that the library
supported tutoring program increases participant’s
grades.
3. Why do you offer the service?
 What is the “big picture” reason for funding/supporting
it?
 List several, but which one(s) are the most important?
4. Who is the target user?
 Who is/was it intended to impact?
 List all if more than one group, but choose one to focus
on
5. What behavior change should we
be seeing?
 Tough question! - “Operationalizing the Variable”
 Why behavior?
 We have to “see” it/it must be measurable
Designing the Assessment
 Sampling (deciding what type)
 Designing the “Instrument”
 Analysis approach
P.S. This section will let you determine the assessment’s logistics
6. Where’s the sample?
 They should be a (representative) subset of #4
 Where will you find them?
 How many should you be available?
7. What are the right tools?
 Surveys
 Interviews
 Focus Groups
 Formalized Observations
 Photo Diaries
 Report Requests (Does someone else already have it?)
 Mapping Diaries/Logs
8. How are you going to look at the
data?
 Actual approach varies on method
 Qualitative analysis (e.g. charts, graphs, tables)
 Qualitative analysis (e.g. content analysis, coding)
 Many overlook logistics of analysis!
 What number of subjects will you us? (sufficient, be efficient)
 What method are you going to use?
 What equipment is needed?
 What staff time is needed?
Applying the Results
 Go beyond demonstrating effectiveness
 Use assessment to improve the program
 Ensure final reports aren’t dust-gathers
 Publicize your progress - “close the loop”
9. How will you improve?
 Think about all the possible outcomes in advance – don’t
allow yourself to be surprised.
 Outcome is positive, celebrate! Now tweak things…
 what makes the program successful?
 What is ‘wrong’ with the program?
 Negative outcome isn’t a failure!
 What did you discover that will improve the program?
 Was something in the assessment process at fault?
 Can you afford to continue the program?
10. How will you communicate?
 Intended audience (#2)
 Internal audience (staff)
 Target Audience (#4)
Assessment Plan First Draft Done!
“Discovery” or Foundational
Assessments
 Who are our users?
 Who in our population aren’t our users (and why)?
 What our patrons use “X” innovative program?
 Why are users doing that???
Difference in the 2 Worksheets
 Question #3
 “Program” is about users
 “Discovery” is can also be about the population
 “Formulating the Objective”
 Discovery = Learning about behaviors and attitudes
 Program/Project = Measuring behaviors as outcomes
 A Broad Generalization…”Discovery” takes longer than
“Program”
Questions?
Want to learn more? Consider joining LLAMA MAES
 Has a broad and accessible assessment community
 Gives access to an assessment toolkit