Centre presentations: An easy guide to using the new template

Download Report

Transcript Centre presentations: An easy guide to using the new template

How do measures measure up?
What is the Centre?
Bringing people and knowledge together to
promote the best mental health and
well-being for every child and youth
Knowledge
Capacity
Partnerships
Full continuum of effective and accessible
mental health services for children and youth.
Brief Introductions
 Name and affiliation
 Program to be evaluated
 One outcome of interest
Objectives
 To provide an overview of concepts on
measurement
 To provide guidelines for assessing and
selecting measures
 To reduce fear on the topic of measures and
indicators
Outline
A. What is measurement?
B. What are the different sources of data and
types of measures?
C. What affects the quality of measures?
D. How can we select appropriate measures
for our evaluation?
A. What is measurement?
Evaluatio
n Heaven!
Getting Ready for
Evaluation
Develop
Data Collection
Procedures
Review and
Select Measures
Identify
Priority
Outcomes
Develop
Logic Model
What is measurement?
Evaluation question
Information needed
Sources of information
Types of data
Evaluation Measure
Measurement
refers to the
process of
“operationalizing”
the evaluation
question
Evaluation question
Sample Evaluation Question:
Information needed
Sources of information
Types of data
Evaluation
Measure
 What are clients’ perceptions
of our services?
 Has client satisfaction with
our services improved?
 What areas of the program
do we need to improve on?
Evaluation question
Information needed
Sources of information
Types of data
Evaluation
Measure
Information from parents
and/or youth on:
 Satisfaction with location
 Ease in accessing the services
 Perceptions of outcomes as a
result of the service
 Appropriateness of the
services
 Perceptions on therapeutic
alliance or relationship
B. What are the different
sources of data and
types of measures?
Levels of measurement




Nominal
Ordinal
Interval
Ratio
Sources of information




Questionnaires
Interviews or focus groups
Observation
Administrative data and/or Health records
Evaluation question
Information needed
Sources of information
Types of data
Evaluation
Measure
Sources of information from parents
and/or youth on how they
experience our services:
 Interview
 Focus groups
 Administrative data: number and
type of complaints
 Self-administered survey
 Standardized questionnaire
Types of Data
 Qualitative data: verbal and pictorial
 Numeric scores:
•
•
•
•
Basic units such as frequency, duration, length
Scores from rating scales
Standardized scores (z scores)
Age- or grade-equivalent or adjusted scores
 Criteria for evaluating scores:
• Standardization: procedural comparability
• Psychometric properties
Evaluation question
Information needed
Sources of information
Types of data
Evaluation
Measure
Type of data on clients’
satisfaction or perception of
care
 Scores on a standardized
measure on client perception
of care:
• Average of ratings on all
items
• Changes in average ratings
every 6 months
C. What affects the quality
of measures?
Cultural and Historical Context
 Use of tests and measures as a
“cultural tool”
 Cultural appropriateness of the
measures
 Revisions or updates to
measures to accommodate
changes in scores in the
population
Psychometric Properties: Validity
Concurrent
Does the measure REALLY
Construct, measure what it is supposed
Criterion and to measure?
Factorial
Discriminant or
Divergent
Psychometric Properties: Reliability
= Dependable
= Trustworthy
= Same old, Same old
Psychometric Properties:
Feasibility
 Cost and availability
 Time for administering, scoring
and analyzing
 Staff involvement
 Information management
Psychometric Properties:
Relevance





Clinical utility
Culturally appropriate
Administrative uses for decision-making
Usefulness for improving program
Usefulness for information on public reports
Other measurement considerations
when involving children




Developmental level
Educational level
Health status
Family
D. How can we select
appropriate measures for
our evaluation?
Criteria for selecting measures
Evidence-based
Feasible
Relevant and meaningful
Evidence-based
 Sound psychometric properties: valid and
reliable
 Used in similar settings
 Recommended by experts, if no previous
literature
Sample Summary Tables
Sample Summary Table
Name of
Measure
Reference
or
Developer
Areas
measured
Evidence on
reliability
and validity
Feasibility
(cost,
training,
who
administers)
Relevance Comments
Evaluation question
Youth Services Surveys (YSS, YSSF)
General Information
Information needed
Sources of information
Developer/ Authors
Children’s Indicator Workgroup of Sixteen State Study, Centre for Menta
Health Services.
Date of publication,
versions available
2001.
Separate forms for parents and youth.
Constructs measured
The YSS is used as a measure of youth service usage and satisfaction
with services.
The YSSF is used as a measure of parent report of youth service usage
and satisfaction with services.
Population for which
designed
Adolescents ages 13 and up.
Method of
administration
Self-report questionnaire, paper or telephone interview.
Subscales and
number of items
25 items.
Five scores based on a factor analysis can be obtained:
Good access to services
Participation in treatment
Cultural sensitivity of staff
Appropriateness of services
Positive outcome of service
Estimated time to
administer
Not mentioned.
Costs, availability and
permission to use
In public domain. See online resources below.
Examiner
Information not available at this time.
Types of data
Evaluation
Measure
Youth Services Surveys (YSS, YSSF) http://www.mhsip.org
Technical Information
Sample for
development
of norms
The YSS was created as part of the Mental health Statistics Improvement Program for
the Sixteen State Study.
To date 430 youth have completed the survey.
Reliability
Reliability indices for the 5 factors of the survey are good: Access = .713, Participation i
Treatment = .776, Cultural Sensitivity = .863, Appropriateness = .863, Outcome
=.893 (Brunk et al 2000)
Validity
This indicator was developed by the U.S. Substance Abuse and Mental Health Services
Administration (SAMHSA) for the National Mental Health Performance Measures,
and feasibility was assessed across 16 states (Lutterman et al, 2003). The Virginia
Department of Mental Health, Mental Retardation, and Substance Abuse Services
coordinated the development of the surveys for children and youth (Brunk et al
2000).
The survey is widely used in the U.S. and reports by various states are available in the
grey literature.
Uses
The YSS may be used for the following:
(1) As a data collection method for the following: Client perception of care, consumers
linked to physical health services, children in family-like arrangements and other 24
hour residential care programs.
(2) As a clarification method for findings from other indicators.
Evaluation question
Measures of Therapeutic Alliance
Information needed
Blatt & Zuroff 2005: Importance of
including therapeutic alliance as a
factor moderating treatment
outcomes science direct link
Sources of information
Types of data
Evaluation
Measure
Elvins & Green 2008: Review of
concept and measures on therapeutic
alliance science direct link2
Green 2006: Measures on therapeutic
alliance for child and youth mental
health interscience link
Evaluation question
Measures on Cultural Competency
Information needed
Sources of information
Types of data
Evaluation
Measure
Cultural Competency Self-Evaluation
Questionnaire friends link
Some common issues
 To translate or not
 Picking and choosing items from various
measures
 Making sense of data: indicators,
benchmarks and standards
Decision process for identifying,
selecting and using measures
Identifying
Specify
topic
of
inquiry
Conduct
literature
review
Selecting
Identify
existing
measures
Examine
evidence,
feasibility &
relevance
Using
Use
measure
Revise
measure
Identify
Gap:
NO existing
measures
Develop
new
measure
Collect &
Analyze
Data (as
Developed)
Gather new
evidence on
validity and
reliability
Steps in Translating Measures
Translate
to second
language
Back
translate
to first
language
Pilot-test
with
experts &
end-users
Gather new
evidence
on validity
and reliability
Compare
results with
original
version
Measures Matrix








Evaluation question
Outcome or process variable
Indicator or measure
Where data comes from
How data is collected data & how frequently
Who collects data
When data is collected
How data will be analyzed
Sample Measures Matrix
Evaluation question
Do clients perceive outcomes as a result of the
care/ services they receive?
Process or Outcome
variable
Client perception of care
Indicator/ Measure
Youth Services Survey
Source of data
Youth 16 to 18 years, and
Parents of children and youth below 16 years
How data is collected
Self-administered questionnaire, in-person and by
mail
Who collects data
Receptionist
When data is collected
At termination of the program, at least 3 months in
program
Plan for analysis
Compare means of total scores for those in
Program A with those in Program B
Questions?
Key insights?
Summary



Measurement involves specifying or
“operationalizing” the evaluation question.
Process or outcome variables can be
assessed using numerical or qualitative
data.
Sources of information include
questionnaires, interviews, focused groups
and administrative data.
Summary, Cont.
 Summary tables assist in organizing
information about various measures, and
help in making informed decisions on the
most appropriate measure for the evaluation.
 A matrix of the indicators and measures to be
used in the evaluation summarizes how,
where and when data is to be collected.
Next Steps
 Review logic model and identify evaluation
questions
 Specify process and outcome variables
 Summarize literature review
 Review and select measures
 Create indicators and measures matrix
 Develop protocol for collecting data
 Develop plan for analyzing data
Future webinars?
 February 2009: Using Excel and SPSS for
basic statistics
 March 2009: Writing the Final Report
For more information
Evangeline Danseco, PhD
Head, Evaluation and Research
[email protected]
613.737.7600, ext. 3319
www.onthepoint.ca