Going Beyond the Basics

Download Report

Transcript Going Beyond the Basics

Going Beyond the Basics
Using CQI to Integrate Fidelity Measurement into
All Core Functions of an Organization
Kimberly Gentry Sperber, Ph.D.
Objectives




Identify opportunities for assessing
fidelity
Identify resources for monitoring and
improving fidelity
Address barriers to monitoring fidelity
Create a work plan for starting fidelity
reviews
Key Terms
Evidence-Based Practice:
A practice that has been shown to work
through use of scientific research.
Fidelity:
The extent to which an intervention is
delivered as designed.
Opportunities to Monitor Fidelity








Training
Assessments
Treatment groups
Individual sessions
Case Management
Milieu
Documentation Review
Program Assessments
Ensuring Training Transfer




Use of knowledge-based pre/post-tests
Use of knowledge-based proficiency tests
Use of skill-based rating upon completion of
training
Mechanism for use of data


Staff must meet certain criteria or score to be
deemed competent.
Failure to meet criteria results in consequent training,
supervision, etc.
Assessments

Desktop Reviews




Accurate scores
Reviews of overrides
Integration with service plan/dosage
Observations



Use of standardized audit sheet
Assess interviewing skills
Assess accuracy of item ratings
Treatment Groups

Observation-based ratings of adherence to
treatment model.

CBT:








Frequency of role-plays
Structure of role-plays
Appropriateness of role-plays
Use of behavioral reinforcers
Effective use of authority and disapproval
Teaches the thought-behavior chain
Teaches structured skill building
Follows curriculum
Individual Sessions

Observation-based ratings of adherence to
treatment model.

CBT:







Teaches thought-behavior chain
Teaches problem-solving
Teaches structured skill building
Conducts role-plays
Appropriate use of thinking reports/homework
Graduated practice
Appropriate use of reinforcers
Case Management

Observation-based ratings of adherence
to treatment model.

CBT:





Teaches thought-behavior chain
Teaches problem-solving
Teaches structured skill building
Appropriate use of reinforcers
Helps client to integrate skills learned into real
world environment (e.g., employment)
Milieu

Observation-based ratings of competence in core
correctional practices




Focus is more on effective use of authority and
disapproval and appropriate use of reinforcers and
sanctions.
Standardized list of behavioral indicators
Structure for observing and rating staff
interacting with clients in milieu
Can also review incident data for trends
Documentation Review: Why Do It?

Clinical Implications



Operational Implications





Good documentation should drive decision-making.
Means of communication
Risk Management Implications


Documentation is not separate from service delivery.
Did the client receive the services he/she needed?
If it isn’t documented, it didn’t happen.
Permanent record of what occurred in the facility
Source of Staff Training
Reflection of the provider and organization’s
competency:


EBP
Outcome of care
Program Assessments



Correctional Program Checklist (CPC)
Correctional Program Assessment
Inventory (CPAI)
ICCA Treatment Survey
Sample Measures






Percentage of groups containing role-plays
Percentage of successful completers receiving
appropriate dosage based on risk/needs assessment
Percentage of staff achieving 4:1 ratio
Percentage of groups observed where staff modeled
the skill prior to having clients engage in role-play
Percentage of role-plays containing practice of the
correctives
Percentage of role-plays that required observers to
identify skill steps and report back to the group
Required Resources

Observation-Based Ratings





Creation of audit sheets
Schedule for conducting the reviews
Staff qualified to conduct and rate the
observations
Time for staff to conduct observations
Mechanism to record and use the data


Supervision and individual staff development
QI and training initiatives
Required Resources

Documentation Review





Staff to conduct the review
Schedule for review rotation
Audit sheet
Time to conduct the review
Mechanism for recording and using the
data

Action planning
Common Barriers






Strength of conceptual understanding of the
EBP to be measured
Resources
Setting priorities
Understanding/skill sets required for
measurement
Conflicting philosophies (helper vs. evaluator)
Time!
Potential Strategies

Start small


Use technology to increase efficiencies


For example, desk top review of assessments
versus observation-based ratings
For example, videotape interactions for
observation-based ratings
Take the time to build expertise



Train on model
Train on evaluation methodology
Insure understanding of purpose (e.g., QI versus
punishment)
Starting a Work Plan

What services do you say/promise that you
deliver?



List all programming components




What does your contract say?
What do referral sources expect?
What is the model (e.g., CBT, IDDT, IMR, TFM,
etc.)?
What curricula are in use?
Identify which is most important
Make selection for measurement
Creating/Finding Tools




Scale should adequately sample the critical
ingredients of the EBP.
Need to be able to differentiate between
programs/staff that follow the model versus
those that do not.
Scale should be sensitive enough to detect
progress over time.
Need to investigate what measurement tools
may already exist.
Identify Process Workflow

Identify who will be observing, measuring, and
documenting

Define who will be observed, what data will be collected,
from what source

Define frequency of observations and who will be
observed


Consider use of technology (e.g., videotaping groups)
Decide how data will be coded, stored, aggregated,
reported
Identify Process Workflow


Determine responsible parties for all parts of the process
Determine how data are to be used (e.g., are there
expectations that program staff will document
improvement plans)

Agree on common definitions for key terminology
contained in measurement tools (e.g., modeling)

Determine training needs for assigned evaluators (internal
v. external evaluators)
Create Training Package

Formally document and share details of
work plan with all staff:




Defining fidelity
Outlining importance and details of fidelity
initiative
Who’s impacted and how
Expectations for all staff, (e.g. tie
performance appraisal scores to use of
model)
Create Training Package

Train staff as evaluators:





Clarify role of evaluation and how this might differ
from their assigned operational role in the program
Define all terminology contained in measurement
tools/process
Agree on what constitutes evidence that staff are
utilizing the model correctly
Specify how they are to document results and any
responsibilities they have for data coding, storage,
etc.
Identify expectations, (e.g. tie performance
evaluation to integrity of Measurement)
Conclusions





Many agencies are allocating resources to
selection/implementation of EBP with no evidence
that staff are adhering to the model.
There is evidence that fidelity directly affects client
outcomes.
There is evidence that internal evaluation processes
directly affect client outcomes.
Therefore, agencies have an obligation to routinely
assess and assure fidelity to EBP’s.
Requires a formal infrastructure to routinely monitor
fidelity performance.