How to implement EBPs - JBS International, Inc.

Download Report

Transcript How to implement EBPs - JBS International, Inc.

Evidence-Informed
Planning:
Benefiting from EvidenceBased Interventions
Defending
Childhood
January 26, 2011
Melissa K. Van Dyke, LCSW
Associate Director
National Implementation Research Network
University of North Carolina, Chapel Hill
Two Sides of the Same Coin
To successfully implement and sustain
evidence-based and evidence-informed
interventions, we need to know:
The WHAT - What is the intervention
(e.g. Al’s Pals, FFT, PCIT, Second Step)
AND
The HOW - Effective implementation and
sustainability frameworks (e.g. strategies to
change and maintain behavior of adults)
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
The Challenge
“It is one thing to say with the
prophet Amos,
‘Let justice roll down like
mighty waters’ …
… and quite another to work
out the irrigation system.”
William Sloane Coffin
Social activist and clergyman
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Science “to” Service
Why Focus on Implementation?
“Children and families cannot benefit from
interventions they do not experience.”
SCIENCE
IMPLEMENTATION
GAP
SERVICE
Implementation is defined as a specified set
of activities designed to put into practice an
activity or program of known dimensions.
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Goals for Today’s Session
The “What”
Review general information about evidencebased practices
The “How”
Present ‘stage-related’ work necessary for
successful service and system change
Present the Implementation Drivers that result in
competence and sustainability
Explore “improvement cycles” and how to use
them at a number of levels
The “Who”
Discuss the roles and responsibilities of
implementation team and program purveyors
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Which intervention is right for you?
What are the needs of your population?
What interventions are available to
address those needs?
What is the strength of the evidence of
those interventions?
Which interventions are a good fit for
our community?
Do we have what is required to fully
and effectively implement these
interventions?
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Assessing Evidence-Based
Programs
and Practices
Capacity to Implement
Staff meet minimum qualifications
Able to sustain Imp Drivers
• Financially
• Structurally
• Buy-in process operationalized
• Practitioners
• Families
• Agency
Intervention Readiness for
Replication
Qualified purveyor
Expert or TA available
Mature sites to observe
# of replications
How well is it operationalized?
Are Imp Drivers operationalized?
EBP:
Need in Agency, Setting
Socially Significant Issues
Parent & Community Perceptions
of Need
Data indicating Need
Fit with current -
Need
Capacity to
Implement
Medium
Need
Fit
Resources Availability
Fit
Intervention
Readiness
for
Replication
Resource
Availability
Evidence
5 Point Rating Scale: High = 5; Medium =
3; Low = 1. Midpoints can be used and
scored as a 2 or 4.
High
•Initiatives
• State and Local Priorities
• Organizational structures
• Community Values
Low
Resource Availability
IT
Staffing
Training
Data Systems
Coaching & Supervision
Administrative & system
supports needed
Evidence
Outcomes – Is it worth it?
Fidelity data
Cost – effectiveness data
Number of studies
Population similarities
Diverse cultural groups
Efficacy or Effectiveness
Evidence
Readiness for Replication
Capacity to Implement
Total Score:
© National Implementation Research Network 2009
Adapted from work by Laurel J. Kiser, Michelle Zabel,
Albert A. Zachik, and Joan Smith at the University of Maryland
Becoming an Informed Consumer
NREPP
http://www.nrepp.samhsa.gov/
- Descriptive information
- Outcomes
- Quality of Research
- Study Population
- Readiness for Dissemination
- Costs
- Replications
Questions to ask model developers or
model purveyors:
http://www.nrepp.samhsa.gov/pdfs/Questions
_To_Ask_Developers.pdf
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
NREPP Program Review Sample
For example –
Second Step
Readiness for Dissemination Ratings
by Criteria (0.0-4.0 scale)
Implementation Materials 4.0
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Training and Support
4.0
Quality Assurance
3.5
Overall Rating
3.8
Science-to-Service Gap
Implementation Gap
What is adopted is not used with
fidelity and good outcomes
What is used with fidelity is not
sustained for a useful period of time
What is used with fidelity is not
used on a scale sufficient to impact
social problems
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Implementation
Review and synthesis of the
implementation research and
evaluation literature
(1970 – 2004)
Multi-disciplinary
Multi-sector
Multi-national
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Insufficient Methods
Implementation by laws/ compliance by itself
does not work
Implementation by “following the money” by
itself does not work
Implementation without changing supporting
roles and functions does not work
Diffusion/dissemination of information by
itself does not lead to successful
implementation
Training alone, no matter how well done,
does not lead to successful implementation
Fixsen, Naoom, Blase, Friedman, Wallace, 2005
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Implementation Pre-Requisites
Start with Data related to Need
Look for “best evidence” to Address the Need
An Evidence-Based Practice or Program
An Evidence-Informed Initiative
Systems Change and Its Elements
Clearly operationalize the program and/or
practice features or the systems change
elements
Operationalize
to define a concept or
variable so that it can be measured or expressed
quantitatively
Webster's New Millennium™ Dictionary of English,
Preview Edition (v 0.9.7) Copyright © 2003-2008
Lexico Publishing Group, LLC
Part of Speech: verb Definition:
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What Works
INTERVENTION
IMPLEMENTATION
Effective
Effective
NOT Effective
Actual
Benefits
Inconsistent;
Not Sustainable;
Poor outcomes
NOT Effective Poor outcomes;
Sometimes harmful
Poor outcomes;
Sometimes harmful
(Institute of Medicine, 2000; 2001; 2009; New Freedom Commission on
Mental Health, 2003; National Commission on Excellence in
Education,1983; Department of Health and Human Services, 1999)
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What Works
IMPLEMENTATION
INTERVENTION
Effective
Effective
Actual
Benefits
from Mark Lipsey’s 2009 MetaNOT
Effective
analytic
overview
of the primary
factors that characterize
effective
juvenile offender
Inconsistent;
interventions
–
Not Sustainable;
“. . . in some analyses, the
Poor outcomes
quality with which the
intervention is
implemented has been as
Poor outcomes;
Poor outcomes;
strongly related to
NOT Effective
Sometimes harmfulrecidivism
Sometimes
effects harmful
as the
type of program, so much
so that a well-implemented
intervention of an
inherently
less efficacious
(Institute of Medicine, 2000; 2001; 2009; New
Freedom Commission
on
Mental Health, 2003; National Commission type
on Excellence
in
can outperform
a
Education,1983; Department of Health and more
Humanefficacious
Services, 1999)
one that
is poorly implemented.”
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Implementation Frameworks
Practice, program and systems
change through…
Multi-dimensional, fully integrated
use of
Implementation Drivers
Implementation Stages
Implementation Teams
Improvement Cycles
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Implementation Drivers
Common features of successful
supports to help make full and
effective uses of a wide variety of
innovations
Staff Competency
Organizational Supports
Leadership
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Improved outcomes
for children and
families
Performance Assessment
(fidelity measurement)
Coaching
Systems
Intervention
Facilitative
Administration
Training
Selection
Integrated &
Compensatory
Decision Support
Data System
Leadership
© Fixsen & Blase, 2008
Improved outcomes
for children and
families
Performance Assessment
(fidelity measurement)
Coaching
Training
Implementation Lens
Selection
Graphics by Steve Goodman,2009
© Fixsen & Blase, 2008
Performance Assessment
Purposes:
Measure fidelity
Ensure implementation
Reinforce staff and build on strengths
Feedback to agency on functioning of
Recruitment and Selection Practices
Training Programs (pre and in-service)
Supervision and Coaching Systems
Interpretation of Outcome Data
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Coaching
Purposes:
Ensures fidelity
Ensures implementation
Develops clinical and practice judgment
Provides feedback to selection and
training processes
Grounded in “Best Practices”
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Training and Coaching
OUTCOMES
% of Participants who Demonstrate Knowledge,
Demonstrate New Skills in a Training Setting,
and Use new Skills in the Classroom
Knowledge
Skill
Demonstration
Theory and
Discussion
10%
5%
0%
..+Demonstration
in Training
30%
20%
0%
…+ Practice &
Feedback in
Training
60%
60%
5%
…+ Coaching
in Classroom
95%
95%
95%
TRAINING
COMPONENTS
Use in the
Classroom
Joyce and Showers, 2002
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Training
Purposes:
“Buy-in”
Knowledge acquisition
Skill Development
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Selection
Purposes:
Select for the “unteachables”
Screen for pre-requisites
Set expectations
Allow for mutual selection
Improve likelihood of retention after
“investment”
Improve likelihood that training,
coaching, and supervision will result in
implementation
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Improved outcomes
for children and
families
Performance Assessment
(fidelity measurement)
Coaching
Training
Implementation Lens
Selection
Graphics by Steve Goodman,2009
© Fixsen & Blase, 2008
Organizational Change
"All organizations [and systems] are
designed, intentionally or unwittingly,
to achieve precisely the results they
get."
R. Spencer Darling
Business Expert
Copyright © Dean L. Fixsen©and
Karen Fixsen,
A. Blase, 2010
Dean
Karen
Blase, Robert Horner, George Sugai, 2008
Improved outcomes
for children and
families
Performance Assessment
(fidelity measurement)
Coaching
Training
Selection
Graphics by Steve Goodman,2009
Systems
Intervention
Facilitative
Administration
Decision Support
Data System
© Fixsen & Blase, 2008
Decision Support Data System
Purposes:
To make a difference for children and families
Provide information to assess effectiveness
of evidence-based practices
Analyze the relationship of fidelity to
outcomes
To guide further program development
Engage in continuous quality improvement
Interaction with Core Implementation
Components
Celebrate success
Be accountable to consumers and funders
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Facilitative Administration
Purposes:
Facilitates installation and
implementation of the Drivers
Aligns policies and procedures
Takes the lead on Systems
Interventions
Looks for ways to make work of
practitioners and supervisors easier!!
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Systems Intervention
Purposes:
Identify barriers and facilitators for
the new way of work
Create an externally and internally
“hospitable” environment for the new
way of work
Contribute to cumulative learning in
multi-site projects.
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Improved outcomes
for children and
families
Performance Assessment
(fidelity measurement)
Systems
Intervention
Coaching
Training
Selection
Adaptive
Integrated &
Compensatory
Technical
Facilitative
Administration
Decision Support
Data System
Leadership
Graphics by Steve Goodman,2009
© Fixsen & Blase, 2008
Integrated and Compensatory
Implementation Drivers
Integrated
Consistency in philosophy,
goals, knowledge and skills
across these processes
(S/T/C/SE/DSDS/FA/SI)
Compensatory
At the practitioner level
At the program level
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Improved outcomes
for children and
families
Implementation Takes Time
Major Implementation Initiatives occur
Performance Assessment
in stages:
Exploration
Systems
Coaching (Sustainability)
Intervention
Installation (Sustainability)
Training
Adaptive
Initial Implementation
Full
Selection
Integrated &
Compensatory
Implementation
Facilitative
Administration
(Sustainability)
Decision Support
Data System
(Sustainability
& Effectiveness)
Technical
Leadership
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
2-4
Years
EXPLORATION
Integrated &
Compensatory
Leadership
Stages of
Implementation
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
Exploration
Goals:
Examine degree to which the Evidence
Based Practice, best practice, systems
change meets the needs in the settings
identified
Determine whether moving ahead with the
initiative and implementation is desirable
and feasible
Create readiness for change at many
levels
“Pay now or pay later.”
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Sustainability
Goals:
Financial:
Ensure funding streams for desired change and
necessary infrastructure
Programmatic:
Ensure high fidelity and positive outcomes
through infrastructure improvement and
maintenance
Plan for turnover
“The only thing harder than getting there is
staying there.”
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Integrated &
Compensatory
Leadership
EXPLORATION
Stages of
Implementation
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
Installation
Goal:
To make the structural and
instrumental changes necessary
to initiate services
“If you build it, they will come”. . . but you
actually have to built it!
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
INITIAL
IMPLEMENTATION
Integrated &
Compensatory
Leadership
EXPLORATION
Stages of
Implementation
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
Initial Implementation
Goals:
Survive the awkward stage!
Learn from mistakes
Continue “buy-in” efforts
Manage expectations
“Anything worth doing…is worth doing
poorly.”
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
FULL
IMPLEMENTATION
2-4
Years
Integrated &
Compensatory
Leadership
EXPLORATION
Stages of
Implementation
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
Full Implementation
Goals:
Maintaining and improving skills and activities
throughout the system
Components integrated, fully functioning
Skillful practices by front line staff, supervisors,
administrators
Changes in policy that are reflected in practice
at all levels
Ready to be evaluated for expected outcomes
“The only thing worse than failing and not knowing why you failed,
is succeeding and not knowing why you succeeded.”
~ Jane Timmons-Mitchell
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Stages of Implementation
Major Implementation Initiatives
occur in stages:
Exploration (Sustainability)
Installation (Sustainability)
Initial Implementation
(Sustainability)
Full Implementation
(Sustainability & Effectiveness)
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Exploration
“Many implementation efforts fail
because someone underestimated
the scope or importance of
preparation. Indeed, the
organizational hills are full of
managers who believe that an
innovation’s technical superiority and
strategic importance will guarantee
acceptance.”
Leonard-Barton & Kraus,
Harvard Business Review, 1985
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Analyze data related to “needs”

Identify options and assess
feasibility

Reassess, revise, prioritize, rescope

Formalize structures
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Analyze data related to “needs”

Identify options and assess
feasibility

Reassess, revise, prioritize, rescope

Formalize structures
Form an “Exploration Workgroup”
Formation of an exploration
workgroup
Focal point for the exploration work
Empowered to make decisions and/or
to make recommendations
Representative of the “stakeholders”
Develop collaboration / coownership in the community
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Analyze data related to “needs”

Identify options and assess
feasibility

Reassess, revise, prioritize, rescope

Formalize structures
Analyze Data Related to “Needs”
Assessment of current outcomes
Dimensions – Root cause analysis
(5 Whys)
Prevalence of the problem(s) – How
frequent and pervasive?
Persistent nature of the problem – Have
we been struggling for a long time?
Social significance – If this changed,
would it make a significant different for
students?
Leverage point – If these few indicators
changed then other outcomes would be
likely to be “pulled along.”
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Analyze data related to “needs”

Identify options and assess
feasibility

Reassess, revise, prioritize rescope

Formalize structures
Identify Options &Assess Feasibility
Needs
Fit
Resource availability
Evidence
Readiness for replication or degree to
which it is operationalized
Capacity
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Need in State, District, Schools
Assessing Fit and Feasibility of Initiatives
Socially Significant Issues
Parent & Community Perceptions
of Need
Data indicating Need
Need
Fit with current -
Capacity
Staff meet minimum qualifications
Able to sustain Imp Drivers
• Financially
• Structurally
Buy-in process operationalized
• Practitioners
• Families
• Agency and Departments
Fit
•Initiatives
•State, District, School
Priorities
• Organizational structures
• Community Values
Capacity to Implement
Readiness
Qualified purveyor
Expert TA available
Mature sites to observe
# of replications
How well is it operationalized?
Are Imp Drivers operationalized?
Resource
Availability
Intervention Readiness
for Replication
Initiative :
5 Point Rating Scale: High = 5; Medium =
3; Low = 1. Midpoints can be used and
scored as a 2 or 4.
High
Medium
Low
Need
Evidence
Fit
Resources Availability
Resources
Staffing
Training
Data Systems
Coaching & Supervision
Administrative & system
supports needed
Time
Evidence – is there any?
Outcomes – Is it worth it?
Fidelity or process data
Cost – effectiveness data
Number of studies
Population similarities
Diverse cultural groups
Efficacy or Effectiveness
Evidence
Readiness for Replication
© National Implementation Research Network 2009
Capacity to Implement
Total Score:
Adapted from work by Laurel J. Kiser, Michelle Zabel,
Albert A. Zachik, and Joan Smith at the University of Maryland
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Analyze data related to “needs”

Identify options and assess
feasibility

Reassess, revise, prioritize, rescope

Formalize structures
Reassess. . .
Does this change initiative still address the
most critical needs?
Does it fit our current political and social
context?
Do we have the necessary resources and
support?
Do we have the capacity and access to
necessary expertise to proceed?
Have we bitten off more than we can chew?
Is this our leverage point?
What has emerged during Exploration that
impacts our decisions?
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Analyze data related to “needs”

Identify options and assess
feasibility

Reassess, revise, prioritize, rescope

Formalize structures
Formalize Structures
Formalize structures and processes –
develop implementation teams based
on…
Design elements (e.g. components of
the initiative)
Legal
Practitioner level
Agency level
District/State/Tribal work
Stakeholders
Identify linkages among the ‘structures’
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
CREATE READINESS
Exploration: In Depth
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
What happens during
Exploration?

Form “exploration workgroup”

Review data related to “needs”

Identify options and feasibility

Reassess, revise, re-scope

Formalize structures
Resistance to Change
There is no such thing – only
inadequate preparation
It is not “their” problem, it is
ours.
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Creating Readiness for Change
Individual readiness for change
Transtheoretical Model or Stages of
Change
Precontemplation
Contemplation
Preparation
Action
Maintenance
Prochaska and DiClemente
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Stages of Change
Stage of Change for Pre-Action
Individuals:
Precontemplation – 40%
Contemplation – 40%
Preparation – 20%
“If only 20% of employees in organizations
are prepared to take action. . . .”
Janice M. Prochaska, James O. Prochaska, and Deborah A.
Levesque (2001)
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
FULL
IMPLEMENTATION
2-4
Years
Integrated &
Compensatory
Leadership
EXPLORATION
Stages of
Implementation
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
“Who”
Purveyors
Intermediary Organizations
Technical Assistance Centers
Implementation Teams
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Organized, Implementation Support
Developers
Technical Assistance
Implementation Team
Simultaneous, Multi-Level Interventions
Practitioner Competence
Provider Agency Supports
Management (leadership, policy)
Administration (HR, structure)
Supervision (nature, content)
Regional Authority Supports
State and Tribal Leadership
Organized, Implementation Support
Developers
Technical Assistance
Implementation Team
Simultaneous, Multi-Level Interventions
Practitioner Competence
Provider Agency Supports
Management (leadership, policy)
Administration (HR, structure)
Supervision (nature, content)
Regional Authority Supports
State and Tribal Leadership
System Change
"All organizations are designed,
intentionally or unwittingly, to achieve
precisely the results they get.”
…R. Spencer Darling
Systems trump programs!
…Patrick McCarthy, Annie E. Casey
Changing on Purpose
New practices do not fare well in
existing organizational structures
and systems
•
Effective practices are changed to fit the
system, as opposed to existing systems
changing to support effective evidencebased practices.
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Effective System Change
EXISTING SYSTEM
EXISTING SYSTEM IS
CHANGED TO SUPPORT
THE EFFECTIVENESS OF
THE APPROACH
Effective approaches are
Changed to Fit the
System
Or Operate in the Shadows
(Ghost System)
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
(Host System)
EFFECTIVE APROACH
Changing on Purpose
People, organizations, and systems. . .
•
•
•
•
Cannot change everything at once (too big;
too complex; too many of them and too few
of us)
Cannot stop and re-tool (have to create the
new in the midst of the existing)
Cannot know what to do at every step (we
will know it when we get there)
Many outcomes are not predictable (who
knew!?)
Trial & Learning
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
PDSA Cycles: Trial & Learning
Shewhart (1924); Deming & Juran (1948);
Six-Sigma (1990)
Plan – Decide what to do
Do – Do it (be sure)
Study – Look at the results
Act – Make adjustments
Cycle – Do over and over again until the
intended benefits are realized
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Improvement Cycle Uses
Rapid Cycle Teams
Problem-solving
Practice Improvement
Usability Testing
Practice-Policy Feedback Loops
Transformation Zones
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
Implications
Clearly understand/define the “What”
Stage-matched activities guide the
process
Build processes/systems to continuously
improve “drivers”
Local and/or state systems will need time
to implement effectively
Support the development of organized,
skilled implementation support to build
organization and system capacity to
implement well
Copyright © Dean L. Fixsen and Karen A. Blase, 2010
For More Information
Melissa Van Dyke
[email protected]
http://www.fpg.unc.edu/~nirn/resources/publications/Monograph/
Copyright © Dean L. Fixsen and Karen A. Blase, 2010