Transcript Slide 1

Implementing EBHV Models in CommunitiesLessons Learned –
translating science into practice
17th National Conference on Child Abuse &
Neglect
April 1, 2009
Presented by:
Nancy Gagliano, MSW, LICSW
Programs & Evaluation Director
Council For Children & Families
&
Nicole Rose, BA
Project Associate
Washington State University Area Health Education Center of Eastern
Washington
Topics we will cover

CCF Historical Funding Approach

New EBHV Dollars- New Game

Initial Logic Model for Funding EBHV

Assumptions, Expectations & Early Realizations

Design of an Evaluation Plan

Case Study - Strengths & Directions for Continued Development

What Does it Really Take to Implement EBHV Programs in Diverse
Communities?
Council for Children & Families

Created by the legislature in 1982

Supported by state general fund, CBCAP, Children’s Trust Fund,
private donations

Three Activities: Funding Community Based Programs, Public
Awareness/Education & Partnerships
CCF Historical Funding Approach –
organizational capacity building in the development and use
of information to guide services

12-15 programs each year; 3 year funding cycle
o
Parent Education/Training, Parent Support and Mentoring, and Home Visiting,
and Crisis Nursery.

Local communities choose various program type/focus based on local needs,
capacity, and interest.
o
Capacity-Building Framework –
Community needs assessment
Research
Support programs in developing evaluation processes for quality
assurance, program development and sustainability!



New EBHV Dollars….New Game!
¢
2007 Washington State legislature dramatically increased its investment in
child abuse and neglect prevention and early school readiness by providing
new dollars for implementation of EBHV programs.

$3.2 million for a 2 year period for CCF to fund Evidence Based Home
Visitation Programs across the state.

Earmarked:
$185 K of the EBP funding for underserved rural and/or tribal applicants.
$400 K based on input from NFP consortium
$150 K per Thrive by Five demonstration site
New Dollars, New Game cont…
Which EBHV Models to Fund?

CCF Research Advisory Committee - approved EBHV models

Matrix- recommended for those submitting a proposal.

Three levels of evidence approved:
Best Support
Good Support
Promising Practice



A Portfolio is Created…
fast turn-around….
legislative approval
out to 13 different organizations
Implementing Five Different EBHV





NFP – Nurse Family Partnership
PAT – Parents as Teachers
STEEP – Steps Towards Effective Enjoyable Parenting
PCHP – Parent Child Home Program
Project SafeCare
October 2007 Logic model for
funding EBHV programs
inputs
activities
staff
•Consultants
Staff training
1. CCF staff to
participate in TA
Staff training
1. Staff participate in
summit, quarterly TA calls
and NIRN training
EBHV programs
2. Review research
Evaluating EBHV &
implementing with
fidelity.
Fund 13 EBHV program
2. Establish process
measures related to
implementing w/ fidelity
•CCF
Funding
•State
•CBCAP
Programs
implementing
EBHV
•
•National
Offices/EBHV
Developers
•Materials
/research
implementing
EBHV
3. Develop plan for TA
to EBHV programs
4. Develop and provide
training for EBHV
programs on process
evaluation (fidelity
measures if model
specific)
5. Share Collected
Information & Resources
outputs
3. Produce TA plan and
timeline
4. Provide 4 training
sessions to EBHV
•Workshop on EBHV
implementation, fidelity
& process
•Understanding and
communicating data
5. Produce Outcomes –
EBHV outcomes &
process evaluation
outcomes
Intermediate Outcomes:
•Increased Understanding of critical
elements of EBHV by CCF staff and
funded programs
•Increase Internal Capacity to support
community based agencies in
implementing EBHV
Intermediate Indicators:
• Increase number of Programs
reporting on fidelity measures
•Increase number of TA events on
fidelity
•Programs report satisfaction with
TA provided
Long Term Outcomes
•Implement EBHV programs w/
fidelity= outcomes.
•Demonstrate child parent benefits
•Benefit of multiple home visiting
programs across the state
Assumptions & Expectations

o
o
o
o

o

Implementing with fidelity yields effective practice - programs are
willing to engage around fidelity implementation:
Good understanding model
Clear definition of model elements
Model elements guidelines actually exist
Able to implement with some degree of fidelity
Model developers are offering adequate technical assistance,
monitoring and supporting implementation and development
Programs have internal capacity for outcomes and process
evaluation - are using data to inform practice
Programs have organizational capacity – infrastructure and support
internally
Early Realizations
Capacity Challenges –
Some organizations limited to no understanding




logic models
collecting basic demographic data
contract compliance
confusion on reporting on outputs
Process Measures Related to Fidelity Challenges –
 implementation demonstrating fidelity to the model
 “We are implementing the program with 100% fidelity.”
Early Realization…
programs don’t know what they don’t know
Long Term Outcome Programs:
Implement EBHV Programs w/ Fidelity= outcomes.






Asking the question was not enough!
programs said they were implementing with fidelity
did they really understand fidelity?
how did the different program models actually measure fidelity?
how consistently?
How much fidelity was enough fidelity?
Were programs going to achieve the outcomes that the models promised if
they didn’t get fidelity?
Early Realization
funders don’t know what they don’t know either!
Long Term Outcome Funder

Demonstrate child/parent benefits of significant degree to justify
the investment of state dollars.

Document benefit of implementing multiple home visiting models
under this state program.
How was CCF going to achieve these outcomes and
were there other outcomes?
System level outcomes to be addressed:

Document state standards for program delivery and
improvement of quality in Washington State home visiting

Develop a home visiting learning community to support
progressive improvements in quality
Design of an Evaluation Plan
Called in WSU
original evaluation questions
•
Does the routine use of home visiting programs using various
evidence informed protocols collectively result in better child and
caregiver outcomes?
•
Can we demonstrate significant benefit to justify investment of state
dollars?
•
Can we document benefit across a portfolio approach to support
the continuation of this approach?
Design of an Evaluation Plan
WSU


Look at the research – comprehensive literature review around
home visiting
Exploration of the state of home visiting in all fifty states

Start with the programs before we looking at multi method
approach and child parent outcomes

Design a rigorous program evaluation rather than a research
design
WSU Reviewed the Research:
Evaluating EBHV & Implementing with Fidelity
•
Translation of evidence based home visiting models from
randomized controlled trials into local program practice is very
challenging.
•
Improving program quality and implementation of the model with
fidelity is a major issue for the field.
•
Organizational conditions and capacity are the key to a successful
implementation of an EBHV model.
Organizational conditions for adopting Evidence Based Programs
(Fixsen, et al., 2006)
o
Support for adoption across leadership & treatment staff



o
Capacity to Implement



o
Organizational leadership skills to support adoption of new practices
Staff skill level – training in specific home visiting model skills
Information management system and use of data for quality improvement
Staff retention
Supervisory capacity and skills
Family engagement capacity and skills
Capacity to develop & sustain information- driven problem solving


Quality improvement practices, staff development, continuing family engagement
Use of information and outcomes in program development
Participatory Evaluation in Action





Initial site visits were completed with 13 programs across the state of Washington
Topics covered in the initial site visit included the following:
Program elements
 Program outcomes and goals
 Program implementation
 Client population
Program offerings
 Strengths and needs assessments of clients
 Supports provided to families
 Staffing
Data resources and collection
 Current data systems, collection and access
 Capacity issues around data collection and data constraints (HIPPA, etc.)
 Integration of CCF requirements with current program data collection
 Ways to use currently-collected data
The Reality Sets In –
findings support the research

Programs vary in terms of organizational capacity to deliver their programs

Data collection and information use is a common area that needs further
development and support

Existing outcome assessment of the model is either limited or involved
measuring strategies which do not meet reliability and validity standards

Bottom line- programs need significant support in outcomes assessment
and using the information for program improvement and clinical decision
making
The New Evaluation Proposal

Increase the capacity of funded home visiting programs to collect
model-specific outcome information. Evaluate the impact of key
program implementation variables on individual child and parental
outcomes.

The following next steps were set in motion:
 Assessment tool exploration
 Contact with model developers
 A second round of site visits
Tool Exploration

Was there a common tool that could be used across the balanced portfolio
of five EBHV models to measure caregiver child outcomes?

Began an extensive search of outcome measures to see if there was one
that could be used across programs.

Looked for tools that were valid and reliable; wanted to incorporate tools
that programs might already be using to decrease burden on program staff.
Findings from the tool exploration

Most programs were using valid and reliable child
outcome measures

Because of specific differences in child outcomes being
looked at by different models it would be difficult to come
up with one common child outcome measure

The Protective Factor Survey could be used by all 13
programs to measure programmatic impact on
family/caregiver outcomes
Contact with National Model Developers

Contacted NFP, PAT, PCHP and STEEP to discuss fidelity
measures and core implementation components

Model developers were open and willing to talk about this whole
idea of fidelity – what is fidelity and how much fidelity is enough?

Model specific questions were designed around the core
implementation components to take on the next round of site visits
Second site visit and the “Discussion Tool”

How does a program’s organizational capacity effect implementing
with fidelity?

Not only do we need to ask core component/fidelity questions but
we also have to find a way to assess organizational capacity.

We get a little help from our “FRIENDS” at the National Resource
Center for Community Based Child Abuse Prevention
FRIENDS and the Tailored Discussion Tool

Integrating Evidence-Based Practices into CBCAP Program:
A Tool for Critical Discussions - Utilized Appendix C- The
Capacity Checklist for Implementing with Fidelity
o
CQI Self Assessment Document

WSU incorporated questions - data management capacity &
programs ability to use data to inform program practice
Framework for the second round of site visits

Model Components/Fidelity

Staff Experience

Staff Training and Monitoring

Outcome Measurement/Quality Assurance

Community Capacity

Support Available from the Program Developer or Other Technical Assistance
Provider

Funding Availability

Overall Assessment
Two Programs
Two Programs
Implementing Parents as Teachers
Program One:

20 low-income, rural, unemployed,
single parents and their children

Two 30 minute home visits a
month

Parent Support Groups-2 to 4
times monthly

Born to Learn curriculum
Program Two:




40 low-income, Native
American, teen parents,
grandparents, and foster
families
Minimum of one personal
home visit a month.
Group parent support sessions
will be offered monthly.
Born to Learn curriculum
Two Programssecond site visit
the details
Strengths
The Framework Comes to Life
PAT Program One:

Training of staff is intentional and
comprehensive - recognition of staffing
needs to effectively run program

Community recognizes the need for the
program and families have been
receptive, fits in well with other family
services offered

Home visits are offered according to
family need

Program has strong funding sources
and is supported within the agency
PAT Program Two:

One person has set up the pieces for
a PAT program without any additional
support
Continued Development
The Framework Comes to Life
PAT Program Two
PAT Program One

Program to allocate more
resource to meaningful data
analysis to inform program
practice

Community and families do not
recognize the need for the
program

Staff states program does not fit
well with other services offered

Groups are the only services
being provided to families

Reflection on what keeps families
from engaging in home visits

Organizational capacity needs on
all levels
Direction for Continued Development – all programs

Allocating more resources to analyze, reflect and respond to data
at both the individual and programmatic level to inform continuous
quality improvement of the evidence based home visiting model.

True cost of program implementation: administrative costs, in-kind,
evaluation, etc.

Articulate clearly guidelines for income eligibility
Direction for continued development- all models

Pulling the data from the MIS system difficult (to serve the CQI
Process) – programs do not have the control of pulling individualized
reports.

How much fidelity is enough – what things can’t change?
What does it really take to implement an EBHV
program in diverse communities?
It is not just about buying a model
It is not just about buying a model
It is not just about buying a model
What does it take?
Capacity building







Support - community, organizationally and at the program level
Delivery of core components with fidelity –
model developers - identify core components
organization and program – understand, implement and monitor
core components
Program Developer - clear monitoring guidelines and targeted
technical assistance
Well developed Program Evaluation- using data to inform
program service delivery and CQI
Technical Assistance to build capacity - Targeted, individualized,
tailored TA provider
Questions?
Nancy Gagliano, LICSW
206-389-3297
[email protected]
www.ccf.wa.gov
Nicole Rose
509-358-7608
[email protected]
www.ahec.spokane.wsu.edu