Developmental Disabilities Program Independent Evaluation

Download Report

Transcript Developmental Disabilities Program Independent Evaluation

Developmental Disabilities Program
Independent Evaluation (DDPIE)
Project
UCEDD Meeting – Technical
Assistance Institute
May 31, 2007
Lynn Elinson, Ph.D.
Project Director
Developmental Disabilities Program
Independent Evaluation (DDPIE) Project
Also known as “ADD Independent Evaluation”
Purpose of PowerPoint


To understand the background and progress
of the ADD independent evaluation
To obtain a background and context for
giving feedback on ADD independent
evaluation materials
PowerPoint Outline
1.
2.
3.
4.
5.
Background of ADD Independent Evaluation
A. Purpose of the DDPIE Project
B. Challenges
Research design
Project implementation
A. Overview
B. Project activities
C. Evaluation tools
D. Validation
Seeking individualized input
Progress and timing
1.
Background
A. Purpose of the DDPIE Project




Demonstrate impact of DD Network programs on:
– Individuals
– Families
– Service providers
– State systems
Provide feedback to ADD to help improve the
effectiveness of its programs and policies
Promote positive achievements of DD Network programs
by “storytelling”
Promote accountability to the public
Why the independent evaluation?





In 2003 ADD conducted a Program Assessment Rating Tool
(PART) self-assessment under OMB guidance.
PART is a series of questions designed to provide a consistent
approach to rating programs across the Federal Government.
PART has four parts: (1) Program Purpose & Design; (2)
Strategic Planning; (3) Program Management; and (4)
Program Results.
PART 4 asks whether an agency has conducted an
independent evaluation of sufficient scope and quality to
indicate that the program is effective and achieving results?
ADD answered “no” which lowered overall score.
Challenges
Each UCEDD program is unique.
Challenge is to develop performance standards that:
 are relevant to all UCEDD programs;
 capture the differences among the programs
(variability); and
 will be useful to ADD in demonstrating impact.
2. Research design
Design Considerations


PART prefers experimental or quasiexperimental research designs
The structure of the ADD programs does not
lend itself to conducting randomized trials or
pre- and post-tests.
Research Design: Standards-Based
Evaluation


NOT a randomized control trial or quasiexperimental design
IS a standards-based evaluation to:
Set national standards
Determine levels that characterize extent
to which national standards are being met
Determine impact DD Network programs (and
collaboration among programs) are having on
people with developmental disabilities, family
members, State systems, and services providers
Reporting at national level



Data will be collected on individual programs
and rolled up to national level.
Independent evaluation will NOT be
comparing programs to one another
Independent evaluation will NOT replace
MTARS, which is specific to individual
programs.
2 Types of Standards




Evidence-based
Consensus-based
Performance standards for DDPIE are
consensus-based
Performance standards will be developed for
each DD Network program and collaboration
among the three DD Network programs
Key assumptions for designing
performance standards



State programs vary on their level of
performance across the standards.
Consistently high performance across the
standards is related to better outcomes.
Consistently low performance across the
standards is related to poor outcomes.
Research design: seeks input and participation
from stakeholders
Seeks input from:
 Project Advisory Panel
 DD Network Program Working Groups
 All State programs
 Validation Panels
 The public
Role of Advisory Panel
To provide balance, impartiality, and expertise
To provide advice on:
 DDPIE process
 Benchmarks, indicators, performance standards, and
performance levels
 Data collection protocols
 Pilot study
 Synthesis of findings and recommendations
Composition of Advisory Panel






Self-advocates
Family members
Representatives from 3 programs – Richard
Carroll from Arizona UCEDD
Child/disability advocates
Evaluation expert
Federal representative (for PAIMI evaluation)
Working Groups



4 Working Groups (P&A, UCEDD, DD Council,
Collaboration)
Process: In-person and telephone meetings
Role:
- To assist Westat in understanding
programs
- To provide feedback on benchmarks,
indicators, performance standards
UCEDD Working Group members
Carl Calkins
Kansas City, MO
Tawara Goode
Gloria Krahn*
David Mank
Fred Orelove*
Fred Palmer
Lucille Zeph
Washington, DC
Portland, OR
Bloomington, IN
Richmond, VA
Memphis, TN
Orono, ME
*Collaboration Working Group
3. Project implementation
A. Overview
Phases of DDPIE Project



DDPIE will be conducted in 2 phases.
Phase 1 – development and testing of
evaluation tools (measurement matrices
and data collection protocols)
Phase 2 – full-scale evaluation
Westat was contracted by ADD to implement Phase 1.
Project began September 30, 2005
End of contract – September 29, 2008
Phase 2 will be funded upon completion of Phase 1.
B. Project activities
Steps in Phase I



Construct evaluation tools (measurement
matrices and data collection protocols) that
contain performance standards and
performance levels
Conduct Pilot Study to test evaluation tools
(measurement matrices and data collection
protocols)
Revise evaluation tools
C. Evaluation tools
2 types of evaluation tools

Measurement matrices, which include:
- Key functions, benchmarks, indicators,
performance standards
- Performance levels

Data collection protocols
Definitions of key terms in
measurement matrices




Key functions
Benchmarks
Indicators
Performance standards
- Outcome performance standards
- Program performance standards
Key Functions
Logic model/format for measurement
matrices
Benchmarks
Indicators
Performance Standards
Key Functions






Groups of activities carried out by DD Network
programs
Cover all aspects of program activity
5 UCEDD key functions
1st four key functions identified by Working Group
(core functions in DD Act)
Governance and Management – Relevant to other
four key functions
Benchmarks, indicators, and performance standards
are being developed for all key functions.
UCEDD Key Functions
A.
B.
C.
D.
E.
Interdisciplinary pre-service training
and continuing education
Conduct of basic and/or applied
research
Provision of community services
Dissemination of information
Governance and management
Benchmarks




Broad, general statements
Set bar for meeting expected
outcome(s) of each key function
About 20 UCEDD benchmarks
3-4 benchmarks for each key function
Indicators



Identify what gets measured to determine
extent to which benchmarks and
performance standards are being met
4 types of indicators: outcome, output,
process, structural
Will guide the development of data collection
instruments
Performance standards
Criterion-referenced (measurable)
 Consensus-based
 2 types:
- Outcome performance standards
- Program performance standards

Outcome performance standards


Linked to expected outcomes of
each key function
Answer the questions:
- Were the expected outcomes
met?
- To what extent?
Program performance standards
What the program should achieve, have,
and do to effectively:
- meet the principles and goals of the
DD Act; and
- have an impact on people with
developmental disabilities, family
members, State systems, service
providers
Program performance standards
(continued)


Linked to the structures, processes, and outputs of UCEDD
program
Answers the questions:
What structures should be in place to carry out UCEDD
network key functions? What should they be like?
What processes should be used? What should they be
like?
What should the UCEDD network produce? What should
products be like? To what extent should they be produced
(e.g., how often, how many)?
D. Validation
Overview of validation





There is no “gold standard” for an effective UCEDD, so another
approach needs to be used to identify performance standards.
The ADD independent evaluation uses a consensus approach.
This implies participation in the process and validation from a
wide variety of stakeholders.
There will be several opportunities for validation throughout the
development of performance standards.
Stakeholders hold a variety of perspectives and, therefore, may
not always agree with one another.
Validation approach for DDPIE project





Consists of obtaining input, feedback, and consensus
Consists of validating measurement matrices (indicators and
performance standards) and data collection instruments
Is a multi-step process
Provides validation opportunities to several types of
stakeholders (e.g., consumers, family members, program
representatives, advocates, evaluation experts)
Provides opportunities for validation at different points in the
process
Opportunities for validation






Working Group process
Advisory Panel meetings
State programs (at TA meetings, by
telephone, in writing)
Validation Panel process
OMB process
Pre-test and pilot study
Validation Panels


There will be 4 Validation Panels (UCEDDs,
P&As, DD Councils, Collaboration).
Process
- Telephone call orientation
- “Paper” approach (not face-to-face) –
accommodation will be provided
- Opportunity for discussion by telephone
Criteria for Validation Panel
selection
Stakeholder groups (e.g., people with
developmental disabilities, family
members, advocates, programs,
service providers)
 Researchers

Criteria for Validation Panel
selection (continued)
Understands consumer needs
 Understands DD Network programs
 Diverse composition (gender,
race/ethnicity)
 Mix of junior and senior program staff
 Urban and rural representation

Focus of Validation Panel process



Will achieve consensus
Formal process
Builds in objective methodology (e.g., criteria
for eliminating and accepting indicators and
performance standards)
OMB approval process is another form
of validation




OMB approval process results from the
Paperwork Reduction Act
Act is administered by Office of Management
and Budget (OMB)
Purpose of Act is to ensure that information
collected from the public minimizes burden and
maximizes public utility
All Federal agencies must comply
OMB approval process (continued)




When contemplating data collection from the public, Federal
agencies must seek approval from OMB.
Must submit an OMB package consisting of description of
study and data collection effort, an estimate of burden, and
data collection instruments.
Approval process consists of making data collection
instruments available for public comment in the Federal
Register.
ADD will be submitting an OMB package; all interested
parties will have opportunity to comment during public
comment period.
Pre-test and Pilot Study – additional
form of validation




Data collection protocols will be pre-tested in
one state.
A pilot study will be conducted in up to 4
states.
Pilot study states will be chosen randomly.
Pilot study will test reliability and validity of
measurement matrices and feasibility of data
collection.
4. Seeking individualized input
Opportunities for individualized input



UCEDD TA meeting (May 31, 2007)
Distribution of draft benchmarks, indicators, and
a few examples of
performance standards
Small group discussions facilitated by AUCD
Telephone meetings scheduled in June and July
In writing
Small Group Discussions at UCEDD
Technical Assistance Meeting (May 31, 2007)


Westat will:
Distribute draft performance standards on UCEDD
Network and Collaboration
Review organization of materials
Describe feedback process for individual UCEDD
programs
Answer questions on process for feedback
UCEDD programs will:
Continue to meet in small groups to discuss the materials
(facilitated by AUCD)
Report out in a large group on first impressions
Type of Input Sought




Benchmarks and indicators: Are they the concepts
that need to be addressed?
Benchmarks and performance standards: Do they
represent what the programs should be
achieving/should have/should do in order to be
effective in meeting the principles and goals of the
DD Act and have an impact on people with
developmental disabilities, families, State systems,
and service providers?
Indicators: Which seem the most important and
feasible to measure? Which could be eliminated?
If not these, then what?
5. Progress and Timing
Progress to Date







Meetings with ADD, head of national associations, TA
contractors – November, 2006
Site visit to programs in one state – December, 2006
Review of background materials (provided by ADD; Working
Groups; national websites; other) – October, 2005 – February,
2007
Meetings with Working Groups – March, 2006 – September,
2006
Meetings with Advisory Panel - March, 2006, October, 2006,
March, 2007
Synthesis of all information by Westat – September, 2006 to
February, 2007
Draft benchmarks, indicators, performance standards –
February, 2007
Upcoming DDPIE Project Milestones
Feedback from UCEDD Working Group
April – May, 2007
UCEDD TA meeting
May 31, 2007
Feedback from all UCEDD programs
June - July, 2007
UCEDD Validation Panel
Sept. – Dec., 2007
DD Council Validation Panel
Oct. – Jan., 2008
P&A Validation Panel
Nov. – Feb., 2008
Collaboration Validation Panel
Feb. – April, 2008
DDPIE Project Milestones (continued)
Data collection instruments
June, 2008
Measurement matrices
July, 2008
Final report (with evaluation tools)
Sept., 2008
OMB Comment Period
Pilot Study
New contract