center.serve.org

Download Report

Transcript center.serve.org

Webinar:
Program Evaluation
Toolkit
August 9, 2012
Toolkit Series from the Office of Migrant Education

Comprehensive Needs Assessment (CNA)
Toolkit

Service Delivery Plan (SDP) Toolkit

Program Evaluation Toolkit




Third in a series of three documents
Purpose is to provide guidance and resources
to state directors to conduct useful Program
Evaluations (PE)
Draws upon good practices gleaned from
Program Evaluation field
Aims to help fill gaps between available
guidance and OME’s expectations for highquality evaluation reports.



Designed to help state directors align their
Program Evaluation with the requirements of
the Federal legislation and Guidance
Adaptable to programs of different sizes
Can be used as a “gauge” on how much
outside expertise on Program Evaluation you
may need or as a “guiding document” for your
current program evaluator.
Resource Document
Step-by-step approach
Flexible Tool – can use the whole thing
or just certain sections
 Customizable tools and charts in
appendices
 Includes questions for application
 Web and Word based document



Online Documentation
Code of Federal Regulations, Title 34, Section 200.83
http://ecfr.gpoaccess.gov/cgi/t/text/text-idx?c=ecfr&tpl=/ecfrbrowse/Title34/34tab_02.tpl
The Code of Federal Regulations (CFR) is the codification of the general and permanent rules
published in the Federal Register by the executive departments and agencies of the Federal
government. Title 34, Section 200.83 specifies that it is the responsibility of the state education
agency to implement projects through a Comprehensive Needs Assessment (CNA) and a
comprehensive state plan for service delivery.
Elementary and Secondary Education Act (ESEA), Section 1306
http://www2.ed.gov/policy/elsec/leg/esea02/index.html
ESEA is the federal statute overseeing primary and secondary education. Section 1306 refers
specifically to the CNA and Service Delivery Plan (SDP), as well as other authorized activities.
Non-Regulatory Guidance for Title I, Part C Education of Migratory Children
http://www.ed.gov/programs/mep/mepguidance2010.doc
The Guidance (found in the ESEA) is designed to help state education agencies and local
operating agencies use migrant education program funds to develop and implement
supplemental educational and support services to assist migratory children. Chapter IV of the
document refers specifically to the procedures for conducting a CNA and the development of
an SDP.
CHECKLIST FOR STATE MIGRANT EDUCATION PROGRAM EVALUATION

Includes a written evaluation plan in the statewide Service Delivery Plan (SDP), which specifies how you will
collect data related to (1) the implementation of MEP activities and services and (2) the results achieved
through these services and activities

Collects data on state performance targets related to performance goals 1 and 5 for each grade,
disaggregated for Priority for Services (PFS), other migrant and non-migrant students.

If applicable, collects data on additional state performance targets for school readiness and other needs,
disaggregated for PFS, other migrant and non-migrant students.
Collects data on six (Government Performance and Results Act) GPRA measures and reports it annually to
the Office of Migrant Education, to be used in the evaluation of the federal MEP



Collects data on measurable program outcomes (MPOs) established for all MEP activities and services,
disaggregated PFS and other migrant students
Notifies local MEPs in advance of specific data needed for the statewide evaluation and provides guidance
for how to collect the necessary data

Provides guidance to local MEPs on what to evaluate locally and how to evaluate it

Develops a plan for monitoring the progress of local MEPs against state and local MPOs and for making
decisions about the continuation of subgrants

Develops a plan for reviewing all evaluation findings and using the results to improve services to migrant
children

Documents the evaluation in a written report, including the purpose of the evaluation, what data were
collected and how they were collected, the findings of the implementation evaluation, results for PFS and
other migrant students compared to all other students, and the implications for making decisions about
MEP activities and services
1.
2.
3.
What is a Continuous Improvement Cycle?
What is the purpose of the MEP
Evaluation Toolkit?
What flexible features of the Toolkit will
be most useful to you?
Please dial *1 on your telephone keypad to
provide a brief response to any of the above
questions.







Section A: Introduction & Overview
Section B: Overview of Statutes, Regulations, and
Non-Regulatory Guidance
Section C: Planning the Evaluation
Section D: Collecting Evaluation Data
Section E: Analyzing and Interpreting Data
Section F: Communicating Evaluation Findings
Section G: Using Evaluation Findings
Preview:
http://center.serve.org/nche/ome_toolkits/pe.html

MPOs are outcomes for the MEP to enable migrant students to achieve
the state migrant student performance targets.
 Focused
 Detailed
 Quantifiable
 Clearly define what you would consider a "success" in meeting a particular
need

In addition, MPOs clearly define:
 Which students will participate
 What will happen in the program
 What is expected to happen as a result of participation in the migrant program
 In what time frame this will occur
In grades 3 & 8,
the percentage of
migrant students
attaining
proficiency or
advanced in
reading &
language arts was
73.2% compared
to the range for
all students of
86.7%.
Strategy 1.1:
Provide parents
of migrant
students with a
menu of support
services to help
them support
their child’s
academic
accomplishments
From 2013-2015,
students who
parents attended
at least 50% of
the reading and
homework
support
workshops will
perform ,on
average, 10
percentage points
higher on the
state reading test
than children of
non-participants
All students will
reach high
standards, at a
minimum
attaining
proficiency or
better, in
reading/language
arts
Strategy: Provide parents of migrant students with a menu of support services to
help them support their child’s academic accomplishments.
MPO: From 2013-2015, students who parents attended at least 50% of the reading
and homework support workshops will perform, on average, 10 percentage points
higher on the state reading test than children of non-participants
What are some relevant Implementation Questions?
To what extent were programs delivered as intended?
What are some relevant Results Questions?
How much change occurred as a result of strategy?
State performance target 1: All students will reach high standards, at a minimum attaining proficiency or better in
reading/language arts and math.
Strategy: Provide parents of migrant students with a menu of support services to help them support their child’s
academic accomplishments.
MPO 1a: From 2013-2015, children whose parents attended at least 50% of the reading and homework support
workshops will perform on average 10 points higher on the state reading test than children of non-participants.
Parent
Service
focus
provider
State test
Evaluation Questions
group
interviews
scores
Implementation
1.
What support services are most (and least) requested by parents?
1.
Are program staff trained/qualified to provide these services?
1. Do services help parents support children academically?
Results
1. How much better do the children of parents who attended at least
50% of the workshops perform on the state reading test compared to
a sample of children whose parents did not participate in the reading
and homework support workshops?
1. How much better do the PFS children of parents who attended at
least 50% of the workshops perform on the state math test
compared to a sample of PFS children whose parents did not
participate in the math and homework support workshops?
X
X
X
X
X
X
1.
2.
3.
4.
What is a Measurable Program
Outcome (MPO)?
What are specific components of a
strong MPO?
What is a implementation question?
What is a results question?
Please dial *1 on your telephone keypad to
provide a brief response to any of the above
questions.
Gives overview of main types of data and statistical
analyses
 Provides 2 Key Appendices:

 Using Microsoft Excel to analyze Quantitative data
 Using Microsoft Access to analyze Qualitative data


Meant as a primer and resource
The concepts that underlie really high quality data
analysis take expertise and training and just a single
topic has volumes

What are aspects of effective Evaluation
Reports?
Knowledge check:
 What detracts or takes away from an
effective Report?
Please dial *1 on your telephone keypad to provide a
brief response to any of the above questions.

Communicating Findings
 Pointers
 Reflection questions
 Online resources

Using Findings
 Protocol for using findings in Continuous Improvement
Cycle
 Online resources
1.
2.
3.
With whom should you share the MEP
evaluation report?
What are aspects of effective evaluation
reports?
How can an evaluation report be used as part
of the continuous improvement cycle?
Please dial *1 on your telephone keypad to provide
a brief response to any of the above questions.

Finalized document will be available
on the RESULTS website

Questions can be directed to Irene
Harwarth: [email protected]