Washington 21st Century Community Learning Centers Program Evaluation Copyright © 2010 American Institutes for Research All rights reserved. February 2011
Download
Report
Transcript Washington 21st Century Community Learning Centers Program Evaluation Copyright © 2010 American Institutes for Research All rights reserved. February 2011
Washington 21st Century
Community Learning Centers
Program Evaluation
Copyright © 2010
American Institutes
for Research
All rights reserved.
February 2011
Introduction
• Provide a summary of our proposed
approach to undertaking the evaluation
2
Objectives
Methods
Instruments
Timeline
The Evaluation Team
• American Institutes for Research
Recent merger with Learning Point Associates
Demonstrated 21st CCLC and afterschool content knowledge
- Statewide 21st CCLC and afterschool evaluation and research
studies in New Jersey, South Carolina, Texas, and Wisconsin
- Responsible for the development and maintenance of the Profile
and Performance Information Collection System (PPICS)
- Support the U.S. Department of Education in monitoring state
delivery of 21st CCLC
- Provider of afterschool training and technical assistance based on
our Beyond the Bell toolkit and currently serve as the statewide
training and technical assistance provider for 21st CCLC in Illinois
Robust methodological, statistical, and psychometric skills
3
The Evaluation Team
• David P. Weikart Center for Youth Program
Quality
Developers of the Youth and School Age PQAs
Working to build program quality systems in 20 states, including
12 statewide 21st CCLC implementations, including Washington
Rigorously tested intervention strategy for improving the quality
of youth-serving programs
4
Timely and Highly Relevant Topics
• The current afterschool literature
indicates:
An uneven level of effectiveness in supporting
the achievement of positive academic
outcomes.
Various paths to supporting student
achievement are possible.
A need to define quality criteria and to
intentionally support quality improvement
efforts.
5
Evaluation Objective One
• Provide an Assessment of the Current State
of 21st CCLC Program Impact
Conduct an analysis of 2009-10 regular attendee and
other program data obtained from PPICS (February to
April 2011)
Synthesize results from the local evaluation reports
submitted for the 2009-10 programming period
(February to April 2011)
Meet with program directors and local evaluators to
discuss findings and ways to further align local and
statewide evaluation efforts (May 2011)
6
Evaluation Objective Two
• Support the PPICS Reporting Process and
the Collection of Student Level Data
Conduct five webinars on entering data into
PPICS and support the collection of PPICS
related data (April 2011–June 2011)
Modify the Regular Attendee module in PPICS
to collect additional information about students
to support data merges with the state
assessment data warehouse (July 2011–
October 2011)
7
Evaluation Objective Three
• Document the Extent to Which 21st CCLC
Programs Are Meeting Local, State, and
Federal Targets and Goals
Collect data and conduct analyses to assess the impact
of program participation on state assessment in reading
and mathematics as compared to a nonparticipant group
(July 2011–October 2011)
Create a new data collection module in PPICS to collect
information about the nature of local evaluation efforts
and how evaluation data is being used to support
program improvement efforts
8
Evaluation Objective Four
• Identify Characteristics Associated With
High-Performing Programs
Collect additional information about center and
staff practices from site coordinator and staff
surveys (February 2011-May 2011)
Conduct analyses to assess the relationship
between both (a) the level of participation and
(b) program and student characteristics and
student outcomes (July 2011–October 2011)
Replicate analyses incorporating Youth PQA
data with a subset of centers
9
Analysis of Program Impact
• Evidence that students participating more
frequently demonstrated better performance.
• Evidence of a relationship between center and
student characteristics and the likelihood that
students demonstrated better performance.
• Evidence that students participating in 21st
CCLC demonstrated better performance as
compared to similar students not participating
in the program.
10
Evaluation Objective Five
• Increase the Capacity of Grantees to Meet
Their Program Improvement and Evaluation
Obligations
Design and prepare leading indicator reports (February
2011–June 2011)
Conduct two regional trainings on how technical
assistance providers/coaches can use the leading
indicator reports in their site-level work (September
2011)
Conduct six webinars on improvement strategies and
techniques that are aligned with improvement priorities
identified in the leading indicator reports (September
2011–October 2011)
11
How Grantees Will Be Impacted
• Rolling PPICS deadlines
Operations, Staffing, Feeder Schools, Partners
Activities, Attendance
Objectives, Regular Attendee Data
• Reporting of all student data in the Regular
Attendee module of PPICS
• Completion of new local evaluation module
in PPICS
12
How Grantees Will Be Impacted
• Participation in site coordinator and staff
surveys
Two stage process: (1) provide information
about staff and (2) staff completion of surveys
• Participation in events to shape and review
leading indicators
• Participation in training events on how to
utilize leading indicators to inform program
improvement efforts
13
Neil Naftzger
E-Mail: [email protected]
1120 East Diehl Road, Suite 200
Naperville, IL 60563-1486
General Information: 800-356-2735
Website: www.learningpt.org
14