Washington 21st Century Community Learning Centers : From State to Local Program Evaluation Copyright © 2010 American Institutes for Research All rights reserved. June 2011

Download Report

Transcript Washington 21st Century Community Learning Centers : From State to Local Program Evaluation Copyright © 2010 American Institutes for Research All rights reserved. June 2011

Washington 21st Century
Community Learning Centers :
From State to Local Program
Evaluation
Copyright © 2010
American Institutes
for Research
All rights reserved.
June 2011
The Evaluation Team
American Institutes for Research
• Recent merger with Learning Point Associates
• Responsible for the development and
maintenance of the Profile and Performance
Information Collection System (PPICS)
• Demonstrated 21st CCLC and afterschool content
knowledge
• Other statewide evaluations of 21st CCLC in New
Jersey, Texas, and Oregon
2
The Evaluation Team
David P. Weikart Center for Youth Program
Quality
 Developers of the Youth and School Age PQAs
 Working to build program quality systems in 20 states,
including 12 statewide 21st CCLC implementations,
including Washington
 Rigorously tested intervention strategy for improving
the quality of youth-serving programs
3
Evaluation Objectives
 Provide an Assessment of the Current State of 21st CCLC
Program Impact
 Support the PPICS Reporting Process and the Collection of
Student Level Data
 Document the Extent to Which 21st CCLC Programs Are
Meeting Local, State, and Federal Targets and Goals
 Identify Characteristics Associated With High-Performing
Programs
 Increase the Capacity of Grantees to Meet Their Program
Improvement and Evaluation Obligations
4
Overview
• Provide an overview of the 21st CCLC
program at the state level
• Provide an overview of the 21st CCLC
program at the local level
• Recommendations for future
directions
5
Assessment of the Current
State of 21st CCLC
• Define the characteristics of the current 21st
CCLC programs in Washington
• Assess program outcomes with participating
students
• Examine local evaluation reports and alignment
with state level data
• Examine the quality of local evaluation reports to
support state guidance
6
Data Sources
• 21st CCLC Profile and Performance
Information Collection System (PPICS)
 Grantee and center level
 From APR 2010
• 45 local evaluation reports prepared
for the 2009-10 reporting period
7
Characteristics of 21st CCLC
Programs in Washington
• 46 active 21st CCLC grantees (21 new, 23
mature, 2 sustaining)
• A total of 172 centers
• Median first-year award: $333,886
• 54 percent of grants has a school district
as a fiscal agent
• 96 percent of centers located in schools
• 34 percent offer summer programming
8
Characteristics of 21st CCLC
Centers in Washington
During Academic Year:
• Average of 9.9 hours of programming after
school each week
• Average of 4.4 days per week over 32
weeks
During Summer:
• Average of 20 hours of programming per
week
• 4.4 weeks of programming
9
Characteristics of 21st CCLC
Centers in Washington
• 68 percent of students were regular attendees
• Multiple grade levels served, 38 percent are
“elementary only”
• Of the staff, 25 percent are paid school
teachers, 35 percent are volunteers.
• Multiple types of activities, 39 percent of the
centers were “mostly enrichment “, 19 percent
were “mostly homework help”
10
Characteristics of 21st CCLC
Centers in Washington
11
Characteristics of 21st CCLC
Centers in Washington
12
Characteristics of 21st CCLC
Centers in Washington
13
21st CCLC Program Student
Outcomes
1. Is higher levels of attendance in 21st
CCLC programming related to the desired
academic and behavioral outcomes?
2. Are particular center and student
characteristics associated with student
academic and behavioral improvement?
14
State Assessment Outcomes
15
Attendance & Program Outcomes
• Significant positive relationship between # of
days in the program and improved behavior
(based on teacher surveys).
• Higher level of program attendance was not
significantly related to increased performance
in state assessments in reading and
mathematics in 2009-10 (among students
who scored below proficiency in 2008-09).
16
Program Characteristics &
Outcomes
School-based centers :
• More likely to be associated with teacher reports
of higher improvement rates.
• More likely to demonstrate significant
improvement in elementary students’
mathematics proficiency level.
• Students that scored in the lowest proficiency
category in the previous year (e.g., Well Below
the Standard) were more likely to demonstrate
improvement than higher performing students.
17
Program Characteristics &
Outcomes
• In centers classified as “mostly tutoring”,
elementary students’ mathematics assesment
scores were more likely to improve.
• In centers classified as “mostly recreation”
teachers rated lower rates of improvement in
student behaviors.
• In centers staffed by “mostly teachers” teachers
reported higher levels of improvement in
motivation, attentiveness and motivation, and
homework completion and quality.
18
Local Evaluation Reports
1. What type of student outcomes do the
programs target, and how do they
measure these program outcomes?
2. What evidence can be obtained from the
local evaluation reports regarding how
programs may impact student outcomes?
19
Student Attendance
• Grantee goals: All grantees took attendance
but only 20 identified increasing attendance
and retention as a program objective.
• Attendance was highest in programs serving
elementary school students and averaged 80
percent or higher.
• Many programs seeking to achieve a high
percentage of regular attendees were not able
to achieve their objectives.
20
Academic Performance
• Grantee goals: All grantees aimed to increase
academic performance.
• PPICS teacher survey and change in state
assessment scores are the most commonly
used measures of academic performance.
• Many reports stated objectives to achieve a
specific percentage of increase in student
achievement but only few reported the percent
change in achievement.
21
Academic Performance
• Almost all reports shared findings descriptively.
• Findings were more positive for teacher reports
of improvement as compared to reports of
change in state assessment scores.
• A large number of reports shared only the
percentages of students that improved while
ignoring those whose behaviors declined or
those who did not need to change.
22
Student Behavior and Attitudes
Grantee goals:
• Increase positive attitudes towards and sense
of connection to school (most common)
• Decrease referrals and negative behavior
• Increased student skills
• Student exposure to enrichment and community
involvement activities
• Exposure to career and post-secondary
education opportunities.
23
Student Behavior and Attitudes
• Data collected from student surveys, teacher
surveys, student focus groups, parent surveys.
• Teacher reports suggested that the majority of
students improved although significant
variations were commonly reported.
• Evaluations that targeted more specific skills
and behaviors reported positive change (e.g.,
reduced discipline referrals, fewer missed
days, increased knowledge of careers).
24
Parental Involvement
Grantee goals:
• Increase parental involvement in children’s
education
• Parent attendance in program activities
• Increasing parent skills and knowledge in
English, literacy, and community resources
25
Parental Involvement
• Data collected through parent surveys,
interviews, or informal feedback but primarily
reports of parent attendance in program
activities.
• Only a small number of programs achieved
their goals on parental involvement.
 Mostly reports of attendance in social events.
 Few reported increased parent involvement
with student’s education
 Few reported increased level of skills and
knowledge in parenting, ESL, GED.
26
Analysis of Local Evaluation
Reports
• Largely descriptive analysis to assess program
impact
• Assessment of outcomes based on PPICS
teacher surveys and district-level assessments
• Lack of meaningful assessment of student
behavioral growth common
• Assessment of parental involvement based on
attendance
• Results not rigorously compiled
27
Quality of Local Evaluation
Reports
Outcome
Above
Marginal Below
Acceptable
Standards
Quality
Standards
4
3
2
1
3.0
18
13
13
1
2.8
9
21
13
2
3.0
14
17
13
1
Rigor of Evidence
2.4
1
18
25
1
Report Requirements
3.4
18
27
0
0
Clarity of
Presentation
Comprehensiveness
of Content
Relevance of Content
to Goals and
Objectives
28
Average
Quality of Presentation
• Average is 3.0
• A large number of reports were rated as
Above Standards ;
 Clearly organized
 Materials presented in an order that described
what the program goals and objectives are
 Activities described in a clear manner to
provide an overview of the program.
29
Comprehensiveness of Content
• Average is 2.8
• A large number rated Acceptable or Marginal
Quality
 Varying degrees of detail
 Inconsistent reporting of data tools
30
Relevance of Content to Goals &
Objectives
• Average is 3.0
• A large number rated as Acceptable or
Marginal Quality
 Did not reflect program action theory
 Objectives are not presented by SMART criteria
 Assessments were not aligned with program goals
and intended outcomes.
31
Rigor of Evidence
• Average is 2.4
• More than half of the reports are rated as
Marginal Quality
 Did not triangulate findings from different resources to
support evidence.
 Descriptive reporting provide little evidence that
outcomes are due to program participation.
 Limited information on number of respondents (e.g.,
survey response rates, focus group participants)
 Little insight into program implementation and quality
32
Alignment with Report
Requirements
• Average is 3.
• In general, reports followed state
requirements and guidelines.
• Weakest aspect is meaningful discussion of
findings and recommendations for future
decision making.
33
Recommendations on Evaluation
Design
• Identification of evaluation questions
• Developing a logic model to guide program
evaluation
• Consider the most rigorous evaluation design
that resources allow
• Align data collection tools with goals and
objectives rather than what’s available
 Need to use measures that better assess student
behavioral outcomes.
34
Recommendations on Program
Implementation and Quality
• Incorporate program monitoring into annual
evaluation
• Adoption of a self-assessment tool on
program supports and quality of activities
• Developing an evaluation team
35
Recommendations on Reporting
• Use a template to report findings
 Developed and disseminated by OSPI
• Require more insightful recommendations in
evaluation reports that will guide action
plans.
36
State-Level Recommendations
• Consider the adoption of one or more measure(s) to
assess social-emotional functioning and other
behaviors related to academic functioning (e.g., task
persistence, organizational skills, etc.)
• Leveraging local assessment data to both (a) inform
the design and delivery of programming and (b)
assess student growth and development
• Consider adopting indicators that ask programs to
measure within-year student growth on formative
assessments employed by the districts they are
working with
37
State-Level Recommendations
• Further examine the relationship between student
recruitment and enrollment policies and the
achievement of desired outcomes
• Find ways to connect the leading indicators being
developed currently with local evaluation efforts
38
Activity
- Pair or group with the participants sitting next to you.
- Reflect on the findings that were presented.
- Share with group
1. How can the local evaluations supplement state
data rather than duplicate the findings?
2. What are the most feasible short-term and longterm recommendations for improving local
evaluations?
3. How can the state can provide guidance to
implement the short-term and long-term
recommendations?
39
Manolya Tanyu , Ph.D
Researcher
Neil Naftzger
Principal Researcher
American Institutes for Research
American Institutes for Research
P: 312-288-7611
P: 630-649-6616
E-Mail: [email protected]
20 North Wacker Dr. Suite 1231
Chicago, IL 60606
E-Mail: [email protected]
1120 East Diehl Road, Suite 200
Naperville, IL 60563
General Information: 800-356-2735
Website: www.air.org
40