Transcript Slide 1

Do After-school Programs Affect Important
Youth Outcomes? If So, Do We Know Why?
Robert C. Granger, Ed.D.
Remarks prepared for “Making a Difference in After-school - Measuring and Improving Program
Quality”
Sacramento, CA / March 17, 2009
Two questions
• Do after-school programs improve academic
performance?
• Do we know why some programs make a
difference while others do not?
2
Two answers
• Yes*
• Starting too…
*Yes, but…
3
The review
Background
Policymakers and practitioners want to know if after-school
programs affect academic achievement.
Goal
Review strong evidence regarding the effects of after-school
programs and examine the practices of effective programs.
Method
Summarize the results from three rigorous reviews of over 90
evaluations of after-school programs.
4
Society for Research in Child Development. (2008, April).
After-school Programs and Academics: Implications for
Policy, Practice, and Research. (Social Policy Report Vol.
XXII, No. 2). Ann Arbor, MI: Robert C. Granger.
Society for Research in Child Development. (2008, April).
Improving After-school Programs in a Climate of
Accountability. (Social Policy Report Brief Vol. XXII, No. 2).
Ann Arbor, MI.
http://www.srcd.org/spr.html
5
The findings
•
On average after-school programs improve important
academic outcomes like test scores and grades.
•
A subset of the evaluated programs that achieved
outstanding results account for the overall positive
picture.
•
The most effective programs had explicit goals,
activities aligned with those goals, and got youth
actively involved in their own learning.
6
The two most important questions facing
policymakers and practitioners in education
and youth programs:
• What do effective teachers, youth workers, or
mentors do differently than their less effective
colleagues?
• Can you make teachers, youth workers, or
mentors more effective?
7
Sources of useful information about
both questions
•
Practitioner consensus on best practices (Forum for
Youth Investment, 2003)
•
In-depth studies of program practices (Halpern,
Larson, Hirsch)
•
Practitioner efforts to improve program effectiveness
(Many)
•
Measures of program quality (Forum for Youth
Investment, 2009)
8
Measuring what matters
• Importance of the point-of-service.
• Good measures have clear, unambiguous items.
• The best measures also teach.
9
Making a Difference in After School:
Measuring and Improving
After School Quality
Nicole Yohalem,
Forum for Youth Investment
Sacramento, CA
March 17, 2009
Quality assessment tools
•
Assessing Afterschool Program Practices Tool (APT)
National Institute on Out-of-School Time and the MA Department of Education
•
CORAL Observation Tool (CORAL)
Public/Private Ventures
•
Out-of-School Time Observation Instrument (OST)
Policy Studies Associates
•
Program Observation Tool (POT)
National Afterschool Association
•
Program Quality Observation (PQO)
Deborah Vandell and Kim Pierce
•
Promising Practices Rating Scale (PPRS)
WI Center for Education Research and Policy Studies Associates, Inc.
•
Quality Assurance System (QAS)
Foundations Inc.
•
Program Quality Self-Assessment Tool (QSA)
New York State Afterschool Network
•
School-Age Care Environment Rating Scale (SACERS)
Frank Porter Graham Child Development Center, UNC
•
Youth Program Quality Assessment (YPQA)
High/Scope Educational Research Foundation
© The
Forum for Youth Investment 2008
Measuring Youth Program Quality
A Guide to Quality Assessment Tools
Updated January 2009
Quality assessment tools
There is a lot of similarity in how quality
practice is defined. All tools assess:
• Relationships
• Environment
• Engagement
• Social/Behavioral Norms
• Skill Building Opportunities
• Routine/Structure
Note: CA self-assessment tool includes items that address these areas.
© The
Forum for Youth Investment 2008
Measuring what matters
• Importance of the point-of-service.
• Good measures have clear, unambiguous items.
• The best measures also teach.
© The
Forum for Youth Investment 2008
Emphasis on point-of-service
• CA Tool: 16 of 77 items focus on POS
• SACERS & NAA < half focus on POS
• APT & YPQA > half focus on POS
© The
Forum for Youth Investment 2008
Clear and unambiguous?
Examples from the CA tool:
High inference
• Ensures staff & volunteers have respectful
interactions with participants & families.
Low inference:
• Regularly provides families with program
information in multiple languages and literacy
levels.
© The
Forum for Youth Investment 2008
Measures that teach?
Examples from the CA Tool:
Diagnostic
• Provides opportunities & support for participants to
take on leadership roles.
Diagnostic and prescriptive
• Regularly provides collaborative partners with
program information, such as program progress and
evaluation reports and information about program
events, in a variety of formats and in multiple
languages if appropriate.
© The
Forum for Youth Investment 2008
Quality improvement
Key components of quality improvement
systems:
• Quality standards that include what should happen
at the point of service
• Ongoing assessment of how well services compare
to the standards
• Targeted plans for how to improve
• Training and coaching that fits improvement plans
© The
Forum for Youth Investment 2008
Emerging examples and lessons
• Afterschool Program Assessment System (APAS)
National Institute on Out-of-School Time
• Youth Program Quality Intervention (YPQI)
Weikart Center for Youth Program Quality
© The
Forum for Youth Investment 2008
APAS pilot
• Conducted by NIOST, Wellesley College
• October 2006-July 2008
• Atlanta, Boston, Charlotte, Middlesex Cnty NJ
• 65 individuals, 28 programs, 3 intermediaries
• Well-established K-8 after-school programs
• Low stakes
• Emphasis on continuous improvement, flexibility
© The
Forum for Youth Investment 2008
Core APAS tools and supports
Tools
• Survey of Afterschool Youth Outcomes Tool (SAYO)
• Assessing Afterschool Program Practices Tool (APT)
• Web-Based Data Management System
Supports
• Training (2 days up front, online training ongoing)
• 1-day site visit
• Local coach
© The
Forum for Youth Investment 2008
Findings from the APAS pilot
• APAS helped programs identify areas for improvement
•
•
•
•
and staff development
Most sites said they made program changes as a result.
Coaches are key to implementation and useful to sites
Engagement across staff levels is important
Engaging funders is important (even with low stakes)
based on follow-up phone interviews with sites and coaches
For more on APAS: www.niost.org/content/view/1654/282/
© The
Forum for Youth Investment 2008
Youth Program Quality Intervention
Systemic quality improvement systems (QIS) anchored by the YPQA being developed in:
–Statewide strategies: MI, ME, RI, KY, NM, AR, MN, IA, WA, NY
–Cities and Counties: Austin, Chicago, Rochester, Detroit, Grand Rapids, Palm Beach County, Baltimore,
Nashville, St. Louis, Louisville, Georgetown Divide/Sacramento, Columbus IN, Indianapolis IN, Tulsa OK
Seattle
Rochester
Minnesota
Grand
Rapids
Minneapolis
Washington*
Chicago
New York
etroit
Iowa
Indianapolis
Sacramento/
Georgetown
Divide
l
mbus
Columbus
New
Mexico
Oklahoma
Maine
St. Louis
Kentucky
Arkansas
Nashville
Rhode
`` Island
`
Baltimore
Austin
West Palm Beach
County
© The
Forum for Youth Investment 2008
YPQI Focus: POS quality in context
Youth PQA Form A
Engagement
POS
Point-of-Service
Interaction
Support
Safety
PLC
Professional
Learning Community
SAE
System
Accountability
Environment
© The
Forum for Youth Investment 2008
Youth PQA Form B
•Org policies/practices
•Management values
•Performance feedback
•Continuity/staffing
•Standards and metrics
•Staff development
The Providence AfterSchool Alliance (PASA)
Quality Improvement Strategy
Improvement
Efforts
-Learning communities
-Site visits
-Model curricula
-School alignment
Quality
Standards
-What exists
-What we know
-What works
-Based on national
examples
Capacity Building/
Professional Development
-Staffing & Prof. Dev. Survey
-Workshop series tied to RIPQA
-BEST Youth Worker Training
-Standards workshops aligning
academics with enrichment
© The
Forum for Youth Investment 2008
Tracking
Tool
-Youthservices.net
-Participation &
retention data
-Citywide data
management system
Quality
Indicators
-Measure of standards
-Promising practices
-Provider/Community
Input
Self-Assessment
Tool
-Partnership with High/Scope
-Rhode Island Program Quality
Assessment Tool (RIPQA)
-Adopted by 21st CCLC initiative and in
use statewide
Incentivizing participation
PASA “endorsed” programs must:
• Maintain certain enrollment and retention
benchmarks
• Have a written curriculum
• Undergo self-assessment using RIPQA annually
In exchange for:
• Streamlined grant application process
• Small administrative funding supplement
© The
Forum for Youth Investment 2008
Requiring participation
Excerpt from Rhode Island 21st CCLC RFP
“Applicants must participate in the 21st CCLC Rhode
Island Youth Program Quality Assessment Process
(RIPQA), which includes the use of a self-assessment
tool, outside observations, development and
implementation of action plans to strengthen the
program over time, working with a Technical Advisor,
including designation of staff to coordinate the
process.”
© The
Forum for Youth Investment 2008
Rhode Island
st
21
CCLC pilot
Assessment & Planning
1.
2.
3.
4.
Kick-off, 2-day training on RIPQA
Quality Advisor (QA) meets with programs individually to orient
Observation visits (3-8 programs per site)
QA develops progress report, teams meet with instructors to share reports and
develop action plans
5. ED and other key staff complete Form B individually
6. QA summarizes, meets with team to discuss scores and improvement strategies
7. QA generates overall report on strengths and improvement steps
Training & Technical Assistance
•
•
•
•
•
© The
Series of 2-hour workshops focused on RI-PQA content
Additional training on behavior management
AYD training (32 hours) offered twice annually
4-session supervisor training
5 hours of on-site coaching per site from QA
Forum for Youth Investment 2008
RI 21st CCLC pilot – lessons
Lessons Learned
•
•
•
•
•
Programs liked tool and found process worthwhile
Initial data collection model was time consuming
Timing is important to ensure changes get implemented
Needs across sites are very similar
Strong desire for on-site TA/coaching
Adjustments for Cohort 2
•
•
•
•
Smaller observation teams, fewer observations per site
One program report as opposed to individualized reports
Additional TA/training
Start with Form B, then observations (Form A)
For
more information: www.mypasa.org/pasa-strategies
Forum for Youth Investment 2008
© The
Palm Beach County QIS Pilot
• Centerpiece of the Prime Time Initiative
• 38 providers in pilot; now working with 90
• January 2006 – fall 2007
• Based on the PBC-PQA
• Financial incentives for programs
PD
Training
© The
Forum for Youth Investment 2008
Findings from the Palm Beach pilot
•
•
•
•
•
Most programs completed all phases of QIS
Quality improved
Quality improvement is a long-term process
On-site TA very important component
Clarity of purpose is critical
Spielberger & Lockaby, 2008
www.chapinhall.org
© The
Forum for Youth Investment 2008
Coaching
Characteristics:
Roles/functions:
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Willing to listen
Experienced
Accessible
Flexible
Responsive
Creative
Resourceful
Keep programs engaged
Deliver training
Answer questions on tools, process
Participate in observations
Generate reports
Facilitate improvement planning
Provide on-site feedback, modeling
Key considerations:
•
•
© The
Program vs. system-level coaching, role of intermediaries
Dosage
Forum for Youth Investment 2008
Purposes and methods
Audience
Resources
Purposes
Methods
Lower Stakes
Hybrid Approaches
Higher Stakes
Site-based selfassessment teams
Trained, reliable assessors recruit Trained, reliable assessors
site-based self-assessment teams not connected to the
to co-produce quality scores
program
Rough data to
get staff thinking &
discussing program
quality in the context
of best practice
Rough & precise data co-mingled.
Supports planning & staff
development but not appropriate
for evaluation or accountability
Precise data for
internal & external
audiences for evaluation,
monitoring, accountability,
improvement, reporting
Less time, lower cost
Most expensive, potentially
highest learning impact
More time, higher cost
Impact internal
audiences
Impact internal & external
audiences
Impact internal & external
audiences
Smith, Devaney, Akiva & Sugar forthcoming in New Directions
© The
Forum for Youth Investment 2008
Lessons for California
1.
2.
3.
4.
5.
6.
7.
8.
9.
© The
Have well defined purposes for the system.
Focus on the point of service.
Anchor quality improvement efforts with data about the POS.
Create incentives for continuous improvement.
Build in on-site, ongoing technical assistance/coaching.
Be intentional about pilot participation.
Build learning communities.
Recognize that management is a key lever.
Worry about the quality of your measures and data.
Forum for Youth Investment 2008
For more information:
Nicole Yohalem, Program Director
Forum for Youth Investment
[email protected]
www.forumfyi.org