ECO Longitudinal - OSEP Leadership Mtng

Download Report

Transcript ECO Longitudinal - OSEP Leadership Mtng

Comprehensive Assessment in Early
Childhood:
How Assessments Can Shape Policy and
Improve Practice
Kathy Hebbeler
SRI International
Presented at
State Kindergarten Entry Assessment (KEA) Conference
San Antonio, Texas
February, 2012
Purpose of the
assessment is a
critical driver of many
other decisions
Early Childhood Outcomes Center
2
Why build a statewide measurement
system for young children?
• Accountability
• Program Improvement
Early Childhood Outcomes Center
3
More specifically….
•
•
•
•
•
Improve transition [to kindergarten]
Inform instruction in the classroom
Inform parents of the child’s status
Screen for delays or disabilities
Determine the status of the population– social
benchmarking
Early Childhood Outcomes Center
4
And even more purposes…
• Make planning decisions at the building,
district, or state level
• Examine the effectiveness of the program
– Growth over one year*
– Performance years later (longitudinal)
*Need more than an entry assessment
Early Childhood Outcomes Center
5
Decisions that follow from purposes
• Who is assessed?
• Who administers the assessment?
• How often to assess?
– [For KEA: Entry only or Entry plus?]
• What kind of assessment?
• Who sees the results?
• Who will (or is expected to) to act on the
results?
Early Childhood Outcomes Center
6
Results Matter
Colorado and Nebraska
Early Childhood Outcomes Center
7
http://www.cde.state.co.us/resultsmatter/rm_system.htm
Early Childhood Outcomes Center
8
System Features
• Observation-based assessment completed by
teachers
– Significant investment in professional development
• Assessment administered several times a year to
all children in participating programs
• Data entered online
• Different levels of reports for different audiences
(child, classroom, building, district, state)
• Plans to link data to K-12
Early Childhood Outcomes Center
9
Program Improvement Purposes
Inform instruction
Early Childhood Outcomes Center
– Data on all children
• No sampling
– Multiple time points
(Fall, Winter, Spring)
– Child and classroom
level reports
10
What was learned…
• Ongoing stakeholder involvement has
been critical for buy-in
– Initial decision-making and ongoing feedback
• Teachers have embraced the observationbased assessments
– Created an assessment culture
– Results filled an information void
• Up to 3 years for teacher to get proficient
Early Childhood Outcomes Center
11
Program Improvement Purposes
Improve transition
Early Childhood Outcomes Center
• Receiving teacher
has access to
previous year’s
assessments
12
Program Improvement Purposes
Make planning
decisions at the
building, district or
state level
Early Childhood Outcomes Center
• Online system
generates reports at
the building, district,
or state level
• Administrators at
these levels have
access to the data
13
Program Improvement Purposes
Determine program
effectiveness
Early Childhood Outcomes Center
• Multiple points of
data to examine
growth over the
program
• System will link
preschool
assessment data for
program attendees
to K-12 data
14
Accountability Purposes
• Generate state level
analyses for federal
reporting (only for
children in EI or
ECSE)
• Produce information
for the state’s annual
report
Early Childhood Outcomes Center
• State has access to
the data
• State can analyze the
data and produce
many different kinds
of analyses
15
Early Childhood Outcomes Center
16
Early Childhood Outcomes Center
17
Quality Indicators
Components
Elements
Purpose
1.
State has articulated purpose(s) of COMS.
Data Collection and Transmission
Purpose
2.
Data collection procedures are carried out efficiently and effectively.
3.
Providers, supervisors, and others involved in data collection have the
required knowledge, skills, and commitment.
Data
Collection and
Transmission
4.
State's method for entering, transmitting, and storing data is effective and
efficient.
Analysis
State identifies accountability and program improvement questions related to
child outcomes.
Local programs identify accountability and program improvement questions
related to child outcomes.
Analysis
7.
State agency analyzes data in a timely manner.
8.
Local programs analyze data in a timely manner.
9.
State agency ensures completeness and accuracy of data.
Reporting
Using Data
Evaluation
Cross-System
Coordination
Early Childhood Outcomes Center
5.
6.
Reporting
10. State agency interprets, reports, and communicates information related to
child outcomes.
11. Local programs interpret, report, and communicate information related to
child outcomes.
Using Data
12. State agency makes regular use of information on child outcomes to improve
programs.
13. Local programs makes regular use of information on child outcomes to
improve programs.
Evaluation
14. State evaluates its COMS regularly.
Cross-system Coordination
a.
b.
c.
d.
e.
f.
g.
h.
State has…
State has…
State agency..
Representative..
State agency…
State ……
State provides…
State has..
15. Part C and 619 coordinate child outcomes measurement.
16. Child outcomes measurement is integrated across early childhood (EC)
programs statewide.
17. Child outcomes measurement is aligned with state’s early learning
guidelines/standards.
18. State has a longitudinal data system to link child outcomes data from EC
program participation to K–12 data.
www.the-eco-center.org
18
Conclusions
• Articulate the purpose(s)
• Design the assessment system to align
with the purpose(s)
• Invest in building capacity
– To collect data
– To interpret and act on the data
Early Childhood Outcomes Center
19
Additional Resource
Kindergarten Assessment Process Planning
Report (2008)
http://policyweb.sri.com/cehs/publications/WA_DEL_KRA_ProcessPlanning.pdf
Report on a process to help
Washington State plan for
their KEA.
Early Childhood Outcomes Center
20