Transcript CSPR

Using Data for Program Quality Improvement
Stephanie Lampron, Deputy Director
Session Overview
 The Title I, Part D Data Collection
 Importance of Data Quality and Data Use
 Actively Using Data for Program
Improvement
2
3
The Title I, Part D
Data Collection
What are Title I, Part D and NDTAC?
 Title I, Part D (TIPD) of the Elementary and
Secondary Education Act of 2001
– Subpart 1-State Agency
– Subpart 2-LEA
 National Evaluation and Technical Assistance Center
for the Education of Children and Youth who Are
Neglected, Delinquent or At-Risk (NDTAC)
4
NDTAC's Mission Related to Data and
Evaluation
 Develop a uniform evaluation model for State
Education Agency (SEA) Title I, Part D, programs
 Provide technical assistance (TA) to States in order
to increase their capacity for data collection and
their ability to use that data to improve educational
programming for N & D youth
5
Background: NDTAC’s Role in
Reporting and Evaluation
Specific to Title I, Part D, Collections
 TA prior to collection
Webinars, guides, and tip sheets
 TA during collection
Data reviews, direct calls, and summary reports for ED
 Data analysis and dissemination
GPRA, Annual Report, and online Fast Facts
Related TA
 Data use and program evaluation
6
TIPD Basic Reporting and Evaluation
Requirements
Where do requirements come from?
 Elementary and Secondary Education Act, amended
in 2001 (No Child Left Behind)
– Purpose of Title I, Part D (Sec. 1401)
– Program evaluation for Title I, Part D (Sec. 1431-Subpart 3)
How does ED use the data?
 Government Performance and Results Act (GPRA)
 Federal budget requests for Title I, Part D
 Federal monitoring
 Provide to NDTAC for dissemination
7
Collection Categories for TIPD in the
Consolidated State Performance Report (CSPR)
 Types/number of students and programs funded
 Demographics of students within programs
 Academic and vocational outcomes
 Pre-posttesting results in reading and math
8
9
Title I, Part D in Pennsylvania
State Agency (S1)
2008-09
2009-10
Local Agency (S2)
2010-11
2008-09
2009-10 2010-11
Number of Programs
US
PA
771
720
861
2,712
2,889
2,689
7
8
11
295
286
288
Number of Students Served
US
PA
125,456
109,146
106,747
373,071
367,121
354,591
1,643
(1%)
1,189
(1%)
1,123
(1%)
24,863
(7%)
24,562
(7%)
26,510
(7%)
Local Education Agency (S2)
Academic Outcomes
10
80%
70%
67%
60%
50%
44%
40%
30%
US HS Course
Credits
PA HS Course
Credits
US GED/Diploma
20%
9%
6%
10%
0%
2008-09
2009-10
2010-11
PA GED/Diploma
* 2010-11 data are preliminary
Long-term Students Improvement in
Reading (Subpart 2)
90%
80%
70%
72%
64%
78%
64%
11
76%
64%
60%
50%
US
PA
40%
30%
20%
10%
0%
2008-09
2009-10
2010-11
* 2010-11 data are preliminary
Long-term Students Improvement in
Math (Subpart 2)
90%
80%
80%
70%
12
74%
72%
63%
64%
63%
60%
50%
US
40%
PA
30%
20%
10%
0%
2008-09
2009-10
2010-11
* 2010-11 data are preliminary
13
Data Quality & Data
Use
Functions of Data
14
 Help us identify whether goals are being met (accountability)
 Tell our departments, delegates, and communities about the
value of our programs and the return on their investments
(marketing)
 Help us replace hunches and hypotheses with facts
concerning the changes that are needed (program
management and improvement)
 Help us identify root causes of problems and monitor
success of changes implemented (program management
and improvement)
14
15
Why Is Data Quality Important?
You need to TRUST your data as it informs:
 Funding decisions
 Technical assistance (TA)
needs
 Student/facility programming
15
What Is “high data quality”?
16
If data quality is high, the data can
be used in the manner intended
because they are:





Accurate
Consistent
Unbiased
Understandable
Transparent
16
What data are the most useful?
17
Useful data are those that can be used to answer
critical questions and are…
 Longitudinal



Actionable (current, user-friendly)
Contextual (comparable, part of bigger picture)
Interoperable (matched, linked, shared)
Source: Data Quality Campaign
Should you use data that has lower
quality data?
18
YES!! You can use these data to…
 Become familiar with the data and
readily ID problems
 Know when the data are ready to be
used more broadly or how they can be
used
 Incentivize and motivate others
18
Data Quality Support Systems
19
 Insure systems, practices, processes, and/or
policies are in place
− Understand the collection process
− Provide/request TA in advance
− Develop relationships
− Develop multilevel verification processes
− Track problems over time
− Use the data (even when problematic)
− Link decisions (funding, hiring, etc.) to data evidence
 Indicate needs to others
19
20
Using Data Actively
Essential Steps Related to Data Use
1. Identify problem or goal to address
2. Explore & analyze existing data
3. Develop and implement change

Set targets and goals
4. Develop processes to monitor and review data
21
Step 1: Identify concerns or goals
22
Identify your level of interest
 State
 Facility / School
 Classroom
Define, issue, priorities or goals
 Upcoming decisions
 State or district goals or initiatives
 Information from needs assessments (or, conduct one)
Identify how data will be used & questions
Resource: NDTAC Program Administration Planning Guide-Tool 3 on Needs
Assessments
Program Components by Data Function
Program
Accountability
Student
demographics
Student
achievement
Student
academic
outcomes
Program
Marketing/
Promotion
23
Program
Improvement
Are the appropriate
students being
served?
How are you
addressing the
needs of diverse
learners?
Which students
need to be better
served?
Are students
learning?
What are students
learning? What
gains have they
made?
How can we help
improve student
achievement?
Are students
continuing their
education?
What are students
doing to continue
their education?
How can we help
improve student
academic
outcomes?
23
Focusing the Questions
24
Break the question into inputs and outcomes:
 Inputs (what your program contributes):
− Teacher education, experience, full-time/part-time
− Instructional curriculum
− Hours of instruction per week
 Outcomes (indicators of results):
− Improved posttest scores
− Completed high school
− Earned GED credentials
24
Focusing/Refining the Question
25
Weak Question:
 Does my school have good teachers?
Good Question:
 Does student learning differ by teacher?
Better Question:
 Do students in classes taught by instructors who
have more teaching experience have higher test
scores than those taught by new teachers?
25
Step 2: Explore Existing Data

Locate the data you do have

Put it in a useful format
−Trends, comparisons

What story is the data telling you?
−What jumps out at you about the data?
−Are the data telling you something that is timely and
actionable?
−What questions arise? What is the data not telling
you that you wish you knew?**
−What data could help answer those questions?
26
Local Education Agency (S2)
Academic Outcomes
27
80%
70%
67%
60%
50%
44%
40%
30%
US HS Course
Credits
PA HS Course
Credits
US
GED/Diploma
20%
9%
10%
6%
0%
2008-09
2009-10
2010-11
PA
GED/Diploma
LEA 1: Comparison data (1)
Percent of Students Earning HS CC
80%
State Average
70%
60%
70%
50%
LEA Average
40%
40%
30%
33%
20%
20%
10%
0%
Facility A
Facility B
Facility C
Facility D
28
29
Comparison Data (2): Context
Earning HS
Per Pupil
Course
Expenditure
Credits
FT
teachers
Entering
below
grade level
% LEP
Facility A
$500
70%
5
65%
25%
Facility B
$450
40%
5
10%
40%
Facility C
$550
20%
5
91%
70%
Facility D
$600
33%
5
50%
30%
30
Longitudinal data: more context
80%
70%
70%
60%
50%
40%
40%
33%
30%
Facility A
Facility B
Facility C
20%
20%
10%
0%
year 1
year 2
year 3
Facility D
Do you know enough?
Sometimes, the data will lead to more questions
and a need for more information…
 Compare to other LEA’s facilities
 Use student-level data and disaggregate
 Look at monitoring information and applications
 Collect additional information-surveys, interviews
*Keep data quality in mind
31
Step 3: Implement improvement plan
 Implement new programming, change, etc.
 Set benchmarks, performance targets
− In terms of your priorities, where do you want your
subgrantees and facilities to be in one year? Two
years? Three years?
− What performance benchmarks might you set to
measure progress along the way?
− How will you know when to target a subgrantee or
facility for technical assistance? At what point might
you sound the alarm?
32
Step 4: Develop processes for
reviewing data
Keep using it!
 Monitor change and compare against benchmarks
 Review data in real time
 Share it and discuss it
33
34
Keep in mind
 Data use is not easy*
 Data should be a flashlight, not a hammer*
 Change takes time-set realistic goals
 “No outcome” can be a useful finding
 Aggregated data can usually be shared
*Source: Data Quality Campaign
Data Capacity Exists !
35
(Data Quality Campaign, 2011 Report)
10 Essential Elements of Longitudinal Data Systems
# States
A unique student identifier
52
Student-level enrollment, demographic, and program participation information
52
The ability to match individual students’ test records from year to year to measure
academic growth
Information on untested students and the reasons why they were not tested
52
A teacher identifier system with the ability to match teachers to students
44
Student-level transcript data, including information on courses completed and grades
earned
Student-level college readiness test scores
41
Student-level graduation and dropout data
52
The ability to match student records between the P–12 and postsecondary systems
49
A state data audit system assessing data quality, validity, and reliability
52
51
50
Next Step: Data Use
36
(DQC-2011)
1. Link State K-12 data systems with early learning, postsecondary education, workforce, social
services, and other critical agencies.
11
2. Create stable, sustained support for robust state longitudinal data systems.
27
3. Develop governance structures to guide data collection, sharing, and use.
36
4. Build state data repositories that integrate student, staff, financial, and facility data.
5. Implement systems to provide all stakeholders with timely access to the information they need while
protecting student privacy.
44
6. Create progress reports with individual student data that provide information educators, parents, and
students can use to improve student performance.
7. Create reports that include longitudinal statistics on school systems and groups of students to guide school-,
district-, and state-level improvement efforts.
8. Develop a purposeful research agenda and collaborate with universities, researchers, and intermediary
groups to explore the data for useful information.
9. Implement policies and promote practices, including professional development and credentialing, to
ensure that educators know how to access, analyze, and use data appropriately.
10. Promote strategies to raise awareness of available data and ensure that all key stakeholders, including
state policymakers, know how to access, analyze, and use the information.
2
29
36
31
3
23
Accessible Data – N or D Related
Title I, Part D Data
 ED Data Express:
www.eddataexpress.ed.gov
 NDTAC State Fast Facts Pages:
http://data.neglected-delinquent.org/index.php?id=01
 Title I, Part D, Annual Report:
www.neglected-delinquent.org/nd/data/annual_report.asp
Civil Rights Data Collection (district and school)
http://ocrdata.ed.gov/
37
Accessible Data – N or D Related
OSEP Data Collection
https://www.ideadata.org/default.asp
Youth Behavior Survey (CDC)
http://www.cdc.gov/healthyyouth/yrbs/index.htm
OJJDP Juvenile Justice Surveys /Data Book
http://www.ojjdp.gov/ojstatbb/
38
Resources
 NDTAC reporting and evaluation resources:
http://www.neglecteddelinquent.org/nd/topics/index2.php?id=9
 Data Quality Campaign: www.dataqualitycampaign.org
Data for Action 2011—Empower With Data
39
Questions?
Stephanie Lampron
NDTAC Deputy Director
[email protected]
202-403-6822
NDTAC Data Team
 Dory Seidel: [email protected]
 Liann Seiter: [email protected]
40