Unified Improvement Planning: Analyzing Data Version 2.0 Hosted by: Colorado Department of Education Provided by : Center for Transforming Learning and Teaching.

Download Report

Transcript Unified Improvement Planning: Analyzing Data Version 2.0 Hosted by: Colorado Department of Education Provided by : Center for Transforming Learning and Teaching.

Unified Improvement
Planning:
Analyzing Data
Version 2.0
Hosted by: Colorado Department of Education
Provided by : Center for Transforming Learning and
Teaching
Introductions
Center for Transforming
Learning and Teaching
Julie Oxenford-O’Brian
Mary Beth Romke
Colorado Department of
Education
Lindsey Dulin
Judy Huddleston
Christina Larson
Erin Loften
Lisa Medler
Alyssa Pearson
Session Purpose
Ensure school planning teams
are prepared to identify notable
trends and prioritize
performance challenges as part
of unified improvement plan
data narrative.
Introductions
Share:
– Name, Job Title, School/District
– Your role in facilitating unified improvement
planning
– Your most important outcome for this session
Materials
The materials used during this session
were developed in partnership with the
Center for Transforming Learning and
Teaching located in the School of
Education and Human Development at
the University of Colorado Denver.
Norms
The standards of behavior by
which we agree to operate
while we are engaged in
learning together.
Session Outcomes
Engage in handson learning
activities and
dialogue with
colleagues.
Access additional
resources.
Complete followup activities.
•
Explain how unified improvement planning (UIP) will
improve student learning and system effectiveness.
•
Identify the data analysis process included in UIP and
how the results will be captured in Data Narrative.
•
Determine what data reports/views will be used.
•
Interpret required performance metrics.
•
Review current school or district performance.
•
Describe notable trends (over at least 3 years).
•
Determine which performance challenges will focus
school/district improvement activity for the coming year.
•
Apply the UIP Quality Criteria to evaluate trend
statements and priority performance challenges.
•
Document the process used to identify trends and
prioritize performance challenges for the Data Narrative.
•
Develop a plan for completing data analysis for the school
or district UIP.
Agenda
UIP & Data
Narrative
Overview
Identify
Notable
Trends
Interpret
Performance
Metrics
Prioritize
Performance
Challenges
Review
Current
Performance
Plan
Data
Analysis
Purposes of Unified Improvement Planning
• Provide a framework for performance management.
• Support school and district use of performance data to
improve system effectiveness and student learning.
• Shift from planning as an event to continuous
improvement.
• Meet state and federal accountability requirements.
• Give external stakeholders a way to learn about how
schools and districts are making improvements.
How will engaging in
unified improvement
planning result in
improvements in
performance?
Theory of Action: Continuous
Improvement
FOCUS
Monitor
Progress at
least
quarterly
Performance Indicators
Achievement
Growth
Growth Gaps
Percent
proficient and
advanced
Normative
and CriterionReferenced
Growth
Growth Gaps
• Reading
(TCAP, Lectura,
and Co-Alt)
• Writing (TCAP,
Escritura, and
Co-Alt)
• Math (TCAP
and CoAlt)
• Science (TCAP
and CoAlt)
• Median Student
Growth
Percentiles in
Reading,
Writing,
Mathematics,
and English
Language
Proficiency
• Median
Adequate
Growth
Percentiles
Median Student
Growth
Percentiles and
Median
Adequate
Growth
Percentiles for
disaggregated
groups:
•
•
•
•
•
Poverty
Race/Ethnicity
Disabilities
English Learners
Below proficient
Postsecondary
and Workforce
Readiness
Colorado ACT
Graduation Rate
Disaggregated
Graduation Rates
Dropout Rate
Planning Terminology
Consider the Unified Improvement Planning Terminology (in
the Unified Improvement Planning Handbook, Appendix A)
Work in a triad to answer the following questions:
1.What is the relationship between performance
indicators, measures, metrics, expectations and
targets?
2.What is the difference between a measure and a
metric?
Unified Improvement Planning Processes
Preparing
to Plan
Gather and
Organize
Data
Section III:
Data
Narrative
Review
Performance
Summary
Section IV:
Target Setting
Describe
Notable
Trends
Today
Ongoing:
Progress
Monitoring
Section IV:
Action Planning
Prioritize
Performance
Challenges
Identify
Root
Causes
Set
Performance
Targets
Identify Major
Improvement
Strategies
Identify
Interim
Measures
Identify
Implementation
Benchmarks
Colorado Unified Planning Template
Major Sections:
I. Summary Information about the school or
District
II. Improvement Plan Information
III.Narrative on Data Analysis and Root
Cause Identification
IV.Action Plan(s)
Section I
Summary
Information
about the
School/District
 Student
Performance
Measures for
State and
Federal
Accountability
 Accountability
Status and
Requirements
for
Improvement
Plan
Section II Section III
Additional
Information
about the
School/
District
Section IV
Progress Monitoring of
Prior Year’s Targets
School Target Setting Form
 Priority Performance
Challenges
Data Worksheet
 Targets
 Notable Trends
 Interim Measures
 Priority Performance  Major Improvement
Challenges
Strategies
 Root Causes
Improvement Data Narrative
Plan
 Description of
Information
School/District and
Process for Data
Analysis
 Review Current
Performance
 Trend Analysis
 Priority Performance
Challenges
 Root Causes
Action Planning Form
 Major Improvement
Strategies
 Research Supporting
 Associated Root Causes
 Action Steps
 Timeline
 Key People
 Resources
 Implementation Benchmarks
 Progress
Updates to UIP Data Analysis
• Clarification regarding the role of the Data
Narrative
• Two additional metrics on the SPF/DPF
and UIP Template
• Removal of AYP and Educator
Qualification from UIP Template
• Additional reports required for UIP
Planning and Accountability Timeline
• When should local teams engage in developing or
revising unified improvement plans?
• Review the Planning Timeline (UIP Handbook, p. 38)
and Sample Planning Calendar for Developing/Revising
UIP (Toolkit, p. 5)
• Consider:
• How do these calendars compare to the timeline in
which your schools engaged in planning for the 201112 school year?
• Will you submit your UIP for one of the early posting
dates?
The Role of the Data Narrative
• Turn to: Narrative on Data Analysis and
Root Cause Identification (UIP Handbook,
p. 11)
• Work with a partner to explain:
– What is the role of the Data Narrative?
– Why were two additional worksheets included
in this section of the UIP template?
Capturing Notes Today
• Capture notes for the UIP Data Narrative
in the Data Narrative Outline.
• Plan for completing the Data Narrative
using the Planning Data Analysis note
catcher.
• Bookmark the Data Narrative Outline
(Toolkit, p. 11) and the Planning Data
Analysis (Toolkit, p. 79).
Agenda
UIP
Processes
Overview
Identify
Notable
Trends
Interpret
Performance
Metrics
Prioritize
Performance
Challenges
Review
Current
Performance
Plan
Data
Analysis
Data are like
___________ because
______________.
Accountability Measures and Metrics
Consider the table of performance indicators,
measures, metrics and expectations (UIP
Handbook, p. 8-11).
• What measures are required?
• What metrics are required?
• What are minimum state and federal expectations
for each metric?
Metrics included in the SPF
• Take out your SPF/DPF and turn to the detailed
reporting by performance indicator (p. 2)
• Identify which metrics are included for each
performance indicator:
–
–
–
–
Academic Achievement
Academic Growth
Academic Growth Gaps
Postsecondary and Workforce Readiness (secondary
only)
Indicators and Metrics
Indicate your current level of comfort explaining
each of the following metrics to a colleague (on a
scale of 1 to 5).
Indicator
Metrics
Academic Achievement
% Proficient/Advanced
School’s Percentile
Academic Growth
Median Growth Percentile
Median Adequate Growth Percentile
Academic Growth Gaps
Subgroup Median Growth Percentile
Subgroup Median Adequate Growth Percentile
Postsecondary and
Workforce Readiness
Graduate Rate
Disaggregated Graduation Rate
Dropout Rate
Colorado ACT Composite Score
Reviewing SPF and Required
UIP Metrics
• Growth:
– Median Growth Percentiles
– Median Adequate Growth Percentiles (catchup and keep-up growth)
– Growth Gaps
– Growth in English Language Proficiency
(CELApro growth)
• Disaggregated Graduation Rates
Percentage vs. Percentile
Percentiles
Percentiles
Growth Percentiles
• Range from 1 - 99
• Range from 1-99
• Indicate the standing of a
student’s score relative to
the norm group (i.e. how
a particular student
compares with all others
who took the same test).
• Indicate the standing of a
student’s progress to their
academic peers, or
students with a similar
score history (i.e. how
his/her recent change in
scores compares to the
change in scores of
other’s who started at the
same level).
Medium 3rd grade score (540)
High 3rd grade score (671)
4th Grade Students
563
57
5
581
458
69
9
Low 3rd grade score (295)
363
57
5
481
358
59
9
663
57
5
68
1
558
74
9
Medium 3rd grade score (540)
563
57
5
581
458
69
9
Low 3rd grade score (295)
363
575
481
358
59
9
High 3rd grade score (671)
663
57
5
68
1
558
74
9
Medium 3rd grade score (540)
High 3rd grade score (671)
458
56
3
575
581
69
9
558
57
5
66
3
681
74
9
11
31
50
58
86
19
24
52
64
99
Low 3rd grade score (295)
358
36
3
481
575
59
9
35
39
61
82
95
Student
Growth
Percentiles
Student Growth Percentiles
• Require 2 consecutive years of state assessment
results.
• Calculated for individual students (reading, writing, math,
English proficiency).
• Compare individual student’s change in performance to
that of his/her academic peers (statewide).
• Are based on all of the sequential years for which prior
state assessment results are available.
• Provide a normative basis for asking about how much
growth a student could make.
Mountain School
56
3
358
575
458
558
681
57
5
581
31
35
50
11
19
64
24
58
Valley School
749
69
9
481
575
59
9
36
3
66
3
99
86
61
82
95
39
52
Mountain School
458
558
57
5
56
3
358
575
581
681
11
19
24
31
35
50
58
64
Valley School
36
3
66
3
481
575
69
9
599
749
39
52
61
82
86
95
99
Median
Growth
Percentile
Mountain School
33
458
558
57
5
56
3
358
575
581
681
11
19
24
31
35
50
58
64
Valley School
82
36
3
66
3
481
575
69
9
599
749
39
52
61
82
86
95
99
Median Growth Percentile
• Aggregate measure of the growth of a
group of students:
– District/ School
– Grade-Level
– Disaggregated Group (ELL, IEP, FRL, Minority)
• Middle (median) growth percentile for the
students in the group.
• “Typical” student growth for the group.
Adequate Growth (CSAP/TCAP)
• What is adequate growth?
• Based on catch-up and keep-up growth
• So. . . a quick refresher on catch-up and
keep-up growth.
• See Adequate Growth Basics (Toolkit, p.
19)
Catch-Up Growth
To be eligible to make catch-up growth:
• The student scores below proficient
(unsatisfactory or partially proficient) in the
previous year.
To make catch-up growth:
• The student demonstrates growth adequate
to reach proficient performance within the
next three years or by 10th grade, whichever
comes first.
Calculating Catch-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
95
Not Proficient
55
2011
2012
2013
2014
2015
Calculating Catch-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
85
85
Not Proficient
2011
2012
2013
2014
2015
Calculating Catch-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
80
80
80
Not Proficient
2011
2012
2013
2014
2015
Calculating Catch-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
76
76
76
Not Proficient
76
2011
2012
2013
2014
2015
Calculating Catch-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
76
80
85
80
76
2011
80
85
95
76 is the minimumthis student’s
adequate growth
percentile.
76
76
Not Proficient
2012
2013
2014
2015
Adequate Growth Percentile for
Catch Up
• For students eligible to make catch-up growth
(those who scored unsatisfactory or partially
proficient in the previous year).
• Adequate Growth Percentile = the minimum
growth percentile he/she would have needed
to make catch-up growth.
Calculating Catch-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
76
76
76
76
Not Proficient
2011
2012
2013
2014
2015
Calculating Catch-Up Growth
6th grade
7th grade
55th percentile growth will not
be enough for this student to
catch up – she did not make
catch-up growth.
8th grade
9th grade
10th grade
Proficient
76
76
76
76
Not Proficient
55
2011
55
55
55
2012
2013
2014
2015
Keep-Up Growth
To be eligible to make keep-up growth:
• The student scores at the proficient or
advanced level in the previous year.
To make keep-up growth:
• The student demonstrates growth adequate
to maintain proficiency for the next three
years or until 10th grade, whichever comes
first.
Calculating Keep-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
79
12
Not Proficient
2011
2012
2013
2014
2015
Calculating Keep-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
25
25
Not Proficient
2011
2012
2013
2014
2015
Calculating Keep-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
38
38
38
Not Proficient
2011
2012
2013
2014
2015
Calculating Keep-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
50
50
50
50
Not Proficient
2011
2012
2013
2014
2015
Calculating Keep-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
38
50
25
50
38
25
50
38
12
50 is the maximum this student’s
adequate growth
percentile
50
Not Proficient
2011
2012
2013
2014
2015
Adequate Growth for Keep-Up
• For students eligible to make keep-up growth
(those who scored proficient or advanced in
the previous year).
• Adequate Growth Percentile = the maximum
of the growth percentiles needed for each of
the next three years (or until 10th grade)
he/she needed to score at least proficient.
Calculating Keep-Up Growth
6th grade
7th grade
8th grade
9th grade
10th grade
Proficient
50
50
50
50
Not Proficient
2011
2012
2013
2014
2015
Calculating Keep-Up Growth
6th grade
7th grade
9th grade
10th grade
79
79
Proficient
79
79
50
50
79th percentile growth will
be enough for this student
to keep up – he made keepup growth.
2011
8th grade
2012
50
50
Not Proficient
2013
2014
2015
Calculating Median Adequate
Growth Percentiles for CSAP/TCAP
AGP
45
78
99
32
11
91
55
67
43
10
77
Sorted AGPs
Adequate
growth
percentiles
for all
catch-up
and keepup students
Search for the
middle value…
Median AGP
Median Adequate
Growth for this
school is 55
Move-Up Growth
To be eligible to make move-up growth:
• The student scores at the proficient level in
the previous year.
To make move-up growth:
• The student demonstrates enough growth to
move up to advanced within the next three
years or by 10th grade; whichever comes first.
Catch-up ● Keep-up ● Move-up
• Check your understanding. . .
– Which students could make catch-up growth?
– Which students could make keep-up growth?
– Which students could make move-up growth?
• Draw a Venn diagram to show if/how these
groups overlap.
Catch-up ● Keep-up ● Move-up
Eligible to make
Catch-Up
Growth
Eligible to make
Keep-Up Growth
Eligible to
make MoveUp Growth
Percent Making Catch-Up Growth
• Denominator: The number of students who scored below
proficient (unsatisfactory or partially proficient) in the
previous year (i.e. students eligible for catch-up growth).
• Numerator: The number of students who made catch-up
growth (i.e. demonstrated enough growth to reach proficient
performance within the next three years or by 10th grade,
whichever comes first).
• Performance is improving if:
– The denominator is getting smaller (approaching zero)
– The numerator is increasing
– The percent is increasing (approaching 100)
Percent Making Keep-Up Growth
• Denominator: The number of students who scored proficient
or advanced in the previous year (i.e. students eligible to
make keep-up growth).
• Numerator: The number of students who made keep-up
growth (i.e. demonstrated enough growth to maintain
proficiency for the next three years or until 10th grade,
whichever comes first).
• Performance is improving if:
– The numerator is increasing
– The percent is increasing (approaching 100)
Percent Making Move-Up Growth
• Denominator: The number of students who scored proficient
in the previous year (i.e. students eligible to make move-up
growth).
• Numerator: The number of students who made move-up
growth (i.e. demonstrated enough growth to move up to
advanced within the next three years or by 10th grade,
whichever comes first).
• Performance is improving if:
– The numerator is increasing.
– The percent is increaseing (approaching 100)
Catch-up ● Keep-up ● Move-up
Does the sum of these percentages add up to
100?
• The percent of students making catch-up growth
• The percent of students making keep-up growth
• The percent of students making move-up growth
Catch-Up in Different Contexts
• School or District Growth Summary Reports:
– The percent of students in the school/district
making catch-up growth
– Number of students making catch-up growth/
the number of students eligible to make catch-up
growth
• SPF or DPF
– For students eligible to make catch-up growth
– Median Growth Percentile
– Median Adequate Growth Percentile
Comparing SGP & CUKUMU
• Student Growth Percentiles
– Normative
– Compare student progress to that of their
academic peers
• Adequate growth/Catch-up, Keep-up, Move-up
– Growth to standard
– Compare student growth to how much growth
they need to reach or stay proficient
Academic Growth Gaps
• Consider the definition of “Academic Growth
Gaps” in the Planning Terminology (UIP
Handbook p. 28)
• Talk with a partner:
– Is this definition consistent with the interpretation
of “growth gaps” used in your district?
– If not, how is it different?
– How could trends in growth gaps be described
using this definition? What data is needed?
Adequate Growth Percentiles Over Time
• Used in conjunction with median growth
percentiles to describe growth gap trends.
• Accessed through:
– www.schoolview.org, data lab (see, Accessing
Median Adequate Growth Percentiles over Time)
– SPF reports over time
• How will you access adequate growth
percentiles over time for disaggregated
groups? [Planning Data Analysis note catcher]
New Measures and Metrics
• Indicator: Student Academic Growth
– Sub-Indicator: English Language Proficiency
– Measure: CELApro
– Metrics: Median Student Growth Percentile, Median Adequate
Growth Percentile (calculated differently)
• Indicator: Postsecondary and Workforce Readiness
– Sub-Indicator: Graduation Rate
– Measure/Metrics: Disaggregated 4-,5-,6-,7-year graduation rates
– Disaggregated groups: Minority, FRL, ELL, IEP
Measuring Growth of English Language
Development
• Uses CELApro as the measure (instead of TCAP/CSAP)
• Applies the Colorado Growth Model methodology to CELApro
results
• Reported only for schools/districts with 20 or more ELLs
• Measures how much normative growth a student has made
towards attaining English proficiency (MGP)
• Measures how much growth would be adequate to attain the
desired level of English language proficiency within a given
timeframe (AGP)
Disaggregated Graduation Rates
Consider the definition of Graduation Rate in the
Planning Terminology (UIP Handbook, Appendix A, p.
28) and the SPF Scoring Guides and Reference Data
(Toolkit, p. 27)
– How are 4,5,6,7 year graduation rates calculated?
– Which disaggregated groups are included in the
SPF/DPF disaggregated graduation rates?
– What disaggregated graduation rate meets
expectations?
Disaggregated Graduation Rates
4-year
5- year
6- year
7-year
Number of students
graduating in 4 years
4-year rate
5-year rate
6-year rate
+ number of
students
graduating in 5
years
+ number of
students
graduating in
6 years
+ number of
students
graduating in 7
years
+ number of students from
the base year who
graduated early
Number of students in 9th grade in the base year
+ Transfers in
- Transfers out
Disaggregated Achievement
Data
Accessing Disaggregated
Achievement Data
• Most districts already use this data and
access it through a local data tool.
• Also available through:
www.schoolview.org, Data Center
• Job-aide: Accessing Disaggregated
Achievement Data (UIP Data Analysis
Toolkit, p.)
Small N?
• What if summary reports have little or no data?
• CDE does not report data for small N to protect
student privacy.
• Options?
– Student-Level Data
– Summary statistics for smaller N
• Accessed through
– District data reporting tool
– Downloading student-level records from CEDAR
– The Colorado Growth Model web-based application
(student-level)
Accessing Data Reports/Views
• Turn to the Planning for Data Analysis
note catcher.
• Make notes about how you will access
required state metrics to finalize your data
analysis.
• Include CELApro Growth if appropriate.
• Make notes about how you will access
local performance data.
Agenda
UIP
Processes
Overview
Identify
Notable
Trends
Interpret
Performance
Metrics
Prioritize
Performance
Challenges
Review
Current
Performance
Plan
Data
Analysis
Reviewing Current Performance
1. Use the SPF to identify and describe:
–
School or District accountability status
–
Indicators (and sub-indicators) where performance
did not at least meet state/federal expectations
–
Magnitude of the over-all school/district performance
challenge
2. Describe how current performance compares to
the prior year’s plan (using the Progress
Monitoring of Prior Year’s Performance Targets
Worksheet)
Review SPF Report
Capture your answers to the following questions in the Data
Narrative Outline:
1. What was the school’s plan type assignment?
2. In which indicator areas did school performance not
at least meet state and federal expectations?
3. In which sub-indicators did school performance not at
least meet state and federal expectations?
4. In which indicators and sub-indicators did school
performance not at least meet local expectations?
Magnitude . . .
• From the UIP Quality Criteria:
Schools/districts must identify priority
performance challenges and root causes that
reflect the magnitude of the overall
performance challenge.
• What does this mean?
Identifying the magnitude of the
performance challenge
Do the school’s performance challenges include:
• 80% or more of the students or closer to 15% of
the students?
• All students or only some disaggregated groups
of students? Which ones?
• All content areas? One or two content areas?
Which ones?
Determining Magnitude
• Use the Identifying the Magnitude of the
Performance Challenge Worksheet, in the 3rd
column answer each question in reference to
your school (or a school in your district).
• Describe the magnitude of your performance
challenge in your Data Narrative Outline,
(Toolkit, p. 12.)
Describing Performance in Relationship
to Prior Year’s Targets
Consider: Progress Monitoring of Prior Year’s Performance
Targets Worksheet.
Use your UIP from 2011-12 (School Target Setting Form)
and your 2012 SPF to answer the following questions:
• Which annual targets from 2011-12 were met? Which
were not met?
• For targets that were met: Is this worth celebration?
Were the target(s) rigorous enough?
• For targets that were not met: Should this continue to be
a priority for the current year?
Reflecting on Prior Year’s
Targets
• Brainstorm answers to the following questions:
Why were the school’s performance targets
met?
Why were the school’s performance targets
not met?
• Select one or two explanations to share.
• Capture your “best thinking” on your Data
Narrative Outline
Data Analysis Planning
• Turn to the Planning for Data Analysis
note catcher.
• Make notes about how you will complete
the following:
– Review Current Performance (Toolkit, p. 80)
– Progress Monitoring of Prior Year’s Targets
(Toolkit, p. 81)
Agenda
UIP
Processes
Overview
Identify
Notable
Trends
Interpret
Metrics
Prioritize
Performance
Challenges
Review
Current
Performance
Plan
Data
Analysis
Collaborative Inquiry for Data Analysis
•
Choose a partner. Take out: Guiding Assumptions for
Collaborative Inquiry (Toolkit, p. 35)
•
Read individually one row in the chart.
•
When each partner has completed a row, look up and
“say something.” Something might be a question, a
brief summary, a key point, an interesting idea or
personal connection to the text.
•
Continue until you complete all of the rows in the table.
What are notable trends?
•
Review Step Two: Identify Notable Trends
(UIP Handbook, p 13-15).
•
Discuss:
–
What are the most critical things to remember about
performance trends?
–
How can we determine if a trend is notable?
–
What are some examples of “notable” performance
trends?
Trends
• Include all performance indicator areas.
• Include at least three years of data.
• Consider data beyond that included in the school
performance framework (grade-level data, K-2).
• Consider local performance data.
• Include positive and negative performance patterns.
• Identify where the school did not at least meet state and
federal expectations.
• Include information about what makes the trend notable.
Inventory Local Performance Data
• Consider the following tool: Inventory of Performance
Data Sources (Toolkit, p. 17)
• Components (see Legend)
–
–
–
–
–
–
–
Content Area
Assessment
Grade Levels
Which Students
Content Focus
Metrics
Questions
• Determine how you will complete the inventory of locally
available performance data. Capture notes in the
Planning Data Analysis note catcher (Toolkit, p. 79).
Trend Statements Include
• Measure/Metric
• Content Area
• Which students (grade-levels, disaggregated groups)
• Direction (stable, increasing, decreasing)
• Amount (percentages, percentiles, rates, scores)
• Time period (years)
• What makes the trend notable
How to Describe Notable Trends
1.
Determine what metrics will be considered and what
questions will guide analysis.
2.
Make predictions about performance.
3.
Interact with data (at least 3 years).
4.
Look for things that pop out, with a focus on patterns
over time (at least three years).
5.
List positive and negative facts about the data (with a
focus on patterns over time, or trends).
6.
Identify which trends are notable (narrow) and which
require additional analysis.
7.
Write notable trend statements.
Levels of Performance Data
School
System
Grade-Level
Program
(Tier I)
Disaggregated
group
Standard/Sub-content
Area
Program
(Tier II/
Tier III)
Classroom
Student work
Individual
Levels and Performance Metrics
Level
Performance Metric (examples)
• Aggregate school or districtlevel
• % and number scoring at each
performance level, MGP, & AGP
(overall and by grade-level)
• Standard/strand
• Number and % meeting standard
• Disaggregated group
• % and number (within group)
scoring at each performance level,
MGP, AGP (overall and by gradelevel)
• Classroom (formal)/Individual
• Scale score, individual
performance rating, student
growth percentile
Questions
Different metrics make it possible to answer different
questions. For example :
• Could you determine which students were likely to be
proficient within the next three years if the
metric you are considering is the % of students who
scored proficient or better this year?
Organizing Data for Continuous
Improvement
• Consider Organizing Data for Continuous
Improvement (Toolkit, p. 41)
• Components:
– Path through the data
– Measures and metrics
– Critical questions for each metric
– Associated data reports (or views)
A path through the data. . .
Select one content area on which to focus
Look for
and
describe
positive
and
negative
trends
Performance
(achievement/growth) by
grade level for 3+ years
Performance by disaggregated
groups by grade level for 3+
years
Within grade-levels
achievement by
standard/sub-content area
Disaggregate groups further
Look across groups
Cross-content area performance (3+ years)
Post-Secondary and Workforce Readiness metrics (3+ years)
Performance Metrics
• Academic Achievement (overall and by gradelevel)
– % proficient or better
– % and number scoring at each performance level
(unsatisfactory, partially proficient, proficient, and
advanced)
• Academic Growth (overall and by grade-level)
–
–
–
–
–
Median Student Growth Percentiles
Median Adequate Growth Percentiles
% catch-up
% keep-up
% move-up
Metrics for Achievement at the
Standard/Sub-Content Area Level
• TCAP Achievement by Standard or SubContent Area by grade-level
– % proficient and above
Disaggregated Group Metrics
• Disaggregated Groups:
– Minority (combines: Asian, Black, Hispanic, Native
American)
– Free/Reduced
– ELL
– IEP
– Below Proficient
• Academic Achievement Metrics (%P/A, % and N
for each achievement level)
• Academic Growth Metrics (MGP, AGP, % catchup, keep-up, move-up)
Disaggregating Disaggregated
Groups
• Minority (Asian, Black, Hispanic, Native
American,)
• ELL (FEP, LEP, NEP, monitoring status)
• IEP (limited Intellectual capacity, emotional
disability, specific learning disability, hearing
disability, visual disability, physical disability,
speech/language disability, deaf-blind, multiple
disabilities, infant disability, autism, traumatic brain
injury)
Post-Secondary and Workforce
Readiness Metrics
•
•
•
•
Graduation Rate
Disaggregated Graduation Rates
Drop-out Rate
Average Colorado ACT Composite Score
Identifying questions to guide analysis
• Use Organizing Data for Continuous Improvement and
Data Analysis Questions.
• Consider the magnitude of the performance challenge and
make-up of the student population to determine which
disaggregated data will be considered.
• Determine which local performance data will be used.
• Capture the questions that will guide the analysis for each
metric on the Data Analysis Questions chart.
Some Questions for Academic
Achievement
Over-All Aggregated and by Grade level
Achievement
• What are trends in % proficient and advanced
over the last 3-5 years?
• What are the trends in % proficient and
advanced by grade level for the last 3-5 years?
• How do our trends compare to the state trends
for the same time period?
Some Questions for Academic Growth
Overall and Grade-Level Growth
• What has been the school-level trend in median growth percentiles
over the last 3-5 years?
• What has been the trend in median growth percentiles by grade level
for the last 3-5 years?
• How do the MGPs for the last 3-5 years compare to minimum state
expectations?
• What has been the trend in % of students making catch-up growth
overall and by grade level?
• What has been the trend in % of students making keep-up growth
overall and by grade level?
• How do the school’s trends in CUKU compare to the state?
Some Questions for Disaggregated
Group Performance
• What have been the trends in %proficient and advanced
for each disaggregated group present at our school over
the last 3 years?
• What have been the trends in median growth percentiles
for each disaggregated group present at our school over
the last 3 years?
• How does the MGP compare to the median AGP for
each disaggregated group at our school for the last 3
years?
Focus and Reports
• In what content area will you focus your initial
analysis?
• Organize your data reports for that content area,
including:
– TCAP/CSAP Summary by grade level (at least 3
years)
– Growth Summary by grade level
– Achievement and Growth by disaggregated groups
– Achievement at the standard and sub-content area
level
How to Describe Performance Trends
1.
Determine what metrics will be considered and
identify questions to guide analysis.
2.
Make predictions about performance.
3.
Interact with data (at least 3 years).
4.
Look for things that pop out, with a focus on patterns
over time (at least three years).
5.
List positive and negative facts about the data
(observations).
6.
Identify which trends are notable (narrow) and which
require additional analysis.
7.
Write trend statements.
Why Predict?
• Access prior learning
• Name the frames of reference through
which we view the world
• Make the assumptions underlying our
predictions explicit, trying to understand
where they came from
• Activate our engagement with the data
Preparing to Predict
1.
Select a recorder for your
table.
2.
On a piece of flip-chart
paper, create a T-chart.
3.
Put “predictions” on one
side and “assumptions” on
the other side of the Tchart.
4.
The recorder will capture
predictions on the left side
of this chart.
Predictions
Assumptions
Questions Guide Predictions
• Take out your Data Analysis Questions chart.
• Use your questions to make predictions about
what you will see in your data.
• Capture predictions and assumptions on the Tchart.
• Post Predictions and Assumptions on your data
wall.
How to Describe Performance Trends
1.
Determine what metrics will be considered and
identify questions to guide analysis.
2.
Make predictions about performance.
3.
Interact with data (at least 3 years).
4.
Look for things that pop out, with a focus on
patterns over time (at least three years).
5.
List positive and negative facts about the data
(observations).
6.
Identify which trends are notable (narrow) and
which require additional analysis.
7.
Write trend statements.
Analyzing Data
• Be patient and hang out in uncertainty
• Don’t try to explain the data
• Observe what the data actually shows
• No Because
Because
Interacting with data
• Consider strategies for interacting with data:
– Highlight (color code) based on a legend.
– Do origami – fold the paper so you can compare
columns.
– Create graphic representations.
• Agree on an approach
– How will you interact with your data?
– Plan to include a visual representation (consider the
Interacting with Data Job Aide, Toolkit, p. 65)
Capture your Observations
1. Consider the questions to guide your
analysis.
2. Identify things that “pop out”. Note
patterns over time (3-5) years.
3. Include both strengths and challenges.
4. Capture observations about your data on
a flip chart.
How to Describe Performance Trends
1.
Determine what metrics will be considered and
identify questions to guide analysis.
2.
Make predictions about performance.
3.
Interact with data (at least 3 years).
4.
Look for things that pop out, with a focus on
patterns over time (at least three years).
5.
List positive and negative facts about the data
(observations).
6.
Identify which trends are notable (narrow) and
which require additional analysis.
7.
Write trend statements.
What makes a trend notable?
• Consider the UIP Handbook, What makes a trend
notable? (p. 14)
• With a partner discuss. . . to what could we compare our
performance trends?
– How did our performance compare to a specific
expectation (criterion)?
– How did our performance compare to others (groups
of students within the school, district, state)?
• Use CSAP/TCAP Historical Trends (Toolkit, p. 69) as
reference for trends in % proficient and advanced.
Trend Statement Example
Component
Example
Measure/Metric
Percent of students proficient or
advanced on TCAP/CSAP
Content Area
Math
Which students (grade-levels,
disaggregated groups)
4th Grade (all students in school)
Direction
Declined
Amount
70% to 55% to 48%
Time period
2009 to 2011
What makes the trend notable?
This was well below the minimum
state expectation of 71%.
Examples of Notable Trends
• The median growth percentile of English Language
learners in writing increased from 28 to 35 to 45 between
2009 and 2011,meeting the minimum expectation of 45
in 2011 and exceeding the district trend over the same
time period.
• The dropout rate has remained relatively stable (15, 14,
16) and much higher than the state average for each
year between 2009 and 2011.
Identify Notable Trends
1. Consider your observations.
2. Compare school performance trends to
other points of reference (criterion, others
performance over the same time period).
3. Determine which of the identified patterns in
school performance are notable.
4. Continue analysis until at least 8 notable
trends (positive and negative) are identified.
How to Describe Performance Trends
1.
Start with a performance focus and relevant data
report(s) and identify questions to guide analysis.
2.
Make predictions about performance.
3.
Interact with data (at least 3 years).
4.
Look for things that pop out, with a focus on
patterns over time (at least three years).
5.
List positive and negative facts about the data
(observations).
6.
Identify which trends are notable (narrow) and
which require additional analysis.
7.
Write trend statements.
Write Observations as Trend
Statements
Use the “Developing Trend Statements” template
1. Specify the measure/metrics and for which
performance indicator the trend applies.
2. Describe for which students the trend applies (grade
level and disaggregated group).
3. Describe the time period.
4. Describe the trend (e.g. increasing, decreasing,
stable).
5. Determine if the trend is notable and describe why.
Checking our Thinking
• Work with your “partner table”. Assign an ‘A’
and a ‘B’ table.
• Take turns presenting trends and
providing/receiving feedback:
– Table A facilitator presents their team’s notable trends
explaining why each was identified as “notable”
– Table B team members ask clarifying questions.
– Table A facilitator responds.
– Table B team members provide warm and cool
feedback about Table A notable trends.
– Switch roles
Capturing Trends in the UIP
Template
• Capture notable trends (positive and
negative) in the Data Analysis Worksheet,
(Toolkit, p. 75 excerpted from the UIP
template).
• Note: this worksheet is organized by
performance indicator.
Make Notes for Data Narrative
• Take out the Data Narrative Outline.
• What data did the planning team review to
identify notable trends? Capture this
information.
• Describe the process in which your team
engaged to analyze the school’s data and
identify notable trends.
• What were the results of the analysis (which
trends were identified as notable)?
Completing Trend Analysis
• Take out Planning for Data Analysis
• Make notes on how you will complete your
trend analysis. . .
– Who will participate?
– When?
– What materials and tools will you use?
Agenda
UIP
Processes
Overview
Identify
Notable
Trends
Interpret
Metrics
Prioritize
Performance
Challenges
Review
Current
Performance
Plan
Data
Analysis
Priority Performance Challenges
•
Review
–
•
Step Four: Prioritize Performance Challenges in the
UIP Handbook, p. 15.
Discuss:
–
What are the most critical things to remember about
priority performance challenges? Why do we prioritize
performance challenges?
–
How do performance challenges relate to trends?
–
How do priority performance challenges relate to the
magnitude of the over-all school challenges?
Priority Performance Challenges
Priority performance challenges are. . .
• Specific statements about performance
• Strategic focus for the improvement efforts
• About the students
Priority performance challenges are NOT
• What caused or why we have the performance challenge
• Action steps that need to be taken
• Concerns about budget, staffing, curriculum, or
instruction
• About the adults
Priority Performance Challenges NonExamples
• To review student work and align proficiency levels to
the Reading Continuum and Co. Content Standards
• Provide staff training in explicit instruction and adequate
programming designed for intervention needs.
• Implement interventions for English Language Learners
in mathematics.
• Budgetary support for para-professionals to support
students with special needs in regular classrooms.
• No differentiation in mathematics instruction when
student learning needs are varied.
Prioritizing Performance Challenges
1. Review for which performance indicators priorities must be
identified and the magnitude of the over-all performance
challenge.
2. Consider notable trends.
3. Focus the list, combining related trends.
4. Identify trends that are most urgent to act on.
5. Do a reality check (initial prioritization).
6. Evaluate the degree to which the proposed priorities reflect
the magnitude of the over-all performance challenge.
7. Achieve consensus on the top three (or four) priorities.
What guides our prioritization?
Take out the Data Narrative Outline, consider:
• In which indicator areas (Academic
Achievement, Academic Growth, Academic
Growth Gaps, Postsecondary and Workforce
Readiness) did school/district performance not
at least meet state/federal expectations?
• Review the magnitude of the school’s over-all
performance challenge.
Prioritizing Performance Challenges
1. Review for which performance indicators priorities must be
identified and the magnitude of the over-all performance
challenge.
2. Consider notable trends.
3. Focus the list, combining related trends.
4. Identify trends that are most urgent to act on.
5. Do a reality check (initial prioritization).
6. Evaluate the degree to which the proposed priorities reflect
the magnitude of the over-all performance challenge.
7. Achieve consensus on the top three (or four) priorities.
Combine Related Trends
• Consider your notable trend statements.
• Do any of these trends address the same performance
challenge (e.g. growth and achievement trends for the
same students in the same content area)?
• Combine related trend statements.
• Note combined trend statement can include more than
one metric (MGPs and % proficient/advanced) for the
same students.
• Capture combined trend statements (and those that
could not be combined) on a flip chart.
Prioritizing Performance Challenges
1. Review for which performance indicators priorities must be
identified and the magnitude of the over-all performance
challenge.
2. Consider notable trends.
3. Focus the list, combining related trends.
4. Identify trends that are most urgent to act on.
5. Do a reality check (initial prioritization).
6. Evaluate the degree to which the proposed priorities reflect
the magnitude of the over-all performance challenge.
7. Achieve consensus on the top three (or four) priorities.
Initial Prioritization
• Identify trends that are urgent to act on (those
that represent performance challenges).
• Do a preliminary check on team priorities using
“dot voting”
– Each person gets 2 (or 3) votes.
– Team members can spend their votes on different
performance challenges or all on one.
– Identify the performance challenges with the highest
number of votes (“proposed priorities”).
Prioritizing Performance Challenges
1. Review for which performance indicators priorities must be
identified and the magnitude of the over-all performance
challenge.
2. Consider notable trends.
3. Focus the list, combining related trends.
4. Identify trends that are most urgent to act on.
5. Do a reality check (initial prioritization).
6. Evaluate the degree to which the proposed priorities reflect
the magnitude of the over-all performance challenge.
7. Achieve consensus on the top three (or four) priorities.
Aligning Priorities to Magnitude
• Review, “How to determine the appropriate level
for a priority performance challenge”, (UIP
Handbook, p. 15-16)
• Work with a partner:
– What does it mean to say the priority performance
challenge is aligned to the magnitude of the overall
performance challenges for the school?
– Identify an example of a priority performance
challenge that would not be aligned to the magnitude
of the school or district’s over-all performance
challenge.
Evaluating Proposed Priorities
• As a team, consider all of the proposed priority
challenges.
• Eliminate priorities that do not reflect the over-all
magnitude of the performance challenge for the
school or district.
• Identify remaining priority performance
challenges.
Prioritizing Performance Challenges
1. Review for which performance indicators priorities must be
identified and the magnitude of the over-all performance
challenge.
2. Consider notable trends.
3. Focus the list, combining related trends.
4. Identify trends that are most urgent to act on.
5. Do a reality check (initial prioritization).
6. Evaluate the degree to which the proposed priorities reflect
the magnitude of the over-all performance challenge.
7. Achieve consensus on the top three (or four) priorities.
Capturing Priority Performance
Challenges in the UIP Template
• Capture priority performance challenges by
performance indicator in the Data Analysis
Worksheet (Toolkit, p. 75 excerpted from the
UIP template).
• Some priority performance challenges may be
listed by more than one performance indicator.
Apply Quality Criteria Section III: Priority
Performance Challenges
• Use the Quality Criteria for Unified Improvement
Planning, Trends and Priority Performance Challenges
• Consider:
– How are the trends and priority performance
challenges similar and/or different from that reflected
in quality criteria?
– How could these sections be improved upon?
Data Narrative Notes
• Take out the Data Narrative Outline. (Toolkit, p.
14)
• Describe the process in which your team
engaged to prioritize your performance
challenges.
• What were the results? Which performance
challenge(s) were selected as priorities for the
current school year? Why was each prioritized?
• List your priority performance challenges.
Completing Prioritization of
Performance Challenges
• Take out Planning for Data Analysis note
catcher (Toolkit, p. 84).
• Make notes on how you will complete your
prioritization of performance challenges. . .
– Who will participate?
– When?
– What materials and tools will you use?
Agenda
UIP
Processes
Overview
Identify
Notable
Trends
Interpret
Metrics
Prioritize
Performance
Challenges
Review
Current
Performance
Plan
Data
Analysis
Data Narrative Notes
•
In the Planning for Data Analysis/Data Narrative note
catcher (Toolkit p. 79-85) Make any final notes about the
following components of the data narrative:
–
Review of Current Performance
–
Trend Analysis
–
Priority Performance Challenges
•
Consider the tasks involved in completing the Data
Analysis Portion of the Data Narrative.
•
Make notes about how these tasks will be completed,
when, and by whom.
Next Steps
• Bring Prioritized Performance
Challenges to the Root Cause
Analysis session.
Give us Feedback!!
• Written: Use sticky notes
+ The aspects of this session that you liked or worked for you.
 The things you will change in your practice or that you would
change about this session.
? Question that you still have or things we didn’t get to today.
Ideas, ah-has, innovations
• Oral: Share one ah ha!