Denver Public School

Download Report

Transcript Denver Public School

Denver Public Schools
Unified Improvement
Planning 101
July 16, 2012
UIP 101 Road Map
 Overview of the UIP Process
 Structure and Components of the UIP Template
 New Updates to the UIP Process
 Developing a UIP
 Data Narrative
 School Performance Frameworks and the UIP
 Data Analysis:
 Review Past Performance/Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Intent, Power, Possibility and
Purpose of the UIP
School Improvement Historically at DPS:
 Prior to the UIP process DPS developed School
Improvement Plans (SIPs)
 In 2008, Colorado introduced the Unified
Improvement Plan (UIP) to streamline state and
federal accountability requirements.
 The UIP was established by the Education
Accountability Act of 2009 (SB 09-163)
 Colorado is entering its 3rd year (2012-13) of
requiring UIPs for all schools throughout the state.
Planning Requirements met by the UIP
State
Accountability
(SB09-163)




School Level
Performance
Improvement
Priority Improvement
Turnaround

Student Graduation and Completion
Plan
Focus Schools
Title IA Schoolwide Program Plan
Title IA Targeted Assistance
Program Plan

Title I (priority improvement or
turnaround)
Title IIA 2141c (priority improvement
or turnaround)
Title III Improvement (AMAO)-
Tiered Intervention Grant (TIG)
(Priority Schools)
School Improvement Grant (SIG)/
Implementation Support
Partnership (ISP)
Other grants reference the UIP

Student
Graduation and
Completion


ESEA Program Plan 

Competitive
Grants







District Level
Distinction
Performance
Improvement
Priority Improvement
Turnaround




Targeted District Improvement Grant
(TDIP)
Implementation Support Partnership
(ISP)
Other grants reference the UIP
Intent, Power, Possibility and
Purpose of the UIP
Theory of Action: Continuous Improvement
FOCUS
How Has the UIP Process Focused
Your School Improvement Efforts?
Dori Claunch,
Principal Morey
Middle School
“The UIP focuses our school community
on the areas in which our students are
needing the most intensive support or
enrichment. The collaborative work to
create the document provides us with a
clear vision of our goals and the steps
that will be taken to accomplish those
goals. All stakeholders play a part in the
creation and implementation of the
plan, thus leading to a strong cohesive
learning community. The UIP is at the
center of all we do.”
How Has the UIP Process Focused
Your School Improvement Efforts?
Anthony Smith, Principal
Martin Luther King
6-12 School
“The UIP has helped us to decipher all
relevant data and narrow down our
focus into high impact areas. The UIP
is an organic and living document that
has served as the anchor for our work
this year.”
How Has the UIP Process Focused
Your School Improvement Efforts?
Jeannie Peppel, Principal
John F. Kennedy
High School
“Developing a meaningful UIP is the
key to focused instruction for the
academic and post secondary
success for all
students. Remembering that the
UIP is meant to be a living
document ensures that revisions are
expected based on the work being
done and the results that are
achieved.”
How Has the UIP Process Focused
Your School Improvement Efforts?
Alex Magaña, Principal
Grant Beacon Middle
School
“The UIP reminds us to do what we
expect from our teachers…analyze the
data, set goals, develop action steps and
measure the results. Here at GBMS we
align our UIP to the professional
development, SGO's and data team
process. Although we have set up our
plan there are times we have not met our
goals. This allows us take a step back and
talk about why we did not meet the
goal. For example, we were going to
incorporate small group instruction in our
reading classes but did not provide
enough PD to support the goal.”
Intended Session Outcomes
Participants will understand:
 The purpose, power and possibility of the UIP.
 The accountability measures associated with the UIP.
 How to develop each component of the UIP template.
 How local, state and federal accountability connect to
the UIP process.
 How to progress monitor and engage stakeholders in
the UIP process.
 The state and district tools and resources available to
assist school leaders in engaging in UIP work.
Purpose of the UIP
UIP Purpose
 Provide a framework for performance
management.
 Support school and district use of performance
data to improve system effectiveness and student
learning.
 Shift from planning as an event to continuous
improvement.
 Meet state and federal accountability
requirements.
 Give external stakeholders a way to learn about
how schools and districts are making
improvements.
Structure and Components of the UIP
Template
Major Sections:
I. Summary Information about the School
(pre-populated template)
II. Improvement Plan Information
III. Narrative on Data Analysis and Root
Cause Identification
IV. Action Plan(s)
V. Appendices (addenda forms)
Unified Improvement Planning Processes
Gather and
Organize
Data
Review
Performance
Summary
Describe
Notable
Trends
Data Analysis
Analysis
Data
(Data Narrative)
Narrative)
(Data
Progress
Monitoring
Prioritize
Performance
Challenges
Identify
Root
Causes
Set
Performance
Targets
Identify Major
Improvement
Strategies
Identify
Interim
Measures
Identify
Implementation
Benchmarks
Target Setting
Action Planning
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
 New Updates to the UIP Process
 Developing a UIP
 Data Narrative
 School Performance Frameworks and the UIP
 Data Analysis:
 Review Past Performance/Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
New UIP Changes/Updates
Data Narrative
• More Guidance Provided
• It serves as a repository for everything you do in the UIP
process.
Elements of the Data Narrative:
 Description of the School Setting and Process for Data Analysis
 Review Current Performance
 State & Federal Accountability Expectations
 Progress Towards Last Year’s Targets
 Trend Analysis
 Priority Performance Challenges
 Root Cause Analysis
 Progress Monitoring (Ongoing)
New UIP Changes/Updates
Pre-Populated Template & Data Analysis
• Adequate Yearly Progress (AYP) is no longer a
part of school or district accountability; metrics
removed from UIP
• CELApro growth including median growth
percentiles and median adequate growth percentiles
added to SPF/DPF and UIP Template.
• Disaggregated graduation rates added to
SPF/DPF and UIP Template.
New UIP Changes/Updates
Action Planning: Expectations/Target Setting
• English Language Proficiency –
• CELA median student growth percentiles and median adequate growth
percentiles in state SPF
• If this is a priority performance challenge targets must be set in this
area and reflected in the UIP template.
• Disaggregated Graduation Rates –
• Grad rates for disaggregated groups in state SPF
• If this is a priority performance challenge for the school/district, targets
must be set in this area and reflected in the UIP template.
• Disaggregated Student Achievement – The state has established
guidelines for setting targets for disaggregated student group performance.
Starting in 2011-12, schools and districts should consider this guidance in
establishing targets for student academic achievement.
UIP Clarification
UIP is a 2-Year Plan
The plan and following elements should
cover 2 academic years:
 Targets
 Major Improvement Strategies
 Associated Action Steps
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
 Data Narrative
 School Performance Frameworks and the UIP
 Data Analysis:
 Review Past Performance/Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Data Narrative as a Repository
Data Narrative:
A Repository
School A has ….
TCAP
Reading
Writing
Math
Science
2007
38
28
18
16
2008
41
29
19
15
2009
45
30
40
14
2010
41
31
40
17
2011
38
29
39
20
UIP Data and Information
Description of School
& Data Analysis
Process
Review Current
Performance
Trend Analysis:
Worksheets 1 & 2
Action Plans
Target Setting/Action
Planning Forms
Data Narrative
Purpose: The purpose of the data narrative is to describe the process
and results of the analysis of the data for school improvement. It serves
as a repository for everything you do in the UIP process.
Elements to Include in the Data Narrative:
 Description of the School Setting and Process for Data Analysis
 Review Current Performance
 State & Federal Accountability Expectations
 Progress Towards Last Year’s Targets
 Trend Analysis
 Priority Performance Challenges
 Root Cause Analysis
Throughout the school-year capture the following in the data narrative:
 Progress Monitoring (Ongoing)
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
 School Performance Frameworks and the UIP
 Data Analysis:
 Review Past Performance/Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Step 2.
Review Current Performance
State School Performance
Framework (SPF)
State SPF: Pre-Populated UIP
Templates
23
State SPF: Pre-Populated UIP
Templates
24
State: District & School
Performance Frameworks
Through the Colorado Educational Accountability
Act of 2009 (SB09-163)…
• CDE annually evaluates districts and schools based on
student performance outcomes.
• All districts receive a District Performance Framework
(DPF). This determines their accreditation rating.
• All schools receive a School Performance Framework (SPF).
This determines their school plan types.
• Provide a common framework through which to understand
performance and focus improvement efforts.
State Performance Indicators
Achievement
Growth
Growth Gaps
Percent
proficient and
advanced
Normative
and CriterionReferenced
Growth
Growth Gaps
• Reading (CSAP,
Lectura, and
CSAPA)
• Writing (CSAP,
Escritura, and
CSAPA)
• Math (CSAP and
CSAPA)
• Science (CSAP
and CSAPA)
• CSAP Reading,
Writing and
Math
• CELApro
• Median Student
Growth
Percentiles
• Adequate Median
Student Growth
Percentiles
Median Student
Growth Percentiles
and Median
Adequate Growth
Percentiles for
disaggregated
groups:
•
•
•
•
Poverty
Race/Ethnicity
Disabilities
English Language
Learners
• Below proficient
Postsecondary
and
Workforce
Readiness
Colorado ACT
Graduation Rate
(overall and for
disaggregated
groups)
Dropout Rate
State: Performance Indicators
Weight
27
State: 1 year vs. 3 year data
• CDE provides two different versions of the School
Performance Framework Reports:
▫ The most recent year of data
(1-year SPF, 2011)
▫ The most recent three years
of data (3-year SPF, 2011)
• Only one report counts for official accountability
purposes:
▫ The one under which the school has ratings on a higher number
of the performance indicators, or
▫ If the school has ratings under an equal number of indicators, the
one under which it received a higher total number of points.
State: SPF Ratings
• Cut-points for measures are calculated using state
norms – all students in the state
• Assigns a rating to each of the performance
indicators:
▫
▫
▫
▫
Exceeds
Meets
Approaching
Does not meet
State: School Plan Types
The ratings roll up to an overall evaluation of the school/district’s
performance, which determines the school’s plan type assignment:
30
DPS School Performance
Framework (SPF)
31
What is DPS’ SPF?
• Comprehensive, annual review of school performance in terms of student
achievement and overall organizational strength using a variety of
measures
• Is the basis of school accreditation ratings required by statute?
• Aligns district goals, state requirements, and federal mandates
• Provides information for teacher and principal compensation systems
• Made public for the Denver community and is a factor in enrollment
decisions
• Goals
 Improve overall learning and achievement by informing critical, instructional
decisions
 Provide a complete and comprehensive picture of how schools in DPS are
performing
2011 DPS SPF Overall Rating
Categories
100%
11%
90%
80%
70%
42%
Distinguished
60%
Meets Expectations
Accredited on Watch
50%
Accredited on Priority Watch
40%
Accredited on Probation
30%
31%
20%
7%
10%
9%
0%
*Alternative Schools excluded
9/26/2011
33
DPS Sample Stoplight
Scorecard: Indicators
34
DPS 2011 SPF Indicator Weights
DPS Sample Detail Scorecard:
Measures
36
DPS SPF: Based on 2-Years of Data
Example: TCAP Median Growth Percentile
2011:
Approaching
2012 Measure Rating: Approaching
2012:
Meets
2012 SPF Indicators & Measures:
Growth and Status
Indicators
1. Student
Progress over
Time - Growth
2. Student
Achievement
Level - Status
Measures
1.1a-c
MGPs
1.2a-c
MGPs compared to similar schools (FRL+ELL+SpEd +Mobility)
1.3a-c
Catch up growth
1.4a-c
Keep up growth
1.5a-c
Continuously enrolled growth (ES & MS only)
1.6
COAlt growth
1.7a-c
CSAP/TCAP Subgroup Growth (MGPs for each subgroup)
1.8a-c
CSAP/TCAP Subgroup Growth Comparison (Compare focus and reference groups MGPs)
1.9
Students w/Disabilities Subgroup Growth Comparison
1.10
CELA MGPs
2.1a-d
% CSAP/TCAP proficient or above
2.2a-d
% CSAP/TCAP proficient or above compared to similar schools
(FRL + ELL + SpEd + Mobility)
2.3a-c
CSAP/TCAP Subgroup Status (comparison w/district standard)
2.4
CSAP/TCAP Students w/Disabilities Subgroup Status Comparison
2.5
% CSAP/TCAP Advanced
2.6
CELA % At Level 5
2.7
% DRA/EDL on grade level or above (current year data only)
38
2012 SPF Indicators & Measures:
Post-Secondary Growth and Status
Indicators
3. PostSecondary
Readiness
Growth (high
schools only)
3. PostSecondary
Readiness
Status (high
schools only)
Measures
3.1a-d
10th Grade CSAP/TCAP to COACT Growth
3.2
CDE best-of graduation rate change (“best of” from prior year vs same year for current year)
3.3
DPS 4-year cohort graduation rate change (1st time 9th graders only)
3.4
On track to graduation change
3.5
Post-Secondary Credit/IB enrollment change (test taking or course enrollment)
3.6
AP/IB test taking rate change
3.7
AP/IB test passing count change
3.8
Post-Secondary passing count change
3.9
College remediation
4.1a-d
Colorado ACT
4.2a-d
Colorado ACT compared to similar schools (FRL + ELL + SpEd + Mobility)
4.3
CDE best-of graduation rate change (“best of” from prior year vs same year for current year)
4.4
CDE Graduation rate compared to similar schools (FRL + ELL + SpEd + Mobility)
4.5
On track to graduation
4.6
Post-Secondary Credit/IB enrollment (test taking or course enrollment)
4.7
AP and IB test taking rate
4.8
AP and IB test passing rate
4.9
Post-Secondary passing rate
4.10a-c
College remediation
4.11a-c
College remediation compared to similar schools (FRL + ELL + SpEd + Mobility)
39
2012 SPF Indicators & Measures: Student
& Parent Engagement and Re-Enrollment
Indicators
5. Student
Engagement &
Satisfaction
6. ReEnrollment
7. Parent
Engagement
Measures
5.1
Attendance rate (capture attendance from prior year and carry
forward)
5.2
Student satisfaction
5.3
Center-based program offerings**
6.1
Re-enrollment rate (% End of year who return in October)
6.2
% enrolled October to May (ES/MS only)
6.3
Dropout Rate (HS only)
6.4
Enrollment Change bonus points
7.1
Parent satisfaction
(more items removed)
7.2
Parent response rate
40
CDE vs. DPS SPF
41
DPS & CDE SPF Ratings Crosswalk
DPS SPF Rating
CDE SPF Rating
Distinguished
Performance
Meets Expectations
Performance
Accredited on Watch
Improvement
Accredited on Priority Watch
Priority Improvement
Accredited on Probation
Turnaround
Looking at the 2010-11 SPF,
• DPS’ SPF was higher than the state’s 17% of the times.
• The state’s SPF was higher than DPS’ 13% of the times.
42
DPS SPF vs. State SPF
DPS SPF Indicators
CDE SPF Indicators
Academic Achievement
(Status)
Academic Achievement
Academic Growth
Academic Growth
Academic Growth Gaps
Academic Growth Gaps
Postsecondary Readiness
(Status)
Postsecondary & Workforce
Readiness
Postsecondary Readiness
(Growth)
n/a
Student Engagement &
Satisfaction
n/a
Re-Enrollment
n/a
Parent Engagement
n/a
DPS SPF: Process & Timeline
August
Begin sharing
2012 DPS SPF
changes
Sept 10 – 14
DPS SPF
Principal
Review Period
Sept 24 – 28
Official DPS
SPF Release
September
Pre-Populated
UIP template
&
Share 2013
SPF Measures
December
State SPF
Release
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
 Data Analysis:
 Review Past Performance/Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Lunch Break
Lunch 1 hour
Please reflect and add any questions on
sticky notes to the parking lot.
Data Analysis (Worksheets #1 and #2)
Initial Root
Cause Analysis
Describe
Notable
Trends
Review
Performance
Prioritization
of
Performance
Challenges
Root Cause
Analysis
Consider Prior Year’s
Performance
Review Prior Year’s Performance (Worksheet #1)
 List targets set for last year
 Denote the following:
 Whether the target was met or not
 How close the school/district was to meeting the
target; and
 To what degree does current performance support
continuing with current major improvement
strategies and action steps
Prior Year’s Performance DPS
Example
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
 Data Analysis:
√ Review Past Performance/Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Data Locations
Indicator
Measure
Location
Academic Achievement
(Status)
TCAP, CoAlt, CELA
Principal Portal
School Folders (CoAlt)
Academic Growth
Median Student Growth
Percentile
Principal Portal
- School level by
Content Area
- School level by grade
Academic Growth Gaps
Median Student Growth
Percentile
CDE (SchoolView.org)
- Sub-groups
Data Locations
Indicator
Measure
Location
Post Secondary and
Work Force Readiness
Graduation Rate
School Folders
Disaggregated
Graduation Rate
CDE (SchoolView.org)
- School Performance –
Data Center
Dropout Rate
School Folders
Mean ACT Composite
Score
School Folders
Data Locations for Additional
Analyses (Examples)
• Interims and Course Data
▫ SchoolNet, Principal Portal (August)
• Assessment Framework Reports
▫ School Folders
• SPF Measures
▫ Principal Portal – SPF Drill Down Tool
• School Satisfaction Survey
▫ School Folders
• Continuously Enrolled
▫ Principal Portal
Data Websites
• Principal Portal
http://principal.dpsk12.org/
• School Folders
https://secure2.dpsk12.org/schoolfolders/
• CDE
http://www.schoolview.org/
• SchoolNet
https://schoolnet.dpsk12.org/
Data Websites Practice
Practice Activity:
Locate the % of students at Level 5 on CELA Reading for
(3rd or 6th or 9th) grade in 2012.
Click Path:
Log onto the principal portal
http://principal.dpsk12.org/
Reports Assessments  CELA  School CELA by Content
Area and Grade  Select Your “School”  Select Your
“Measure Name”  View Report
Step 2: Identify Trends
• Include all performance indicator areas.
• Identify indicators* where the school did not at least
meet state and federal expectations.
• Include at least three years of data (ideally 5 years).
• Consider data beyond that included in the school
performance framework (grade-level, sub-group, local
data).
• Include positive and negative performance patterns.
• Include information about what makes the trend
notable.
* Indicators listed on pre-populated UIP template include: status, growth, growth gaps and postsecondary/workforce readiness
Writing Trend Statements:
1. Identify the measure/metrics.
2. Describe for which students (grade level and
disaggregated group).
3. Describe the time period.
4. Describe the trend (e.g. increasing, decreasing,
flat).
5. Identify for which performance indicator the
trend applies.
6. Determine if the trend is notable and describe
why.
Trends Could be:
Stable
Increasing
Decreasing
Increasing then decreasing
Decreasing then increasing
Flat then increasing
Flat then decreasing
What other
patterns
could staff
see in three
years of
data?
What makes a trend notable?
• In comparison to what . ..
• Criterion-based: How did we compare to a
specific expectation?
▫ Minimum state expectations
▫ Median adequate growth percentiles
• Normative: How did we compare to others?
▫ District or state trends for the same metric over the
same time period.
▫ For disaggregated groups, to the school over-all
▫ By standard to the content area over-all
Examples of Notable Trends
• The percent of 4th grade students who scored proficient
or advanced on math TCAP/CSAP declined from 70% to
55% to 48% between 2009 and 2011 dropping well below
the minimum state expectation of 71%.
• The median growth percentile of English Language
learners in writing increased from 28 to 35 to 45
between 2009 and 2011,meeting the minimum
expectation of 45 and exceeding the district trend over
the same time period.
• The dropout rate has remained relatively stable (15, 14,
16) and much higher than the state average between
2009 and 2011.
Data Analysis: DPS Example
Trends:
What did they get right?
What can be improved?
Data Analysis: DPS Example w/
Table & Graph
Disaggregating Data
Review Current Performance
by Sub-Group
Is there a difference
among groups?
Example data to review
• Status
• Principal Portal -TCAP
• Growth
• Schoolview.org - CDE Growth Summary
• Academic Growth Gaps
• Schoolview.org - CDE Growth Gaps Report
• Example: More than
10% difference.
Yes
Analyze more data to
understand differences:
Example data to review
• CELA Trajectory
• Continuously Enrolled
• Frameworks
• District SPF Measures
• Follow steps for Prioritizing Performance
63 Challenges.
No
STOP!
• Step back and look for patterns in
performance.
• Denote that performance challenge spans
across sub-groups.
• Follow steps for Prioritizing Performance
Challenges.
Analyzing Trends: Keep in Mind…
• Be patient and hang out in uncertainty
• Don’t try to explain the data
• Observe what the data actually shows
• No Because
Because
A path through the data. . .
Review the SPF/DPF Report to identify where performance did
not at least meet expectations (federal/state/local)
Select one content area on which to focus
Consider performance
(achievement/growth) by
grade level for 3+ years
Consider performance by
disaggregated group by grade
level for 3+ years
Within grade-levels consider
achievement by
standard/sub-content area
Disaggregate groups further
Look across groups
Consider cross-content area performance (3 + years)
Consider PWR metrics over 3+ years
How to Describe Trends
1.
Start with a performance focus and relevant data
report(s).
2.
Make predictions about performance.
3.
Interact with data (at least 3 years).
4.
Look for things that pop out, with a focus on patterns
over time (at least three years).
5.
Capture a list of observations about the data in Data
Analysis worksheet (positive or negative).
6.
Write trend statements.
7.
Identify which trends are significant (narrow) and
which require additional analysis.
Make Notes for Data Narrative
Guiding Questions
• In which performance indicators did school
performance not at least meet state
expectations?
• What data did the planning team review?
• Describe the process in which your team
engaged to analyze the school’s data.
• What were the results of the analysis (which
trends were identified as significant)?
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
 Data Analysis:
√ Review Past Performance/√ Describe Trends/Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Priority Performance Challenges
Priority performance
challenges are. . .
Priority performance
challenges are NOT…
Specific statements (notable trend
statements) about performance
What caused or why we have the
performance challenge
Strategic focus for the improvement
efforts
Action steps that need to be taken
About the students
About the adults
Concerns about budget, staffing,
curriculum, or instruction
Priority Performance Challenges
Examples
• The percent of fifth grade students scoring proficient or
better in mathematics has declined from 45% three years
ago, to 38% two years ago, to 33% in the most recent
school year.
• For the past three years, English language learners
(making up 60% of the student population) have had
median growth percentiles below 30 in all content areas.
• Math achievement across all grade-levels and all
disaggregated groups over three years is persistently less
than 30% proficient or advanced.
Priority Performance Challenges NonExamples
• To review student work and align proficiency levels to
the Reading Continuum and Co. Content Standards
• Provide staff training in explicit instruction and
adequate programming designed for intervention needs.
• Implement interventions for English Language Learners
in mathematics.
• Budgetary support for para-professionals to support
students with special needs in regular classrooms.
• No differentiation in mathematics instruction when
student learning needs are varied.
Data Analysis: DPS Example
Priority Performance
Challenge:
What can be improved?
Prioritizing Performance Challenges
1. Clarify indicator areas where performance challenges
must be identified (where school performance did not
at least meet state/federal expectations).
2. Start with one indicator area, consider all negative
trends.
3. Focus the list (combining similar trends) (i.e., If you
have declining performance…in ELLs, SPED
4. Do a reality check (preliminary prioritization by dot
voting on the tree map)
5. Achieve consensus about top priorities (consider using
the REAL criteria, see the UIP Handbook).
6. Record on Data Analysis Worksheet.
Capture in Data Narrative
1.
2.
3.
4.
5.
6.
7.
Guiding Questions
In which performance indicators did school performance
not at least meet state expectations?
Who was involved in identifying trends and prioritizing
performance challenges?
What data did the planning team review?
In what process did the planning team engaged to analyze
the school’s data?
What were the results of the analysis (which trends were
identified as significant)?
How were performance challenges prioritized?
What were identified as priority performance challenges
for the 2011-12 school year?
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
 Data Analysis:
√ Review Past Performance/√ Describe Trends/ √ Performance
Challenges/Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
What is a Root Cause?
• Root causes are statements that describe the deepest
underlying cause, or causes, of performance
challenges.
• They are the causes that, if dissolved, would result in
elimination, or substantial reduction, of the
performance challenge(s).
• Root causes describe WHY the performance
challenges exist.
• Things we can change and need to change
• The focus of our major improvement strategies.
• About adult action.
Focus of Root Cause Analysis
• Root Cause analysis is always focused on student
performance.
• It answers the question: What adult actions
explain the student performance that we see?
• Root cause analysis can focus on positive or
negative trends.
• In this case the focus is on “challenges”
Steps in Root Cause Analysis
1.
Focus on a performance challenge (or closely related
performance challenges).
2.
Consider External Review results (or categories)
3.
Generate explanations (brainstorm)
4.
Categorize/ classify explanations
5.
Narrow (eliminate explanations over which you have
no control) and prioritize
6.
Deepen thinking to get to a “root” cause
7.
Validate with other data
Root Cause Activity
Priority Performance Challenge:
Little Johnny, who is in the fifth grade, habitually
gets up late in the morning for school. As a result
Johnny is late for his 1st class period.
• Work with your table partners to determine the
potential root causes for Johnny’s behavior.
(7 minutes)
Levels of Root Causes
•Organizational Structure
•Allocation of Staff
•Mission, Vision, Values and Beliefs
•Collaboration
•School Culture
•Budget
•Planning
•Policies
Systemic
Programmatic
Procedural
•The student
•The test
•The incident
•The teacher
•Instructional Process
•Alignment
•Scheduling
•Training and Staff Development
•Materials
•Curriculum Assessment
From Priority Performance
Challenge to Root Cause……
Systemic
Programmatic
Incident or
Procedural
Level
Priority
Performance
Challenge
There has been an
overall decline
academic
achievement in
reading, writing
and math from
2007-2011 for all
grades (K-8).
There has been a
decline in
achievement in
reading, math and
writing for 4th
grade reading
from 2007-2011.
4th graders did
not demonstrate
the multiplication
concept on the last
Everyday Math
Exam.
Root Cause
The RTI process
has not been
implemented with
fidelity.
Instruction has
not been
differentiated to
meet the needs of
the 4th grade
student.
The teachers did
not teach
multiplication
concepts in the
last EDM unit.
Important to Verify Root Cause
Ask the key questions for identifying whether a
cause is a root cause:
1. Would the problem have occurred if the cause had not
been present?
2.Will the problem reoccur if the cause is corrected or
dissolved?
3.Will correction of dissolution of the cause lead to similar
events?
Make any final revisions to your root cause explanation
as needed.
Preuss, P. (2003). Root Cause Analysis: School Leaders Guide to Using Data to
Dissolve Problems, Larchmont, NY: Eye on Education.
Verify Root Causes (example)
Priority Performance Challenge: The % proficient/adv students in reading has
been substantially above state expectations in 3rd grade but substantially below
stable (54%, 56%, 52%) in 4th and 5th for the past three years.
Possible Root
Causes
Questions to
Explore
Data Sources
Validation
Curriculum
materials and
Instructional plans
for each grade.
K-3 strategies are
different from 45.
K-3 is using new
teaching strategies,
4-5 are not.
What strategies are
primary vs.
intermediate teachers
using ?
Less time is given
to direct reading
instruction in 4-5
How much time is
Daily schedule in
devoted to reading in
each grade level.
primary v. intermediate
grades?
No evidence that
less time is
devoted to
reading in 4-5.
Capture in Data Narrative
Guiding Questions
• What data and evidence was used to determine
root causes?
• What process was used to identify these? (e.g., 5
Whys?)
• What process and data was used to verify these?
(e.g., reviewed curriculum, teacher observations,
interim assessments)
Break/Parking Lot Time
Break: 10 minutes
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
√ Data Analysis:
√ Review Past Performance/√ Describe Trends/ √Performance
Challenges/ √Root Causes
 Action Plans:
 Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
How to set annual performance targets. . .
Focus on a
priority
performance
challenge
Review state
or local
expectations
Determine
progress needed
in first two years
Determine
timeframe
(max 5 years)
Describe
annual targets
for two years
Action Planning Tools, p. 9
Minimum State Expectations
• The value for which a rating of “meets” would be assigned for
the state metric in each sub-indicator area.
• Academic Achievement: The 50th percentile of % Proficient
or Advanced for Colorado schools.
• Academic Growth and Academic Growth Gaps: A median
growth percentile (MGP) of 55 if MGP is less than the
Adequate Growth Percentile, and 45 otherwise.
• Postsecondary and Workforce Readiness: Graduation rate
at or above 80%, Drop-Out rate at or below the state
average, and Colorado ACT Composite Score at or above
the state average.
DPS District Accountability
 DPS’ goal is to close or significantly reduce the
academic achievement and postsecondary
readiness gaps between DPS and the state by 2015.
DPS’ SPF is used to evaluate school performance
as schools pursue the district’s goal.
 School specific targets have been established that
represent each school’s share of the district’s goal
annually through 2015. Schools should strive to
achieve these targets at minimum; however, some
schools will need to set higher targets.
Minimum District Expectations
Setting Academic Achievement Targets
Practice Activity:
1. Review your school’s performance for the 2011-2012 school year.
Find one area where you are not meeting district or state
expectations (Priority Performance Challenge).
2.
How long would it take for your school to meet DPS or CDE
expectations? (at most 5 years)
3.
How much progress can you make in the next two years?
 This is the process you will use to set your Annual
Targets for the next two years.
Interim Measures
 Once annual performance targets are set for the next two
years, schools must identify interim measures, or what
they will measure during the year, to determine if
progress is being made towards each of the annual
performance targets.
 Interim measures should be based on local performance
data that will be available at least twice during the school
year.
 Across all interim measures, data should be available
that would allow schools to monitor progress at least
quarterly.
92
Interim Measures
• Examples of Interim Measures:
▫ District-level Assessments: Benchmarks/Interims,
STAR, SRI
▫ School-level Assessments: End of Unit Assessments,
DIBELS, NWEA MAPS
• Measures, metrics and availability should be specified in
the School Goals Form.
• Remember that the Interim Measures need to align with
Priority Performance Challenges. Disaggregated groups
should be included as appropriate.
93
Examples of Interim Measures
• The percentage of all students scoring
Proficient/Advanced on the DPS Writing
Interim assessment will increase by a minimum
of 10 percentage points from the Fall
administration to the Spring administration.
• The median SRI score for 6th grade students will
increase by 50 lexile points for each of the three
administrations during the 2012-2013 school
year.
94
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
√ Data Analysis:
√ Review Past Performance/√ Describe Trends/ √Performance
Challenges/ √Root Causes
 Action Plans:
√ Target Setting/Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Major Improvement Strategies
• Respond to root causes of the performance
problems the school/district is attempting
to remedy.
• Action steps are smaller activities that fit within
larger major improvement strategies.
• Improvement Strategies and Action Steps must be
associated with resources, people, and time.
Describe your Desired Future
• If root causes are eliminated . . .
• What will these different groups be doing differently?
▫
▫
▫
▫
Students
Staff members
Leadership team
Parents / Community
• Examples:
▫ All students monitor the progress of their learning towards grade level
expectations on a weekly basis and set personal learning goals.
▫ Teachers daily use data about learning formatively to refocus instruction
on their students’ needs.
▫ Staff members consistently implement identified practices in effective
literacy instruction.
Action Steps
1. Timeline
•
•
Should include the 2012-2013 and 2013-2014 school
years.
Should include specific months.
2. Key Personnel
•
Consider who is leading each step and their capacity.
3. Resources
•
•
Include the amount of money, time, and source.
Consider resources other than money.
Action Steps
Action
Steps
Timeline
Key
Personnel
Resources Implementation
Benchmarks
Action steps must be defined for each
major improvement strategy.
Implementing all of the action steps =
implementing the major improvement
strategy.
External Vendors
• If the school/district will employ external
vendors, the plan should include:
▫ Major activity undertaken by the external vendor,
▫ Timeline for those activities,
▫ Resources that will pay for the external vendor, and
▫ Implementation benchmarks for the activities of the
external vendor.
Implementation Benchmarks
• Directly correspond to the action steps.
• Are something that a school/district leadership
team could review periodically.
• Should help teams adjust their plans – critical to
a cycle of continuous improvement.
Implementation Benchmarks
• Implementation Benchmarks are. . .
▫ how schools will know major improvement strategies
are being implemented;
▫ measures of the fidelity with which action steps are
implemented; and
▫ what will be monitored.
• Implementation Benchmarks are NOT:
▫ Performance measures (assessment results).
Implementation Benchmarks /Interim
Measures Activity
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation
Benchmark
Practice
ELL students increased their performance on Reading
Interim assessment in Round 2 by 5%.
Implementation Benchmarks or Interim
Measures?
Students increased STAR performance.
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation Benchmark
Implementation Benchmarks or Interim
Measures?
Classroom walkthroughs weekly.
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation Benchmark
Implementation Benchmarks or Interim
Measures?
Third grade students’ progress in reading will
be benchmarked three times through
AIMsWeb.
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation Benchmark
Implementation Benchmarks or Interim
Measures?
Input of classroom teachers will be gathered
during the last week of October.
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation Benchmark
Implementation Benchmarks or Interim
Measures?
High school English students will demonstrate
mastery using teacher developed writing
rubrics.
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation Benchmark
Implementation Benchmarks or Interim
Measures?
Staff will participate in three Strategy Labs.
Teacher leaders and administration will gather
evidence and give feedback on the strategies
being implemented in classrooms. Teachers will
keep reflection journals on their
implementation of the strategies.
• STAND UP if Example = Interim Measure
• SIT DOWN if Example = Implementation Benchmark
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
√ Data Analysis:
√ Review Past Performance/√ Describe Trends/ √Performance
Challenges/ √Root Causes
 Action Plans:
√ Target Setting/ √Action Planning/Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Progress Monitoring
• Consider:
▫ What performance data will be used to monitor
progress towards annual targets? How will you check
throughout the year that your strategies and actions
are having an impact on student outcomes?
▫ What process data will be used to monitor progress
towards annual targets? How will you check
throughout the year that your strategies and actions
are having an impact on adult actions?
Minimum District Expectations
 DPS UIP Targets
 DPS’ goal is to close or significantly reduce the
achievement and PSR gaps between DPS and the
state by 2015.
 ARE developed school specific targets (located in
School Folders) to achieve the DPS goal.
 Schools should strive to achieve these targets at
minimum.
 Some schools will need to set higher targets in
order increase their DPS SPF ratings.
UIP 101 Road Map
√ Overview of the UIP Process
√ Structure and Components of the UIP Template
√ New Updates to the UIP Process
 Developing a UIP
√ Data Narrative
√ School Performance Frameworks and the UIP
√ Data Analysis:
√ Review Past Performance/√ Describe Trends/ √Performance
Challenges/ √Root Causes
√ Action Plans:
√ Target Setting/ √Action Planning/ √Progress Monitoring
 Leadership Considerations & Resources
 School Type & Title I
 UIP Timeline & DPS/CDE Resources
Importance of School Type
 See CDE’s Quality Criteria (in the back of the
UIP handbook) to determine what you should
include based on school/plan type.
 School plan types are based on CDE’s SPF (and
in a few minor cases the DPS’ SPF) and will be
officially released in Dec. 2012
 ARE will communicate your preliminary CDE
SPF rating with your IS/DAP/SIP only if you are
a priority improvement or turnaround school.
Title I Requirements
 All Title I DPS schools follow Schoolwide plan
requirements
 Additional Title I requirements for:
 State SPF Priority Improvement Schools
 State SPF Turnaround Schools
 Focus Schools
 DPS Turnaround (TIG)
 ARE will communicate to you your status and
the appropriate Title addenda forms to attach to
your UIP in the fall.
Timelines/Deadlines/Resources
Websites
Contains
DPS UIP
http://testing.dpsk12.org/accountability/U
IP/UIP.htm
Training Materials & Tools
Timeline
Templates/Addenda Forms
UIP Upload Tool
CDE UIP
http://www.cde.state.co.us/uip/index.asp
Training Materials & Tools
Templates/Addenda Forms
DPS SPF
http://communications.dpsk12.org/initiati
ves/school-performance-framework/
Principal Portal:
 Drill-Down Tool
 Current and previous year’s SPF
 Additional reports on TCAP and
CELA
School CDE SPF Results
Resources
CDE SPF Rubric
CDE SPF
http://www.schoolview.org/performance.a
sp
School CDE SPF Results
Resources
CDE SPF Rubric
DPS
Federal
Programs
http://fedprograms.dpsk12.org/
Title I Status
District & Network Contacts
District Contact
UIP
Brandi Van Horn (ARE)
[email protected]
720-423-3640
SPF
Yen Chau (ARE)
[email protected]
720-423-3734
Federal
Programs
(Title I)
Veronica Bradsby (Federal
Programs)
[email protected]
720-423-8157
Assessments
Assessment Coordinators
(ARE)
http://testing.dpsk12.org/secur
e/sal_resources/SAL%20Role%
20and%20Assessment%20Info
rmation.pdf
Network Contact
Instructional Superintendents (IS)
School Improvement Partner (SIP)
Data Assessment Partner (DAP)
UIP 101 Principal Boot Camp
Presenters
Topic
Dept.
Contact Information
SPF
ARE
Yen Chau
[email protected]
720-423-3734
UIP
Elementary
Education
Tammy Giessinger
[email protected]
Data
Portals
ARE
Danielle Johnson
[email protected]
720-423-3850
UIP
ARE
Brandi Van Horn (ARE)
[email protected]
720-423-3640
Data Websites
• Principal Portal
http://principal.dpsk12.org/
• School Folders
https://secure2.dpsk12.org/schoolfolders/
• CDE
http://www.schoolview.org/
• SchoolNet
https://schoolnet.dpsk12.org/