Denver Public School

Download Report

Transcript Denver Public School

Denver Public Schools
Unified Improvement
Planning 102
August 15, 2012
Announcement:
Training about Inside & Edge
In Spring 2012, DPS adopted NEW
materials for middle and high school level
ELLs: Inside materials for Middle School
Edge materials for High School
Training: Saturday, August 25, 9am to
3:30pm at Del Pueblo.
Please ask Teachers to register via SchoolNet.
Why Our Work Matters
The following slides provide global, national
and local context for the work in which
we’re engaged.
Percent of Adults with an Associate Degree or Higher by Age Group U.S. & Leading OECD Countries
Age 25-34
Age 55-64
37.7
39.6
40.9
39.2
40.8
41.4
41.5
33.2
36.2
34.6
34.8
28.5
16.9
16.0
19.4
22.5
24.9
26.9
30.0
26.8
24.0
19.2
22.9
20
32.8
30.3
30
41.9
42.2
38.1
39.5
43.6
37.5
37.4
40
39.3
43.2
46.2
50
Age 45-54
53.0
54.1
50.8
54.8
60
Age 35-44
10.6
10
0
Canada
Japan
Korea
Source: OECD, Education at a Glance 2008
New
Zealand
Ireland
Belgium
Norway
France
Denmark
U.S.
Change in Population Age 25-44 By
Race/Ethnicity, 2005-2025
Source: U.S. Census Bureau
Difference in Education Attainment Between Whites
and Hispanics (2006, Percent)
Source: U.S. Census Bureau, 2006 American Community Survey (ACS) Public Use Microdata Sample (PUMS) File. Via NCHEMS
status
Gaps - Ethnicity
TCAP Reading
% Proficient or Above
100%
90%
80%
70%
60%
50%
40%
30%
20%
49% 52%
41%
48%
53%
68% 72%
58%
40% 43%
40% 42%
Black
Hispanic
82% 84%
49% 51%
10%
0%
District
245
American Indian
234
336
Asian
2011154
Pct Prof 20
Or Above
6
Pacific Islander
Two or More
Races
White
2012 Pct Prof Or Above
• The Reading achievement gap
between White students
andN =American
Indian,
N = 215
N = 158
98
N = 393
Asian, and Two or More Races students decreased slightly from 2011 to 2012. All
other gaps remained relatively unchanged.
• Every subgroup within DPS experienced growth on TCAP Reading from 2011 to
2012.
status
Gaps - Ethnicity
TCAP Math
% Proficient or Above
100%
90%
80%
70%
60%
50%
40%
30%
20%
41% 43%
10%
71% 75%
54% 59%
31% 34%
26% 27%
35% 35%
42% 41%
53% 57%
0%
District
245
American Indian
234
336
Asian
Black
2011 Pct Prof Or Above
154
20
6
Hispanic
Pacific Islander
Two or More
Races
White
2012 Pct Prof Or Above
• The Math achievement gapNbetween
White students
and Black,
Hispanic, and
= 215
N = 158
N = 98
N=
N = 393
380
Pacific Islander students increased
from 2011 to 2012. All other gaps
remained relatively unchanged.
• 5 of the 7 subgroups experienced growth on TCAP Math from 2011
to 2012.
status
Gaps - Ethnicity
TCAP Writing
% Proficient or Above
100%
90%
80%
70%
60%
50%
40%
73% 74%
30%
20%
39%
41%
10%
46%
51%
27% 31%
30% 30%
29% 32%
Black
Hispanic
42% 45%
56%
58%
0%
District
245
American Indian
234
336
Asian
154Pct Prof20
2011
Or Above
6
Pacific Islander
Two or More
Races
White
2012 Pct Prof Or Above
N = 215
N = 158
N = 98
N=
N = 393
• With the exception of Black
students,
the Writing
achievement
gap between
380
White students and all other ethnic groups decreased slightly from 2011 to
2012.
• 6 of the 7 subgroups experienced growth on TCAP Writing from
2011 to 2012.
status
Gaps - Ethnicity
TCAP Science
% Proficient or Above
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
27%
31%
24%
36%
45%
43%
20%
17%
19%
17%
54%
65%
70%
20%
0%
District
245
American Indian
Asian
2011 Pct Prof Or Above
234
336
154
20
6
Black
Hispanic
Two or More Races
White
2012 Pct Prof Or Above
• The Science achievement gap increased between American Indian and White
N = 215
N = 158
N = 98
N=
N = 393
380
students, but decreased between White
and Asian and Two or More Races
students.
•
5 of the 6 subgroups experienced growth on TCAP Science between
2011 and 2012.
* The Pacific Islander population is <16 and is therefore not included.
Warming Up
UIP Scavenger Hunt
State Feedback on DPS UIPs
• The next 4 slides provide a summarization of the
feedback received from CDE for DPS priority
improvement and turnaround school UIPs submitted
Jan 2012.
• All feedback listed in the slides was common among
most of the DPS UIPs.
• Feedback in bold font indicates that almost all priority
improvement and turnaround schools received this type
of feedback.
• Feedback NOT in bold font indicates that about half or
slightly less than half of the priority improvement and
turnaround schools received this type of feedback.
State Feedback on DPS UIPs
Data Analysis
•Trend/Data analysis did not drill down far
enough (i.e., disaggregation by grade level,
subgroups, cohorts, etc.)
•Data is listed but there is no description of what
the data tell us
•Exclusion of strategic focus on subgroups when
data show subgroups underperforming
•Deeper root cause needed, asking more “why” questions.
•Too many priority performance challenges (CDE
recommends 3 to 4)
•Trends concentrated on one content area or performance
indicator
•Too many root causes
State Feedback on DPS UIPs
Data Narrative
• Does not describe the processes used to
prioritize the challenges or to identify and
verify root causes
• Data analysis lacks coherence:
 Performance challenges do not relate directly to the data
 Contradictions b/n data narrative and data in trend
section
 No connection between the data and root cause
State Feedback on DPS UIPs
Target Setting
•Exclusion of strategic focus on subgroups
•Some schools did not mention how often they
will school examine interim measures.
•Some schools did not list interim measures:
“Smart Goal Success” is not an interim measure
State Feedback on DPS UIPs
Action Planning
•More detail in action steps (i.e., “what does Individualized
PD and support” mean?)
•Implementation benchmarks should include “how schools
will measure completion and effectiveness”, “when” and “by
whom”:
▫ Current implementation benchmarks:
 “Prior to each unit being taught.”
 “Weekly via homework assignments and via parent conferences.”
 “Teacher sign-in sheets.”
▫ Example according to CDE critical quality criteria:
 “100% of teachers will attend training in Oct 2012. Attendance will be measured by a sign
in sheet at training. Principal walkthroughs with rubric will provide evidence of use of new
training skills Nov-Dec 2012)
•Plan only addresses 1 year
•Total budget does not provide exact dollar amount
•Major improvement strategies and action steps listed are insufficient
to increase student performance.
New UIP State Requirements/Updates
Pre-Populated Template & Data Analysis
• Adequate Yearly Progress (AYP) is no longer a
part of school or district accountability; metrics
removed from UIP
• CELApro growth including median growth
percentiles and median adequate growth percentiles
added to SPF/DPF and UIP Template.
• Disaggregated graduation rates added to
SPF/DPF and UIP Template.
New UIP State Requirements/Updates
• English Language Proficiency –
• CELA median student growth percentiles and median adequate growth
percentiles in state SPF
• If this is a priority performance challenge targets must be set in this
area and reflected in the UIP template.
• Disaggregated Graduation Rates –
• Grad rates for disaggregated groups in state SPF
• If this is a priority performance challenge for the school/district, targets
must be set in this area and reflected in the UIP template.
• Disaggregated Student Achievement – The state has established
guidelines for setting targets for disaggregated student group performance.
Starting in 2011-12, schools and districts should consider this guidance in
establishing targets for student academic achievement.
These changes impact every section of the UIP.
Consider Prior Year’s
Performance
Review Prior Year’s Performance (Worksheet #1)
 List targets set for last year
 Denote the following:
 Whether the target was met or not
 How close the school/district was to meeting the
target; and
 To what degree does current performance support
continuing with current major improvement
strategies and action steps (NEW Addition)
New UIP State Requirements/Updates
Data Narrative:
A Repository
School A has ….
TCAP
Reading
Writing
Math
Science
2007
38
28
18
16
2008
41
29
19
15
2009
45
30
40
14
2010
41
31
40
17
2011
38
29
39
20
UIP Data and Information
Description of School
& Data Analysis
Process
Review Current
Performance
Trend Analysis:
Worksheets 1 & 2
Action Plans
Target Setting/Action
Planning Forms
New UIP State Requirements/Updates
Data Narrative
• More Guidance Provided
• It serves as a repository for everything you do in the UIP
process.
Elements to Include in the Data Narrative:
Description of the School Setting and Process for Data Analysis
Review Current Performance
 State & Federal Accountability Expectations
 Progress Towards Last Year’s Targets
Trend Analysis
Priority Performance Challenges
Root Cause Analysis
Throughout the school-year capture the following in the data narrative:
Progress Monitoring (Ongoing)
New UIP State Requirements/Updates
UIP is a 2-Year Plan
The plan and following elements should
cover 2 academic years:
 Targets
 Major Improvement Strategies
 Associated Action Steps
Digging Into Your UIP
Description
of Notable
Trends
Priority
Performance
Challenges
Root Cause
Analysis
Locate The “Exemplar” Activity
• You have three different color sheets at
your table.
• Review each of them and determine which
one meets the critical quality criteria.
• Discuss your selection with your
colleagues.
• Take a few seconds to compare the
“exemplar” with your own UIP (status
reading data/trend statements).
23
Step 2: Identify Trends
• Include all performance indicator areas.
• Identify indicators* where the school did not at
least meet state and federal expectations.
• Consider data beyond that included in the school
performance framework (i.e., local data).
• Include positive and negative performance
patterns.
* Indicators listed on pre-populated UIP template include: status, growth, growth gaps and postsecondary/workforce readiness
Writing Trend Statements:
1. Identify the measure/metrics.
2. Describe for which students (grade level and
disaggregated group).
3. Include at least three years of data (ideally 5
years).
4. Describe the trend (e.g., increasing, decreasing,
flat).
5. Identify for which performance indicator the trend
applies.
6. Determine if the trend is notable and describe why.
Examples of Notable Trends
• The percent of 4th grade students who scored proficient
or advanced on math TCAP/CSAP declined from 70% to
55% to 48% between 2009 and 2011 dropping well below
the minimum state expectation of 71%.
• The median growth percentile of English Language
learners in writing increased from 28 to 35 to 45
between 2009 and 2011,meeting the minimum
expectation of 45 and exceeding the district trend over
the same time period.
• The dropout rate has remained relatively stable (15, 14,
16) and much higher than the state average between
2009 and 2011.
Disaggregating Data
Review Current Performance
by School-level & Grade
Example Report: TCAP ALL Tests
Review Current Performance
by Sub-Group
Example Comparison:
ELL vs. Non-ELL TCAP Reading
Is there a difference
among groups?
• Example: More than
10% difference.
Yes
Analyze more data to
understand differences.
Follow steps for Prioritizing
Performance Challenges.
27
Examples of data to review:
• CELA Trajectory
• Continuously Enrolled
• Frameworks
No
STOP!
• Step back and look for patterns in
performance.
• Denote that performance challenge
spans across sub-groups.
• Follow steps for Prioritizing
Performance Challenges.
List of Subgroups
Subgroups
Typical Groupings
Grade Levels
Race/Ethnicity (R/E)
Free/Reduced Lunch (FRL) Status
English Language Learner Status (ELL)
Students with Disabilities (SPED)
Gender
Analyzing Trends: Keep in Mind…
• Be patient and hang out in uncertainty
• Don’t try to explain the data
• Observe what the data actually shows
• No Because
Because
A path through the data. . .
Review the SPF/DPF Report to identify where performance did
not at least meet expectations (federal/state/local)
Select one content area on which to focus
Consider performance
(achievement/growth) by
grade level for 3+ years
Consider performance by
disaggregated group by grade
level for 3+ years
Within grade-levels consider
achievement by
standard/sub-content area
Disaggregate groups further
Look across groups
Consider cross-content area performance (3 + years)
Consider PWR metrics over 3+ years
Priority Performance Challenges
Priority performance
challenges are. . .
Priority performance
challenges are NOT…
Specific statements (notable trend
statements) about performance
What caused or why we have the
performance challenge
Strategic focus for the improvement
efforts
Action steps that need to be taken
About the students
About the adults
Concerns about budget, staffing,
curriculum, or instruction
Priority Performance Challenges
Examples
• The percent of fifth grade students scoring proficient or
better in mathematics has declined from 45% three years
ago, to 38% two years ago, to 33% in the most recent
school year.
• For the past three years, English language learners
(making up 60% of the student population) have had
median growth percentiles below 30 in all content areas.
• Math achievement across all grade-levels and all
disaggregated groups over three years is persistently less
than 30% proficient or advanced.
Priority Performance Challenges NonExamples
• To review student work and align proficiency levels to
the Reading Continuum and Co. Content Standards
• Provide staff training in explicit instruction and
adequate programming designed for intervention needs.
• Implement interventions for English Language Learners
in mathematics.
• Budgetary support for para-professionals to support
students with special needs in regular classrooms.
• No differentiation in mathematics instruction when
student learning needs are varied.
What is a Root Cause?
• Root causes are statements that describe the deepest
underlying cause, or causes, of performance
challenges.
• They are the causes that, if dissolved, would result in
elimination, or substantial reduction, of the
performance challenge(s).
• Root causes describe WHY the performance
challenges exist.
• Things we can change and need to change
• The focus of our major improvement strategies.
• About adult action.
Steps in Root Cause Analysis
1.
Focus on a performance challenge (or closely related
performance challenges).
2.
Consider External Review results (or categories)
3.
Generate explanations (brainstorm)
4.
Categorize/ classify explanations
5.
Narrow (eliminate explanations over which you have
no control) and prioritize
6.
Deepen thinking to get to a “root” cause
7.
Validate with other data
Root Cause Activity
Priority Performance Challenge:
The percent of English Language Learners (74% of
students) scoring proficient or above in
mathematics has declined from 45% three years
ago, to 38% two years ago, to 33% in the most
recent school year.
•Work with your table partners to determine the potential
root causes for the performance challenge.
•Follow steps on previous slide.
(7 minutes)
Levels of Root Causes
•Organizational Structure
•Allocation of Staff
•Mission, Vision, Values and Beliefs
•Collaboration
•School Culture
•Budget
•Planning
•Policies
Systemic
Programmatic
Procedural
•The student
•The test
•The incident
•The teacher
•Instructional Process
•Alignment
•Scheduling
•Training and Staff Development
•Materials
•Curriculum Assessment
From Priority Performance Challenge to
Root Cause……
Systemic
Programmatic
Incident or
Procedural
Level
Priority
Performance
Challenge
There has been an
overall decline
academic
achievement in
reading, writing
and math from
2007-2011 for all
grades (K-8).
There has been a
decline in
achievement in
reading, math and
writing for 4th
grade reading
from 2007-2011.
4th graders did
not demonstrate
the multiplication
concept on the last
Everyday Math
Exam.
Root Cause
The RTI process
has not been
implemented with
fidelity.
Instruction has
not been
differentiated to
meet the needs of
the 4th grade
student.
The teachers did
not teach
multiplication
concepts in the
last EDM unit.
Important to Verify Root Cause
Ask the key questions for identifying whether a
cause is a root cause:
1. Would the problem have occurred if the cause had not
been present?
2.Will the problem reoccur if the cause is corrected or
dissolved?
Make any final revisions to your root cause
explanation as needed.
Preuss, P. (2003). Root Cause Analysis: School Leaders Guide to Using Data to
Dissolve Problems, Larchmont, NY: Eye on Education.
Verify Root Causes (Examples)
Priority Performance Challenge: The % proficient/adv students in reading has
been substantially above state expectations in 3rd grade but substantially below
stable (54%, 56%, 52%) in 4th and 5th for the past three years.
Possible Root
Causes
Questions to
Explore
Data Sources
Validation
Curriculum
materials and
Instructional plans
for each grade.
K-3 strategies are
different from 45.
K-3 is using new
teaching strategies,
4-5 are not.
What strategies are
primary vs.
intermediate teachers
using ?
Less time is given
to direct reading
instruction in 4-5
How much time is
Daily schedule in
devoted to reading in
each grade level.
primary v. intermediate
grades?
No evidence that
less time is
devoted to
reading in 4-5.
Break & Reflection Time
Break Time: 10 minutes
How to set annual performance targets. . .
Focus on a
priority
performance
challenge
Review state
or local
expectations
Determine
progress needed
in first two years
Determine
timeframe
(max 5 years)
Describe
annual targets
for two years
Action Planning Tools, p. 9
Minimum Expectations for Target Setting
(2011 Data)
CDE Expectations
DPS Expectations
•
Academic Achievement:
Reading: 72% Proficient/Advanced
Math:
71% Proficient/Advanced
Writing: 54% Proficient/Advanced
Science: 48% Proficient/Advanced
•
Academic Achievement:
Reading: 50% Proficient/Advanced
Math:
50% Proficient/Advanced
Writing: 40% Proficient/Advanced
Science: 30% Proficient/Advanced
•
Academic Growth and Academic
Growth Gaps: A median growth
percentile (MGP) of 55 if MGP is less
than the Adequate Growth Percentile,
and 45 otherwise.
•
Academic Growth and Academic
Growth Gaps: A median growth
percentile (MGP) at or above 50.
•
Postsecondary and Workforce
Readiness: Graduation rate at or above 80%,
•
Postsecondary and Workforce
Readiness: Measures only apply to
High Schools.
Drop-Out rate at or below the state average, and
Colorado ACT Composite Score at or above the state
average.
DPS District Accountability
 DPS’ goal is to close or significantly reduce the
academic achievement and postsecondary
readiness gaps between DPS and the state by 2015.
DPS’ SPF is used to evaluate school performance
as schools pursue the district’s goal.
 School specific targets have been established that
represent each school’s share of the district’s goal
annually through 2015. Schools should strive to
achieve these targets at minimum; however, some
schools will need to set higher targets.
Academic Achievement Targets
Activity:
Did You Meet Your DPS Targets?
1. Compare your current TCAP data to the 2012 targets
that were set for your school in all 4 content areas.
▫
▫
In which content area(s) did you meet the target?
In which content area(s) did you not meet the target?
2. Which content areas need to be a focus for the ’12-’13
school year?
3. How will you use your current performance data to set
targets for the ’12-’13 school year?
Guiding Questions for Target Setting
• Is the target aligned to the Priority Performance
Challenge?
• How will subgroups be addressed within the target?
• What will be the amount of progress you expect to make
in the next 2 years?
• What implications will the current target have for
meeting the 5 year goal?
47
Interim Measures
 Once annual performance targets are set for the
next two years, schools must identify interim
measures, or what they will measure during the
year, to determine if progress is being made
towards each of the annual performance targets.
 Interim measures should be based on local
performance data that will be available at least
twice during the school year.
48
Interim Measures
• Examples of Interim Measures:
▫ District-level Assessments: Benchmarks/Interims,
STAR, SRI
▫ School-level Assessments: End of Unit Assessments,
DIBELS
• Measures, metrics and availability should be specified in
the School Goals Form.
• Remember that the Interim Measures need to align with
Priority Performance Challenges. Disaggregated groups
should be included as appropriate.
49
Examples of Interim Measures
• The percentage of all students scoring
Proficient/Advanced on the DPS Writing Interim
assessment will increase by a minimum of 10 percentage
points from the Fall administration to the Spring
administration.
• The percentage of students identified as
Proficient/Advanced on the “Pathway to Performance on
CSAP” report will increase by 5% from the October to
the December STAR assessment administration.
50
Interim Measures: Non-Examples
• Evidence of academic language in classrooms.
• Data teams will analyze SMART goal results.
• Analyzing student writing samples.
• Vertical team and grade level meetings to assess
writing.
51
Major Improvement Strategies
• Respond to root causes of the performance
problems the school/district is attempting to
remedy.
• Action steps are smaller activities that fit within
larger major improvement strategies.
Examples of Major
Improvement Strategies
• Increase student achievement in the area of Literacy
with an emphasis on Hispanic and ELL students by
ensuring implementation of the Denver Literacy Plan
with fidelity, differentiating based on data, and use of
effective interventions.
• Teachers will identify gaps and next steps for students
reading below grade level and systematically provide
differentiated instruction to address needs and close
gaps in Reading achievement.
53
Action Steps
1. Timeline
•
•
Should include the 2012-2013 and 2013-2014 school
years.
Should include specific months.
2. Key Personnel
•
Consider who is leading each step and their capacity.
3. Resources
•
•
Include the amount of money, time, and source.
Consider resources other than money.
Implementation Benchmarks
• Directly correspond to the action steps.
• Are something that a school/district leadership
team could review periodically.
• Should help teams adjust their plans – critical to
a cycle of continuous improvement.
Implementation Benchmarks
• Implementation Benchmarks are. . .
▫ how schools will know major improvement strategies
are being implemented;
▫ measures of the fidelity with which action steps are
implemented; and
▫ what will be monitored.
• Implementation Benchmarks are NOT:
▫ Performance measures (assessment results).
Activity: Implementation Benchmarks
• For each action step, brainstorm a possible Implementation
Benchmark.
Action Step
Develop a data team structure
with consistent expectations and
data analysis.
Teachers will provide
differentiated core instruction
during the literacy block through
implementation of the Workshop
model.
Implementation Benchmark(s)
Progress Monitoring
• Consider:
▫ What performance data will be used to monitor
progress towards annual targets? How will you check
throughout the year that your strategies and actions
are having an impact on student outcomes?
▫ What process data will be used to monitor progress
towards annual targets? How will you check
throughout the year that your strategies and actions
are having an impact on adult actions?
Expectations for Tracking
Progress
• 3 times per year you will need to do a formal
data review of your UIP to assess your progress
and the impact you are having on students.
• You should have monthly check points to review
progress on your action steps – perhaps as part
of SLT and/or data team meetings.
• Peer Review will happen between Oct 1st and Oct
15th by trios of principals getting together to peer
review.
59
Tracking Your Success
UIP Tracker 2012-2013
1st SLT Review by 11/3/12
School:
Implementation
Benchmark(s)
Action Steps
Major Improvement Strategy #1 :
Action Steps
Major Improvement Strategy #2:
Action Steps
Major Improvement Strategy #3:
Evidence of
Implementation
Data that demonstrates
student progress
2nd SLT Review by 1/18/13
Evidence of
Implementation
Data that demonstrates
student progress
3rd SLT Review by 5/24/13
Evidence of
Implementation
Data that demonstrates
student progress
Many Lenses for Review
PRIORITIES
• Help our educators grow
• Shift our teaching practices
with students
• Improve the outcomes of
linguistically diverse students
• Differentiate support to schools
VALUES
•
•
•
•
•
•
Integrity
Students First
Collaboration
Equity
Accountability
Fun
61
Another Lens
New Standards
MATH
• Focus
• Coherence
• Rigor
LITERACY
• content-rich nonfiction
and informational texts
• evidence from texts
• increasingly complex
texts and academic
vocabulary
ELA
• Social and Instructional
language
• The language of Language
Arts
• The language of Mathematics
• The language of Science
• The language of Social Studies
62
UIP Timeline Dates 2012-13
▫Sep 12
▫Oct 12
▫Nov 12
Cycle 1
• 10/1-10/12: UIP Peer
Review
• 10/15: Initial 201213 UIP Due
▫Dec 12
Cycle 2
• Nov: District SIAC
reviews
• 12/16: 2012-13 UIP
Due
• 10/21 - 11/15
• Formal Review
with SLT & IS
▫Jan 13
▫Feb 13
▫Mar 13
▫Apr 13
• 1/8 – 2/11
• Mid-Year Review
with IS
Cycle 3
• Jan – Apr: Monthly Implementation
Reviews
• Feb: Develop updated UIP for 201213 (update budget, scheduling, etc.)
• Late Feb: DPS to receive feedback
from CDE on UIPs
• 4/3 FINAL DPS UIPs Due (made
public)
• 5/6 – 6/2
• End of Year
Review with IS
CDE Submissions
*Priority improvement & turnaround schools will be submitted to
CDE for review.
• 1/15: Turnaround &
Priority UIPs to CDE
• 4/15: All UIPs to CDE
Upcoming UIP Timeline
Cycle 1
Sept 2012
Oct 2012
Nov 2012
Dec 2012
 Sept – DPS SPF Data Available
 Early Oct – UIP Peer Review
 10/15–Initial 2012-13 UIP
Due
• 10/21 - 11/15
• Formal Review
with SLT & IS
 Nov – District SIAC reviews
12/16–2012-13 UIP Due*
*Priority improvement & turnaround schools will be submitted to CDE for review.
Resources
Websites
Contains
DPS UIP
http://testing.dpsk12.org/accountability/U
IP/UIP.htm
Training Materials & Tools
Timeline
Templates/Addenda Forms
UIP Upload Tool
CDE UIP
http://www.cde.state.co.us/uip/index.asp
Training Materials & Tools
Templates/Addenda Forms
DPS SPF
http://communications.dpsk12.org/initiati
ves/school-performance-framework/
Principal Portal:
 Drill-Down Tool
 Current and previous year’s SPF
 Additional reports on TCAP and
CELA
School CDE SPF Results
Resources
CDE SPF Rubric
CDE SPF
http://www.schoolview.org/performance.a
sp
School CDE SPF Results
Resources
CDE SPF Rubric
DPS
Federal
Programs
http://fedprograms.dpsk12.org/
Title I Status