New Mexico Principal Support Network

Download Report

Transcript New Mexico Principal Support Network

New Mexico
Principal Support
Network
Helping Leaders Use
Accountability Data
Effectively
Presented By:
Beata I. Thorstensen, NM Office of Education Accountability
Jan Sheinker, Sheinker Education Services
Judy Englehart, Aztec Municipal Schools
Tania Prokop, Aztec Municipal Schools
Presented At: CCSSO’s Education Leaders Conference,
September 12-14 2007
Hilton St. Louis at the Ballpark
NM Principal Support Network:





Began in 2005 with 40 principals & assistant
principals in 8 districts.
Expanded in 2006 to superintendents,
principals, assistant principals and district staff
from 28 districts.
Expanding again in 2007 to 43 districts.
Purpose: to provide comprehensive
assessment data, data analysis tools and
professional development on data-based
decision making.
Funded through the generous support of The
Wallace Foundation.
What the PSN Does:
Provides a network of training and support
for assistant principals, principals,
superintendents and other school district
leaders.
 Focuses on the analysis, interpretation
and use of high stakes accountability data.
 Goal: to help educational leaders use data
for school improvement, communication
and advocacy.

PSN Curriculum:

Data Analysis and Use
 Analyzing
high stakes assessment data to
pinpoint areas for school improvement.
 Communicating data to school boards,
superintendents, teachers, students and the
community.
 Using evidence in conjunction with best
practices for data-based decision making.
PSN Curriculum:

Data-Based Decision Making
 Providing
best practice information on the
facets of data-based decision making
 Informing comprehensive school improvement
plans utilizing New Mexico’s Educational Plan
for Student Success (EPSS).
PSN Curriculum:

Connecting with Peers
 Members
work with peers both within their
district and outside of their district to:
Analyze student data
 Build comprehensive school improvement plans
 Share promising practices for interventions

Outcomes of PSN: Data Use




Used to communicate data with district staff,
teachers and the community.
Used to facilitate visits with school improvement
teams from the Public Education Department.
Used to inform the development of the statewide
data warehouse reporting system.
Used to inform decisions about
instructional/curriculum interventions.
PSN Membership & Format
Superintendents, Principals, Assistant
Principals and other district data staff.
 Focus on Schools in Need of
Improvement.
 3-4 meetings per year.

Excel Pivot Tables for District & School Level
Data Analysis:
Assessing Student Performance
Beyond the Report Card:
Allowing DrillDown to Student
Level
Example of Student Level
Drill-Down:
Example of Data Display: Simple Bar Chart
Logging into NM DBDM
Surfing NM DBDM
Start Anywhere
Purpose of DBDM for PSN
A tool for leaders (helping their schools):

Multiple entry points for differentiated support

Specific, real time explanations, examples,
links, school stories, and resources

On-going job-embedded assistance

Continuous monitoring, collaboration, and
feedback

Continuous data-based school improvement
system into routine practice
Establish a school improvement team
What is a school improvement team?
How do we establish a school improvement team?
Who is on our team? What do we do?
How does the school improvement team make time
to do its work?
Develop a hypothesis
What information does our school or district
need to make decisions that will improve
student achievement?
How is our school doing compared to the
standard?
Develop a hypothesis
What information does our school or district need to make decisions that will
improve student achievement?
Baseline – What do you already know about your school?
•What learning strengths and weaknesses are evident in the school data?
•Which subgroups of students are having difficulty learning?
•What instructional changes might improve student learning in the areas of weakness?
•What professional development is needed to improve student learning in the areas of
weakness?
•What materials and equipment are needed to support changes in instruction?
How is our school doing compared to the standard?
Baseline – What do we think we know about how we are doing?
•Overall school results.
•Disaggregated by gender.
•Disaggregated by disability status.
•Disaggregated by ethnicity.
•Disaggregated by English proficiency.
•Disaggregated by income status.
•Disaggregated by migrant status.
Gather data to assess needs
What are the most useful sources of student data?
Why use multiple measures?
What are the most useful sources of direct student achievement data?
What are the most useful sources of indirect student achievement data?
What are the most useful sources of subgroup student achievement data?
What are the most useful sources of demographic data?
How do context variables impact the validity of our interpretation?
What do we have? What do we need?
Gathering Data

Direct Measures




Indirect Measures



Attendance and graduation rates
Information About Curriculum
Demographics







NMSBA– From Pivot Tables
Other Assessments –DIBELS, Short-cycle, etc.
Teacher Made Assessments
Ethnicity and race proportions
Gender proportions
Socio-economic percentages
Language status proportions
Disability status percentages
Migrant status proportions
Other?




Subgroups
Class size
Teacher training
Student mobility
Use data
How do we organize the data to help us answer important questions?
What do different sources tell us?
What do different displays tell us?
How do we display the data?
What patterns exist in the data?
How do we present data to the school and examine it?
What are the tests designed to measure?
Is there confirmation across data?
How should we present data and conclusions to the school community?
How do we formulate data-based goals?
Does our interpretation raise new questions?
What is our level of confidence in our interpretation?
Use data
How do we organize the data to help us answer important questions?
What do different sources tell us?
What do different displays tell us?
How do we display the data?
What patterns exist in the data?
How do we present data to the school and examine it?
What are the tests designed to measure?
Is there confirmation across data?
How should we present data and conclusions to the school community?
How do we formulate data-based goals?
Does our interpretation raise new questions?
What is our level of confidence in our interpretation?
Formulating Data Based Goals

When preparing to set goals based on the data, schools
clarify the results to determine the area of greatest
concern for setting one or two important goals for
improvement. The school:

uses the information about differences in achievement across content
areas (reading, writing, mathematics) to pinpoint the goal for
improvement
 uses the clarification of specific standards (basic reading, reading
comprehension, math computation, geometry, math problem solving) or
benchmarks to help plan the strategies and interventions

The goals for improvement are:

specific to the content area of greatest concern (reading, writing,
mathematics)
 sometimes related to strategies for improving specific standards
(reading for comprehension, use literature and media) or benchmarks
(Reading for information, Reading strategies, Literature)
Use data
How do we organize the data to help us answer important questions?
What do different sources tell us?
What do different displays tell us?
How do we display the data?
What patterns exist in the data?
How do we present data to the school and examine it?
What are the tests designed to measure?
Is there confirmation across data?
How should we present data and conclusions to the school community?
How do we formulate data-based goals?
Does our interpretation raise new questions?
What is our level of confidence in our interpretation?
Does our interpretation raise new questions?

Which specific standards or benchmarks are students farthest from
achieving? Impact of benchmarks on overall standard

Which specific subgroups of students are failing to achieve the
standards? Impact of a benchmark on scores for all subgroups

How have past changes affected student performance? Data over time
What is our level of confidence in our interpretation?
•
Error of measurement. Distance from cutpoints. Reliability of scores/proficiency
classification.
•
Different scores on two reading tests. Differences in what is measured.
Differences in how it is measured. Differences in degree of alignment with standards.
•
Different results for state and local tests. Differences between SBA and shortcycle results. Differences is specificity. Differences in sample size.
Develop a data-based plan
What must be considered when setting data-based goals?
How do we set data-based goals?
How can additional data help us identify the interventions we need?
How do we select interventions?
How do we select interventions for targeted subgroups?
How do we plan to include parents in interventions?
What staff development and support are necessary?
How does our plan impact our budget?
What is our time line?
What assignments are necessary?
Develop a Plan

Is the goal clear to everyone?

Do strategies/activities address specific data-based needs?

Are the activities specific and sequential?

Are persons responsible stated (who will do what, when, how)?

Are resources specifically stated?

Are the selected intervention scientifically research
based/research proven?

Are task and evaluation deadlines clearly stated?

Do specific professional development activities support full
implementation of the strategies and interventions?

Are activities to involve parents and community specific to the
strategies and interventions to be implemented?

Is evidence of completion specifically identified?
Monitor progress and document success
How do we monitor implementation of the plan?
How do we use data to monitor progress toward our goals?
How do we know if we made the right decisions?
How do we use data to document success in meeting goals?
What should we report to the public?
Other information about purpose and use

Explanations and search tools






Links to State customized version




overview
“how to” guide for various audiences
glossary of terms
annotated bibliography of school
improvement publications
key work search
on-line data and school improvement
resources
examples of school, district, and state
data use
State school improvement documents
Password protected school
accounts


for tracking progress on improvement
implementation.
for on-time interactions with state
support teams, district support
personnel, and technical assistance
providers.
Avoiding Pitfalls: Common Reasons
Why Plans Fail






The plan is never fully implemented.
Timelines are not met for each activity.
Interventions are not evident in all
classrooms.
Tasks are not all completed on time.
Resources are not acquired or deployed in
accordance to the plan.
Next steps are not articulated.
New Mexico Principal
Support Network
Data Based Decision Making Website found at:
http://www.edvantia.org/dbdm
Office of Education Accountability
New Mexico Department of Finance & Administration
Contact Information: 505-476-1070
www.nmdfa.state.nm.us