Decision Making for Results

Download Report

Transcript Decision Making for Results

Decision Making for Results
Part One: Objectives
• Develop a deeper understanding of the
Decision Making for Results: Data-Driven
Decision Making process
• Increase awareness of the relevance of
data and its impact on leadership,
teaching, and learning
• Reinforce the importance of collecting
both cause and effect data
Objectives
• Apply the Decision Making for Results:
Data-Driven Decision Making process
to monitor leadership, teaching, and
learning
• Implement the Decision Making for
Results: Data-Driven Decision Making
process to monitor school
improvement
Principles of
Decision Making For Results
Antecedents
Accountability
Collaboration
Seminar Overview
•
•
•
•
Introduction
Building the foundation
Process and application
Action planning
Becoming Data Driven
How are you
currently embracing
a data-driven
decision making
process that leads
to results?
Results-Driven Schools
• Where is the proof?
•
•
•
•
90/90/90 Schools, Reeves 2003
Education Trust, 2002
NCREL, 2000
Consortium for Policy Research in
Education, 2000
• EdSource, 2005
• Northern Illinois University Center
for Governmental Studies, 2004
Reflection
“The value of the data emerges only
when analysis provides insights that
direct decisions for students.”
S. White, 2005
Part Two
Building the Foundation
• Cause data and effect data
• Continuous improvement cycle
• Principles and processes of
Decision Making for Results:
Data-Driven Decision Making
“Only by evaluating both causes and
effects in a comprehensive
accountability system can leaders,
teachers, and policymakers understand
the complexities of student achievement
and the efficacy of teaching and
leadership practices.”
Reeves, 2006
Definitions and Examples
Effect data:
Outcomes or
results
Cause data:
Professional
practices that
create specific
effects or results
The Leadership & Learning Matrix
Effects/Results
(stud.out.)
Lucky
Leading
High results, low
understanding of
antecedents
Replication of success
unlikely
High results, high
understanding of
antecedents
Replication of success
likely
Losing Ground
Learning
Low results, low
understanding of
antecedents
Replication of failure
likely
Low results, high
understanding of
antecedents
Replication of mistakes
unlikely
Antecedents/Cause Data (Adult Actions)
PIM
Planning
Needs Assessment
Inquiry
Goals
Implementation
Strategies
Professional
Development
Parental
Involvement
Monitoring
Frequency
Evaluation
Part Three:
Process and Application
Ocean View Elementary School
A Look at Collaboration
The Process for Results
Inquiry;
Develop
Questions
Treasure
Hunt
Monitor &
Evaluate Results
Results
Indicators
Analyze to
Prioritize
SMART
Goals
Specific
Strategies
Inquiry
“Data-driven decision making begins
by asking fundamental questions.”
Doug Reeves
• What questions do you have about
teaching and learning in your
school?
• What data sources are you using to
gather the specific information?
Step 1: Conduct a Treasure Hunt
• Why? To gather
and organize data
in order to gain
insights about
teaching and
learning practices
• Considerations
• Measures of data
• Disaggregation
• Triangulation
• Reflection
Measures of Data
•
•
•
•
Student learning
Demographics
Perceptions
School processes – Behaviors within
our control: instructional and
leadership strategies, programs and
resources, and organization
Disaggregation
• To separate something into its component
parts, or break apart
• “Disaggregation is not a problem-solving
strategy. It is a problem-finding strategy.”
Victoria Bernhardt, Data Analysis, 1998
Think, pair, share:
What data do you disaggregate, and how
do you use the information?
Triangulation
A Look at Learning
DRA
Benchmark
Running
Records
Case Study
• Read case study
• Part 1: How did they
categorize the different
data sets and record
their observations?
• Part 2: What did they
discover?
Conduct a Treasure Hunt
Application
1.
2.
3.
4.
Review inquiry questions
Conduct a “Treasure Hunt”
Organize data on templates
Use rubric to monitor and evaluate
your work
Can You Identify with This?
“It is not so much a lack of data, but an
absence of analysis, and an even
greater absence of actions driven by
the data.”
White, 2005
Step 2
Analyze Data to Prioritize Needs
Data Analysis at Northside Middle School
Analyze Data to Prioritize Needs
• Why? To identify
causes for
celebration and to
identify areas of
concern
• Considerations
•
•
•
•
Strengths
Needs
Behavior
Rationale
6
5
4
3
2
1
0
Quality Prioritization
• Why? To take immediate action on the most
urgent needs
• Quality prioritization requires a thorough
understanding of:
• Student population
• Curriculum and Power/Priority Standards
(leverage, readiness)
• Antecedents affecting student achievement
• Quality of program implementation
White, 2005
Case Study
• Review case study
• What insights did you gain after
reading analysis of student
performance?
• Make a recommendation: What is the
most urgent need?
Review, Analyze, and Prioritize
Application
1. Review data from Step 1
2. Conduct analysis using the guiding
questions
3. Prioritize urgent needs using the
suggested criteria
4. Record your work on the templates
5. Use rubric to monitor and evaluate
your work
Step 3
Establish SMART Goals
• Why? To identify our most critical
goals for student achievement based
on the challenges that were identified
through the inquiry process
• Specific, Measurable, Achievable,
Relevant, Timely
Establish Your SMART Goals
Application
• Review prioritized needs
• Review Treasure Hunt baseline data
• Apply SMART goal formula, use
templates
• Use rubric to monitor and evaluate
your work
Goals – Application
1. Review prioritized needs
2. Review Treasure Hunt baseline data
3. Apply SMART goal formula; use
templates to record your work
4. Use rubric to monitor and evaluate
your work
Share Your Findings with
Colleagues
• Meet in the middle of
the room
• Be prepared to share
your findings from
Steps 1-3
• Highlight one
celebration from a
colleague
Step 4
Select Specific Strategies
Let’s watch Lake Taylor High School as
they discuss strategies.
Select Specific Strategies
• Why?
• Adult actions will impact student
achievement
• Strategies are –
•
•
•
•
Action-oriented
Measurable/accountable
Specific
Research-based
• Considerations: Instructional,
organizational, leadership, programmatic
Research-Based Strategies
• Reeves, D.B. (2003). 90/90/90 schools.
Retrieved from www.LeadandLearn.com
• Reeves, D.B. (2006). Ten things high
schools can do right now to improve
student achievement.
• Learning 24/7 Observation Study (2005).
What’s happening in schools? Or not?
Additional Evidence in Support
of Research-Based Strategies
• Zemelman, S., Daniels, H., & Hyde, A. (2005).
Best practice. Portsmouth, NH: Heinemann.
• Marzano, R. (2007). The art & science of
teaching. Alexandria, VA: ASCD.
• Barr, R., & Parrett, W.H. (2007). The kids left
behind. Bloomington, IN: Solution Tree.
• Marzano, R., Waters, T., & McNulty, B. (2005).
School leadership that works. Alexandria, VA:
ASCD.
Let’s Do It!
Guided Practice
Case Study
• Revisit case study analysis
• What types of strategies (instructional,
organizational, leadership,
programmatic) did they select?
• How will the strategies help students
overcome the obstacles?
Select Your Specific Strategies
1. Revisit your prioritized needs
2. Research the best possible strategies
to meet the learner needs
3. Group by type of strategy:
Instructional, organizational,
programmatic, and leadership
4. Use rubric to monitor and evaluate
your work
Step 5
Determine Results Indicators
Why? To monitor the degree of
implementation and evaluate the
effectiveness of the strategies
Results Indicators
• Considerations
• Serve as an interim measurement
• Used to determine effective
implementation of a strategy
• Used to determine if strategy is
having the desired impact
• Help to determine midcourse
corrections
Case Study
• Review case study
• How will their results indicators serve
as an interim measurement?
• How clearly will the results indicators
help to monitor implementation and
impact?
Results Indicator Application
1. Revisit strategies (Step 4)
2. Develop results indicators
3. Use rubric to monitor and evaluate
your work
“Improvement cycles require
leadership follow-up and relentless
efforts to maintain the focus on data if
decisions are truly going to be driven
by informed data.”
White, 2005
Step 6
Monitor and Evaluate Results
Why? To engage in a continuous
improvement cycle that –
• Identifies midcourse corrections
where needed
• Adjusts strategies to assure fidelity
of implementation
Case Study
• Review the case study
• How did they monitor strategies?
• Was there any evidence of midcourse
corrections?
Develop Your Monitoring Plan
• Review your work from developing questions to
determining results indicators then determine
how you will monitor the strategies. When you
create your monitoring plan consider:
•
•
•
•
•
•
Teacher or administrator teams
Monitoring cycles
Goals
Strategies
Impact on student and adult behavior
Ability to make midcourse corrections
Educators Matter
“Many people live their lives aspiring to make a
difference and lead a life that matters. There
need be no such uncertainty in the life of an
educator or school leader. Every decision we
make, from daily interactions with students to
the most consequential policies at every level
of government, will influence leadership and
learning…
… After all these words, statistical
analyses, and graphs,…
What we do matters.”
Reeves, 2006
Questions and Discussion
Your ideas and reflections are important to us.
Please take time to complete the short
evaluation form that we reviewed at the
beginning of this seminar.
The Leadership and Learning Center
866.399.6019
LeadandLearn.com