Data Driven Decision Making

Download Report

Transcript Data Driven Decision Making

Data Driven Decision Making
Missouri PBS Summer
Institute
June 28 & 29, 2006
Purpose



Provide guidelines for using data for team
planning
Provide guidelines for using data for ongoing problem solving
Apply guidelines to examples
Improving Decision Making
From
To
Problem
Problem
Solution
Problem Solving
Solution
Key features of data systems that work




The data are accurate and valid
The data are very easy to collect (1 % of staff time)
Data are presented in picture (graph) form
Data are used for decision-making
–
–
–
The data must be available when decisions need to be
made (weekly?)
Difference between data needs at a school building and
data needs for a district
The people who collect the data must see the information
used for decision-making.
Why collect discipline data?



Decision making
Professional accountability
Decisions made with data (information) are
more likely to be 1) implemented and 2)
effective.
What data to collect for decision
making?
Use what you have:
 Attendance
 Suspensions/Expulsions
 Vandalism
 Office discipline referrals/detentions
– Measure of overall environment. Referrals are affected by 1)
student behavior 2) staff behavior and 3) administrative
context
– An under-estimate of what is really happening
– Office referrals per day per month
When should data be collected?




Continuously
Data collection should be an embedded part
of the school cycle, not something “extra”
Data should be summarized prior to
meetings of decision-makers
Data will be inaccurate and irrelevant unless
the people who collect and summarize it see
the data used for decision making.
Organizing Data for “active decision
making”

Counts are good, but not always useful

To compare across months use “average
office discipline referrals per day per month”
Using Data for On-going Problem
Solving


Start with the decision, not the data
Use data in “decision layers” (Gilbert, 1978)
–
–
Is there a problem? (overall rate of ODR)
Localize the problem




(location, problem behavior, students, time of day)
Don’t drown in the data
It’s “OK” to be doing well
Be efficient
Interpreting Office Referral Data: Is
there a problem?

Absolute level (depending on size of school)
–
–

Trends
–
–

Middle, High Schools (1> per day per 100)
Elementary Schools (1> per day per 250)
Peaks before breaks?
Gradual increasing trend across year?
Compare levels to last year
–
Improvement?
What systems are problematic?

Referrals by problem behavior?
–

Referrals by location?
–

Are there specific problem locations?
Referrals by student?
–

What problem behavior is most common?
Are there many students receiving referrals or only a small
number of students with many referrals?
Referrals by time of day?
–
Are there specific times when problems occur?
Designing Solutions


If many students are making the same
mistake it typically is the system that needs
to change, not the students.
Teach, monitor and reward before relying on
punishment.
Application Exercise






What is going well?
Do you have a problem?
Where?
With whom?
What other information might you want?
Given what you know, what considerations
would you have for possible action?
SWIS: School-Wide Information
System



http://www.swis.org
SWIS Readiness Checklist
SWIS Compatibility Checklist
Summary


Transform data into “information” that is used
for decision making
Present data within a process of problem
solving
–
–

Use the trouble-shooting tree logic
Big Five first (how much, who, what, where, why)
Ensure the accuracy and timeliness of data