Overview of Positive Behavioral Interventions & Supports

Download Report

Transcript Overview of Positive Behavioral Interventions & Supports

Data Based
Decision Making:
Evaluating the Impact of SWPBIS
Idaho SWPBIS Training Institute
Purpose
• Define the outcomes for SWPBIS
•Is SWPBIS related to reduction in problem behavior?
•Is SWPBIS related to improved school safety?
•Is SWPBIS related to improved academic performance?
• Define tools for measuring SWPBIS
outcomes
• Examine a problem-solving approach for
using ODR data for decision-making
• Provide strategies for using data for
decision-making and action planning
Idaho’s Tiered Instructional and Positive Behavioral
Interventions and Support (PBIS) Framework
Academic Systems
Behavioral Systems
Intensive, Individual Interventions
Individual Students
Assessment-based
High Intensity
1-5%
Targeted Group Interventions
Some students (at-risk)
High efficiency
Rapid response
Universal Interventions
All students
Preventive, proactive
5-10%
80-90%
1-5%
Intensive, Individual Interventions
Individual Students
Assessment-based
Intense, durable procedures
5-10%
Targeted Group Interventions
Some students (at-risk)
High efficiency
Rapid response
80-90%
Universal Interventions
All settings, all students
Preventive, proactive
To Improve Schools for
Children
•
•
•
•
Use evidence-based practices
Always look for data of effectiveness
Never stop doing what is working
Implement the smallest change that will
result in the largest improvement
Measure
Compare
Improvement
Model of Continuous Improvement
Plan
Act
Do
Check
Improving Decision-Making
Problem
Problem
Solution
Problem-solving
Information
Solution
Problem-solving Steps
1. Define the problem(s)
1. Analyze the data
2. Define the outcomes and data sources for
measuring the outcomes
3. Consider 2-3 options that might work
Problem-solving Steps
4. Evaluate each option
1. Is it safe?
2. Is it doable?
3. Will it work?
5. Choose an option to try
Problem-solving Steps
6. Determine the timeframe to evaluate
effectiveness
7. Evaluate the effectiveness by using the
data
1. Is it worth continuing?
2. Try a different option?
3. Redefine the problem?
Key Features of
Effective Data Systems
•
•
•
•
Data are accurate
Data are very easy to collect
Data are used for decision-making
Data are available when decision need to be
made
• Data collectors must see the information used
for decision making
Key Features of
Effective Data Systems
Activity: Reflect on your data.
What data do you have? Is it:
•
•
•
•
•
Accurate?
Very easy to collect?
Used for decision-making?
Available when decisions need to be made?
Collected to be used?
Guiding Considerations
•
•
•
•
Use accessible data
Handle data as few times as possible
Build data collection into daily routines
Establish and use data collection as a
conditioned positive reinforcer
• Share data summaries with those who collect it
Questions to Consider
When Collecting Data
1. Is the current approach achieving the intended outcomes?
o
o
o
o
Is the plan working as well as or better than it did last year?
Is a change in the plan needed?
Do students have the skills to do what is expected?
Are the behavioral needs of all students being adequately met?
2. What areas need improvement?
o
o
o
Which grade level(s ) need additional skills training?
What physical areas of the school are perceived as less safe?
Which classroom routines do students need to be retaught?
3. Which students need additional support?
o
o
Which students received two or more ODRs in the first month of school?
Which students consistently show signs of emotional distress (e.g., anxiety,
depression)?
Young et al., 2012
Collecting data without having a definite
question is like going shopping without a list.
Teams may spend a great deal of effort and time
collecting data and creating systems that do not
give them the answers they really need.
Young et al., 2012
Types of Questions
• Initial Assessment Questions:
o What type or which program do we need?
o Where should we focus our efforts?
• Ongoing Evaluation Questions:
o Is the program working?
o If no:
• Can it be changed?
• Should we end the program?
o If yes:
• Do we need this program anymore?
• What do we need to do to sustain success?
What Data Should Be Collected?
• Always start with the
questions you want to
answer.
• Make data that will
answer your question.
• Balance between
reliability and
accessibility.
o Systems approach
• Consider logistics.
o
o
o
o
Who?
When?
Where?
How?
• Two levels
o What is readily accessible?
o What requires extra
resources?
When & By Whom Should Data
Decisions Be Made?
• Natural cycles, meeting times
o Weekly, monthly, quarterly, annually
• Level of system addressed
o
o
o
o
Individual: daily, weekly
Schoolwide: daily, weekly
District/Region
State-level
• Teacher, coach, support personnel, paras,
secretary
Basic Evaluation Questions by
School or Program
1. What does “it” look like now?
2. How would we know if we are successful?
3. Are we satisfied with how “it” looks?
1.
YES:
 Celebrate
2. NO:
 What do we want “it” to look like?
 What do we need to do to make “it” look like that?
4. What can we do to keep “it” like that?
Basic SWPBIS Evaluation
Questions
Are our efforts making a difference?
1. Is our school adopting SWPBIS to criterion?
2. Is our school perceived as safe?
3. Are teachers delivering instructional lessons
with fidelity as planned?
4. Is SWPBIS improving student outcomes?
Is SWPBIS Having a Positive
Influence on School Culture?
Using Office Discipline Referral Data
Office Discipline Referrals
• Examine office discipline referral rates and
patterns
o Major Problem Events
o Minor Problem Events
Office Discipline Referrals
• Ask the “BIG 5” questions:
1.
2.
3.
4.
5.
How Often are problem behavior events occurring?
Where are they happening?
What types of problem behaviors?
When are the problems occurring?
Who is contributing?
Office Discipline Referral
Caution
• Data reflects 3 factors:
o Students
o Staff members
o Office personnel
• Data reflects overt rule violators
• Data is useful when implementation is
consistent.
Do staff and administration agree on office-managed problem
behavior versus classroom-managed behavior?
Defining MINOR, MAJOR,
& CRISIS events in your
school.
Staff Managed (minors)
• Tardy
• Unprepared; no
homework/materials
• Violation of classroom expectations
• Inappropriate language
• Classroom disruption
• Minor safety violation
• Lying/cheating
Office Managed (majors)
•
•
•
•
•
•
•
Consequences are determined by staff. •
•
•
•
•
Repeated minor behaviors
Insubordination
Blatant disrespect
Abusive/inappropriate language
Harassment/intimidation
Fighting/physical aggression
Safety violations that are
potentially harmful to self, others
and/or property
Vandalism/property destruction
Plagiarism
Theft
Skipping classes
Illegal behaviors: arson, weapons,
tobacco, alcohol/drugs
General Procedure for Dealing with Problem Behaviors
Observe problem
behavior
Find a place to talk with
student(s)
Is behavior
major?
NO
Determine
consequence
Problem solve
Determine
consequence
Follow procedure
documented
File necessary
documentation
Ensure safety
Write referral and
Escort student to office
Problem solve
NO
YES
Does
student
have 3?
Follow
documented
procedure
YES
Follow
through with
consequences
Send
referral to
office
File necessary
documentation
Follow up
with student
within a
week
SWIS™ Compatibility Checklist
Procedure for Documenting Office Discipline Referrals
School ___________________________
Date ____________________
Date
Date
Compatibility Question
1. Does a clear distinction exist between problem behaviors that are staff
managed versus office managed exist and is it available for staff reference?
Yes
No
Yes
No
2. Does a form exist that is SWIS™ compatible for SWIS™ data entry that
includes the following categories?
Yes
No
Yes
No
a. Student name?
Yes
No
Yes
No
b. Date?
Yes
No
Yes
No
c. Time of incident?
Yes
No
Yes
No
d. Student’s grade level?
Yes
No
Yes
No
e. Referring staff member?
Yes
No
Yes
No
f. Location of incident?
Yes
No
Yes
No
g. Problem behavior?
Yes
No
Yes
No
h. Possible motivation?
Yes
No
Yes
No
i. Others involved?
Yes
No
Yes
No
j. Administrative decision?
Yes
No
Yes
No
k. Other comments?
Yes
No
Yes
No
l. No more than 3 extra info.
Yes
No
Yes
No
3. Does a set of definitions exist that clearly defines all categories on the office
discipline referral form?
Yes
No
Yes
No
Next review date: _______________
Redesign your form until answers to all questions are “Yes.”
Readiness requirements 4 and 5 are complete when you have all “Yes” responses.
Tables versus Graphs
2001
Aug
Number of
Days
0
2001
Sep
19
5
0.26
2001
Oct
21
18
0.86
2001
Nov
18
17
0.94
2001
Dec
14
21
1.50
2002
Jan
22
18
0.82
2002
Feb
17
15
0.88
2002
Mar
19
26
1.37
2002
Apr
21
14
0.67
2002
May
18
13
0.72
2002
Jun
11
2
0.18
2002
Jul
0
0
0.00
180
149
0.83
Year
Totals:
Month
Number of
Referrals
0
Average Referrals
Per Day
0.00
Number of ODR per Day and
Month
Total versus Rate
Total
f ve rs us
Re f/Day/M
Total Re
Number
of ODRs
per o
NV High School
70
Month
R efer r als
60
50
40
30
20
10
0
Aug Sept Oct Nov Dec Jan
Feb
School Month
Mar
Apr
May Jun
M ean R efer r als per D ay
Number
ODRs
ando
Total Reof
f ve
rs usper
ReDay
f/Day/M
Month
5
4
3
2
1
0
Aug Sept Oct
Nov Dec
Jan
Feb
School Month
Mar
Apr
May
Jun
Priorities & Rational
• Graphs
• Rate
Interpreting Office Referral Data:
Is there a problem?
• Absolute level (depending on size of school)
o Middle Schools (>5 per day)
o Elementary Schools (>1.5-2 per day)
• Compare levels to last year
o Improvement?
• Trends
o Peaks before breaks?
o Gradually increasing trend across year?
Are Schools Adopting
SWPBIS to Criterion?
• Use the:
o Team Implementation Checklist (TIC 3.1)
o Schoolwide Evalutation Tool (SET)
o EBS Self-Assessment Survey (SAS – “Schoolwide” section)
• Measure and analyze annually
Today we will focus on the TIC 3.1
Team Implementation Checklist
(TIC 3.1)
• Characterizes the evolution of SWPBIS
implementation:
o “Achieved,” “In progress,” or “Not started”
• Assists in:
o Initial assessment
o Getting started on action plan
o Measuring progress of SWPBIS implementation
• Assesses team-based response
o Quarterly or monthly
TIC 3.1 Feature Areas
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Establish commitment
Establish and maintain team
Conduct self-assessment
Define expectations
Teach expectations
Establish reward system
Establish violations system
Establish information system
Build capacity for function-based support
Ongoing activities
Scoring the TIC 3.1
• Implementation Points
o
o
o
Achieved = 2
In progress = 1
Not started = 0
• Percentage of items implemented
o
o
Total
•
Number of items scored as “Achieved” divided by 17 (items)
Subscale scores
•
Number of items in each subscale area scored as “Achieved”
divided by the number of items in that subscale area
• Percentage of points implemented
o
o
Total
•
Total number of points divided by 34
Subscale scores
•
Total number of points in each subscale divided by total number
of items multiplied by 2
SWPBIS Main Messages
• Invest in prevention
• Create an effective environment
o Leadership, teams; hosts for effective practices
• Use different systems for different problems
o Individual student level alone will be insufficient
o Collaboration with Mental Health Professionals
• Build a culture of competence
o Define, teach, monitor, and reward appropriate behavior
• Build sustainable systems
o Resist person-dependent interventions
• Invest in gathering and using information for decisionmaking and problem-solving
Action Planning for SWPBIS
•
Use your self-assessment information
o
o
o
o
•
Rally schoolwide commitment
Establish a PBIS team
Focus on prevention (define, teach, monitor, and reward appropriate
behavior)
•
Ask kids tomorrow if they know the expectations
•
Ask kids if they are being acknowledged for appropriate behavior
Use information system to guide implementation efforts
Build Action Plan
o
o
o
When will the team meet?
What will be reported to faculty?
What will be reported to families?
Action Planning for SWPBIS
•
•
Which system are you going to work on?
What are the specific outcomes?
o
o
o
•
When will they be completed?
What short-term activities are needed?
Who will be responsible?
Reported Schedule
o
o
What information will be gathered and by whom?
When will information be reported?
Dirty Data