Is PBIS Evidence-based? George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August 5, 2008 www.cber.org www.pbis.org [email protected].
Download ReportTranscript Is PBIS Evidence-based? George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August 5, 2008 www.cber.org www.pbis.org [email protected].
Is PBIS Evidence-based?
George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August 5, 2008 www.cber.org www.pbis.org [email protected]
Purpose Is PBIS Evidence-based Practice?
• What is
PBIS
?
• How is
evidence
-based determined?
• What is
PBIS evidence
?
www.pbis.org
Horner, R., & Sugai, G. (2008). Is school-wide positive behavior support an evidence-based practice? OSEP Technical Assistance Center on Positive Behavioral Interventions and Support.
http://www.pbis.org/files/101007eviden cebase4pbs.pdf.
Evidence Basics
Why evidence-based?
• Maximize
outcomes
• Minimize
harm
• Increased
accountability
• Increase
efficiency
• Improve
decision making
• Improve
resource use
Basic Approach
• Start w/ what has greatest likelihood of addressing ( evidence-based ) confirmed problem/question – Explained/supported conceptually/empirically • Adapt to local context/culture/need • Monitor regularly & adjust based on data • Adapt for efficient & durable implementation
4 Evaluation Criteria
• • • • Effectiveness – Has/will practice produced desired outcome?
Efficiency – What are costs (time, resources, $) to implement practice?
Relevance – Is practice & outcomes appropriate for situation?
Conceptually soundness – Is practice based on theory?
Start Review questions & data on regular basis Does problem exist?
Yes Specify features of need/problem No Identify practice that addresses need/problem?
Basic Practices Evaluation No Is practice research based?
Yes No Is evidence of effectiveness available?
No Consider another practice Yes Can practice be adapted?
Yes Implement & monitor effects No Is adequate progress observed?
Yes Improve efficiency & sustainability of practice implementation
Start Review questions & data on regular basis Does problem exist?
Yes Specify features of need/problem No Identify practice that addresses need/problem?
Identify practice that addresses need/problem?
Is practice research based?
Yes No Is evidence of effectiveness available?
No Consider another practice Yes Can practice be adapted?
Yes No
No Yes Can practice be adapted?
Yes Implement & monitor effects Is adequate progress observed?
Yes Improve efficiency & sustainability of practice implementation
Design Questions
• Has functional or cause-effect relationship been demonstrated & replicated?
• Have alternative explanations accounted & controlled for?
been • Have threats or weaknesses of methodology been controlled for?
• Was study implemented w/ fidelity/accuracy ?
Research Designs
• •
Experimental
- RCT & SSR
Evaluation
- Descriptive w/ baseline •
Case Study
- Descriptive w/o baseline •
Testimonial
- No/Limited data
Results Questions • Who were
subjects
?
–
How much like my participants?
•
Where
was study conducted?
–
How much like where I work?
• What
measures
were used?
–
Do I have similar data?
• What
outcomes
were achieved?
–
Are expected outcomes similar
Effectiveness Logic • Significance (“
believe
”)
– Likelihood of same effect by chance
• Effect Size (“
strength
”)
– Size of effect relative to business as usual
• Consequential Validity (“
meaning
”)
– Contextually meaningful
SWPBS/PBIS
CONTINUUM OF SCHOOL-WIDE INSTRUCTIONAL & POSITIVE BEHAVIOR SUPPORT Primary Prevention: School-/Classroom Wide Systems for All Students, Staff, & Settings ~5% ~15% Tertiary Prevention: Specialized Individualized Systems for Students with High-Risk Behavior Secondary Prevention: Specialized Group Systems for Students with At-Risk Behavior ~80% of Students
Basics: 4 PBS Elements
Supporting Social Competence & Academic Achievement OUTCOMES Supporting Staff Behavior Supporting Decision Making PRACTICES Supporting Student Behavior
Team GENERAL IMPLEMENTATION PROCESS Agreements Data-based Action Plan Evaluation Implementation
SWPBS Subsystems Classroom Non-classroom Family Student
School-wide
1. Common purpose & approach to discipline 2. Clear set of positive expectations & behaviors 3. Procedures for teaching expected behavior 4. Continuum of procedures for encouraging expected behavior 5. Continuum of procedures for discouraging inappropriate behavior 6. Procedures for on-going monitoring evaluation &
Non-classroom • Positive expectations & routines taught & encouraged • Active supervision by all staff
– Scan, move, interact
• Precorrections & reminders • Positive reinforcement
Classroom
• • • • • Classroom-wide positive expectations & encouraged taught • Teaching classroom routines & cues taught & encouraged • Ratio of 6-8 positive to 1 negative student interaction adult Active supervision Redirections for minor errors , infrequent behavior Frequent precorrections for chronic errors Effective academic instruction & curriculum
Individual Student
• Behavioral competence at school & district levels • Function-based behavior support planning • Team- & data-based decision making • Comprehensive person-centered wraparound processes planning & • Targeted social skills instruction & self-management • Individualized instructional & curricular accommodations
Family
• Continuum of positive behavior support for all families • Frequent, regular positive contacts, communications, & acknowledgements • Formal & active participation & involvement as equal partner • Access to system of integrated school & community resources
PBS Systems Implementation Logic
Funding Visibility Political Support
Leadership Team
Active & Integrated Coordination Training Coaching Evaluation Local School Teams/Demonstrations
PBIS Evidence Base
VIOLENCE PREVENTION?
• Positive, predictable school-wide climate • Surgeon General’s Report on Youth Violence (2001) • High rates of academic & social success • Formal social skills instruction • Coordinated Social Emotional & Learning (Greenberg et al., 2003) • Positive active supervision & reinforcement • Center for Study & Prevention of Violence (2006) • Positive adult role models • Multi componen t, multi year school-family-community effort • White House Conference on School Violence (2006)
90-School RCT Study Horner et al., in press • Schools that receive technical assistance from typical support personnel implement SWPBS with fidelity • Fidelity SWPBS is associated with ▫ Low levels of ODR ▫ .29/100/day v. national mean .34
▫ Improved perception of safety ▫ reduced risk factor of the school ▫ Increased proportion of 3 rd reading standard.
graders who meet state
RCT Project Target Bradshaw & Leaf, in press • PBIS (21 v. 16) schools reached & sustained high fidelity • PBIS increased all aspects of organizational health • Positive effects/trends for student outcomes – Fewer ODRs (majors + minors) – Fewer ODRs for truancy – Fewer suspensions – Increasing trend in % of students scoring in advanced & proficient range of state achievement test
Elem With School-wide PBS
20 15 10 5 0 -5 1 2 3 4 5 6 7
Schools
8 9 10 11 12 13
Elem Without School-wide PBS
0 -2 -4 -6 6 4 2 1 2 3
Schools
4 5 6 4J School District Eugene, Oregon Change in the percentage of students meeting the state standard in reading at grade 3 from 97-98 to 01-02 for schools using PBIS all four years and those that did not.
Central Illinois Elem, Middle Schools Triangle Summary 03-04
1 05% 11% 20% 0.8
22% 0.6
0.4
84% 58% 0.2
SWPBS schools are more preventive 0 Met SET (N = 23) Not Met SET (N =12) 6+ ODR 2-5 ODR 0-1 ODR
National ODR/ISS/OSS
July 2008 # Sch # Std K-6 1756 6-9 476 9-12 177 2409 781,546 311,725 161,182 1,254,453 # ODR 423,647 414,716 235,279 1,073,642 ISS # Evnt avg/100 # Day
6 12 38 49 38 61
OSS # Evnt avg/100 # Day # Expl
6 10
0.03
30 74
0.29
24 61
0.39
July 2, 2008 40% 30% 20% 10% 0% 100% 90% 80% 70% 60% 50% 3 K-6 8 89
% Students
8 15 77 6-9
School Level
9-12 ODR rates vary by level 9 16 74 6+ 2-5 0-1
July 2, 2008 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 33
% Major ODRs
45 44 42 38 38 26 17 K-6 6-9
School Level
9-12 A few kids get many ODRs 18 6+ 2-5 0-1
1000 900 800 300 200 100 0 700 600 500 400 K
Bethel School District Office Discipline Referrals 2001-2008
1 2 3 4 5 6
Grade Level
7 8 9 10 11 12 2001-02 2002-03 2003-04 2004-05 2005-06 2006-07 2007-08
SWIS summary 07-08 July 2, 2008 2,717 sch, 1,377,989 stds; 1,232,826 Maj ODRs
Grade Range # Schools
K-6 6-9 1,756 476
Mean Enroll.
445 654
Mean ODRs/100/ sch day
(std dev.)
.
.35
(.45)
1/300 day
.91
(1.40)
1/100 /day
9-12 K-(8-12) 177 308 910 401
1.05
(1.56)
1/105/day
1.01
(1.88) 1/100 /day