Rob Horner University of Oregon OSEP Center on PBIS www.pbis.org Build district capacity to support effective practices. Classroom Supports for Students School-wide Systems (curriculum, staff development, coaching, data) District Capacity (Data.

Download Report

Transcript Rob Horner University of Oregon OSEP Center on PBIS www.pbis.org Build district capacity to support effective practices. Classroom Supports for Students School-wide Systems (curriculum, staff development, coaching, data) District Capacity (Data.

Rob Horner
University of Oregon
OSEP Center on PBIS
www.pbis.org
Build district capacity to support effective
practices.
Classroom
Supports
for Students
School-wide Systems
(curriculum, staff development, coaching, data)
District Capacity
(Data Systems, Policies, Hiring, Orientation, Eval)
School-wide Positive Behavioral
Interventions and Supports (PBIS)
Build a continuum of
supports that begins with
the whole school and
extends to intensive,
wraparound support for
individual students and
their families.
What is School-wide Positive Behavior Interventions and Supports?
Horner, Sugai & Anderson (2010),
School-wide
is:
Examining
thePBIS
Evidence
Base for
School-wide
PBIS. Focus
on the social culture and
A systems framework
for establishing
behavioralChildren,
supports needed
for a 1-14
school to be an effective
Exceptional
42 (8),
learning environment for all students.
Randomized
control
trials
indicate that
Evidence-based
features
of SW-PBIS
SWPBIS
is linked to:
Prevention
(a) Reduction
in ODRs,
Define and teach
positive social expectations
Acknowledge
positive behavior
(b) Improved
academic
achievement,
Arrange consistent consequences for problem behavior
(c) Perceived
improvement in school
On-going collection and use of data for decision-making
safety
Continuum of intensive, individual intervention supports.
(d)Perceived
improvement
Implementation
of the systems in
thatteacher
support effective
efficacy
practices
Schools Adopting SWPBIS by Year
16000
14000
14,325 Schools Adopting
School-wide PBIS
12000
10000
8000
6000
4000
2000
0
00
01
02
03
04
05
06
07
08
09
2010
2011
1400
800
Alabama
Alaska
Arizona
Arkansas
California
Colorado*
Connecticut
Delaware
Florida*
Georgia
Hawaii
Idaho
Illinois
Indiana
Iowa*
Kansas*
Kentucky
Louisiana*
Maine
Maryland*
Massachusetts
Michigan
Minnesota
Mississippi
Missouri*
Montana*
Nebraska
Nevada
New Hampshire
New Jersey*
New Mexico
New York
North Carolina*
North Dakota*
Ohio
Oklahoma
Oregon*
Pennsylvania
Rhode Island
South Carolina*
South Dakota
Tennessee
Texas
Utah*
Vermont
Virginia
Washington State
Washington DC
West Virginia
Wisconsin
Wyoming
Schools use SWPBIS (Feb, 2011)
1600
11 states with over 500 schools
Illinois
3 states with over 1000 schools
1200
Florida
Texas
1000
Oregon
600
400
200
0
0
Virginia
Wyoming
Wisconsin
West Virginia
Washington DC
Washington State
50
Vermont
Utah*
Texas
Tennessee
Maryland
South Dakota
70
South Carolina*
Illinois
Rhode Island
Pennsylvania
Oregon*
Oklahoma
Ohio
North Dakota*
North Carolina*
New York
New Mexico
New Jersey*
New Hampshire
Nevada
Nebraska
Montana*
Missouri*
Mississippi
Minnesota
Michigan
Massachusetts
Maryland*
Maine
Louisiana*
Kentucky
Kansas*
Iowa*
Indiana
Illinois
Idaho
Hawaii
Georgia
Florida*
Delaware
Connecticut
Colorado*
California
Arkansas
Arizona
Alaska
Alabama
80
3 states > 60%
6 states > 40%
10 states > 30%
60
Oregon
40
30
20
10
Elementary K-6
700
600
500
400
300
200
100
0
Middle 6-9
High 9-12
K (8-12)
Findings
SWPBIS is possible (at all grade levels)
SWPBIS is associated with:
20-60% reduction in problem behavior (ODRs)
Increases in academic performance
Perception of school as a safe environment
Improved self-assessment of faculty effectiveness
Mean Percentage of Students Statewide
with Majors 2009-10
% of Students with ODRs
Triangle Data by Fidelity Results Only
100%
90%
4%
7%
9%
12%
80%
87%
81.78%
70%
60%
50%
Fully Implementing (n=272)
Partially Implementing (n=25)
Out of School
Suspension per 100
Students Enrolled
Middle Schools
High Schools
National Medians
.22
.50
.68
.42
Elementary School with 150 Students
Compare with National Median
What is
1.50 X .22 = .33
What can be
What is needed
What is possible
150 / 100 = 1.50
Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009).
13
Average Major Discipline Referrals per 100 Students by Cohort
180
160
140
120
100
80
60
40
20
0
Cohort 1 (n=15)
Cohort 2 (n=19)
Cohort 3 (n=34)
2004-2005
2005-2006
2006-2007
2007-2008
Cohort 4
Percent of Students meeting DIBELS Spring Benchmark
for Cohorts 1 - 4 (Combined Grades)
100%
Spring ’09: 62,608 students
assessed in cohorts 1 - 4
90%
5,943
students
assessed
80%
70%
8,330
students
assessed
16,078
students
assessed
32,257
students
assessed
60%
50%
40%
30%
20%
10%
0%
Cohort 1
2003-04
2004-05
Cohort 2
2005-06
Cohort 3
2006-07
2007-08
Cohort 4
2008-09
Percent of Students at DIBELS Intensive Level across
year by Cohort
Percent of Students at DIBELS Intensive
Intervention Level
30%
25%
20%
15%
10%
5%
0%
Cohort 1
2003-04
2004-05
Cohort 2
2005-06
2006-07
Cohort 3
2007-08
Cohort 4
2008-09
Quality, Equity, Efficiency
Build capacity to implement effective practices
Focus on student outcomes
Focus on fidelity with which effective practices are used.
Avoid doing too many different things at one
time
Stages of implementation
Alignment of district practices
what works
what fits
Are the strategies/practices in the district
focused on core student outcomes
Academic excellence
Behavioral competence
Attendance/ graduation
Health and safety
Are the strategies/ practices in the district a
good fit with the students/ families/ faculty/
staff of the district.
Does this build on what we already do well?
Do we actually know how to do this?
Are we comfortable doing this practice?
Stages of Implementation
Implementation occurs in stages:
Exploration
Installation
Initial Implementation
Full
Implementation is a
Implementation
Innovation
repeating process
Sustainability
Fixsen, Naoom, Blase, Friedman, & Wallace, 2005
2 – 4 Years
Successful Student Outcomes
Program/Initiative/Framework
Performance Assessment
(Fidelity)
Coaching
Systems
Intervention
Training
Selection
Facilitative
Administration
Core Implementation
Drivers
Decision Support Data
System
Leadership
Adaptive
Technical
© Fixsen & Blase, 2008
Lessons Learned
Avoid “Initiative Overload” by aligning efforts for
improvement
All initiatives tied to core outcomes
All initiatives are “evidence-based”
All initiatives have proven implementation
effectiveness and efficiency (e.g. at least 50
schools in Oregon)
All initiatives define the “systems” needed for
sustainability
All initiatives have efficient measures of fidelity
Wraparound
Equity
Math
ALIGNMENT
Literacy
Response to Intervention/Prevention
Primary
Prevention
Early Intervention
Universal
Screening
Multi-tiered
Support
Wraparound
Early
Math
Intervention
Family Support
Behavior Support
Student Outcomes
© Dean Fixsen, Karen Blase, Robert Horner, George Sugai, 2008
Progress
Monitoring
Systems to
support
practices
Effective and Efficient
Foundation Practices
1. Effective Curriculum
2. Unambiguous Instruction
Establishing a Universal System
of Support
3. Adequate intensity
4. Reward System
5. Error Correction System
2. Universal Screening
6. Collect information on all
students at least twice a year
7. Use data for decision-making
2 or more ODRs
SSBD is used in Illinois
12
Jennifer
Frank, Kent McIntosh, Seth May
Cumulative Mean ODRs
10
Cumulative Mean ODRs Per Month
for 325+ Elementary Schools 08-09
8
0-1
6
2-5
6+
4
2
0
Aug
Sept
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
3. Continuum of Evidencebased Practices
8. Targeted interventions for
students “at risk”
9. Intensive, Individualized
interventions for students with
more significant needs
10. Early Intervention
Progress Monitoring
11. Collection of data on a
monthly, weekly, daily rate
12. Use of data for decisionmaking
13. Assessing the extent to
which we are implementing
what we claim to implement
5. Fidelity Monitoring
Iowa Checklist 01-05, PK-6 % Fully & Partially Implemented
Team Checklist
14. Use of the data for
decision-making
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
1
1
1
2
2
3
3
4
4
4
4
Start Up Full Implementation
4
5
5
Start Up Part Implementation
5
6
6
7
03-Jun-05
08-Nov-04
7
08-Mar-05
03-Aug-04
01-Nov-03
01-Sep-03
6
01-Mar-04
07-Nov-03
5
06-Feb-04
11-Sep-03
05-Aug-03
05-Nov-03
21-Apr-03
4
01-Sep-03
31-Oct-02
28-Feb-03
12-Sep-02
24-Nov-04
3
01-Mar-05
12-Aug-04
02-Jun-05
22-Jan-04
2
01-Feb-05
23-Feb-04
05-Aug-03
05-Nov-03
0%
7
7
Team Initiated
Problem Solving
(TIPS) Model
Identify
Problems
Develop
Hypothesis
Evaluate and
Revise
Action Plan
Collect
and Use
Data
Develop and
Implement
Action Plan
Discuss and
Select
Solutions
1,7,11
Problem Solving
Meeting Foundations
Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS)36
Training
Manual. Educational and Community Supports, University of Oregon, unpublished training manual.
District policy
Clear statement of values, expectations, outcomes
Ability to conduct universal screening and
progress monitoring assessments
District provides efficient options for universal screening
and progress monitoring measures
Recruitment and hiring
Expectations defined in job announcements
Annual faculty orientation
Professional development
Focused strategies for staff development in core skills
Always train teams not individuals
Match training with access to coaching support
Coaching Capacity
Training linked to on-site assistance to implement
Competent Implementation
OUTCOMES
(% of Participants who Demonstrate Knowledge, Demonstrate new Skills
in a Training Setting,
and Use new Skills in the Classroom)
Knowledge
Skill
Demonstration
Use in the
Classroom
Theory and
Discussion
10%
5%
0%
..+Demonstration in
Training
30%
…+ Practice &
Feedback in Training
60%
60%
5%
…+ Coaching in
Classroom
95%
95%
95%
TRAINING
COMPONENTS
20%
0%
Joyce and Showers, 2002
Annual evaluations
Expectations assessed as part of annual evaluations
Recruitment of individuals with training,
coaching, and implementation skills
Advanced skills in literacy supports
Advanced skills in behavior supports
Summary
Fiscal constraints create opportunities
Efficient Improvement through integration and
collaboration
Implement practices that are evidence-based
Implement practices with the systems needed for
sustainability and impact.
Emphasize measuring for improvement, not just
“accountability” or “compliance”
Are we doing what we said we would do?
Are practices benefiting students?