Transcript Document

Florida’s PS/RtI Project: Evaluation of Efforts to
Scale Up Implementation
Just Read, Florida! Leadership Conference
July 1, 2008
Jose Castillo, M.A.
Michael Curtis, Ph.D.
George Batsche, Ed.D.
Presentation Overview
•
•
•
•
•
Florida PS/RtI Project Overview
Evaluation Model Philosophy
Evaluation Model Blueprint
Examples of Data Collected
Preliminary Outcomes
The Vision
• 95% of students at “proficient” level
• Students possess social and emotional
behaviors that support “active” learning
• A “unified” system of educational services
– One “ED”
State Regulations Require
• Evaluation of effectiveness core instruction
• Evidence-based interventions in general
education
• Repeated assessments in general education
measuring rate changes as a function of
intervention
• Determination of RtI
Response to Intervention
• RtI is the practice of (1) providing high-quality
instruction/intervention matched to student needs and
(2) using learning rate over time and level of
performance to (3) make important educational
decisions (Batsche et al., 2005).
• Problem-solving is the process that is used to develop
effective instruction/interventions.
What Does “Scaling Up”
Mean?
• What is the unit of analysis
– Building
– District
– State
– Region
– Nation?
• Scaling up cannot be considered without
considering “Portability”
Portability
• Student mobility rate in the United States is significant
– 33% Elementary Student, 20% Middle School
(NAEP)
• Impact on Data
– Different assessment systems/databases may limit
portability
• Impact on Interventions
– What if interventions used for 2-3 years are not “available” in
the new district, state?
• Portability of systems MUST be considered when any
realistic scaling up process is considered
Brief FL PS/RtI Project
Description
Two purposes of PS/RtI Project:
– Statewide training in PS/RtI
– Evaluate the impact of PS/RtI on
educator, student, and systemic
outcomes in pilot sites implementing the
model (FOCUS TODAY)
Scope of the Project
•
•
•
•
•
PreK-12 (Current focus = K-5)
Tiers 1-3
Reading
Math
Behavior
FL PS/RtI Project: Where
Does It Fit?
• Districts must develop a plan to guide
implementation of their use of PS/RtI
• State Project can be one component of the
plan
• It cannot be THE plan for the district
• District must own their implementation
process and integrate existing elements and
initiate new elements
Pilot Site Overview
• Through competitive application process
– 8 school districts selected
• 40 demonstration schools
• 33 matched comparison schools
• Districts and schools vary in terms of
– Geographic location
– Student demographics
– Districts: 6,200 – 360,000 students
• School, district and Project personnel work
collaboratively to implement PS/RtI model
Demonstration Districts
Pilot Site Overview (cont’d)
• Training, technical assistance, and support provided to
schools
– Training provided by 3 Regional Coordinators using same
format as statewide training
– Regional Coordinators and PS/RtI coaches (one for each
three pilot schools) provide additional guidance/support to
districts and schools
• Purpose = program evaluation
– Comprehensive evaluation model developed
– Data collected from/on:
• Approximately 25-100 educators per school
• Approximately 300-1200 students per school
Year 1 Focus
Understanding the Model
Tier 1 Applications
Three Tiered Model of School Supports - Tier I
Focus
Behavioral Systems
Academic Systems
Tier 3: Comprehensive and Intensive
Interventions
Individual Students or Small Group (2-3)
Reading: Scholastic Program,
1-5%
Tier 3: Intensive Interventions
Individual Counseling
FBA/BIP
Prevent, Teach, Reinforce (PTR)
Assessment-based
Intense, durable procedures
1-5%
Reading,Mastery, ALL, Soar to Success, Leap
Track, Fundations
Tier 2: Strategic Interventions
Students that don’t respond to the core
curriculum
Reading: Soar to Success, Leap Frog,
CRISS strategies, CCC Lab Math:
Extended Day
Writing: Small Group, CRISS strategies,
and “Just Write Narrative” by K.
Robinson
Tier 1: Core Curriculum
All students
Reading: Houghton Mifflin
Math: Harcourt
Writing: Six Traits Of Writing
Learning Focus Strategies
Tier 2: Targeted Group Interventions
Some students (at-risk)
Small Group Counseling
Parent Training (Behavior & Academic)
Bullying Prevention Program
FBA/BIP Classroom Management
Techniques, Professional Development
Small Group Parent Training ,Data
5-10%
5-10%
Students
80-90%
80-90%
Tier 1: Universal Interventions
All settings, all students
Committee, Preventive, proactive
strategies
School Wide Rules/ Expectations
Positive Reinforcement System
(Tickets & 200 Club)
School Wide Consequence
System School Wide Social Skills
Program, Data (Discipline,
Surveys, etc.) Professional
Development (behavior)
Classroom Management
Techniques,Parent Training
Change Model
Consensus
Infrastructure
Implementation
Stages of Implementing
Problem-Solving/RtI
•
Consensus
– Belief is shared
– Vision is agreed upon
– Implementation requirements understood
•
Infrastructure Development
–
–
–
–
–
Problem-Solving Process
Data System
Policies/Procedures
Training
Tier I and II intervention systems
• E.g., K-3 Academic Support Plan
– Technology support
– Decision-making criteria established
•
Implementation
Training Curriculum
• Year 1 training focus for schools
– Day 1 = Historical and legislative pushes toward implementing the
PSM/RtI Model
– Day 2 = Problem Identification
– Day 3 = Problem Analysis
– Day 4 = Intervention Development & Implementation
– Day 5 = Program Evaluation/RtI
• Considerable attention during Year 1 trainings is focused on
improving Tier I instruction
Evaluation Model
Difference Between
Evaluation & Research
“Prove”
“Improve”
Research
Evaluation
Higher Certainty
Lower Certainty
Lower Relevance
Higher Relevance
Working Definition of
Evaluation
• The practice of evaluation involves the
systematic collection of information about the
activities, characteristics, and outcomes of
programs, personnel, and products for use by
specific people to reduce uncertainties, improve
effectiveness and make decisions with regard
to what those program, personnel, or products
are doing and affecting (Patton).
Data Collection Philosophy
• Data elements selected that will best answer Project evaluation
questions
– Demonstration schools
– Comparison schools when applicable
• Data collected from
– Existing databases
• Building
• District
• State
– Instruments developed by the Project
• Data derived from multiple sources when possible
• Data used to drive decision-making
– Project
– Districts
– Schools
FL PS/RtI Evaluation
Process
Program Goals
Needs Assessment
Program Planning
Service Delivery Process Evaluation
Formative Evaluation
Summative or Outcome Evaluation
FL PS/RtI Evaluation Model
• IPO model used
• Variables included
– Levels
– Inputs
– Processes
– Outcomes
– Contextual factors
– External factors
– Goals & objectives
Levels
• Students
– Receiving Tiers I, II, & III
• Educators
–
–
–
–
Teachers
Administrators
Coaches
Student and instructional support personnel
• System
–
–
–
–
District
Building
Grade levels
Classrooms
Inputs (What We Don’t Control)
• Students
– Demographics
– Previous learning experiences & achievement
• Educators
–
–
–
–
Roles
Experience
Previous PS/RtI training
Previous beliefs about services
• System
– Previous consensus regarding PS/RtI
– Previous PS/RtI infrastructure
•
•
•
•
Assessments
Interventions
Procedures
Technology
Real Ele m entary School
Self-Assessm e nt of Problem Solving Im plem entation (SAPSI)
Consensus
3
Status
2
1
0
District Commitment
1
SBLT support
2
Faculty involvement
3
Item
SBLT present
4
Data to assess
commitment
5
6
7
CBM data used to ID students needing
interventions
ODR data used to ID students needing beh
interventions
Data used to evaluate Tier 2 interventions
Data used to determine Tier 3 RtI
Sp Ed eligibility uses RtI for EBD
Sp Ed eligibility uses RtI for SLD
Evidence-based practices for Tier 1
Evidence-based practices for Tier 2
Evidence-based practices for Tier 3
SBLT has regular meeting schedule
SBLT evaluates target students' RtI
SBLT involves parents
9
10
11
12
13
14
15a
15b
16a
16b
16c
17
18
19
Item
SBLT regularly evaluates Tier 1 and 2 data
Data used to evaluate core beh programs
8
Data used to evaluate core acad programs
Data presented to staff
Data used to make decisions
0
Data is collected
Status
Real Ele m entary School
Self-Assessm e nt of Problem Solving Im plem entation (SAPSI)
Infrastructure Developm ent
4
3
2
1
20
0
21a
21b
21c
21d
21e
21f
22a
22b
22c
22d
22e
Item
22f
22g
22h
22i
23
SBLT meets at
least 2x per year
SBLT meets with
District Team 2x
per year
Plan changed
based on data
Feedback of
PS/RtI project
provided to
Monitoring plan
exists
Clearly defined
Tier 1 acad
instruction
Clearly defined
Tier 1 Beh
Instruction
Clearly defined
Tier 2 acad supp
instruction
Clearly defined
Tier 2 Beh Supp
Instruction
Evidence-based
Tier 3 Acad
Strategies
Evidence-based
Tier 3 Beh
Strategies
Define problem
as data-based
discrepancy
SBLT defines
replacement beh
SBLT conducts
problem analysis
with data
Evidence-based
strategies for
interventions
Support identified
for interventions
Intervention
integrity is
documented
RtI evlauated
through data
collection
Intervention
changed based
on student RtI
Parents involved
in interventions
Status
Real Ele m entary School
Self-Assessm e nt of Problem Solving Im plem entation (SAPSI)
Im ple m entation
3
2
1
24
25
26
27
Real Elementary : Reading AYP
100
Percent Scoring At or Above Grade Level
90
80
70
60
White
E c onomic ally D is advantaged
50
Students with D is abilities
E xpec ted
40
30
20
10
0
0 2 /0 3
0 3 /0 4
0 4 /0 5
School Year
0 5 /0 6
0 6 /0 7
Processes (What We Do)
• Students
– Assessment participation (e.g., DIBELS screening)
– Instruction/intervention participation
• Educators
– Frequency and duration of participation in PS/RtI Project training
– Content of Project training in which they participated
• System
–
–
–
–
Frequency & duration of professional development offered by the Project
Content of professional development offered
Stakeholders participating in professional development activities
Communication between Project and districts/buildings
Implementation Integrity
Checklists
• Implementation integrity measures developed
• Measure
– Steps of problem solving
– Focus on Tiers I, II, & III
• Data come from:
– Permanent products (e.g., meeting notes, reports)
– Problem Solving Team meetings
Outcomes (What We Hope to Impact)
• Educators
– Consensus regarding PS/RtI
• Beliefs
• Satisfaction
– PS/RtI Skills
– PS/RtI Practices
Pre/Post Beliefs 09/06/07
Statewide Pilot Data
Selected Items
5
4.5
4
Mean Score
3.5
3
PRE
POST
2.5
2
1.5
1
7a
7b
8a
8b
11a
11b
12
13
Item
14
15
16
17
20
22
23
Pasco County: Direct Skills Assessment
3
2.5
2.97
1.88
2
Score
1.64
1.54
1.5
1.38
1.38
Possible points
1
0.5
0
Core effectiveness Improvements to
core
Response to
supplemental
instruction
Tier 3 referral
Questions
Literacy Failure
Tier effectiveness
PS/RtI Model
Tier III
Behavior
5% of Students
Academic
Tier II
10-15% More Students
Tier I
ALL STUDENTS
80-90% of Students Respond
T III: COMPREHENSIVE INTERVENTION:
T I + T II + T III Students with Intensive Needs
Problem Solving and Progress Monitoring
Specialized Procedures, of Longer Duration
Frequent, Assessment-Based
Diagnostics, Progress Monitoring, Rate of Learning
T II: SUPPLEMENTAL INTERVENTION:
T I + T II: Targeted Group Interventions
Problem Solving to Identify Students At-Risk
Implement Standard Treatment Protocol
High Efficiency, Rapid Response
Progress Monitoring, Rate of Learning
T I: UNIVERSAL INSTRUCTION:
School-Wide Systems
Implement Core Instruction
Universal Screening, Benchmark Assessment
All Students, All Settings
Preventive, Proactive
Outcomes cont.
• System
– PS/RtI Infrastructure
•
•
•
•
•
Assessments
Interventions
Procedures
Technology
Costs
– PS/RtI Implementation
Outcomes cont.
• Students
– Academic achievement
– Behavioral outcomes
• Systemic
– Discipline referrals
– Referrals for problem solving
– Referrals for SPED evaluations
– SPED placements
Student Data Elements
• Outcome Data
– Florida Comprehensive Assessment Test (FCAT)
• Grades 3-5
• Reading & Math
– Stanford Achievement Test - 10
• Grades 1 & 2 where applicable
• Reading & Math
• Formative Data
– DIBELS (targeted grade levels)
– District assessments where applicable
Pilot School Example
Slides from Data Meeting
Following Winter Benchmarking
Window
Sources of Evidence —
What do our readers know? What are they
able to do?
The student
demonstrates
knowledge of the
concept of print and how
it is organized and read.
•Concepts of print test
•Shared reading
•Guided
reading/Observation
•Early Literacy Behaviors
Checklist (Scott
Foresman)
The student
The student
demonstrates
demonstrates phonemic
phonological awareness. awareness.
•Phonological Awareness
Test
•Guided
reading/Observation
•Early Literacy Behaviors
Checklist (Scott
Foresman)
•Phonological Awareness
Test
•Guided
reading/Observation
•ISF
•PSF
•Early Literacy Behaviors
Checklist (Scott
Foresman)
Sources of Evidence —
What do our readers know? What are they
able to do?
The student demonstrates
knowledge of the
alphabetic principle and
applies grade level
phonics skills to read text.
The student uses multiple
strategies to develop
grade appropriate
vocabulary.
•Shared reading with
•Running records with
distributed practice
miscue analysis
•Guided
•Guided
reading/Observation
reading/Observation
•Conferences
•Literacy centers
•Literacy centers
•NWF
•Writing samples
•Writing samples
•Reading Strategy
•Early Literacy Behaviors
Assessment (Scott
Checklists (Scott Foresman) Foresman)
•Reading Strategy
Assessment (Scott
Foresman)
The student uses a variety
of strategies to
comprehend grade level
text.
•Shared reading with
distributed practice
•Guided
reading/Observation
•Conferences
•Retelling
•Graphic organizers
•Early Literacy Behaviors
Checklists (Scott Foresman)
•Reading Strategy
Assessment (Scott
Foresman)
SAES Assessment 2
Kindergarten
9%
10%
10%
16%
11%
20%
12%
Percent of Students
41%
79%
72%
71%
50%
LN F
I SF
P SF
N WF
DIBELS Measure
L ow Ris k
M oderate Ris k H igh Ris k
2006-2007
SAES Assessment 2
Kindergarten
23%
19%
25%
24%
31%
27%
15%
Percent of Students
48%
52%
49%
54%
33%
LN F
P SF
I SF
N WF
DIBELS Measure
L ow Ris k
M oderate Ris k H igh Ris k
SAES Nonsense Word
F luency
Kindergarten by Classroom
13%
24%
0%
0%
0%
0%
27%
21%
15%
29%
Percent of Students
12%
31%
87%
73%
79%
65%
71%
54%
1
2
4
3
5
6
Classroom
L ow Ris k
M oderate Ris k H igh Ris k
SAES Phoneme Segmentation
F luency
Kindergarten by Classroom
7%
0%
18%
14%
12%
7%
12%
33%
8%
31%
18%
7%
Percent of Students
93%
79%
65%
1
76%
62%
60%
2
4
3
5
6
Classroom
L ow Ris k
M oderate Ris k H igh Ris k
Systemic Outcomes - Office
Discipline Referrals
2005-2006
2006-2007
60
54
50
38
30
30
31
30
30
24
20
10
17
10
8
16
26
26
18
16
6
Month
May
April
March
February
January
December
November
October
September
0
August
Number of ODRs
40
Other Variables to Keep in
Mind
• Contextual factors
– Leadership
– School climate
– Stakeholder buy-in
• External factors
– Legislation
– Regulations
– Policy
Factors Noted So Far
• Legislative & Regulatory Factors
– NCLB reauthorization
– FL EBD rule change effective July 1, 2007
– Pending FL SLD rule change
• Leadership
– Level of involvement (school & district levels)
– Facilitative versus directive styles
School Goals & Objectives
• Content Area Targets
– Reading
– Math
– Behavior
• Majority focusing on reading
• Some selected math and/or behavior as well
• Grade levels targeted varied
– Some chose K or K-1
– Some chose K-5
Special Thanks
• We would like to offer our gratitude to the
graduate assistants who make the intense data
collection and analysis that we are attempting
possible
– Decia Dixon, Amanda March, Kevin Stockslager,
Devon Minch, Susan Forde, J.C. Smith, Josh
Nadeau, Alana Lopez, Jason Hangauer, Leeza
Rooks, and Kristelle Malval
Project Website
• http://floridarti.usf.edu
• http://www.nasdse.org
• http://www.florida-rti.org
(Active Fall, 2008)