Improving Outcomes for ALL Students Through Problem

Download Report

Transcript Improving Outcomes for ALL Students Through Problem

Beyond Tier II: Assisting Students Who
Still Do Not Respond
Gary L. Cates, Ph.D.
[email protected]
Illinois School Psychology Association
January, 2011
Housekeeping
• Timeline (8:30 AM - 5:00 PM)
• 10 Minute Breaks (About 10 & 3:30)
• Lunch (1 hour 12:30-1:30)
Acknowledgments
• Cates, Blum, & Swerdlik (2010). Authors of Effective
RTI Training and Practices: Helping School and
District Teams Improve Academic Performance and
Social Behavior and this PowerPoint presentation
Overview of Session
• Who am I?
• Quick Review of Three Tiered System of
Support – Emphasis on Tier III
• General Framework for understanding
Behavior Problems
• Generic Framework of understanding
Learning Problems
• Factors to consider when selecting,
developing, implementing and evaluating an
intervention plan
• Progress Monitoring/Plan Evaluation
• Intervention Integrity
Tier III
Individualized
Intervention 5%
Tier II
Standard
Protocol 15%
Tier I
Universal
Instruction 80%
RTI Steps
Step I:
Solid Universal behavioral expectations for all
students
Step II: Reliable, Valid, and Brief School wide
Screening of behavior 3 times per year.
Step III: Data review by Problem Solving Team.
Step IV: Targeted interventions and progress
monitoring for low responsiveness
Step V: Intense interventions and progress
monitoring for low responsiveness
Step VI: Entitlement to special education when
student demonstrates little or no response to
both targeted and intense interventions
Tier I Behavior Curriculum
3-5 School Wide Expectations
If we understand that
behavioral skills are
learned, it is necessary to
teach expected behaviors as
we would academic skills.
Example Mark Twain Behavior Matrix
Hallways
Cafeteria
Playground
Bathroom
RESPECT
Self
Walk at all
times.
Eat your food
only.
Walk carefully
to return trays.
Stay in assigned
area.
Get help when
it is needed.
Quietly wait
your turn
Keep to
yourself.
RESPECT
Others
Voices off and
arms folded.
Single file
lines.
Jaguar waves
only.
Stay in order
when in line.
Be polite and
use good
manners.
Use kind words
and quiet
voices.
Stay in order
when in line.
Play by the
rules.
Take turns and
share
equipment.
Use polite
language
Walk in and out
quietly.
Voices off.
Open stall doors
slowly.
RESPECT
the
Environment
Eyes only on
displays.
Be quiet after
ten minute
warning.
Clean up your
own space.
Line up when
signal is given.
Pick up litter
that you see.
Use toilets,
sinks, and
dryers correctly.
Keep bathroom
clean.
Average Referrals Per Day Per
Month
Average Referrals Per Day Per Month
AVERAGE REFERRALS PER DAY
2.5
2
1.5
1
0.5
0
August
September
October
November
December
January
February
March
April
May
June
ODR Data by Behavior
Number of Referrals by Behavior Type
25
Number of Referrals
20
15
10
5
0
ODR Data by Location
Number of Referrals by Location
20
18
Number of Referrals
16
14
12
10
8
6
4
2
0
Hallway
Bathroom Classroom
Cafeteria
Locker
Room
Office
Playground
Bus
Gym
Music
Room
Library
Parking Log Unknown
5:00 PM
4:45 PM
4:30 PM
4:15 PM
4:00 PM
3:45 PM
3:30 PM
3:15 PM
3:00 PM
2:45 PM
2:30 PM
2:15 PM
2:00 PM
1:45 PM
1:30 PM
1:15 PM
1:00 PM
12:45 PM
12:30 PM
12:15 PM
12:00 PM
11:45 AM
11:30 AM
11:15 AM
11:00 AM
10:45 AM
10:30 AM
10:15 AM
10:00 AM
9:45 AM
9:30 AM
9:15 AM
9:00 AM
8:45 AM
8:30 AM
8:15 AM
8:00 AM
7:45 AM
7:30 AM
7:15 AM
7:00 AM
Number of Referrals
ODR Data by Time of Day
Number of Referrals by Time of Day
9
8
7
6
5
4
3
2
1
0
ODR Data by Student
Number of Referrals by Student
14
Number of Referrals
12
10
8
6
4
2
0
3
15
18
21
22
23
29
30
36
41
48
49
51
52
53
70
88
92
107
128
129
133
Tier II Behavior
Check In-Check Out like Standard
Protocols
Daily Report Card
Date _________________
Teacher___________________________
Student__________________
0 = No
1= Yes
Total Points =
Today ________%
Points Possible = 32
Parent’s signature _______________________________________________________
Goal ________%
Teacher
Be Safe
Be Respectful
Be Ready
Initials
Keep hands, feet,
Comments:
and objects to
Use kinds words
Follow
Have Needed
self
and actions
directions
Materials
Reading
0
1
0
1
0
1
0
1
Recess
0
1
0
1
0
1
0
1
Math
0
1
0
1
0
1
0
1
Lunch
0
1
0
1
0
1
0
1
Social Studies
0
1
0
1
0
1
0
1
Recess
0
1
0
1
0
1
0
1
Language Arts
0
1
0
1
0
1
0
1
Science
0
1
0
1
0
1
0
1
Daily Behavior Report Card Monitoring-Patty Provenzano
100
95
90
85
80
75
BEHAVIOR LEVEL
70
65
60
55
50
45
40
35
30
25
20
15
10
5
0
1
2
3
4
5
Daily Performance at or Above Criterion 82 90
Daily Performance Below Criteron
No Data Collected
Absent
6
7
8
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
85 87
19 36
21 28 54 68 46
0
0
Tier III
• Individualized Assessment
– Functional Behavior Assessment/Analysis
– Determine behavioral function (not cause!)
• Individualized Intervention
– Linked to “behavioral function”
– Based on basic principals of behavior
What are Scientifically Based
Interventions?
• Employs systematic, empirical methods
• Ensures that studies and methods are
presented in sufficient detail and clarity
• Obtains acceptance by a peer-reviewed
journal or approval by a panel of independent
experts through scientific review
• Uses research designs and methods
appropriate to the research question
Evidence-Based Interventions
• School-based professionals have a
responsibility for both promoting and
implementing interventions that are evidencebased
• AND objectively evaluating the effectiveness
of those interventions through the data-based
decision-making process. (NCLB & IDEA
2004)
Be Good Consumers
• Does the intervention meet the
standards for Research-Based
Interventions (internal validity)?
• Has the published research been peerreviewed?
• Have the results been replicated?
• Does this apply to my population
(External Validity)?
Selecting Research-Based
Strategies
• Maalox Approach -attempt highprobability strategies that have
demonstrated research support and are
likely to show quick and effective results
before conducting lengthy evaluations
that may not lead to beneficial
interventions.
Linking Behavioral Assessment to
Intervention Through Problem
Solving
To be successful in implementing
Effective Behavioral Interventions
you need to:
• Have a conceptual frameworks for what
behavior REALLY is
• Know what expectations are for the
student
• Know whether what you are doing is
helpful, detrimental, or having no impact
• Be focused on the variables that you
can immediately change
Group Example
Typical Hypotheses
• What are common hypotheses for:
– Not completing homework?
– Talking out during class?
– Out of Seat?
– Inappropriate Touching of others?
Behavioral Function as a Framework
for Understanding Behavior
• Determining what the antecedents and
consequences are for a given behavior
• Focuses on what is maintaining the
behavior not the cause!
What you must keep in mind
• Behavior has a function
• You are trying to identify the function
• You cannot be circular in your logic
(e.g., ADHD).
4 General Functions of Behavior: To Get
Something or To Ger out of Something
• Tangible Reinforcement: Food, Stickers,
Toys,
• Social Reinforcement: Teacher or Peer
• Escape/Avoidance: Get out of or
terminate something
• Sensory Reinforcement: Visual, auditory
primarily (Touch, smell)
Group Example
Typical Hypotheses
• What are common hypotheses for:
– Not completing homework?
– Talking out during class?
– Poor math test scores?
– Inappropriate touching of others?
Approaches to Functional
Assessment
• Questionnaire: Have others tell you what
happens
• Observational Assessment: Watch and
describe A-B-C’s
• Experimental Functional Analysis: Do a test
of hypothesis
 I usually do a bit of all three of the above
Example of Functional Analysis:
Talking out in class
Potential Function
Tangible R+
Attention
Escape
Self-Stimulation
Control Condition
Test Condition
Access Contingent upon
talking out
Reprimand Contingent
upon talking out
Contingent upon talking out
after demand
Leave isolated in room
Play with attention and no
demands
3
2.8
RATE OF TALKING OUT BEHAVIOR
2.6
Attention
2.4
2.2
2
1.8
1.6
1.4
1.2
1
Escape
Tangible R+
0.8
0.6
0.4
0.2
Toy Play
0
0
1
2
3
4
5
6
7
8
9
10
11
12
13
SESSIONS
What is the primary function of Behavior?
14
15
16
17
A More Applied Classroom
Example
Teacher & Student Behavior affect
each other
70
A
(Descriptive Phase)
B
(Verfication Phase)
A
B
Talk Outs
60
FREQUENCY OF BEHAVIOR
50
40
30
Verbal
Aggression
20
Reprimands
10
Tony
Praise
0
0
1
2
3
SESSIONS
4
5
28
A
26
Talk Outs
B
A
B
24
22
FREQUENCY OF BEHAVIOR
20
18
Praise
Redirection
16
14
12
10
8
Reprimand
6
Out of
Seat
4
Card
Change
2
Mike
0
0
1
2
3
SESSIONS
4
5
Special Note
• An experimental functional analysis
should be conducted if the FBA does
not lend itself to an effective
intervention.
Selecting and Developing
Behavioral Intervention Plans
A focus on behavioral function
Phases of Problem-Solving
1. Problem
Identification
2. Problem
Analysis
3. Plan
Development
5. Plan
Evaluation
4. Plan
Implementation
Phase 1: Problem Identification
What is the problem? & Is it a real
problem?
4 Steps of Phase 1:
Problem Identification
1. Operationally Define problem
2. Collect Baseline Data
3. State discrepancy between what is
expected (typical peer performance)
and what is occurring.
4. Identify a replacement behavior
Step 1 of Phase 1: Operationally
Defining the Problem
•
•
•
•
Must be observable
Must be measurable
Must pass the dead man's test
Cannot be circular (e.g. ADHD is why
Johnny acts that way)
Step 2 of Phase 1:
Collect Baseline Data
• Data can be collected from a number of
sources:
– R = Record Review
– I = Interview
– O = Observation
– T = Testing
RIOT TIPS
• Collect only what you need to determine the discrepancy
between what is expected (peer performance) and what is
occurring (target student performance).
• Use existing data when possible:
– ODR
– Records (e.g., attendance, permanent products)
• Collect additional information when needed:
– Interview
– Observation (e.g., Frequency Count, On-task).
Step 3 of Phase 1:
State Discrepancy
• Calculate the Behavior Discrepancy Ratio
(BDR)
– Include statement of student’s current level
of performance.
– Include statement of the expected level of
performance (e.g., peer data, teacher
expectation).
– You are essentially specifying “the gap”
Behavior Discrepancy Ratios
Formula:
Target Student Behavior
Peer Behavior
– Example: For disruptive talking out during 7th grade
math class, Jessica has engaged in the behavior on
average 17 times per class period while the average
7th grade math class peers are engaging in disruptive
talking out behavior on average 4 times per class
period.
Target Student Behavior
Peer Behavior
=
12
3
= 4x discrepant
Discrepancy Ratios
Enables team to make decisions about levels of support
and resource from the start.
Generally speaking…
– A student who is 2x discrepant from his/her peers is
appropriate for the problem-solving team.
– If a student is significantly discrepant from peers,
additional problem-solving and intervention resources
may be appropriate.
– Example: Jessica is 4 x discrepant from peers
and MAY benefit from problem solving.
Provides a way to evaluate student outcomes
Name
Grade
Area
Initial
Performance
Discrepancy
Follow Up
Performance
Discrepancy
Outcome
Decision
Bill
3
Talking out
3.5X
1.9X
Satisfactory;
Maintain/Fade
Intervention
Susie
2
Out of Seat 1.5X
NA
No Severe
Problem
Rob
4
Homework 4.2X
Completion
3.8X
No Progress,
Recycle through
process
Step 4: List Problem Behaviors
and Prioritize
• Teams should tackle one or Two problem at
a time.
• Consider the following problems first:
– Dangerous/Severe behaviors
– High frequency behaviors
– Foundational/Keystone behaviors (e.g.,
reading)
– Chronic problem behaviors
Step 5 of Phase 1:
Identify a Replacement Behavior
• State specifically what you want the student to do
instead
• Example: Initiate compliance with 85% of requests within
5 seconds.
• Example: Raise hand 100% of time during independent
seatwork when the student requires attention from the
teacher.
• Example: Remain on-task for 7 minutes
• Example: Complete 3 digit by 2 digit multiplication
problems with 95% accuracy in 9 weeks.
Group Example
1. Using the information below, how would you prioritize
this 4th grade team’s list of generated concerns
regarding William
– William is on-task during 40% of observed intervals
compared to Peers on-task of 90%.
– They think he may have ADHD
– Makes inappropriate comments in class that disrupts
students.
– Inconsistent homework completion
– Keeps a messy work area.
2. Based on the primary area of concern, how would you
define the behavior in observable and measurable
terms?
What can go wrong at Phase 1
(Problem Identification)?
• Cannot select one problem to focus on.
• Cannot empirically quantify the behavior/too
vague or general about the concern.
• Jumping to solutions
• Cannot establish ‘typical peer’ behavior and
discrepancy between what is expected and
what is occurring
• Problem Naming or “Admiring the problem”
• Lack quantitative baseline data (verbal report
only)
How to stay on track
during Problem Identification…
– Interview the teacher before the meeting to allow
for venting time and facilitate the description of the
problem.
– Proactively collect school-wide benchmark data.
– Collect baseline data before meeting.
– Prioritize keystone behaviors.
– State discrepancies before meeting
– Identify replacement behavior before meeting
Phase 2: Problem Analysis
What may be contributing to the
behavior?
2 Steps of Phase 2:
Problem Analysis
1. Collect enough of the right data.
2. Generate hypotheses of
controllable variables related to the
behavior.
Step 1: Collecting Enough of
the Right Data
• Verbal Reports (e.g. interviews)
• Rating Scales (e.g. BASC)
• Record Review (e.g. Cumulative file, homework –
permanent products)
• Observation Systems (e.g. BOSS)
• Direct Systematic Behavioral Observation (e.g.
Interval recording, frequency counts)
RIOT
•
•
•
•
(R)ecord Review
(I)nterviews
(O)bservation
(T)est
Verbal Reports
• Reliability is a concern
• Can be used to generate hypotheses
• Get direct data (i.e. independent
observation) to corroborate
Specific Questions to Teachers:
Behavior Problems
• What does the behavior look like?
• How often does it occur
• What happens immediately before the
behavior?
• What happens immediately after?
• What have you tried so far?
• What behavior would you rather see?
Rating Scales
• More reliable than verbal report
• Used only as a “screener”
• DO NOT USE ALONE FOR
INTERVENTION OR DIAGNOSIS!
• Broadband versus Narrow band
Observations
• This is not an anecdotal
report of what someone
observed for a class period
General guidelines for observations:
–
–
–
–
–
Don’t be intrusive.
Agree upon a clearly defined and observable behavior first.
Observe across days/times/settings to increase reliability.
Use with other forms of assessment to increase validity.
Carefully consider the goal of the observation before selecting
an observation tool.
– Always note the environmental context of the behavior.
– Observe students in their natural environments.
– Always observe peers for a comparison.
Observation “systems”
• Save your money
• Very limited
• Use direct behavioral systematic
observation methods
Direct Behavioral
Observations
• ABC Log’s
• Frequency Tabulation Log’s
• Systematic Interval Recording
Examples of Direct
Observations
ABC Recording
• Antecedents - what occurs right before the
behavior.
• Behavior - problem behavior (observable and
defined)
• Consequences - what happens right after the
behavior
Data-Based Decision Making Using
Antecedent-Behavior-Consequence
Logs
Practice Analyzing an ABC
Log
• See handout
• Why do you think the behavior is
occurring?
• What might you do for an intervention?
• What is an acceptable alternative
behavior?
• How would you monitor progress?
Behavior Recording LOG
Directions: Please be as specific as possible.
Child’s Name:
Karyn E._______________________
Grade: 2nd
Setting: School: Library, classroom, recess
Date
Time
Setting
Where did the behavior
take place?
10/14
9:15
Library
Task
What should student be
doing?
Picking out a book
Behavior
What did
student do?
Pushed a peer
Threw glue
bottle at peer
10/16
10:05
Small group art project
Working with peers
10/17
9:45
Recess
Free play
10/18
10/19
9:00
10:45
Classroom
Classroom
Date: _4/30_________
Teacher: Mrs. Becker
Observer: Ryan M.____________________
Transitioning between
reading and specials (today
was computer skills)
Working with peers on
piñata
Consequences
How did you and/or
students react?
I sent him to the
office
Effect
What happened after these
reactions?
Came back and was polite
Given a time-out in
the hall
Came back in calm
Hit peer in face
with small
pebble
Stood him against
wall. Peer cried
Went to class with bad
attitude
Did not
transition
quietly
Reminded him he
must transition
quietly
He continued singing “don’t
you wish you girlfriend was
hot like me” and asking a
peer about American idol –
He even asked if I watched
it.
Pushed peer’s
work materials
on the floor
Sent him to the
office and called
mother
His mother picked him up
and took him home
Comments:
As you can see he is often rude, does not respond well to traditional discipline, and is aggressive towards peers.
1. What patterns do you see here? 2. What is the likely function of behavior?
Office Discipline Referral Form
Student:
Grade:
Date of Referral:
Location
o Classroom # ____
o Hallway
o Cafeteria
o Library
o Bathroom ____
o Bus
o Open Yard
o In Front of School
o Parking Lot
o Other (please ID)
Time of Behavior:
External Behavior
o Abusive language
o Physical contact
o Sexual Language toward peer
or adult
o Lying
o Cheating
o Vandalism
o Smoking
o Truant from Class
o Other (please ID)
Antecedent
Referring Staff:
Behavior
Internalizing Behavior
Note: For internalizing referrals send
form, but do not send student to the
office unless necessary.
o Does not talk with peers
o Excessively shy/withdrawn
o Avoid social interaction
o Appears fearful in non
threatening situations
o Fails to assert self
o Unresponsive to social situations
o Doesn’t participate in social
activities
o Other (please ID)
Consequence
Data-Based Decision Making Using
Frequency Counts
Examples of Direct
Observations
Frequency Count (RATE MEASURE!)
–
–
–
–
–
A measure of how often a clearly defined behavior occurs within a given
period of time.
Examine the frequency of the behavior by tallying or counting the
behavior as it occurs.
Use this when the behavior is discrete (has an obvious beginning and
ending) and does not occur at very high rates.
This information is helpful at ALL steps of the problem solving process
ALWAYS MEASURE AS RATE WHEN POSSIBLE!!!!
Practice Using A Frequency
Count/Rate Measure Log
• See Handout
• Determine the rate of behavior
• Determine Discrepancy Ratio
– The average child does this on average 1.8
times per day.
• Write a hypothesis: Remember ICEL
• Develop a method for hypothesis
testing: Remember RIOT
1. What day does the behavior
most often occur? What day
is it least likely to occur?
2. What time of day does the
behavior most often occur?
Least often?
3. When should someone come
to visit if they wanted to
witness the behavior?
Note: It is just as important to look
at when the behavior occurs
as it is to look at when it doesn’t.
Data-Based Decision Making Using
Direct Behavioral Observations
Examples of Direct
Observations
Systematic Data Recording
– Examine percentage of target behavior by:
• Recording when the selected student is engaging in
target behavior during 10-second intervals for 15
minutes.
• Peers are observed in the same way as a
comparison.
– Requires more training than the other
observation tools.
– This information is helpful at all steps of the
problem solving process
Systematic Direct Behavioral
Observations: Interval Recording
• Partial Interval Recording: Occurs
anytime within interval
• Whole Interval Recording: Occurs
majority of Interval
• Momentary Time Sampling: Within 3
seconds
• Duration Recording: How long behavior
occurs
Target Child
Behavior
1
1
A
2
TO
X
3
OT
X
2
3
4
5
6
7
8
9
10
11
12
13
X
X
X
X
X
X
X
X
5
6
7
14
15
16
17
18
19
20
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Composite Child
1
2
3
4
8
9
10
11
12
13
14
15
16
17
18
19
20
Behavior
1
A
X
2
TO
X
3
OT
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Behavioral Observation Form
Target Student Name:_Larry F.__________________
Birth date: 4/1/1998____
School: Metcalf__________________________________
Teacher: Havey_____
Observer: _Blake M.__________________________
Behavior(s)
Behavior 1: Aggression (A)
Date: ___5/30/________
Definitions
Physical or verbal actions toward another person that has
potential for harm
Verbalizations without permission
Oriented to academic task or appropriate engagement with
materials
Behavior 2: Talk-outs (TO)
Behavior 3: On-task (OT)
Behavior 4:
Behavior 5:
Target Child
Behavior 1
1 A
2 TO
X
3 OT
X
4
5
2
3
X
X
4
5
6
7
X
X
X
X
8
X
X
9
X
X
10 11 12 13 14 15 16 17 18 19 20
X
X
X
X X X X
X X X X X X
Behavior 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40
1 A
X
X
2 TO
X X
X
X
X
X
3 OT
X
X
X
X
X
4
5
Composite Child
Behavior 1
1 A
2 TO
X
3 OT
X
4
5
2
3
4
5
6
X
7
8
9
X
X
X
X
X
X
X
X
10 11 12 13 14 15 16 17 18 19 20
X
X
X
X
X
X
X
X
X
X
X
Behavior 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40
1 A
2 TO
X
X
3 OT
X X X X X X X X
X X X X X
X X X X
4
5
TCB1 _4/40_
TCB2 __12/40
TCB3 22/40_
TCB4 ______
TCB5 ______
CCB1 _1/40_
CCB2 _5/40_
CCB3 _35/40
CCB4 ______
CCB5 ______
(#Occurrences/#Observations) X 100
1. What can you get from this?
2. Are all of these behaviors
severe enough to warrant
individualized intervention?
Step 2: Writing a Hypothesis
• Provide the discrepancy statement
• Add because… at the end of the discrepancy
statement and insert your hypothesis.
• The hypothesis should be specific, observable, and
measurable.
– Example:
Beth is on-task for 35% of intervals while peers are on-task
87% of intervals during a 20-minute observation during direct
instruction in Math class, because she is escaping the Math
work which is above her instructional level.
Plan for the Collection of
Additional Data Needed to
Support Hypotheses
• Your hypotheses should be supported by at least 2
convergent sources of RIOT data with at least one piece
being objective.
• If you develop a hypothesis that you don’t have enough
data to support, plan for the collection of additional data
you need validate or refute the hypothesis.
• Data collection should be planned not random!
Challenges/Barriers/What can go wrong
at Stage 2: Problem Analysis
Don’t consider appropriate variables
• Choosing variables you can’t change
• Get ‘stuck’ searching for the cause
The Filibuster
• Individual team members focused on their own
agenda.
Problem analysis is skipped altogether
Hypotheses selected are not supported by 2 forms
of Data
How to stay on track
during Problem Analysis (Step 2)…
– Focus on behavioral function
– Insist that a hypothesis needs at least two supporting pieces of
evidence (one must be quantitative).
– Enforce the agenda.
– Verbally redirect those who provide solutions before developing a
hypothesis.
Phase 3: Plan Development
Linking Assessment to
Intervention
2 Steps of Phase 3:
Plan Development
1. Set a Goal
2. Develop a plan based on the
hypothesized behavioral function
Step 1: Set a goal
• Goal should be to bring the student’s
behavior into acceptable levels relative to
peers.
• Discrepancy Ratios
• Criterion Based
Writing and Evaluating
Measurable Goals
Behavior Goals
Goals/Objectives
• Should state: Performance, condition, criteria and when
possible the date.
e.g. Given a verbal prompt to complete a mastered task,
the student will initiate compliance within 5 seconds 90%
of the time within 6 weeks.
e.g. The student will “respect others” within 30% of average
peer performance within 6 weeks.
Step 2: Develop Intervention
based on Behavioral Function
• Extinction is difficult to manage
• Attention function should not get ignoring alone
• Remember to focus on the alternative
replacement behavior
• Consider NCR, DRO, Response Effort, as
starting points for brainstorming.
Phase 4: Plan Implementation
Support & Integrity
3 Steps of Phase 4:
Plan Implementation
•
•
•
•
•
•
•
•
•
•
Specify when & Where (Steps 1 & 2)
Not the entire day in the beginning.
Not everywhere in the beginning.
Specify who & What (Steps 3 & 4)
Implementer(s)
Integrity Monitor
Data collector/analyzer
Materials
Specify How (Step 5)
Articulation Form
Intervention Fidelity Checklist
Implementer: _____________
Intervention: _________________________
Observer: _________________
School: ______________________________
Student: __________________
Time/Location: _______________________
Grade: ____________________
Teacher: _____________________________
Step
Date
Date
Date
Date
Date
1.
2.
3.
4.
5.
Daily Fidelity Percentage
_____%
_____%
_____%
_____%
_____%
Phase 5: Plan Evaluation
3 Steps of Phase 5: Plan
Evaluation
Answer the following Questions: Is the
intervention plan effective?
A. Is the student making progress toward the goal?
B. Is the student decreasing the discrepancy
between him/her and the general education
peers?
C. Is the plan able to be maintained in the general
education/current setting with current level of
support?
You cannot evaluate an intervention if integrity is not
maintained. No implementation No Evaluation
No Change.
Questions
Tier I Academics
FALSE POSITIVES
Further Diagnostic Assessment
200
STUDENT PERFORMANCE ON HIGH-STAKES TEST
195
Negatives for At-Risk
190
185
180
175
170
165
160
155
150
145
140
135
False Negatives
Additional Data Currently Available
130
POSITIVES for At-Risk
125
120
0
10
20
30
40
50
60
70
80
90
100
110
120
130
140
150
160
170
180
Words Read Correctly Per Minute - 2nd Grade
Relationship Between ORF In Fall of 2 nd Grade and High-Stakes Testing in 3rd Grade
190
200
Instructional Analysis Form
Skill
Phonemic
Awareness
Phonics
Reading
Fluency
Vocab
Comp
Teaching
Strategy
Materials
Format
Allocated Reward or Method of
Time
Reinforcer Assessment
Tier II Academics
Standard Protocol: Scripted with
Monitoring of Progress
105
Spring Benchmark
100
95
90
85
Goal Line
80
75
WORDS READ CORRECTLY PER MINUTE
Fall Benchmark
Winter Benchmark
and Student Goal
70
65
60
55
50
Student Aim Line
45
40
Student Baseline
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Tier III
• Individualized Assessment
– Curriculum Based Evaluation (CBE)
– Determine “instructional level” (CBA)
• Individualized Intervention
– Linked to “instructional level”
– Based on basic principals of learning
A word about “instructional level”
• Instructional level is not:
– 25th to 50th percentile (or something like
that).
– Instructional be based on criterion level of
accuracy and rate. See Shapiro (2004), Howell &
Nolett (2000).
– Instructional level can also be based on the
skill level (i.e. stage of learning).
Linking Academic Assessment to
Intervention Through Problem
Solving
To be successful in implementing
Effective Academic Interventions you
need to:
• Have a conceptual frameworks for what
learning REALLY is
• Know what expectations are for the
student
• Know whether what you are doing is
helpful, detrimental, or having no impact
• Be focused on the variables that you
can immediately change
Group Example
Typical Hypotheses
• What are common hypotheses for:
– Not completing homework?
– Reading “really choppy” out loud?
– Performing poorly on tests only?
– Inconsistent performance with subtraction?
Learning from an Instructional
Hierarchy Perspective
A framework for Linking
Assessment to Intervention
The ABC’s of Learning
• Antecedent
– Instructional pace/Materials/Methods
– Location, Demands, etc.
• Behavior
– Topography
– Rate/Accuracy/Level/Trend/Expectation
• Consequences
– Delayed versus immediate
– Feedback versus none/ R+/P
The Instructional Hierarchy
• 4 Stages of Learning Development
* Acquisition, Fluency, Generalization,
Adaptation
• Similar to other “Stage Theories” with
regard to pros and cons
Stage 1: Acquisition
• General Question: Acquisition
• General Variable: Percent Correct
• General Strategies:
1. Modeling
2. Demonstration
3. Prompting
* Often requires a task analysis
Modeling
• Presenting example of a skill
e.g. Mathematics
“here is a problem for you to look at”
Demonstration
• Active performance of a skill
e.g., Mathematics
“Watch me work this problem here”
Prompting
• Providing a cue to perform a target
response
e.g., Mathematics
“Don’t forget to carry the 1”
Example of using
Demonstration
Subtraction with regrouping
The bottom number is bigger than the top number in the right column
So we must borrow from the left column.
38
- 19
2
38
- 19
Cross out the top number in the left column and write the next smallest number
Above it.
21
38
- 19
Now put a 1 in front of the top number in the right column.
21
38
- 19
9
Now subtract starting in the right column
21
38
- 19
19
Now subtract the left column
Example of Demonstration
Prompting and Modeling
Telling time to the nearest minute
What time is it?
Write down the number that the small hand is pointing to: 11
Hint: If in between two numbers then It is always the smallest number.
Now count by 5’s stating with the number 1 and write down the number that
The big hand is on 11:45
Sometimes big hands are also between numbers. Let’s tell time.
Write down the number that the small hand is pointing to: 1:
Hint: If in between two numbers then It is always the smallest number.
:15
Now count by 5’s starting with the number 1 and write down the
smallest number that the big hand is in between on next to the clock
1:__
:15
3
18
Now count each little tick mark after the smallest number and add it to
the number you wrote down. 1:18
Stage 2: Fluency
• General Question: Accurate response
rate
• General Variable: Behavior per minute
(e.g. wrcpm)
• General Procedures:
1. Drill: Active repeated responses
2. Overlearning (Maintenance)
Example of Drill
Basic Addition Facts
Flashcard Drill Procedure
•
•
•
•
•
All possible combinations 0-12
Start timer
Present first stimulus (wait time)
If correct put in correct pile with feedback
If incorrect put in incorrect pile with corrective
feedback.
• Repeat procedure with incorrect pile until all
cards are put into correct pile
• Graph Data and show student
Stage 3: Generalization
• General Measurement:
Generalization/Transfer
• General Procedures: Practice (new response
with other responses).
• Discrimination Training: Behavior in presence
of one stimulus but not another.
• Differentiation: reinforce responses to
stimulus while slowly varying one essential
aspect of the stimulus
Example of Discrimination
Training
Letter Reversal b and d
b or d?
•
•
•
•
•
•
•
•
Present a single stimulus to student “b”
Ask: What letter is this?
Correct response = praise
Incorrect response = corrective feedback
10 consecutive correct responses fade in d
10 consecutive responses stop and start over with d
10 consecutive responses fade in b
Alternate between the two letters fade in others as
needed
• Graph performance
Differentiation
• Learning to count money under stimulus
“How much is this? (multiple coins
placed in front of child).
• Modify by placing heads up/tales up
• Modify by changing prompt (is this more
or less than 30 cents?)
• Use in multiple environments
Stage 4: Adaption
• Changing form of response when
needed very efficiently
• What’s up versus how are you
• Making change
• Problem solving
• Multiple experiences multiple
environments with heavy feedback
Important Variables in
understanding Instruction and
Learning
ABC’s and 123’s of learning and
instruction
Types of Academic Time
• Allocated Time:
- How much time in school we have
• Instructional Time
- How much time teacher spends providing
instruction
• Engaged Time
- How much time student spends engaged
* This is the best predictor of student
performance
Question 1
Should we focus on increasing
academic engaged time?
Yes and No
Yes if completing the ABC’s with
correct responses
No if not completing ABC’s
ABC’s of Learning
• Antecedents:
Instructional Directions
Stimulus to respond in the presence of
Pace of instruction
ABC’s Continued
• Behavior:
Topography: Written, verbal, typed
Response rate
Inter-trial interval
Wait times
ABC’s continued
• Consequences:
Feedback (negative/positive)
Immediate
Contingent
Change behavior
123’s
• Rate of accurate responding – This is
what you graph as often as possible
• GPA, Grade, Accuracy – This is what
you graph, report, measure as general
long term goal attainment.
Phases of Problem-Solving
1. Problem
Identification
2. Problem
Analysis
3. Plan
Development
5. Plan
Evaluation
4. Plan
Implementation
Phase 1: Problem Identification
What is the problem? & Is it a real
problem?
4 Steps of Phase 1:
Problem Identification
1. Operationally Define problem
2. Collect Baseline Data
3. State discrepancy between what is
expected (typical peer performance)
and what is occurring.
4. Identify a replacement behavior
Phase 2: Problem Analysis
What may be contributing to the
behavior?
2 Steps of Phase 2:
Problem Analysis
1. Collect enough of the right data.
2. Generate hypotheses of controllable
variables related to the behavior.
Step 1: Collecting Enough of
the Right Data
• Verbal Reports (e.g. interviews)
• Rating Scales (e.g. SMALSI)
• Record Review (e.g. Cumulative file, homework –
permanent products)
• Direct Systematic Behavioral Observation (e.g.
Interval recording, frequency counts, IAF)
Interviews: Specific Questions to
Teachers related to Academics
• How are instructional assignments
presented?
• What is expected?
• Where is the student currently?
• How are opportunities for practice
presented?
• How is feedback provided?
• What has or has not worked?
Rating Scales
• In my opinion they are useless.
• Problem exists because data tell us so.
• Use them only to support hypothesis or
generate a hypothesis. They are not
“outcome measures”
Observations
• Direct Observations of:
– Academic Engaged Time
– Instructional Analysis Form
As an integrity tool
Testing
• This is NOT WJ, KTEA, WIAT etc.
• IQ NOT needed nor is it helpful.
• Curriculum Based Evaluation is the
process.
Hypothesis: Won’t do Versus
Can’t Do
• Provide reinforcer for reading accurately
(50% increase?)
Reading Hypothesis 1: Error not
important to meaning
• Tally errors and get percent of words
that violate meaning (i.e. would give you
a different sentence understanding).
• Shouldn’t be out of specified range
(~5%).
Reading Hypothesis 2: Code
Structure is the issue
• Read a passage and note errors.
• Errors related to pattern in words?
• Be sure to base this on opportunity for
error not just percentage of errors.
Reading Hypothesis 3: Word
Substitutions are?
• Related to phonics?
– Misses phonetically regular portions of
words
– Can’t read non-sense words
• Not related to phonics?
– Provide assisted self-monitoring
– Maybe not a problem (Check if affects to
meaning)
CBE: Comprehension
Let’s Change our Thinking
• Comprehension is a complex process
• Let’s talk about how a reader “reacts” to
their reading.
– Answering questions, retelling,
paraphrasing, cloze, maze, t/f etc.
9 Causes of Comprehension
Failure
• These are 9 things that a good reader
does that a poor reader doesn’t.
• If you want a cool round number (the
top “10” reasons) the 10th is Insufficient
reinforcement.
Strategies of Comprehension
• Monitor for meaning and self-correct
• Selective attention to text: Skimming,
going over closely
• Adjust for Text Difficulty: Change rate,
rereading, highlighting
• Connect with Prior Knowledge:
• Clarify: Figure it out in some way to
make it make sense (Ask for help?;
Google)
Enablers of Comprehension
• Decoding: 140 wcpm (after 3rd grade)
• Vocabulary (Semantics) – 70% of the
variability!
– Definitions
– Determining Word Meaning
• Grammar (Syntax): Rare, but could be ESL
• Prior Knowledge
CBE of Math
Mathematics Areas
• Computation: Accurately and quickly
responding with symbols of quantity
• Concepts: Rules
• Strategies: Need to be efficient
• Facts: Numerical statements
• Application: Using math
– Sub-domains: Tool use, content knowledge, and
Vocabulary
• Problem-Solving: Using both computation and
application.
Math Assessments
•
•
•
•
•
Irrelevant standards
Irrelevant formats
Lack empirically validated sequencing
Inadequate samples of student behavior
Provide little insight into why errors are
made
• Not aligned with instructional objectives
Interviewing & Error Analysis
• 2 ways of collecting information for the
development of a hypothesis
• Interviewing: See Instructional Analysis
Form, Previous questions to ask
teachers
• Error Analysis: Need a lot of problems
of the same type (Facts, operations,
applications)
Example of CBE: Tammy
• Fourth-grade student
• Did not make adequate progress with the Tier
II standard protocol intervention in winter
• School psychologist administered an
individual probe (i.e., diagnostic tool) and
observed Tammy’s completion of this probe
• An analysis of responding yielded a diagnosis
of the problem
• This diagnosis of the problem informs
intervention selection
1. What seems to be
the problem?
2. What should the
intervention target?
3. Describe something a
teacher could do to target
this problem.
4. Do you have to buy
an expensive program
just for Tammy?
Setting Goals with BMC
considered
• Consider Basic Movement Cycle (BMC)
– Think of it as a “handicap”
•
•
•
•
Task Mastery Rate (TMR)= 50/minute
Current BMC = 75/minute
Expected BMC = 100/minute
Formula: (TMR * Current BMC)/(EBMC)
– (50*75)/100 = 37.5
– With current BMC student should be able to
make 37.5 DCPM
Step 2: Writing a Hypothesis
• Provide the discrepancy statement
• Add because… at the end of the discrepancy
statement and insert your hypothesis.
• The hypothesis should be specific, observable, and
measurable.
– Example:
Beth is on-task for 35% of intervals while peers are on-task
87% of intervals during a 20-minute observation during direct
instruction in Math class, because she is escaping the Math
work which is above her instructional level.
Consider Multiple Domains: ICEL
Instruction
Curriculum
Learner
Environment
Phase 3: Plan Development
Linking Assessment to
Intervention
2 Steps of Phase 3:
Plan Development
1. Set a Goal
2. Develop a plan based on the
hypothesis (ICEL)
Step 1: Set a goal
• Goal should be to bring the student’s
performance into acceptable levels relative to
peers or Criterion.
Determining Long range Goal
• Multiply number of weeks that you will
be monitoring by the criterion (Expected
ROI).
• Add this number to the median baseline
point
• Example:
– Median baseline point = 35
– Number of weeks = 10
– Expected rate of growth (based on norms
or suggestion)
Baseline
60
Intervention
55
50
45
40
WRCPM
35
30
25
20
15
10
5
0
0
1
2
3
4
5
6
7
8
WEEKS
9
10
11
12
13
14
15
16
Writing IEP Goals
• Long range Goal
In ___ (total # weeks) when presented
with math problems form (curriculum
and grade level) ____ (Student’s name)
will perform ____(long range goal) with
_____ errors or fewer.
Writing IEP Goals
• Short term objective
Each successive week, when presented
with a random selection from _____
(curriculum and grade level) ____
(Student’s name) will perform at an
average increase of _____ DCM and no
increase in errors.
Step 2: Develop Intervention
based on Instructional Level
•
•
•
•
Rule out motivation deficits
Consider multiple topographies
Consider stage of learning
Match learning stage principle to instructional
components.
Phase 4: Plan Implementation
Support & Integrity
3 Steps of Phase 4:
Plan Implementation
•
•
•
•
•
•
•
•
•
•
Specify when & Where (Steps 1 & 2)
Not the entire day in the beginning.
Not everywhere in the beginning.
Specify who & What (Steps 3 & 4)
Implementer(s)
Integrity Monitor
Data collector/analyzer
Materials
Specify How (Step 5)
Articulation Form
Instructional Analysis Form
Skill
Phonemic
Awareness
Phonics
Reading
Fluency
Vocab
Comp
Teaching
Strategy
Materials
Format
Allocated Reward or Method of
Time
Reinforcer Assessment
Intervention Fidelity Checklist
Implementer: _____________
Intervention: _________________________
Observer: _________________
School: ______________________________
Student: __________________
Time/Location: _______________________
Grade: ____________________
Teacher: _____________________________
Step
Date
Date
Date
Date
Date
1.
2.
3.
4.
5.
Daily Fidelity Percentage
_____%
_____%
_____%
_____%
_____%
Phase 5: Plan Evaluation
3 Steps of Phase 5: Plan
Evaluation
Answer the following Questions: Is the
intervention plan effective?
A. Is the student making progress toward the goal?
B. Is the student decreasing the discrepancy
between him/her and the general education
peers?
C. Is the plan able to be maintained in the general
education/current setting with current level of
support?
You cannot evaluate an intervention if integrity is not
maintained. No implementation No Evaluation
No Change.
Questions