Transcript Slide 1

I-RtI Network
Aligning Tier 2: Decision
Rules, Screening and
Diagnostic Tools
Date
Facilitated/Presented by:
Insert name(s) here
The Illinois RtI Network is a State Personnel Development Grant (SPDG) project of the Illinois State Board of
Education. All funding (100%) is from federal sources.
The contents of this presentation were developed under a grant from the U.S. Department of Education, #H325A100005-12.
However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not
assume endorsement by the Federal Government. (OSEP Project Officer: Grace Zamora Durán)
Making
What
Check-in
connections
Applying
Review
One of the best
ways to remember
something is to
test yourself.
September
Extension
Activity
Action Plans
Outcomes
Review
Pre-Meeting
Survey
Results
I-RtI Network
TIER 2 MODELS
Definitions of 2 Approaches to Tier 2
Problem Solving
Standard Protocol
• Uses one consistent
• Uses interventions,
intervention, selected
selected by a team, that
by the school, that can
target each student’s
address multiple
individual needs. This
students’ needs. This
approach has been
approach is supported
used in schools for
by a strong research
more than two decades.
base.
The IRIS Center, 2007
Comparison of RTI Approaches
The IRIS Center, 2007
The Two Models of RTI:
Standard Protocol and
Problem Solving
Edward S. Shapiro
Center for Promoting Research to
Practice
Comparison of Two Models
Standard Protocol
Problem Solving
• ADVANTAGES
• ADVANTAGES
• DISADVANTAGES
• DISADVANTAGES
Which model fits?
• Which model seems most efficient for my resources
and the needs of my students?
• Am I concerned that without problem solving on each
individual student, I will not meet their needs?
• Am I concerned that problem solving on each
individual student is too time consuming?
• How can I evaluate the success of the model chosen?
• How can systemic problem solving assist in the
adoption of a model that is effective and efficient?
EC Fidelity
Checklist
How to Create a Standard Protocol
• Existing Data Can be Used to Predict
Intervention Needs
– Look at your referral history
• 85% of referrals are for the same 7-8 reasons (Batsche,
2008
– Look at your previous intervention history
• Predicts future intervention history by identifying the
most common student needs
• Indicates training needs
Step 2
Step 1
C
80-90%
All Students at
a grade level
Fall
Winter
Teacher will make sure:
1. All students have been
given the DIBELS assessment
2. All data has been entered
3. A copy of the class-wide data
is printed
Questions/concerns: Contact
Building Principal
Addl.
Diagnostic
Assessment
Instruction
None
Continue
With
Core
Instruction
S
5-10%
I
1-5%
Universal
Screening
Spring
Teacher will:
1.Calculate what percent
of the class is at benchmark
2. If below 80%, determine
“core” instructional needs
(Beef-up based on data)
Questions/Concerns:
K-3 Contact _____
4-6 Contact ____-building
teacher w/reading
background
Elem. Example
Heartland AEA, Johnston Iowa
Step 3
Group
Diagnostic
Individual
Diagnostic
Teacher will:
1.Place all non-proficient
students into the 4-Boxes
2.Determine if there is a
need for additional
diagnostic assessment(s)
-see grade level sheet
3.Ensure diagnostic
assessments are given
4.Bring all data to
grade level meetings
Questions/Concerns:
K-3 Contact Reading Teachers
4-6 Contact _____ Building
Teacher with Rdg.
Background
Small
Group
Differentiated
By Skill
Individualized
Intensive
Step 4
Results
Monitoring
Grades
Classroom
Assessments
Yearly ITBS/ITED
2 times/month
weekly
Grade Level Data Meetings:
1.Discuss briefly additions/changes
made to core
2.Share 4-Box data and other diagnostic
data results.
3. Group kids with similar instructional
needs.(COMPARE TO PRIOR GROUPING- IF AVAILABLE)
4. Complete the group intervention
Plan form.(one per group)
-Who, what, when, where of instruction
-Who, what, when, where of monitoring
-Who and when of parent notification
NOTE: if any changes are made during
Intervention period, document on form.
5. Attach an implementation log
and graphs
6. Set date to meet back for check-in
(4-6 weeks)
Questions/Concerns: District Based
Team& IDM Team, Content Specialist
Extreme Off Track
Risk Factors:
High Off Track
Off Track
At Risk for Off Track
On Track
No risk factors indicated
Combining Standard
Protocol and Problem
Solving
How is your understanding of these
two models ?
I am more confused now about these
different models than when I got here.
I have about the same understanding
of them as when I got here.
I have a better understanding of these
models than when I got here.
Questions/Comments
EC Fidelity Checklist #3
“Explanation of district
philosophy and approach to
providing Tier 2 support.”
•Who else will need to be
involved?
•Where in your manual will this
explanation be included?
•What is your plan to
communicate to stakeholders?
I-RtI Network
SCREENING AND DIAGNOSTIC
TOOLS
PAY ATTENTION TO TYPES OF ASSESSMENT
Screening
Diagnostic
Progress
Monitoring
Formative
Summative
Assessment
tools designed
to collect data
for the purpose
of measuring
the
effectiveness of
core instruction
and identifying
students
needing more
intensive
interventions
and support
Formal or
informal
assessment
tools that
measure skill
strengths and
weaknesses,
identify skills in
need of
improvement,
and assist in
determining
why a problem
is occurring
Ongoing
assessment
conducted for
the purposes of
guiding
instruction,
monitoring
student
progress, and
evaluating
instruction/inte
rvention
effectiveness
Ongoing
assessment
embedded
within effective
teaching to
guide
instructional
decisions
Typically
administered
near the end of
the school year
to give an
overall
perspective of
the
effectiveness of
the
instructional
program
Characteristics
Universal Screener
•
•
•
•
•
Brief
Repeatable
Sensitive
Indicative of risk
Predictive
Diagnostic
• Requires more time
• Can be group or
individual
• Identifies deficit skill
• Helps link skill deficit to
intervention
Universal Screening
EXAMPLES
•DIBELS
•AIMSweb
•mClass
•STAR
•easyCBM
USE
PURPOSE
“First
identify
Alert”
children who
need more
intense
assessment to
determine the
potential for
intervention.
QUESTIONS
Who is at risk?
Who may need
additional
assistance?
Who needs close
monitoring?
Extreme Off Track
2-3 Years Behind
No chance for graduation in a
traditional school setting
Disengagement
Risk Factors:
1. Disengagement
•20% absenteeism
2. Behind in Credits
•Particularly Core
Course Failures
3. GPA less than 2.0
4. Failed FCAT
High Off Track
3 or more risk factors
Off Track
2 of 4 risk factors indicated
Students entering with 20%
absenteeism and/or 2 or more
F’s in 8th Grade
At Risk for Off Track
1 of 4 risk factors indicated
On Track
No risk factors indicated
Hendry County Schools
Lower Performing Schools
Screening Plan
When a considerable number of
students fail to demonstrate strong
basic skills
Mark Shinn 2013
Methods for grouping students
IIRC
WORKSHEET FOR USING DISCOVERY EDUCATION TEST RESULTS
Teacher:___________________________________ Class:_________________________________
Circle One:
Test 1
Test 2
Test 3
Subject:_______________________________
Answers Report:
Did all the students finish the test?
Yes
No
Who needs to finish?_____________________________________________________________________
Student Sub Skill Report:
Look at the number of items per skill. Put the skills in order according to the weighting.
1
2
3
4
5
6
Class Summary Report:
In your class, which skills have the greatest amount of red (not green)?
1
2
3
Objective and Subskills Report:
Rank your test items:
Easy (Not Green) = 1, Moderate (Not Green)=2, Hard (Not Green)=3
Consider your non-green skills, the state weighting and your subskill rankings.
Identify your target skills:
Proficiency Range (Proficiency Tables on the Help Menu):
Below =
Meets =
Exceeds =
Student Report:
For your target skill, list the students in the red, yellow and green levels. Highlight students who are below expectations or on the “bubble.”
Red
Yellow
Green
Diagnostic Assessment
Reading
Math
•QPS
•PAST
•GORT
IV
•PALS
•CTOPP
• Star
Math
•TEAM
•TEMA
3
•Key
Math
• MRI
•Isteep
USE
Only when
additional
information is
necessary
for plannning
instruction
and
intervention
PURPOSE
“Indepth
View”
QUESTIONS
What are the
strengths?
The
weaknesses?
Are other
students
exhibiting
similar
profiles?
Tier 2: Can’t Do/Won’t Do
• Can’t Do/Won’t Do
Assessment
• Individuallyadministered
• Treasure chest
of rewards if
score increases
Adapted from Amanda VanDerHeyden
What would you change?
Screening
Diagnostic
Progress
Monitoring
• R-CBM
• R-CBM
• R-CBM
Error
(weekly)
Analysis • MAZE
• CORE
(weekly)
Phonics • Word
Survey
Reading
• RM and
List
Horizon
(weekly)
Placeme
nt Tests
Formative
Summative
• Modified
• MAP
unit
• ISAT
assessments
(monthly)
• Teacher
created
assessment
(monthly)
How is your understanding of
screening and diagnostic assessments ?
I am more confused now about these
types of assessments than when I got
here.
I have about the same understanding
of them as when I got here.
I have a better understanding of these
types than when I got here.
Questions/Comments
EC Fidelity Checklist #4
“Description of assessments
for Tier 2 decision making.”
•Who else will need to be
involved?
•Where in your manual will this
explanation be included?
•What is your plan to
communicate to stakeholders?
Data Type
What we have by Purpose
of Assessment
INSERT AREA YOU
WOULD LIKE TO AUDIT
Screening
Diagnostic
Progress Monitoring
Outcomes
INSERT AREA YOU
WOULD LIKE TO AUDIT
Screening
Diagnostic
Progress Monitoring
Outcomes
Redundancies
Gaps
Fully utilized for
decision making
Communication to
Stakeholders
Reading
Math
Screening
Screening
Diagnostic
Diagnostic
Teaming Reflection
I-RtI Network
DECISION RULES
Decision Rules for…
Addition of Tier 3
Effectiveness of Tier 2
Progress within Tier 2
Entry to/Exit from Tier 2
How are
Decision Rules
Derived ?
• Research Norms
• National Norms
• Local Norms
• Criterion-Referenced
Benchmarks
Method
Research
Norms
Local Norms
CriterionReferenced
Benchmarks
Disadvantages
May be a small sample
size
Advantages
Easily available
Unless large, may not
be representative
Need a large norm base Readily available in
many districts
Variance across district
Accurate
comparison to peers
Can be arbitrary
Can be flexibly
applied to a wide
range of skills
Percentile Rank Cut Scores
– Derived scores that
indicate the
percentage of
people in the
norming sample
that scored at or
below a given raw
score.
– Percentile rank
scores for referred
students typically
are derived from
local norms.
Percentile Rank Cut Scores
Advantages
Easily Interpreted and
Understood
Disadvantages
Needs a Large Norm Base
Norms for Basic Skills
Typically Not Available
Readily Available at Multiple Outside of Basic Skills and
Levels (School, District,
CBM/DIBELS
State, National)
Gives a Clear Picture of
Educational Need and
Benefit
Requires Commitment to
Benchmark Assessment
Example of Percentile Rank Norms
Standards-Based Approaches
Advantages
Disadvantages
Allows Decisions to be Tied Requires that the Linkage
to High Stakes Tests Like
and “Cut-Scores” be
the ISAT
Established Empirically
Allows a Focus on Needs of
Groups of Students Rather
than Just Individual
Students
Difficult to Implement if
There is Not a Commitment
to a 3-Tier Model of
Intervention
EC Fidelity Checklist #5
“Decision making
criteria related to Tier 2”
•Who else will need to be
involved?
•Where in your manual will this
explanation be included?
•What is your plan to
communicate to stakeholders?
RESPONSE
TYPES
Positive
Questionable
Poor
Did things go according to
plan?
What’s your Decision?
Progress Monitoring Decision
Rules
Decision Rules
Positive Response:
• The gap is closing,
• Can extrapolate point at
which target student
will “come in range” of
peers
Intervention Decision
• Continue intervention
until student reaches
benchmark
• Decide how long after
benchmark the
intervention will be
continued
• Fade intervention to
determine if student
has acquired functional
independence
Decision Rules
Questionable Response:
• Rate at which gap is
widening slow
considerably, but gap is
still widening.
• Gap stops widening but
closure does not occur.
Intervention Decision
• Increase intensity of
current intervention for
a short period of time
and assess impact.
• If rate improves,
continue.
• If rate does not
improve, return to
problem solving.
Decision Rules
Poor Response:
Intervention Decision
•
•
Gap continues to
widen with no change
in rate.
Return to problem
solving for new
intervention
Decision Making
What would you decide?
Data Indicates: Group median is below mastery
range and most students are gaining digits
correct per week.
Action: Consider implementing
intervention for an additional week and
then review progress again.
Adapted from VanDerHeyden 2008
Decision Making
What would you decide?
Data Indicates: Class median is below target and most
students are NOT gaining digits correct per week.
Action: Check integrity FIRST, and address with
training if needed. Consider implementing
intervention for an additional week with incentives or
easier task and then review progress again.
Adapted from VanDerHeyden, 2008
How are you feeling about your
Decision Rules Toolbox?
I have fewer tools in my decision rules
toolbox than when I got here.
I have about the same tools in my
decision rules toolbox as when I got
here.
I have more tools in my decision rules
toolbox than when I got here.
Questions/Comments
Manual
Work
“In theory, there is
no difference
between theory and
practice;
in practice, there is.”
Yogi Berra
Building Level, Day 2
70
I-RtI Network
SAPSI-D
Administration
SAPSI-D
DUE
OCTOBER 31,
2013
SAPSI-D Webinar
SAPSI-D Webinar
I-RtI Network
COACHING TIER 2 IMPROVEMENT
What can a coach do?
Model of
Supports
SAPSI-D
Teaming
Decision
Rules
Assessment
Partnership Principles
• Equality
• Choice
• Voice
• Reflection
• Dialogue
• Praxis
• Reciprocity
Knight, J. (2011). Unmistakable Impact
Components of Coaching
• Enroll
• Explore
• Identify
• Refine
• Explain/Mediate
• Modeling
• Observe
Knight, J. (2011) Unmistakable Impact
How are you feeling about coaching
these processes ?
I have less confidence in coaching these
issues than when I got here.
I have about the same confidence in
coaching these issues as when I got
here.
I have more confidence in coaching
issues than when I got here.
Questions/Comments
Key Ideas in Aligning Tier 2
1.
2.
3.
4.
5.
6.
Pros and Cons of SP and PS
Screening and Diagnostic Tools
District Tools Audit
Decision Rules
Ongoing manual work
SAPSI-D Administration
Extension
Activity
Planning
Manual Work:
Development and refinement of
district and building level
decision rules.
Who needs to be involved in this
process?
When can this work be done?
How will you communicate the
final product?
SAPSI-D Administration
Closing
Activities
Technical
Assistance
Planning