MTSS Contact - Brevard Public Schools

Download Report

Transcript MTSS Contact - Brevard Public Schools

Shelly Dickinson, MTSS Trainer
Charlie Eccleston, MTSS Trainer
What do we want you to Know?

The types of data used within the MTSS tiers
What do we want you to Understand?

How to use aim lines and trend lines to guide in student decision
making process
What do we want you to be Able to do?

Share your knowledge at your school
 Analyze data and make decisions
How BIG is the GAP?
How much TIME do
we have to close it?
INSTRUCTION
LEARNER
DATA
ENVIRONMENT
CURRICULUM




Capacity to Problem-Solve
Capacity to Collect Data, and Make Sense of It
Capacity to Deliver Instruction at Different Intensities (TieredDiscuss with a partner
levels of services)
Capacity to Display Data Over Time
Discuss with a partner
Which component(s) do you feel your school is doing well?





Analyze the Past – How did we do? What can we do better?
Plan for today, Drive our Instruction – What should we do
differently?
Diagnose – What specifically is the issue?
Progress Monitor- Is what we are doing working?
Predict the Future- Trends, Student Outcomes
Four Purposes for Assessing within MTSS
Formative
1) Screening: identify students at risk for academic difficulty
2) Diagnostic: provide an in-depth, reliable assessment of
targeted skills
3) Progress Monitoring: determine whether the student is
responsive to given instruction
Summative
1) Outcome: student demonstrates accepted level of mastery



Materials:
Assessment Mat
Assessment Resources
At your table discuss the types of assessments looking at the
different resources provided
Formulate an assessment guide to take back to your building
Three Types of CBMs – (Curriculum-Based Measurements)
General Outcome Measures (GOMs)
Skills-Based Measures (SBMs)
Mastery Measures (MMs)
PRIMARY USES
•
Screening
•
Screening
•
Diagnostic Evaluation
•
Survey-level testing
•
Survey-level testing
•
Specific-level testing
•
Progress Monitoring
•
Progress Monitoring
•
To target content areas of concern
•
To target different proficiency levels
and response types
STRUCTURE
•
Uses global/interactive tasks
•
Composed of mixed items drawn
from a set of goals
May only test one specific skill or
short-term instructional objective
•
Separate skills are not isolated or
marked
•
Skills are usually sampled across a
whole year’s curriculum
A large sample performance is
collected on each skill
•
Targets long-term goals
•
Separate skills may be isolated or
marked
Items are referenced to skills and/or
proficiency levels
•
Often includes common classroom
tasks
•
Items are often cross-referenced to
goals
Some skills nay be examined in
isolation
The ABCs of CBM by Hosp, Hosp, and Howell.
Three Types of CBMs – (Curriculum-Based Measurements)
General Outcome Measures (GOMs)
Skills-Based Measures (SBMs)
Mastery Measures (MMs)
ADVANTAGES
•
•
•
•
Provides perspectives
Gives an overall impression of skill
level
•
•
Useful for double checking a
problem indicated on a GOM or SBM
•
Gives an overall impression of skill
level
Provides brief measures
Useful for Monitoring
Illustrates retention and
generalization
•
•
•
Useful for Monitoring
Illustrates retention
Sensitive to growth overtime
•
Useful for checking hypothesis
about missing skills or subskills
Provides focus
•
DISADVANTANGES
•
Provides little diagnostic
information
•
Small sample for each goal limits
diagnostic utility
Don’t provide the big picture (no
generalization or application)
•
Doesn’t provide information about
specific skills
•
Often includes a high proportion of
items that are either above or below
the student’s skill level
Skill-subskill relationship may not be
real
•
Often includes a high proportion of
items that are either above or below
the student’s skill level
•
May not require generalization or
interactive use of the skill
Should not be used for progress
monitoring
•
Some content areas don’t have
convenient capstone tasks
The ABCs of CBM by Hosp, Hosp, and Howell.
Using Progress Monitoring
within the MTSS Framework
Progress-Monitoring measures are ongoing
assessments conducted for the purposes of:
Guiding Instruction
Monitoring Student Progress
Evaluating Instruction/Intervention
Effectiveness

Progress Monitoring Data determines students’ Response
to Instruction using:
 Tier 1 Data
 Universal Screenings (GOMs)
 Inventories
 District Assessments
 Tier 1 Unit/Weekly Assessments
 Tier 2 Data
 Collecting intervention data at least every 2 to 3 weeks (IPST Form7)
 ORF, MAZE, DIBELS Next, CBMs
 Teacher Made Assessments (MM)
 Tier 3 Data
 Weekly (IPST Form 7)
 Measuring Specific Targeted Skills (SBM & GOM)
 Continually adjusting instruction based on OPM data to meet student’s
needs
Brief &
Easy
Frequent
Sensitive
to growth
Equivalent
Measurements
Graph
Components
Instructional
Change Line
Baseline
Skill
equal
increments
Intervention #1
(Group or
Individual)
Intervention #2
(Group or
Individual)
Goal
Aim Line
Trend Line 2
Trend Line 1
Time - equal increments
Let’s
Practice
Creating a Graph
with
an Aim Line
&
a Trend line
Hint: Use the
Grades 3-6
Assessment
Decision Tree
to determine
year-end goal.
130
125
120
115
Ana’s Aim Line
110
WCPM
105
100
95
Ana’s Aim Trend Line
90
85
80
75
70
65
60
0
2
4
6
8
10
12
Weeks
14
16
18
20
22
24
26
Making Decisions:
Using Data to
Move Between Tiers
Intensity of
Intervention
Intensive
Instruction
Supplemental
Instruction
Core Instruction
Decision rules
Decision rules


Is rate of progress acceptable?
If not, why and what should we do about it?
◦
◦
◦
◦
◦
◦
◦

Frequency and amount of intervention
Instructional strategy
Opportunity for practice and application
Attendance
Fidelity of instruction/intervention implementation
Group size
Other factors?
Choices- try another intervention, modify
existing intervention, other?
MTSS
Procedural
Overview
Flowchart
pages 40-42
Response to Intervention
Positive
Performance
Questionable
Expected
Trajectory
Poor
Observed
Trajectory
Time
Bart - OPM Reading Fluency
100
Intervention Cycle 2
Intervention Cycle 1
90
80
GOAL
Words Correct Per Min
70
60
50
Baseline
40
Aimline
30
30
20
20
10
22
18
21
24
25
26
28
30
31
28
Trendline = 0.95
words/week
22
0
Sept
Oct
Nov
Dec
School Weeks
Jan
Feb
How BIG is the GAP?
How much TIME do
we have to close it?
54
52
50
48
WCPM
46
44
42
40
38
36
34
32
30
Baseline
9/20
4-Oct
8-Oct
1-Nov
Weeks