When School Reform Works U.S. Department of Education’s

Download Report

Transcript When School Reform Works U.S. Department of Education’s

RTI Training Series 2008
LEA Forum: Session 3
Monitoring Student Progress &
Fidelity of Implementation
May 8th-9th
Craig A. Albers, Ph.D.
Assistant Professor
University of Wisconsin – Madison
[email protected]
Speaker: Craig A. Albers, Ph.D.
Dr. Albers, Assistant Professor at the University of Wisconsin in Madison,
will provide the national perspective for Session 3 of this training series. He
has conducted research in the areas of universal screening and progress
monitoring, prevention and early intervention for students experiencing
academic and/or behavioral difficulties, language proficiency and academic
interventions for ELLs, and functional behavior assessment. He earned his
doctorate in educational psychology with a specialization in school psychology
at Arizona State University. Dr. Albers provides consultation and mentoring
services to numerous schools, districts and state educational agencies
regarding the implementation of RTI models. Dr. Albers has been identified as
an Early Career Scholar by the Society for the Study of School Psychology,
and recently guest-edited a Journal of School Psychology special issue relating
to universal screening and progress monitoring within RTI models and the
connection to improved educational and mental health outcomes.
2
Session 3 Goals
1. Identify key components for universal screening and
progress monitoring tools.
2. Examine academic and social-emotional functioning
instruments and procedures that can be utilized as
universal screening measures.
3. Develop specific criteria for decision-making to
determine which students are in need of additional
options.
4. Utilize frequent progress monitoring to determine
effectiveness of interventions and fidelity of
implementation.
3
RTI Conceptual Models
1. RTI models for special education eligibility
2. RTI models for school-wide reform and improved
outcomes for all student
4
Response-to-Intervention
 RTI is the practice of
 providing evidence-based (scientific supported)
instruction/intervention matched to student needs
 using rate and level of performance as assessed over time
to make important psychoeducational decisions
5
Essential Components in RTI Models
 Universal screening
 Progress monitoring
 Selection of universal (core), selected
(supplemental) and targeted (intensive) options
that are likely to be effective
 Fidelity of implementation
 Special education eligibility decisions (decision
rules)
Screening and progress monitoring are the
foundation of data-based decision-making in RTI models!
6
Connecticut
RTI Model
7
The R in RtI
 Selecting At-Risk Students (“screening”)
 Monitoring of At-Risk Students (“progress monitoring”)
 Monitoring of Implementation Fidelity
8
The R in RTI
What Does Screening and Progress Monitoring Look
Like Within Multi-Tiered Models?
Stage 1
Universal
prevention activities
Stage 1 screening
Screening indicates at-risk?
or
Behavioral/academic difficulties arise?
Yes
Continue
universal prevention
activities
No
Stage 2
Eligible for selected
intervention(s)
Conduct more in-depth
screening/assessment
Screening/assessment indicates need for more intense intervention?
or
Behavioral/academic difficulties persist/intensify?
Yes
Continue selected
intervention or
return to universal
No
Stage 3
Eligible for targeted
intervention(s)
Comprehensive Assessment(?)
9
The I in RTI
 The Focus is Primarily on Reading (but other domains are
included)
 Interventions are Multi-tiered
 Problem Solving Model Used
 Standard Treatment Protocol Used
 Intervention-as-Test (Fuchs & Fuchs, 2006)
 Intervention Integrity Important
10
What is Universal Screening?
How do we do it?
11
The R in RTI
Educational Decisions Linked to RtI





Screening for Resource Allocation
Screening for Further Evaluation
Assessment for Response-to-Intervention
Assessment for Instructional Decisions
Assessment for Program Eligibility (including special
education)
12
The R In RTI
What is Universal Screening?
 Conducted with everyone within a population
(e.g., classroom, grade level, school, district, state,
certain age, etc.);
 Conducted to identify those at-risk of failure,
emotional/behavioral difficulties, health issues,
etc.;
 Goal is to identify difficulties (1) before overt
problems/symptoms are manifested, or (2) before
the difficulties become significant and lead to
impairment.
13
The R In RTI
What is Universal Screening?
 Emphasis should be on early identification to
prevent difficulties from further development or
escalation
 Two ways of looking at EARLY identification:
 Early as in an “early” age (e.g., pre-kindergarten, etc.)
 Early as in the “early” development stage of a difficulty
 This should be the focus in a response-to-intervention model
 Should be ONGOING
 This is the connection to progress monitoring
 Usually, only snapshots are taken of an individual that
may not be an accurate portrayal of any significant
issues
14
The R In RTI
What is Universal Screening?
 A good screening program will also include appropriate
intervention options at each level
 Intent is to differentiate among:
 Typically-developing children/adolescents
 Those with elevated risk status
 Those with life-course persistent problems
15
The R in RTI
Legal Mandates and Recommendations for Universal
Screening
 NCLB

Focus on reading
 IDEA 1997/2004

15% of funds can be used for early intervention (IDEA, 2004)
 President’s Commission on Special Education

Recommends locally-driven universal screening of young children
 NRC

Emphasizes early, universal screening for academic and behavioral
issues
 Best practices? Ethical responsibilities?
16
A Comprehensive
Data-Based
Framework
- Math
- Anxiety
- Reading
- Depression
- Spelling
- Antisocial
Behaviors
- Writing
- Science
- Social Studies
- ADHD
- Substance
Abuse
- Social skills
17
The R in RTI
Screening-Related Variables
Screening-Related Variables
Content
 Academic
 Social/emotional
 Behavioral
 Mental health
Frequency
 Single
 Multiple
 Ongoing
Factors
 Risk
 Protective
Format
 Rating scales
 Observations
 Checklists
 Records review
 Interviews
 Direct skills
 Nominations
Timing of outcome measures
(Glover & Albers, 2007)
Intent
 Prediction
 Identification
Stages
 Single-gate
 Multi-gate
Informant
 Teacher/educator
 Parent/caregiver
 Self-report / Direct skills
Models
 Focus on person variables
 Focus on environmental variables
 Focus on interaction between person
and environment variables
Definition of at risk
18
The R in RTI
Universal Screening Content Areas






Academic
Behavioral
Emotional
Mental Health
Social
Health
19
The R in RTI
Universal Screening Intent
 Prediction
 Will difficulties arise in the future? How likely are future
difficulties?
 Identification
 Is there evidence suggesting that difficulties currently
exist?
20
The R in RTI
Who is Universal Screening
Designed to Identify?
Risk Factors
Low
Medium
High
Protective
Factors
High
Medium
Low
Protective Factors
High
Risk Factors
Low
Risk Status
Minimal at-risk
High
Medium
Some at-risk
High
High
Some at-risk/at-risk
Low
High
At-risk
Low
Medium
Some at-risk
Low
Low
Minimal at-risk
Medium
Medium
Some at-risk
21
The R in RTI
Universal Screening Frequency
 Single time
 Multiple times
 Ongoing
22
The R in RTI
Universal Screening Stages / Approaches
 Single-gate
 Multi-gate
23
The R in RTI
What are Multi-Gate Approaches?
24
The R in RTI
Universal Screening Multi-Gate Approaches
Gate 1
 Expected to over-identify difficulties (false
positives)
 Goal is to eliminate those who clearly are not having
difficulties
 Do not allow for definitive statements; at best may
be a preliminary indication that something could
be wrong
 “Speculative screening”
25
The R in RTI
Universal Screening Multi-Gate Approaches
Gate 2
 Goal is to continue to remove students who clearly do not
have significant difficulties from future screening activities
Gate 3
 Goal is to clearly identify those with significant risk
factors/lack of protective factors who are in need of
intervention options
26
The R in RTI
Universal Screening Informant(s)
 Parent / caregiver
 Teacher / educator
 Self-report
 Direct skills
27
The R in RTI
Universal Screening Formats







Direct skills / measures
Rating scales
Checklists
Interviews
Nominations
Observations
Record reviews
28
The R in RTI
Universal Screening Models
 Focus on person
 Focus on environmental variables
 Focus on interaction between person variables and
environmental variables
29
The R in RTI
Universal Screening Outcome Measures Timing
 Concurrent
 Predictive (at some point in the future)
30
The R in RTI
Definition of At-Risk


Who or what determines the “degree of at-risk”?
Related to:
1.
2.
3.
Compatibility with service delivery availability (i.e.,
resource allocation)
Technical adequacy of instruments
Consequences of being identified
31
The R in RTI
Considerations and Sample Questions for Evaluating
Universal Screening Assessments (Glover & Albers, 2007)
Consideration
Sample questions
Appropriateness for the intended use
Compatibility with the service delivery needs
Are the timing and frequency of administration appropriate? Are the
identification outcomes relevant?
Alignment with constructs of interest
Are the measured constructs relevant for determining an individual’s risk
status?
Theoretical and empirical support
Have the format and content been validated in previous research?
Population fit
Is the assessment contextually and developmentally appropriate?
Technical adequacy
Adequacy of norms
Is the normative sample representative, recent, and sufficiently large?
Internal consistency reliability
Are items measuring the same construct? Are alternate forms comparable?
______________________________________________________________________________________________________
32
The R in RTI
Considerations and Sample Questions for Evaluating
Universal Screening Assessments (Glover & Albers, 2007)
Consideration
Sample questions
Technical adequacy (continued)
Test-retest reliability
Is measurement consistent over time?
Interscorer reliability
Is scoring consistent across scorers?
Predictive validity
Sensitivity
Of those actually at risk, what proportion is correctly identified?
Specificity
Of those actually not at risk, what proportion is correctly identified?
Positive predictive value
Of those identified as at risk, what proportion is correctly identified?
Negative predictive value
Of those identified as not at risk, what proportion is correctly identified?
Hit rate
What proportion of the total sample was correctly identified?
Concurrent validity
Is the assessment outcome consistent with a criterion measure?
Construct validity
Does the assessment measure the construct for which it is designed?
____________________________________________________________________________________________________
33
The R in RTI
Considerations and Sample Questions for Evaluating
Universal Screening Assessments (Glover & Albers, 2007)
Consideration
Sample questions
Technical adequacy (continued)
Content validity
Are the assessment format and items appropriate?
Usability
Balance of costs and benefits
Are the costs associated with the assessment reasonable?
Feasibility of administration
Are personnel able to administer the assessment?
Acceptability
Do stakeholders appreciate the benefits associated with the assessment?
Infrastructure requirements
Are resources available to collect, manage, and interpret assessment data?
Accommodation needs
Are accommodations available for those who need them?
Utility of outcomes
Can stakeholders understand the implications associated with assessment
outcomes? Are the outcomes useful for guiding instruction/intervention?
Does universal screening improve student outcomes?
____________________________________________________________________________________________________________
34
The R in RTI
Screening Assessment Outcomes – Predictive Validity
Screening Instrument:
Indicates At-Risk
Indicates Not At-Risk
In reality:
1.
2.
3.
4.
(a)
Actually is a
poor outcome
Valid positive (VP)
Actually is an
adequate outcome
False positive (FP)
(c)
(b)
False negative (FN)
(d)
Valid negative (VN)
Possible outcomes
A student may be screened and identified as having an early learning problem and be actually experiencing the early stages of a
learning problem (a)
A student may be screened and identified as having an early learning problem but may not actually be experiencing the early
stages of a learning problem (c)
A student may be screened and not identified as having an early learning problem but actually be experiencing the early stages of
a learning problem (b)
A student may be screened and not identified as having an early learning problem and not be experiencing the early stages of a
learning problem (d)
Sensitivity - of those actually at risk, what proportion is correctly identified? a / (a + b)
Specificity - of those actually not at risk, what proportion is correctly identified? d / (c + d)
Positive predictive value - of those identified as at risk, what proportion is correctly identified? a / (a + c)
Negative predictive value - of those identified as not at risk, what proportion is correctly identified? d / (b + d)
Hit rate - what proportion of the total sample was correctly identified?
35
The R in RTI
Universal Screening Approaches:
Academic Examples
Reading
 DIBELS
 CBM
 BACESS
 Other standardized
achievement measures
Mathematics
 CBM
 BACESS
 Other standardized
achievement measures
 Or, create approach
specific to your state –
see next slides for
example
36
Creation of Math Progress Monitoring Process for WI
37
38
39
The R in RTI
Universal Screening Approaches: SocialEmotional / Behavioral Examples (Walker, Hope-Doolittle,
Kratochwill, Severson, & Gresham, 2007)
 Systematic Screening for
Behavior Disorders (SSBD;
Walker & Severson, 1990)
 School Social Behavior Scale
(SSBS; Merrell, 1993)
 The Revised Behavior
Problem Checklist (Quay &
Peterson, 1987)
 Drummond’s Student Risk
Screening Scale (SRSS;
Drummond, 1993)
 Conner’s Rating ScalesRevised (CRS-R; Conner’s,
1990)
 Eyberg Child Behavior
Inventory (ECBI; Eyberg &
Ross, 1978) and SutterEyberg Student Behavior
Inventory (SESBI; Sutter &
Eyberg, 1999)
 SSIS Multi-Tiered
Assessment and Intervention
Model (Elliott & Gresham,
2007)
40
The R in RTI
Universal Screening Approaches: Examples
of Screening Instruments Relating to Mental
Health (Levitt, Hunter, & Hoagwood, 2007)
 Ages & Stages Questionnaire
(2002)
 Diagnostic Predictive Scales
(2001)
 Strengths & Difficulties
Questionnaire (1997)
 Conner’s Rating Scales (1997)
 Children’s Depression
Inventory (1992)
 Multidimensional Anxiety
Scale for Children (1997)
 Beck Depression Inventory-II
(1996)
 Center for Epidemiologic
Studies Depression Scale
(1977)
 Columbia Depression Scale
(2005)
 Personal Experience Screening
Questionnaire (1991)
41
What is Progress Monitoring?
42
What is Progress Monitoring?
A
systematic process by which student performance data are frequently and
repeatedly collected and analyzed
Used to assess student performance and evaluate the effectiveness of
instruction.
Used to measure rate of improvement in relation to identified benchmarks
Frequent comparison of current to desired performance over a specified
period of time
43
What is Progress Monitoring?
Continued.
 Typically used to analyze a student response to a
particular option (RTI) “(critical component of
RTI models)”
 Sensitive to small changes in student performance
 You do not have to wait for 6 months to know if
something is making a difference
 Relatively quick and simple to carry out
 Implemented with individual students or an entire
class
44
Progress Monitoring is not:
 In-depth assessment of a content domain
 An analysis of degree of curriculum alignment to
standards
 An analysis of student achievement in terms of
lesson or unit content coverage
 A diagnostic assessment of student strengths and
weaknesses
45
Progress Monitoring: Necessary Distinctions Between
Assessment for Intervention and Assessment of Intervention
Assessment for Intervention
 Identifying,
 Verifying,
 and Aiding….
…in the selection of an
appropriate intervention.
Assessment of Intervention
 Determine whether intervention
is having sufficient impact so
that the student can reach
defined goals
 Is the intervention being
implemented as intended?
Note – see: Albers, C. A., Elliott, S. N., Kettler, R. J., & Roach, A. T. (2005). Evaluating intervention outcomes. In
R. Brown-Chidsey (Ed.), Problem-solving based assessment for educational intervention (pp. 329-351). New York:
Guilford Publications for more descriptions of assessment for – and assessment of intervention in a problemsolving model.
46
RTI Models – Progress Monitoring
Intervention
Intensity
Universal
Low
Progress Monitoring
Frequency
Quarterly,
etc.
options
Selected
options
Targeted
options
High
More
frequently
47
A Progress Monitoring System is
Designed to Identify...
 Rate of all students’ performance
 Effectiveness of instructional options
 Students who may need additional supports
48
On-Going Progress Monitoring Facilitates…
 Efficient selection and use of effective educational
options
 Appropriate resource allocation
 Focus on explicit outcome indicators
 Objective data-based decisions about options
 Clear expectations for students
 Continuous feedback
49
What are the Minimum Requirements for
Progress Monitoring?
 Technical adequacy (i.e., reliability & validity)
 Feasibility of administration (quick/brief)
 Sensitive to student growth
 Sufficient number of alternate forms
 Useful for instructional planning
From: www.studentprogress.org
50
What are the Minimum Requirements
for Progress Monitoring?
 Continued.
 Demonstrated to result in increased student growth
 Scientifically-based! FYI, NCLB references scientificallybased 122 times, so this should be the basis for all of our
practices whenever feasible.
Source: www.studentprogress.org
51
What are the Minimum Requirements for
Progress Monitoring?
See the National Center on Student Progress Monitoring for
more information – www.studentprogress.org
52
How do We Evaluate Progress Monitoring Tools &
Procedures?
Note: Refer to the supporting
document entitled, NCPM
Standard Protocol for
Evaluating Progress
Monitoring Tools.
53
What Effect Does Progress Monitoring
have on Student Performance?
1.4
Effect Size
1.2
1.2
1
0.8
0.75
0.6
0.6
0.4
0.2
0
Reading
Math
Spelling
Domain
54
How Do We Do Progress Monitoring?
55
- Math
- Anxiety
- Reading
- Depression
- Spelling
- Antisocial
Behaviors
- Writing
- Science
- Social Studies
- ADHD
- Substance
Abuse
- Social skills
56
Are all Progress Monitoring Activities Equal?
 Traditional Assessment
 e.g., State/standardized test data, district data, etc.
 Mastery Measurement
 Performance Monitoring
 Curriculum-Based Assessment
 Curriculum-Based Measurement
 Numerous other procedures
57
Are all Progress Monitoring Activities Equal?
The Short Answer is NO!
Inaccurate data or
data without meaning
are worse than
NO DATA!
58
Are all Progress Monitoring Activities Equal? The
Short Answer is NO! Traditional Assessment
 Lack of sensitivity
 Are the scores able to indicate even small growth in skills?
 Too long of lapse between administrations / not administered on
a regular basis (i.e., annually)
 Too long of lapse between administration and receipt of scores
(i.e., lack of immediate feedback)
 Too much time – results in decrease in instructional
opportunities
 Lack of instructional utility - Scores do not necessarily relate to
classroom achievement or desired social-emotional skills.
59
Are all Progress Monitoring Activities Equal? The
Short Answer is NO! Mastery Measurement
 Mastery measurement tracks progress towards shortterm instructional objectives.
 This is accomplished by:
 Determining the sequence of skills in a hierarchy
 For each skill, creating a criterion-referenced test
 What does this look like?
60
Example of Mastery Measurement - Reading
Median scores
for level
Grade level/book
Pre-primer 1
Pre-primer 2
Pre-primer 3
1-1
2-1
Location
WC/
Errors/
in book
Beginning
min
46
min
1
Middle
44
3
End
47
2
Beginning
45
1
Middle
40
3
End
34
3
Beginning
19
4
Middle
23
5
End
34
4
Beginning
8
6
Middle
13
2
End
8
4
Beginning
Middle
End
9
5
11
4
9
8
%
Correct
Learning
WC
ER
%C
46
2
100
I
100
40
3
100
I
100
23
4
80
F
80
8
4
60
F
60
9
20
Level (M,I,F)
8
20
F
61
Example of Mastery Measurement - Math
#
Probe type
ADD: 1D & 1D 11-19
3
Median
SUB: 1D & 1D to 10
2
Median
ADD: 1D & 1D to 10
1
Digits
correct/min
Digits
incorrect/min
% Problems Learning level
correct
(M, I, F)
1
0
100
3
0
100
2
0
100
2
0
100
7.5
0
100
5.5
0
100
7.5
0
100
7.5
0
100
12
0
100
F
F
I
Median
62
Potential Difficulties with Mastery Measurement
 Hierarchy of skills
 Single-skill assessments
 Maintenance, extrapolation, and/or
generalization
 Psychometric properties
 Questionable predictive validity
63
Progress Monitoring System Components
 Standards
 Expectations and benchmarks
 Indicators (what data to collect)
 High quality instructional options (universal, selected, and
targeted)
 Screening & on-going progress monitoring tools
 Data compilation and analysis
 Fidelity check
 Decision rules linked to instructional options
64
Progress Monitoring Questions & Components
KEY QUESTIONS
PROGRESS MONITORING
SYSTEM COMPONENTS
What do we expect all students to
know and do?
 State standards
 Benchmarks
 Indicators
How do we know if they are
meeting expectations?
 Instructional Options
 Screening & Ongoing Progress
Monitoring
What do we do if they are not
meeting expectations?
 Data Compilation & Analysis
 Fidelity Check
 Decision Rules
65
Progress Monitoring & Standards:
What Do We Expect Children to Do?
 Standards indicate what students should know or do and
how well they must perform.
 Content standards- what students should know and be able to
do
 Performance standards- defines how a student will show that
they are meeting the standard
 Proficiency standards- how well students must perform
66
Progress Monitoring & Benchmarks:
What Do We Expect Children to Do?
 Benchmarks provide the foundation for standards
 Benchmarks serve as measurable guides for
teachers and parents on knowledge and
performance required to achieve the standards
67
Progress Monitoring & Indicators:
What Do We Expect Children to Do?
 Critical skills that are measured to determine if the
student is progressing toward the benchmarks and
standards
 And how well the student performs (accuracy,
amount, speed, etc.)
 Example: fluency and comprehension are
critical indicators for measuring reading
proficiency
68
Progress Monitoring & Instructional Options:
How Do We Know?
 High Quality Instruction = Instruction and
intervention that results in desired outcomes
 High Quality Options are Evidence-Based
 Evidence of positive student outcomes
 Aligned to standards and goals
 Implementation with integrity and fidelity
 Outcomes are sufficiently defined
 Various measures of success are defined
(Rathvon, 2003).
69
Progress Monitoring & Tools:
How Do We Know?
 Simple, efficient and effective
 Can be linked to instructional options
 Measures performance on critical indicators
 Can be used frequently
 Quick and easy to administer and “score”
 Produces data that can be easily translated and
analyzed (e.g. percents, counts, lexile scores)
 Sensitive to changes in performance
70
Progress Monitoring & Data Analysis:
What Do We Do?
 Charts and graphs easily produced (see
www.interventioncentral.org for examples)
 Comparison to specific benchmarks can easily be
made
 Can estimate rates of progress
 Results are electronically archived and easily
available for future analysis
71
Progress Monitoring & Fidelity Checks:
What do we do?
 Make sure high quality options are implemented at
the universal level
 Information about the implementation of
instruction/intervention exists
 Accuracy of assessment tools, data, analysis
 Assessment procedures defined and adhered to
 Procedures insure accurate and reliable data
72
Progress Monitoring & Decision Rules:
What do we do?
 Decision Rules link progress monitoring data
to instructional decision-making
 Based on grade level standards and benchmarks
 Applied consistently
 Identifies students meeting or exceeding
benchmarks
 Identifies students at “risk” for not meeting
benchmarks at current rate of progress
 Easy to understand by teachers and parents
73
Progress Monitoring Purpose / Considerations
Intervention
Level
Universal
Selected
Targeted
Students
Involved
Type of
Focus of
Progress Monitoring
Instructional
Instructional
Purpose
Options
Options
Core academic,
Support learning & o Universal screening
social, emotional, & prevent failure
o Proactive curriculum
All students
behaviors
adjustment including
differentiation
Supplemental
o Prevention
o RTI
Small groups options
o Rapid response – o Intervention selection
early intervention
Individualized
o Intervention
o RTI
options, high
o Longer duration o Intervention selection
Individual
intensity/frequency
o Need for further
assessment?
Progress Monitoring
Schedule
o Screening: 1-2
times/year
o Progress monitoring: 24 times/year
o Universal + 1-2
times/month
o Universal +
individualized/weekly
74
The Progress Monitoring Process
Step 1: Determine student baseline level (academic, social or behavioral)
related to curriculum benchmarks.
Step 2: Monitor progress on a regular basis (daily, weekly, quarterly or
monthly).
Step 3: Measure progress to compare expected and actual rates of learning
or behavior change.
Step 4: Based on these measurements, instruction or supports is adjusted /
or continued as needed.
75
Step 1: Determine Student Baseline Level
 Graphic Display
 What is being measured? (e.g., word fluency, problems
correct, assignments completed, etc.)
 Length of time to be measured
 Baseline Data
 Collect data until baseline is stable (minimum of 3 data
points)
 Once baseline is obtained, begin intervention
76
Identify intervention and set up progress
monitoring chart
Name ______________ District _________ School ______ Yr. ______ Gr. ________
Goal Statement __________________________________________________________
Expected Level of Performance _____ Interim #1____ #2 _____ #3 _____ #4 ________
Summary of Intervention ____________________________________________________
100
Baseline
90
80
70
60
50
40
30
20
10
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
77
Step 2: Monitor Student Progress on a Consistent
Basis
 Frequency depends on intensity of instructional options,
degree of difficulty, and type of progress monitoring
instrument/process used
78
Step 3: Measure Progress to Compare Expected and
Actual Rates of Learning or Behavior Change
 Identify needed rate of growth to reach goal
 Monitor to determine if rate of growth is adequate
 Determine decision rules
79
Examples of Decision Rules
 For example:
 If 3 consecutive data points are below the goal line, make
an instructional change in the student’s program
 If 6 consecutive data points are above the goal line, adjust
instruction (make efficient use of resources!)
 If data points are variable (i.e., above and below goal line),
determine slope of growth; if heading towards goal,
continue with the student’s program and progress
monitoring.
80
Step 4: Adjust / Continue Intervention Options
as Appropriate
 Use decision rules to answer:
 How will you know if intervention is effective?
What will you do if it is not? What will you do if
it is?
OPTIONS
 Continue
 Increase frequency / intensity
 Change intervention option(s)
81
The R in RTI
Example of Using Progress Monitoring Data to
Evaluate Intervention Effectiveness (i.e., progress
monitoring approach)
PRF: Words Read Correctly Per Minute
200
180
160
Instructional
changes
140
Student’s
goal-line
120
100
80
60
40
20
0
1
2
3
4
5
6
7
8
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Weeks of Instruction
Selecting a Progress Monitoring System:
Academic Examples
Reading
 DIBELS
 CBM
 Accelerated Readers
 AIMSweb
 EdCheckup
 Monitoring Basic Skills
Progress
 PA Series
 STAR
 Test of Word Reading
Efficiency
 Test of Silent Word Reading
Fluency
 Yearly Progress Pro
 Running Records
Mathematics
 CBM
 AIMSweb
 Monitoring Basic Skills
Progress
 PA Series
 STAR
 Yearly Progress Pro
 End of chapter tests
83
Selecting a Progress Monitoring System:
Academic Examples
 See
www.studentprogress.org
for reviews of these
instruments
84
Selecting a Progress Monitoring System
Johnson, E., Mellard, D.F., Fuchs, D., & McKnight, M.A. (2006). Responsiveness to intervention (RTI): How to
do it. Lawrence, KS: National Research Center on Learning Disabilities.
85
Curriculum-Based Measurement
 Strong research support – based on 30 years of
research
 Strong psychometric (e.g., reliability, validity,
and instructional utility) properties and highly
correlated with high-stakes tests
 Used across the country
 Simple to use, sensitive to progress, and
inexpensive
86
What do CBM Probes Look Like?
 Typically have multiple skills/tasks relating to
end-of-year goals
 Avoids need to specify a skills hierarchy
 Avoids single-skill tests
 Automatically assesses maintenance /
generalization
 Permits standardized procedures for sampling the
curriculum, with known reliability and validity
 SO THAT: CBM scores relate well to
performance on high-stakes tests
• Random
numerals
within
problems
• Random
placement of
problem types
on page
Name _______________________________
Date ________________________
Applications 4
Column A
(1)
Column B
(5)
Write a number in the blank.
Write the letter in each blank.
One page of a
3-page CBM
in math
concepts and
applications
(24 total
problems)
Test 4 Page 1
1 week = _____ days
•
(A) line segment
Z
•K
•M
L
•
•N
(B) line
(6)
Vacation Plans for Summit
School Students
(C) point
Summer
School
(D) ray
Camp
(2)
Look at this numbers.:
Travel
356.17
Stay home
Which number is in the hundredths place?
0
10
20
30
40
50
60
70
80
90 100
Number of Students
(3)
Solve the problem by estimating the sum or
difference to the nearest ten.
Jeff wheels his wheelchair for 33 hours
a week at school and for 28 hours a week
in his neighborhood. About how many
hours does Jeff spend each week wheeling
his wheelchair?
(4)
Write the number in each blank.
Use the bar graph to answer the questions.
The P.T.A. will buy a Summit School
T-Shirt for each student who goes
to summer school. Each shirt costs
$4.00. How much money will the
P.T.A. spend on these T shirts?
$
How many students are planning to
travel during the summer?
How many fewer students are planning
to go to summer school than planning
to stay home?
(7)
3 ten thousands, 6 hundreds, 8 ones
2 thousands, 8 hundreds, 4 tens, 6 ones
To measure the distance of the bus
ride from school to your house you
would use
(A) meters
(B) centimeters
(C) kilometers
.00
How do we incorporate social /
emotional / behavioral data into
a comprehensive data-based
framework?
Defining Academic At-Risk
1. Strong academic skills, strong academic enablers
2. Strong academic skills, weak academic enablers
3. Weak academic skills, strong academic enablers
4. Weak academic skills, weak academic enablers
91
Examples of Social Skills

COMMUNICATION
 Takes turns in conversations
 Makes eye contact when talking

EMPATHY
 Forgives others
 Feels bad when others are sad

COOPERATION
 Follows your directions
 Follows classroom rules

ENGAGEMENT
 Makes friends easily
 Invites others to join in activities

ASSERTION

 Asks for help from adults
 Questions rules that may be unfair

RESPONSIBILITY
 Respects the property of others
 Takes responsibility for own
actions
SELF-CONTROL
 Makes a compromise during a
conflict
 Stays calm when teased
92
Examples of Competing Problem Behaviors
 EXTERNALIZING


Fights with others
Talks back to adults
 BULLYING


Bullies others
Keeps others out of social circles
 HYPERACTIVITY/INATTENTION


Fidgets or moves around too much
Gets distracted easily
 INTERNALIZING


Acts sad or depressed
Acts anxious with others
 AUTISM SPECTRUM


Has nonfunctional routines or rituals
Becomes upset when routine is changed
93
SSIS = SSRS + SSIG + New Products
The Social Skills Improvement System (SSIS) is the family
name for products which include the revisions of the Social
Skills Rating System (SSRS) and the Social Skills Intervention
Guide (SSIG), along with new products:
 SSIS Rating Skills (revision of SSRS)
 SSIS Intervention Guide (revision of SSIG)
 SSIS Performance Screening Guides (New!)
 SSIS Class-wide Intervention Program (New!)
94
Are social skills deficits indicative of disabilities or
adjustment problems?
 Research indicates that social skills deficits in early
childhood, if untreated, are relatively stable over time,
related to poor academic performance, and may be predictive
of social adjustment problems in adolescence and adulthood.
 A number of investigators have documented that students
with disabilities exhibit significant deficits in social skills.
95
Social Skills:
The Foundation for Academic Success?
 Caprara, Barbaranelli, Pastorelli, Bandura, & Zimbardo
(2000) found that prosocial skills (cooperating, helping,
sharing, and consoling) in 3rd grade was a better
predictor of 8th grade academic achievement than 3rd
grade academic achievement.
 Malecki & Elliott (2002) reported similar findings for
social skills and problem behaviors for an elementary
sample, with social skills significantly predicting end-ofyear achievement test performance on a high stakes test.
96
Basic Assumptions About Classroom Learning:
A Very Social Event for Most Students & Teachers
1.
2.
3.
4.
5.
6.
Academic performance & classroom behavior are highly
interrelated.
Children can teach each other important skills.
Learning is improved when opportunities to respond are
increased.
Learning is improved when time-on-task is increased.
Learning is improved when feedback about effort &
products is provided in a timely manner.
Learning is improved when reinforcement is provided.
97
The Relationships Among Social Skills, Problem
Behaviors, & Academic Functioning
Models for Thinking About Social Skills
Social
Skills
Problem
Behaviors
Academic
Functioning
98
The Top 10 School Social Skills








Listens to Others
Follows Directions
Follows Classroom Rules
Ignores Peer Distractions
Asks for Help
Takes Turns in Conversations
Cooperates With Others
Controls Temper in Conflict
Situations
 Acts Responsibly With Others
 Shows Kindness to Others
99
Comprehensive Model for Improving Student Social
Behavior with the SSIS
100
Convenient Step-down Booklet Design
Skill definition
next to
Performance
Descriptor
Each level described is
color coded for quick
and easy reference
Column for circling
evaluations next to
Performance
Descriptor with
matching color codes
Student
names are
visible with
alternating
grey bars to
guide
evaluation
process
101
Comprehensive Social Skills Assessment
 The SSIS Rating Scales (revision of
SSRS) are multi-rater scales (teacher,
parent, & student).
 Measures social skills, problem
behaviors, & academic competence.
 English and Spanish versions.
 National norms (ages 3-18).
 Widely used and technically sound
social skills assessment.
 Each rating takes 15-20 minutes.
 Results can be seamlessly integrated
with the SSIS Intervention Guide.
102
SSIS Class-Wide Intervention Program
 Designed for use by general education
teachers in mainstream classrooms.
 Provides teachers with an easy-to-use,
effective, and efficient way to teach 10
of the most important social skills.
 Blends instructional best practices and
proven intervention methods to teach
social skills.
 Three developmental levels:

Preschool/ Kindergarten

Early Elementary

Upper Elementary/Middle
103
Materials for Teaching Social Skills
104
Video Clips Provide Positive & Negative
Models of Social Behavior
26 video clips to
facilitate discussion
& demonstrate skills
105
Students Monitor Their Own Progress
Students evaluate
their progress
during each lesson
106
Social Skills Practice in Varied Settings
107
Resources to Monitor Student Progress
108
Resources to Monitor the Integrity of
Intervention Implementation
 Intervention Integrity Forms
help teachers adhere to the
instructional plan and provide a
record of the overall quality of
the intervention.
 There are 3 different formats
available on the SSIS Resource
Disc.
109
How do We Put a Progress Monitoring
System in Place?
110
Determine What is Currently in Place and
What Needs to be Added
Level
Reading
PROGRESS MONITORING TOOLS
Grade Level:
Writing
Mathematics
Behavior
Emotional
Universal: Options provided
to all students through a core
curriculum, differentiated
instruction, ongoing screening
and progress monitoring, and
schoolwide pupil services and
behavioral supports such as
violence prevention. Universal
options are aimed at
enhancing success and
reducing barriers to learning.
Selected: Options provided to
remove barriers and enhance
success for students who
demonstrate certain risk
factors that make it likely they
will have increased difficulty
if concerns are not addressed.
Some examples might include
supplemental reading
instruction or short-term
tutoring, support from adult
mentors, family support or
training.
Targeted: Options for
students who have a high
likelihood of developing or
who already exhibit a pattern
of academic failure or high
levels of social or emotional
distress. Because of the
intensive nature of such
options, targeted interventions
are typically needed by very
few students. Examples
include individualized
supplemental or replacement
instruction, individual
behavior plans and wraparound services provided by
mental health professionals.
111
Example of Determining What is Currently in Place
and What Needs to be Added
Progress monitoring is a scientifically-based practice used to assess student performance and evaluate the effectiveness of instruction and
interventions. Standardized assessments are used to determine whether students have achieved state and district academic standards. A variety
of screening tools are used to identifying students who are or may be at risk for academic and behavioral problems. Curriculum-based measures
are administered more frequently to evaluate the effectiveness of instruction and interventions. The chart below can be used to explore where the
School District
is at in the process of implementing a system for progress monitoring.
Level or Tier
Universal
Tier I
Model of Best Practice
 Teachers and support staff are trained in
progress monitoring strategies
 Conduct universal screening of all students
one time per year
 Conduct benchmark assessments in reading,
writing, and math 3 times per year (fall, winter,
spring)
 Collect data (i.e. indicators) to
 estimate performance rates
 evaluate the effectiveness of universal
options
 identify students needing additional
support
 design interventions
 monitor response to intervention
 Use curriculum-based measurement
 Letter Sound Fluency (Kgn) using lower
case letters – 1 minute
 Word Identification Fluency (grade 1 up
to 40 wpm) – 1 minute
 Passage Reading Fluency with difficulty
expected for year-end competence (thru
grade 4) – 1 minute
 Maze fluency with difficulty expected for
year-end competence – 2 ½ minutes
(grades 4-6)
 Computation: (addition, subtraction,
multiplication, division, fractions, and
decimals) number of correct digits
produced within a fixed time period
based on grade (2-8 minutes)
Current Practice
 Formal training has not taken place; some staff
members have attending workshops on progressmonitoring
 Conduct benchmark assessments
 in reading 3 times per year
 in writing 2 times per year (fall and spring)
 Conduct common math assessments 2 times per
year (winter and spring) to determine if student
competence is increasing
 Conduct screenings
 Of incoming Kindergarten students using the
xxx, parent information, and observations
 Of students suspected of Speech/Language
problems using parent and teacher reports,
verbal interaction, and observations
 Of students suspected of ADHD using record
review, parent and teacher reports, and
observations
 To identify students who may need additional
support
 Reading specialists conduct screenings of
individual students
 Curriculum-based measurement not used
consistently
 Data not consistently displayed in visual format
 Data not consistently used to inform families of
student progress
 Curriculum-based assessments used to determine
instructional reading levels
 Mastery measurement used for unit tests, but may
not be consistent across classrooms and schools
112
Example of Determining What is Currently in Place
and What Needs to be Added
Level or Tier
Selected
Tier II
Targeted
Tier III
Model of Best Practice
 Collect data 1-2 times per month to
 evaluate the effectiveness of selected
options
 identify students needing additional
support
 identify students who no longer require
selected options
 monitor response to intervention
 Intervention plans include baseline and followup data to show evidence of student
progress

 Collect data at least weekly to
 evaluate the effectiveness of targeted
options
 identify students who may need
additional assessment
 identify students who no longer require
targeted options
 monitor response to intervention








Current Practice
Systematic data not collected
Baseline often includes running records and
Macmillan assessment results
Exit criteria not established
Interventions do not consistently supplement
universal instruction
Systematic data not collected
Baseline often includes running records and
Macmillan assessment results
Exit criteria not established
Interventions do not consistently supplement
universal instruction
113
Identify Instruments / Procedures to
Meet Needs
Considerations
 Area (academic, social, emotional, and/or behavioral)
 Age / grade level
 Short-term versus long-term progress monitoring
 Discrete skills versus general (global) skills
 Available resources (e.g., money, time, support staff, etc.)
versus required resources
 Decision rules regarding resource allocation
114
Example of Using Progress Monitoring Data to Identify Students in Need
of Additional Mathematics Options
Option 1 –
Low
performing
student
Option 2 –
Student
making
insufficient
progress
115
Example of Using Progress Monitoring Data to Monitor
Intervention Effectiveness
Intervention
began
14
10.0
Goal
12
Intervention
began
8.0
8
Goal line
6
Digits Correct
Digits Correct
10
6.0
4.0
4
2.0
2
R
30
9
2
16
4
H
H
H
H
27
IL
C
C
C
C
18
Y
A
M
P
A
R
A
M
R
A
M
R
A
M
R
A
M
N
Ad m ini st ra tio n Dat e
JA
0
T1
30
4
27
IL
H
C
R
C
9
2
16
H
H
H
C
C
C
O
0.0
Y
A
M
P
A
R
A
M
R
A
M
R
A
M
R
A
M
18
0
T1
N
C
JA
O
0
Ad m ini st ra tio n Dat e
10
14
8
Digits Correct
12
Digits Correct
10
8
6
4
6
4
2
2
R
30
9
2
16
4
H
H
H
H
27
IL
C
C
C
C
AY
M
AP
AR
M
AR
M
AR
M
AR
M
18
0
T1
N
C
30
4
H
27
IL
C
R
9
2
16
H
H
H
AY
M
AP
AR
M
C
C
C
AR
M
AR
M
18
0
T1
N
C
JA
O
AR
M
JA
O
0
0
Adm ini stration Date
Ad m ini stra tio n Date
10
40
8
Digits Corrects
Digits Correct
35
30
25
20
6
4
Intervention
modified
15
30
4
H
27
IL
C
R
9
2
16
H
H
H
Y
A
M
P
A
C
C
C
18
R
30
9
16
4
H
H
H
2
27
IL
C
C
AY
M
AP
AR
M
AR
M
C
H
18
C
N
0
T1
AR
M
JA
C
AR
M
0
O
Ad m ini st ra tio n Dat e
2
R
A
M
R
A
M
R
A
M
R
A
M
0
T1
N
C
JA
O
10
Admini stration Date
116
Example of Using Progress Monitoring Data to Develop
Classroom, School, and/or District Norms (Reading)
Mean
Median
Minimum
Maximum
Percentiles:
25th
50th
75th
Median number of words
read correctly per minute
Median number of
errors per minute
96.7
103
41
143
3.1
2.5
1
6
Percentage of
comprehension
questions correct
80
80
40
100
66
103
117
4.5
2.5
2
70
80
100
Table 1: The results of the reading passages administered, including the number of words read
correctly and errors per minute, along with the mean percentage of comprehension questions
answered correctly relating to the second probe.
117
Example of Using Progress Monitoring Data to Develop
Classroom, School, and/or District Norms (Math)
1a
160
140
120
100
75th Percentile
50th Percentile
80
60
25th Percentile
40
20
N=
10
118
Example of Using Progress Monitoring Data to Develop
Classroom, School, and/or District Norms (Math)
30
20
10
0
0.0
10.0
5.0
20.0
15.0
30.0
25.0
40.0
35.0
50.0
45.0
60.0
55.0
65.0
Digits Correct Per Minute
119
Progress Monitoring Resources
 www.studentprogress.org
 http://www.nrcld.org/rti_manual/pages/RTIManualSection
2.pdf
 www.edcheckup.com
 www.edprogress.com
 www.aimsweb.com
 www.interventioncentral.org
120