www2.oregonrti.org

Download Report

Transcript www2.oregonrti.org

Supporting Students with Additional Needs in an RTI System

Jon Potter, Ph.D.

Lisa Bates, Ph.D.

David Putnam, Ph.D.

Oregon RTI Project OSPA Conference, Fall 2012

Afternoon Targets

Tier 2/3: Using data to place students in interventions (literacy) & evaluating intervention effectiveness Tier 3: Individual Problem Solving

What is your role in ensuring the right students receive the right support at the right time?

School Psychologists’ Role

“RTI calls for behavioral needs, classroom teachers and special education personnel and parents, and a systemic commitment to locating and employing the necessary resources to ensure that students make progress in the general education curriculum.” early identification of learning and close collaboration among

- NASP School Psych Role and RTI Fact Sheet

Assessment Consultation Program Evaluation

Using screening data to match interventions to student need (Literacy)

Which students receive interventions?

• Schoolwide/Districtwide decision rules should determine which students will receive additional support – – Based on schoolwide screening data (DIBELS, easyCBM, AIMSWEB, etc) Based on available resources and system capacity • Lowest 20%? 30%?

• All student

well below benchmark

?

Assessment

easyCBM Decision Rules guide placement in interventions Lowest 25% Lowest 20% All High Risk

Lowest 20% DIBELS Next Lowest 25% All below and well below benchmark 60 2 nd Grade Students

Screening Data

Linking Assessment to Intervention

Intervention Program Instructional need

Some will need more

Oral Reading Fluency & Accuracy Phonemic Awareness Phonics (Alphabetic Principle) Reading Comp Vocabulary

Logistics

• When do these type of discussions typically take place?

– Initial intervention placement meetings after schoolwide screenings – 3x year – May also discuss every 6-8 weeks when reviewing student progress.

Consultation

Ensuring an Instructional Match

Question 1: What is the skill deficit?

Question 2: How big is that deficit?

Question 3: What interventions address that deficit?

Question 4: How do we implement the program?

Question 1: What is the skill deficit?

The Big 5 of Reading Reading Comprehension Oral Reading Fluency & Accuracy Phonics (Alphabetic Principle) Phonemic Awareness Assessment

DIBELS Next easyCBM* AIMSWEB

RTF

Daze

ORF CWPM

MC Reading Comp

Maze

Reading CBM

Common Screening Data Sources Reading Comprehension •

ORF CWPM

ORF Acc %

PRF

WRF

Reading CBM

Oral Reading Fluency & Accuracy •

ORF Acc %

NWF WWR

NWF CLS

PRF Acc %

Letter Sounds

R-CBM Acc %

NWF

LSF

Phonics (Alphabetic Principle) •

PSF

FSF

Phoneme Segmenting DIBELS Next easyCBM*

Phoneme Segmentation AIMSWEB

Phonemic Awareness *easyCBM includes a Vocabulary measure

CBM measures are linked to the Big 5 of Reading

Assessment

Phonemic Awareness Vocabulary Phonics (Alphabetic Principle) Oral Reading Fluency & Accuracy Reading Comprehension DIBELS Next Class List Report (2 nd Grade – Fall)

Phonemic Awareness Vocabulary Phonics (Alphabetic Principle) Oral Reading Fluency & Accuracy Reading Comprehension easyCBM Class List Report (2 nd Grade – Fall)

The Big 5 of Reading Reading Comprehension Oral Reading Fluency & Accuracy Phonics (Alphabetic Principle) Phonemic Awareness

How skills build on each other

• Activity: – Oral Reading Fluency Assessment • Find a partner – – Partner 1 (person with next Birthday) – Reader Partner 2 – Test Administrator

Administer the reading assessment, and have the reader answer the questions

19

Phonics and accuracy are important

Words missed per page when accuracy is…

The Secret Life of Bees

7 th Grade

The Magic School Bus

2 nd – 3 rd grade

95%

18.5

My Brother Sam is Dead

5-6 th grade

15 6

98%

7.4

6 2.4

99%

3.6

3 1.2

Accuracy is more important than fluency

Accurate at Skill Fluent at Skill Able to Apply Skill IF no, teach skill.

If yes, move to fluency If no, teach fluency/ automaticity If yes, move to application If no, teach application If yes, the move to higher level skill/concept

Adapted from

The Big 5 of Reading Reading Comprehension Oral Reading Fluency & Accuracy Phonics (Alphabetic Principle)

Application Fluency Accuracy

Phonemic Awareness

Phonics Example:

Nonsense Word Fluency

Accurate at Skill Fluent at Skill Able to Apply Skill Student knows all letter sounds and makes few, if any, mistakes Student knows all letter sounds AND provides letter sounds fluently Student automatically blends letter sounds into whole words

Accuracy Fluency

7 7 9 8 4 0 0 0 0 0

Application

35/56 letter sounds correct = 63% 35 0

Accuracy Fluency Application

14 14 7 0 0 0 35/36 letter sounds correct = 97% 35 0

Accuracy Fluency

54/54 letter sounds correct = 100%

Application

14 14 15 14 11 5 5 5 5 4 68 24

Validating the deficit

• CBM measures (DIBELS, easyCBM, AIMSWEB, etc) are “indicators” • What does your other data tell you?

– – In-curriculum assessments Other CBM data – OAKS Assessment

Question 2: How big is that deficit?

Is the skill low or

significantly

low?

• You must define what is low and what is significantly low:

Low Significantly low Examples:

DIBELS Next easyCBM* AIMSWEB** Below benchmark Well below Benchmark th percentile th ≤10 th Percentile Between 11 th and 25 ≤10 th Percentile *easyCBM default percentile rank settings **AIMSWEB default percentile rank settings

Question 3: What interventions address that deficit?

Program Evaluation

What intervention programs does your school have that address the skill need(s)?

What intervention programs does your school have that address the skill need(s)?

Triumphs Phonics for Reading Read Naturally STARS SFA Tutoring Reading Mastery Language for Thinking Horizons

Phonemic Awareness

   

Phonics

    

Oral Reading Accuracy & Fluency

   

Vocab

Reading Comp

      

Triumphs Phonics for Reading Read Naturally STARS SFA Tutoring Reading Mastery Language for Thinking Horizons

Phonemic Awareness

   

Phonics

    

Oral Reading Accuracy & Fluency

   

Vocab

Reading Comp

      

Triumphs Phonics for Reading Read Naturally STARS SFA Tutoring Reading Mastery Language for Thinking Horizons

Phonemic Awareness

   

Phonics

    

Oral Reading Accuracy & Fluency

   

Vocab

Reading Comp

      

Additional resources for evaluating interventions

• • • • What Works Clearinghouse – http://ies.ed.gov/ncee/wwc/ Florida Center for Reading Research – http://stage.fcrr.org/fcrrreports/CReportsCS.aspx?rep= supp Oregon Reading First – http://oregonreadingfirst.uoregon.edu/inst_curr_revie w_si.html

Best Evidence Encyclopedia – http://www.bestevidence.org/

Question 4: How do we implement the program?

Consultation

Placement Tests Once an intervention program that addresses the instructional need is identified, placement tests should be used to form instructional groups of students.

Other considerations

• Available resources (time, staff, materials) will guide how many groups are created.

• Consider the behavioral and social/emotional needs of the students

Additional Diagnostic data • Diagnostic assessment in critical area of need:  Quick phonics screener  Curriculum-Based Evaluation  CORE multiple measures  DIBELS booklets error patterns  Running Records  Other?

7 7 9 8 4 0 0 0 0 0 35 0

With your partner

• What other data sources do you currently use or are available to you, to help match interventions to student need?

– Reading – Math – Writing – Behavior

Documentation

Johnny Phonics (in text) X

O O

Oral Reading Fluency Quick Phonics Screener Reading Mastery 2

Evaluating Interventions

What’s the Big Idea(s)!?

• Use appropriate progress monitoring tools • Set Goals • Establish Decision Rules • Analyze data, apply decision rules and determine what to change

Brief & Easy Progress Monitoring Tools Sensitive to growth Frequent Equivalent forms!!!

What are some commonly used progress monitoring tools?

AIMSWEB DIBELS NEXT easyCBM AIMSWEB easyCBM Reading

Reading CBM, Maze FSF, PSF, NWF, ORF, Daze PSF, LSF, WRF, PRF, MC Reading Comp, Vocab

Math

M – Computation, M – Early Numeracy Concepts & Applications, CBM – Numbers & Operations, Measurement, Geometry, Algebra

Written Language

Writing – CBM (Total Words Written, Correct Writing Sequences, Words Spelled Correctly)

What are NOT good progress monitoring tools?

• Phonic Screeners • Report Cards • OAKS

Reading

• DRA • Running Records

Math

Curriculum weekly tests Teacher created math probes* OAKS Writing rubrics*

Written Language

OAKS • Reading curriculum weekly

or

monthly tests

or

fluency passages * when not administered and scored in a standardized and reliable way, or checked for consistency of multiple probes

Do we have the right “indicators”?

• Oral Reading Fluency and Accuracy in reading connected text is one of the best indicators of overall reading comprehension (Fuchs, Fuchs, Hosp, & Jenkins, 2001)

Fluent & accurate reading is not the end goal… but a child who cannot read fluently and accurately cannot fully comprehend written text.

Additional Progress Monitoring Tools For more info and a review of available tools, visit www.rti4success.org (Progress Monitoring Tools Chart)

Goal Setting: Things to Consider 1. What is the goal? – – • Criterion-based Research-based benchmarks/proficiency • • Norm-based Minimum of 25 th average) percentile (bottom limit of School, District, State, National

How do you define success?

Goal Setting: Things to Consider 2. By when will they get there?

– – Long term goals always at proficiency (i.e., grade placement benchmark) Short term towards proficiency (i.e., instructional level material) goals may be an incremental step

Does your goal close the gap?

Goal Setting: Things to Consider 3. What does reasonable growth look like?

– – National Growth rates (Fuchs, AIMSWEB, Hasbrouck & Tindal) Local Growth rates • District, School, Classroom, Intervention Group

What progress can we expect?

National Growth Rates: Reading

Grade Average ORF Growth (WCPM)* Ambitious ORF Growth (WCPM)* Average Maze Growth (WCR)**

5 6 1 2 3 4 2 1.5

1 0.85

0.5

0.3

*Fuchs et al (1993), **Fuchs & Fuchs (2004) 3 2 1.5

1.1

0.8

0.65

0.4

0.4

0.4

0.4

0.4

0.4

“Using national normative samples allows comparisons to be made with the performance levels expected of typical performing students from across the country and equates more closely with data sets that are used in well developed, published, norm referenced tests.” Shapiro, 2008

Local Growth Rates

What does typical growth look like in… …your district?

…your school?

…your classroom?

…your intervention group?

“…use of the combination of local and national norms provides the user of these data with opportunities to evaluate how student performance compares with a national sample of same-grade peers, as well as against the local peers within the particular school.”

Shapiro, 2008

Setting Appropriate Goals Is Important 18 WCPM

Benchmark

36 WCPM

Decision Rules

• • • • Decision rules guide how we decide

if our interventions are working — and when to move on

Your decision rules create consistency across grade levels and schools Determine how to intensify and individualize interventions

Standardizes process for eligibility decision making

Key features of decision rules

• • • • • Set the grade levels for the decision rules (K, 1-6) Number of points below the aimline Give direction if the data is highly variable – Trendline analysis Duration of intervention /frequency of monitoring (Length of time in between meetings (6 to 8 weeks) Define success

Evaluating Interventions:

Is What We Are Doing Working?

AAA

• • •

A

pply Decision Rules: Is the student making adequate progress based on decision rules?

A

nalyze: Is it an individual or a group problem?

A

ction: Determine what to change

A

pply: Is the Student Making Adequate Progress?

60 50 20 10 40 30 D ec .

S c ores

J an .

S c ores

F eb .

S cores

Chase Marc h

S cores

Aimline A p ril

S cores

May

S c ores

J u n e

S c ores 60

A

nalyze: Is it an Individual or a Group Problem?

Cohort Group Analysis: Students who have similar literacy programming: – – – – Grade level Intervention program Time ELD level

Cohort Data 62

20 10 60 50 40 30 Isaiah Mary Amy Dec .

S cores

J an.

S cores

F eb.

S cores

Chase Marc h

S cores

Aimline A pril

S cores

May

S cores

J une

S cores

Cohort Data 63

20 10 60 50 40 30 Dec .

S cores

J an.

S cores

F eb.

S cores

Amy Isaiah Chase Mary Marc h

S cores

Aimline A pril

S cores

May

S cores

J une

S cores

A

ction: Determine What to Change

• • • Listen to the data Gather additional data if necessary Focus on

instructional variables

that you can control!

Focus on what we can control

65

Time Group Size Time/ Engagement Different Program Individual Problem Solving

What do we change?

A Final Thought

It’s better to shoot for the stars and miss than aim at the gutter and hit it.

Anonymous

Break Time

68

Individual Problem Solving

OSPA Fall Conference Oregon RTI Project October 12 th , 2012 69

Targets

• Provide a framework for how to individually problem-solve students with the most intensive needs 70

“It is better to know some of the questions than all of the answers.”

James Thurber 71

Problem-Solving Non-example

Problem-Solving Non-example

Who are students with the most intensive needs?

Students with identified disabilities Students who may have a disability Students with significant literacy deficits

74

If there was a problem…

75

Why proactive problem solving?

“Problem solving assessment typically takes a more direct approach to the measurement of need than has been the case in historical special education practice”

-Reschley, Tilly, & Grimes (1999)

“Intervention studies that address the bottom 10 25% of the student population may reduce the number of at-risk students to rates that approximate 2 6%”

-Fletcher, Lyon, Fuchs, & Barnes (2007) 76

The Problem Solving Process

How is it working?

1. Problem Identification 4. Plan Implementation & Evaluation What are we going to problem?

do

about the Improved Student Achievement 3. Plan Development

What

is the problem?

2. Problem Analysis

Why

is the problem occurring?

77

Problem Solving Form

78

Step 1: Problem Identification

1. Problem Identification

What

is the problem?

Improved Student Achievement 79

Step 1: Problem Identification

A problem is defined as a discrepancy between: Expected performance Current performance Problem Definition 80

Step 1: Problem Identification

• Expected performance is based on data: – Performance of typical/average peers – Research-based benchmarks – Proficiency scores • Actual performance student data is based on current 81

Step 1: Problem Identification

• Calculating magnitude of discrepancy Absolute discrepancy: Expected performance

Current performance 72 wcpm (Winter 2 nd Grade) Discrepancy Ratio:

32 wcpm

= -40 wcpm

Larger Number 72 wcpm (Winter 2 nn Grade)

÷ ÷

Smaller Number 32 wcpm

= 2.25 times discrepant

82

Discrepancy between Current Performance & Expected Performance 83

Step 1: Problem Identification

Problem Definitions should be: 1. Objective – observable and measurable 2. Clear – passes “the stranger test” 3. Complete – includes examples (and non examples when necessary) and baseline data 84

Problem Definition: Example

Harry (2 2 2 nd nd nd grader) is currently reading a median of 44 words correct per minute (wcpm) with 83% accuracy when given grade level text. He also answers an average of 3/10 comp questions correct on weekly in-class tests. grade students in his school are reading an average of 85 wcpm with 97% accuracy on 2 nd grade text and answering 9/10 comp questions correct.

85

Problem Definition: Non-Example Harry struggles with being a fluent reader and is not meeting the 2 nd grade reading benchmark. He makes a lot of mistakes and is currently reading at a 1 st grade level. He also has difficulties answering comprehension questions at grade level and does poorly on his weekly reading tests.

86

Step 1: Problem Identification

• Replacement Skill or Target Behavior – What would it look like if this student were successful?

– What would we prefer the student do, instead of the problem behavior?

87

Problem Definition & Target Skill

88

The Problem Solving Process

1. Problem Identification Improved Student Achievement 2. Problem Analysis

Why

is the problem occurring?

89

Step 2: Problem Analysis

Problem

The

WHY

Plan

should always drive the

WHAT

Analysis

90

The Water…

• • • Focus on “the water” Instruction Curriculum Environment

C I

91

ICEL I

C

Instruction Curriculum E

L

Environment Learner

Student Learning

Instruction: Curriculum:

How you teach What you teach

Environment:

Where you teach

Learner:

Who you teach 93

We can control the how, what, and where. We don’t have much control over the who.

94

When it comes to problem analysis, just remember… 95

ICE, ICE baby I

C

Instruction Curriculum E

L

Environment

then

Learner

96

What impacts student achievement?

Effective teaching variables

Formative Evaluation Comprehensive interventions for students with LD Teacher Clarity Reciprocal Teaching Feedback

Effect size +0.90

+0.77

+0.75

+0.74

+0.73

Other variables

Socioeconomic Status Parental Involvement Computer based instruction* School Finances Aptitude by Treatment Interactions*

Effect size +0.57

+0.51

+0.37

+0.23

+0.19

Teacher-Student Relationships Direct Instruction

+0.72

+0.59

Family Structure Retention

+0.17

-0.16

John Hattie,

Visible Learning

, 2009

Hypothesis Development

Instruction: Curriculum: ?

?

Learner: Environment: ?

?

98

ICEL Assessment 99

Instruction

,

Curriculum

, &

Environment

• What should appropriate instruction, curriculum, and environment look like?

• Video: Early Reading Intervention – – 3 students receiving direct instruction on phonemic awareness & phonics Observe and note effective teaching practices with regard to instruction, curriculum, and environment 100

Instruction, Curriculum, Environment

101

Talk time

• What effective teaching practices did you see related to instruction, curriculum, & environment?

• What questions/concerns/suggestions might you have for this teacher?

102

Assessment ≠ Testing ≠ Evaluation

*Testing

– “administering a particular set of questions to an individual to obtain a score”

*Assessment

– “the process of collecting data for the purpose of making decisions about students”

**Evaluation

– “procedures used to determine whether the child has a disability, and the nature and extent of the special education and related services that the child needs.” *Salvia & Ysseldyke, 2004 ** Oregon Administrative Rules,

581-015-2000

Assessment 103

Assessment: RIOT

R – Review I – Interview O – Observe T – Test

104

Hypothesis Development

Instruction: Curriculum: Environment: Learner:

105

Instruction

• Thinking about RIOT procedures, what are some ways we can gather information about

Instruction

?

R – Review I – Interview O – Observe T – Test Examine lesson plans, attendance, permanent products for instructional demands Talk to teachers about expectations, instructional strategies used Observe instruction in the classroom for effective instructional practices Aggregate test scores of classroom 106

Who knows…?

Instruction: Examples

Targets for Intervention I do, we do, y’all do, you do 1-2 OTR’s/min <50% errors corrected 8-12 OTR’s/min 95-100% errors corrected

107

Is this effective instruction?

Is this effective instruction?

When it comes to interventions…

“It is clear that the program is less important than how it is delivered, with the most impressive gains associated with more intensity and an explicit, systematic delivery” Fletcher & Colleagues, 2007

110

Instruction Resources

Explicit Instruction

– Archer & Hughes (2011) www.explicitinstruction.org

Teaching Reading Sourcebook - CORE –

http://www.corelearn.com/

Classroom Instruction that Works: Research Based Strategies for Increasing Student Achievement

Marzano et al, (2001) 111

Curriculum

• Thinking about RIOT procedures, what are some ways we can gather information about

Curriculum

?

R – Review I – Interview O – Observe T – Test Examine permanent products for skills taught, scope & sequence, instructional match Talk to teachers, administrators about philosophy of curriculum, coverage, etc.

Student success rate Readability of textbooks 112

Curriculum: Examples

Not matched to need Targets for Intervention Matched to need Frustrational (<80%) Weak (<80%) Instructional (>80-90%) Strong (>80%)

113

Reading Skills Build on Each Other Reading Comprehension Oral Reading Accuracy & Fluency Phonics (Alphabetic Principle) Phonemic Awareness 114

Environment

• Thinking about RIOT procedures, what are some ways we can gather information about

Environment

?

R – Review I – Interview O – Observe T – Test Examine school rules, attendance, class size Talk to teachers about expectations, rules, behavior management system, classroom culture, talk to parents Observe in the classroom Aggregate test scores of classroom 115

Not defined

Environment: Examples

Targets for Intervention Explicitly taught & reinforced Low rate of reinforcement Mostly positive (4:1) Chaotic & distracting Organized & distraction-free

116

Academic Learning Time: Typical School

Hours

1170 School Year (6.5 hours x 180 days)

- 65 Absenteeism (1 day/month x 10 months)

= 1105 Attendance Time (Time in School)

- 270 Non-instructional time (1.5 hrs./day for recess, lunch, etc)

= 835

-

Allocated Time (Time scheduled for teaching)

- 209 (25% of allocated time for administration, transition, discipline-15 minutes/hour)

= 626 Instructional time (time actually teaching)

- 157 Time off task (Engaged 75% of time)

= 469 Engaged Time (On task)

94 Unsuccessful Engaged Time (Success Rate 80%)

= 375 Academic Learning Time

Education Resources Inc., 2005

Academic Learning Time: Effective School

Hours

1170 School Year (6.5 hours x 180 days)

- 65 Absenteeism (1 day/month x 10 months)

= 1105 Attendance Time (Time in School)

- 270 Non-instructional time (1.5 hrs./day for recess, lunch, etc)

= 835

-

Allocated Time (Time scheduled for teaching)

- 125 (15% of allocated time for administration, transition, discipline-9 minutes/hour)

= 710 Instructional time (actually teaching-710 vs. 626)

71 Time off task (Engaged 90% of time)

= 639 Engaged Time (639 vs. 469 On task)

64 Unsuccessful Engaged Time (Success Rate 90%)

= 575 Academic Learning Time

Education Resources Inc., 2005

The Difference: Typical vs. Effective Schools

Variable Allocated Non instructional Time Engagement Rate Success Rate Academic Learning time Typical School

25% (15 min/hr)

Effective School

15% ( 9 min/hr)

Time gained

+84 more hours

How the time is gained

Teaching expectations, teaching transitions, managing appropriate and inappropriate behavior efficiently 75% 80% 375 hours 90% 90% 575 hours +86 more hours +30 more hours Better management of groups, pacing Appropriate placement, effective teaching =

200 more hours (53% more) OR 95 more school days (4-5 months!)

119

Learner

• Thinking about RIOT procedures, what are some ways we can gather information about

Learner

?

R – Review I – Interview O – Observe T – Test Examine cumulative file, health records, developmental history, etc Talk to teachers, parents, student about perceptions of the problem Observe student in the classroom Direct assessment 120

Learner: Examples

Poor attendance Well below benchmarks Off-task, disruptive, disengaged Great attendance At benchmarks Focused & attentive

121

Before considering additional testing • • Start with existing data: – Screening data – – – Progress monitoring data State testing data (OAKS) In curriculum data Is additional data needed?

– What additional questions do you have?

– Which diagnostic assessments can answer those questions?

Assessment 122

Additional Resources

• • • •

Curriculum-Based Evaluation: Teaching & Decision Making

– Howell & Nolet CORE Assessing Reading Multiple Measures Quick Phonics Screener DIBELS Deep 123

Hypothesis Development

Instruction: Curriculum: Environment: Learner:

124

Hypothesis Development

• What can

we

do that will reduce the problem (decrease the gap between what is expected and what is occurring)?

Expected performance Current performance 125

Problem Hypothesis

• • Why is the problem occurring?

Example: – Harry’s reading fluency and comprehension problems occur because he lacks strategies for decoding silent-e words engagement and does not provide enough vowel digraphs (oa, ea, ae, ou, etc). His current instruction explicit modeling of these skills. He also currently has a low level of and is highly distracted in both his classroom and intervention room.

126

Prediction Statement

• • What will make the problem better?

Example: – Harry will improve if he receives explicit instruction in his identified missing skills. He also needs instruction that utilizes and effective active engagement strategies keep him highly engaged in instruction, and an environment that is distraction high pacing quiet, without from other students.

to 127

Problem Hypothesis & Prediction

128

What are we going to problem?

do

about the

Step 3: Plan Development

1. Problem Identification Improved Student Achievement 2. Problem Analysis 3. Plan Development Consultation 129

Intervention Plan

130

Progress Monitoring Plan

131

Fidelity Monitoring Plan

132

Fidelity checklist

133

Importance of Feedback

• • • • Wickstrom et al studied 33 intervention cases. Teachers agreed to do an intervention and were then observed in class.

0/33 Teachers had fidelity above 10%.

33/33 on a self report measure indicated that they had used the intervention as specified by the team.

Consultation Slide taken from a presentation by Joseph Witt

Importance of Feedback

“Among the most powerful of interventions is feedback or formative evaluation – providing information to the teacher as to where he or she is going, how he or she is going there, and where he or she needs to go next” Hattie, 2012 (Visible Learning for Teachers) “Feedback is the breakfast of champions”

Kevin Feldman

Consultation 135

Step 4: Plan Implementation & Evaluation

How is it working?

1. Problem Identification 4. Plan Implementation & Evaluation Improved Student Achievement 2. Problem Analysis 3. Plan Development 136

Attendance

137

Fidelity Data

138

Progress Monitoring Data…

139

…as compared to peers/expected growth 140

Cohort Data

20 10 60 50 40 30 Isaiah Mary Amy Dec .

S cores

J an.

S cores

F eb.

S cores

Chase Marc h

S cores

Aimline A pril

S cores

May

S cores

J une

S cores 141

Cohort Data

20 10 60 50 40 30 Dec .

S cores

J an.

S cores

F eb.

S cores

Amy Isaiah Chase Mary Marc h

S cores

Aimline A pril

S cores

May

S cores

J une

S cores 142

Magnitude of Discrepancy

143

Next Steps: Based on Data & District Policies & Procedures 144

Final Thought: Data, Data, Data

145

Questions/Comments

Jon Potter [email protected]

Lisa Bates [email protected]

David Putnam [email protected]

146