Transcript Slide 1

Summer 2009 1

Summer 2009

Progress Monitoring

Strategies for Writing Individual Goals in General Curriculum and More Frequent Formative Evaluation

Mark Shinn, Ph.D.

Lisa A. Langell, M.A., S.Psy.S.

V. Scott Hooper, Ph.D.

Linner, Scott E., Ed.S., NCSP 2

Today’s Learning Objectives:

1.

Brief review of benchmark data,

identifying students “in need,” and how data connects to Instructional Decision-Making and Progress Monitoring

2.

What is Progress Monitor?

3.

Progress monitor (PM) schedule setup (Part 1):

Conducting

Survey Level Assessment (SLA)

to determine student’s present levels of educational performance

4.

Setting individualized student goals

a. Using norm-referenced data b.

Using ate of improvement (ROI) when progress monitoring--two common methods: 6.

7.

8.

5.

Progress monitor (PM) schedule setup (Part 2)

a.

b.

c.

Duration Frequency Intensity Decisions regarding

data collection/PM frequency and duration

Data interpretation,

case studies and practice exercises

Strategies for writing understandable,

effective intervention descriptions

Summer 2009 3

Summer 2009

1

A Brief Review of Benchmark Data:

Identifying Students At-Risk for Academic Failure

*All data and identifying information presented are fictitious.

4

Michael Martin (fictitious):

A student with Benchmark data that indicates he is performing significantly behind peers and targets.

Summer 2009 *All data and identifying information presented is fictitious.

5

Fall Benchmark Data for Michael Martin

Martin, Michael: Grade 5

Summer 2009 (All identifying information and scores are fictitious.) *All data and identifying information presented is fictitious.

6

Performance Comparisons Across Groups

Grade 5: Michael’s School

Summer 2009

Grade 5: Michael’s District

Grade 5: National Aggregate Norms

*All data and identifying information presented is fictitious.

(All identifying information and scores are fictitious.) 7

AIMSweb National Aggregate Norm Table: Michael Martin

Grades 1 – 5:

Summer 2009

Compare Michael Martin

Fall 5 th grade student:

48 wrc / 12 errors

*All data and identifying information presented are fictitious.

8

Big Ideas About Frequent Formative Evaluation Using General Outcome Measures and the Progress Monitoring Program

• • One of the most powerful interventions schools can use is

systematic and

frequent formative evaluation

.

Benchmark Assessment is

not enough

some students because they may be in ineffective programs too long. (3 mos +) for Progress Monitoring enables goal oriented, frequent data collection in order to inform instruction and measure student achievement. Summer 2009 *All data and identifying information presented are fictitious.

9

How often is monitoring needed?

• For

some

,

Benchmark Data

is sufficient to enable important and timely instructional decisions to be made: • For

others,

closer attention

is needed:

More

severe achievement problems

and/or More

resource intensive programs

require More

frequent progress monitoring.

Summer 2009 *All data and identifying information presented are fictitious.

10

More severe achievement problems and/or more resource intensive programs require more frequent progress monitoring U. Boardman: Typically Achieving & Above Average Students:

The consequences of some period of lack of progress, although undesirable, may be less serious. Awareness and action is still important to prevent further delays.

NOW FUTURE U. Boardman:

( Grade 3: Wilson Elementary) • An above average 3rd grade student who has not progressed in reading from Fall to winter • By the time his lack of progress has been documented and a change in his reading program is made, there is still a possibility • He already received an instructional program that may not have been working for him for at that he could become an average or above average reader by spring!!

least 16-20 weeks. Summer 2009 *All data and identifying information presented are fictitious.

11

Student Example 2: Below Average Students:

The consequences of some period of lack of progress increases seriousness and urgency to improve outcomes.

Melissa:

(Grade 4: Wilson Elementary) • Melissa is a very low-performing 4 th grade student whose fall to winter benchmark shows no growth. • She began the year as an at-risk reader and has not improved her reading performance. • By winter, she is now a reader with a severe performance discrepancy.

• She needs frequent progress monitoring and significant instructional improvements.

*All data and identifying information presented are fictitious.

Summer 2009 12

Summer 2009

Melissa

Melissa’s

progress

may need to be

monitored

more frequently —perhaps 1-2x per week —so that teachers can evaluate her progress and adjust her reading program every 4-6 weeks, or as needed.

*All data and identifying information presented are fictitious.

13

What is Progress Monitoring?

The Progress Monitor Schedule Setup Process

2

14

Progress Monitoring is: Research-Based Best Practices:

Systematic Formative Evaluation that requires the use of:

Standard

assessment tools… 1.

2.

That are the

same difficulty

That are

Given the same way

each time.

(AIMSweb® Offers these features.)

Summer 2009 *All data and identifying information presented are fictitious.

15

Formative Assessment

Formative Assessment:

The process of assessing student achievement

during

instruction to determine whether an instructional program is effective for individual students.

• When students are progressing,

keep

using your instructional programs. • When tests show that students are not progressing, you can

change

your instructional programs in meaningful ways.

Summer 2009 *All data and identifying information presented are fictitious.

16

Progress Monitoring is:

• • • • • • … for programs that are resource intensive and therefore should involve

frequent monitoring

of student outcomes, such as: Title I English Language Learning (ELL) Special Education Programs that have a higher cost per student Programs with higher teacher-student ratios Experimental programs

Recommendation 1:

We should monitor student outcomes more frequently than the Benchmark Testing schedule.

Recommendation 2:

We should monitor students who are at risk for academic failure much like one would in a medical intensive care unit *All data and identifying information presented is fictitious.

Summer 2009 17

Formative Evaluation of Vital Signs Requires Quality Tools

• •

Technical adequacy

(reliability and validity);

Capacity to model growth

(able to represent student achievement growth within and across academic years); •

Treatment sensitivity

learning); (scores should change when students are • •

Independence from specific instructional techniques

(instructionally eclectic so the system can be used with

any

instruction or curriculum); type of •

Capacity to inform teaching

teachers improve instruction); (should provide information to help

Feasibility

(must be doable).

Summer 2009 Fuchs and Fuchs (1999) 18

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 19

Summer 2009

Progress Monitor Schedule Setup (Part 1):

Conducting

Survey Level Assessment (SLA)

to determine student’s present levels of educational performance

3

20

The PM Setup Process

1.

2.

3.

4.

Setting up a frequent progress monitoring (PM) schedule setup involves four (4) major steps:

Survey Level Assessment (SLA):

Assess extent of academic deficit

Goal Setting Basics: Assessing PM Schedule Needs:

Analyze student’s needs and available resources, then use to determine frequency and duration of assessment.

Determining the Student’s Goal:

• Norm-referenced method of goal-setting • Criterion-referenced method of goal-setting Summer 2009 *All data and identifying information presented are fictitious.

21

Step 1: Survey Level Assessment (SLA)

22

SLA:

Students are tested in

successive levels of general curriculum

, beginning with their current expected grade placement, until a level at which they are

successful

is determined.

Conducting a

S

urvey

L

evel

A

ssessment

(SLA)

Grade 5 Median:

48/12

Grade 4 Median:

67/10

Summer 2009 Grade 3 Median:

76/8

23

Survey Level Assessment (SLA):

Using National Aggregate Normative Data

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

24

Survey Level Assessment (SLA)

Grade 5:

48/12

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

25

Survey Level Assessment (SLA)

Grade 4:

67/10

Grade 5:

48/12

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

26

Grade 3:

76/8 Survey Level Assessment (SLA)

Grade 4: Grade 5:

67/10 48/12

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

27

Using SLA Data for Present Levels of Educational Performance Statements:

“Michael Martin currently reads about 48 words correctly, with 12 errors, from Grade 5 Standard Reading Assessment Passages. He reads Grade 3 reading passages successfully; 76 words correct per minute, with 8 errors, which is how well average beginning 3 rd material.

” grade students read this

Summer 2009 *All data and identifying information presented are fictitious.

28

GROUPWORK:

Part 1: Quiz / FAQ Part 2: Groupwork

Summer 2009 29

Groupwork Quiz

1. For R-CBM: When conducting SLA, how many probes per grade level does “best practice” recommend collecting?

a. Using the score from

one

R-CBM probe each, per grade level?

b. Using the median score of

three

probes each, per grade level?

c. Using other test data from non AIMSweb® assessments that provide an estimated grade level that may be used to determine a student’s “success level?” d. Any of the above are acceptable.

Summer 2009 30

Groupwork Quiz

1. For R-CBM: When conducting SLA, how many probes per grade level does “best practice” recommend collecting?

a. Using the score from

one

R-CBM probe each, per grade level?

b.

Using the median score of three probes each, per grade level?

c. Using other test data from non AIMSweb® assessments that provide an estimated grade level that may be used to determine a student’s “success level?” d. Any of the above are acceptable.

Summer 2009 31

Summer 2009

Groupwork Quiz

2.

For other AIMSweb® measures, (not R-CBM)—When conducting SLA, how many probes per grade level does “best practice” recommend?

a. Using the score from

one

probe each, per grade level?

b. Using the median score of

three

probes each, per grade level?

c. Using other test data from non level?” AIMSweb® assessments that provide an estimated grade level that may be used to determine a student’s “success d. Any of the above are acceptable.

32

Summer 2009

Groupwork Quiz

2.

For other AIMSweb® measures, (not R-CBM)—When conducting SLA, how many probes per grade level does “best practice” recommend?

a.

Using the score from one probe each, per grade level?

b. Using the median score of

three

probes each, per grade level?

c. Using other test data from non level?” AIMSweb® assessments that provide an estimated grade level that may be used to determine a student’s “success d. Any of the above are acceptable.

33

Groupwork Quiz

Summer 2009

3.

If a student was “benchmark” tested, when could you share the use of those benchmark scores for your grade level SLA data?

a. Never use recent benchmark data for SLA. Always collect new data using probes from the “Progress Monitor” probe set b. Benchmark scores are valid for SLA use up to two weeks from the time the most recent benchmark data were collected c. You may use benchmark data for SLA anytime.

d. Within no more than 4 – 6 weeks from the time the most recent benchmark data were collected e. None of the above.

34

Groupwork Quiz

Summer 2009

3.

If a student was “benchmark” tested, when could you share the use of those benchmark scores for your grade level SLA data?

a. Never use recent benchmark data for SLA. Always collect new data using probes from the “Progress Monitor” probe set b.

Benchmark scores are valid for SLA use up to two weeks from the time the most recent benchmark data were collected

c. You may use benchmark data for SLA anytime d. Within no more than 4 – 6 weeks from the time the most recent benchmark data were collected e. None of the above.

35

Quiz

3. When conducting SLA, how do you recognize the actual grade level at which a student who struggles on expected grade-level probes can perform successfully? (The level at which you discontinue further SLA.) a. The student performs with 95% accuracy or greater b. Administer successively lower grade level probes until the student’s score falls between the 10 th -24 th the time of year (Fall, Winter, Spring) that most closely matches the date on which you conducted the SLA percentile compared to AIMSweb® National Aggregate Norms for c. Administer successively lower grade level probes until the student’s score falls between the 25 th -75 th the time of year (Fall, Winter, Spring) that most closely matches the date on which you conducted the SLA percentile compared to AIMSweb® National Aggregate Norms for d. Administer successively lower grade level probes until the student’s score falls between the 76 th 90th percentile compared to AIMSweb® National Aggregate Norms for the time of year (Fall, Winter, Spring) that most closely matches the date on which you conducted the SLA Summer 2009 e. Test until the student (and you) are exhausted 36

Quiz

3. When conducting SLA, how do you recognize the actual grade level at which a student who struggles on expected grade-level probes can perform successfully? (The level at which you discontinue further SLA.) a. The student performs with 95% accuracy or greater b. Administer successively lower grade level probes until the student’s score falls between the 10 th -24 th the time of year (Fall, Winter, Spring) that most closely matches the date on which you conducted the SLA percentile compared to AIMSweb® National Aggregate Norms for c. Administer successively lower grade level probes until the student’s score falls between the 25 th -75 th the time of year (Fall, Winter, Spring) that most closely matches the date on which you conducted the SLA percentile compared to AIMSweb® National Aggregate Norms for d. Administer successively lower grade level probes until the student’s score falls between the 76 th 90th percentile compared to AIMSweb® National Aggregate Norms for the time of year (Fall, Winter, Spring) that most closely matches the date on which you conducted the SLA Summer 2009 e. Test until the student (and you) are exhausted 37

FAQ

Frequently Asked Questions about SLA:

1.

May we use other reading assessments that provide a “grade level equivalent” score in order to determine the SLA level, instead of AIMSweb® measures?

Summer 2009 38

FAQ

Frequently Asked Questions about SLA:

1.

May we use other reading assessments that provide a “grade level equivalent” score in order to determine the SLA level, instead of AIMSweb® measures?

Answer:

This practice is not recommended as different measures may be measuring different constructs and thus may not yield accurate results.

Summer 2009 39

FAQ

Frequently Asked Questions about SLA:

2. If I have a student in 10 th 3 rd 10 th or 4 th and 3 rd grade?

grade who I estimate is performing at around the grade level, do I have to conduct SLA at every grade level between Summer 2009 40

FAQ

Frequently Asked Questions about SLA:

2. If I have a student in 10 th 3 rd 10 th or 4 th and 3 rd grade?

grade who I estimate is performing at around the grade level, do I have to conduct SLA at every grade level between

Answer:

No. For increased efficiency, you may skip a few grades that you clearly estimate to be too difficult for the student. Then, consider initiating SLA at grade levels 4, 3, and 2 to see where the student falls. If needed move up or drop back additional grade levels until the level at which the student is successful is determined.

Summer 2009 41

FAQ

Frequently Asked Questions about SLA:

3. How do I conduct SLA for Written Expression (WE-CBM), given that there is one master set of Story Starters that are grade-level independent?

Summer 2009 42

FAQ

Frequently Asked Questions about SLA:

3. How do I conduct SLA for Written Expression (WE-CBM), given that there is one master set of story starters that are grade-level independent?

Answer:

software. Give the measure once and enter the total Correct Writing Sequences (CWS) & Total Words Written (TWW) scores as directed in AIMSweb®’s Progress Monitor Summer 2009 43

Frequently Asked Questions about SLA:

For WE-CBM, you may then plot CWS with TWW, or Words Spelled Correctly (WSC) with TWW

Total Words Written Correct Writing Sequences

FAQ

Summer 2009 44

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 45

Step 2: Understanding Goal Setting “Basics”

Goal of 115wrc/min Starting at 48 wrc/min Summer 2009 Start Finish!

*All data and identifying information presented are fictitious.

46

The PM Setup Process

Review:

Setting up a frequent progress monitoring (PM) schedule setup involves three (4) major steps:

1.

2.

Survey Level Assessment (SLA):

Assess extent of academic deficit

Understanding Goal Setting “Basics” 3.

4.

Determine the Actual Goal:

• Norm-referenced method of goal-setting • Criterion-referenced method of goal-setting

Assessing PM Schedule Needs:

Analyze student’s needs and available resources, then use to determine frequency and duration of assessment. Summer 2009 *All data and identifying information presented are fictitious.

47

Step 2: Goal Setting Basics

Step 2: Goal Setting Basics Step 2a. Set a few, but important goals

•Avoid the “goal smorgasbord”

(i.e., “more goals are better” approach)

•Avoid “haphazard goals”

Step 2b. Ensure goals are measurable and linked to validated formative evaluation practices

(AIMSweb® helps you accomplish this step more easily!) •Use AIMSweb® Progress Monitor •Use direct, continuous assessment using valid, reliable measures (AIMSweb®)

Step 2c. Base goal setting on logical educational practices

•Parents, students, and staff should all understand the goal •Parents, students (age appropriate) should understand why we set goal the way in which it was set •Know how long we have to attain the goal •Know what the student is expected to do when the goal is met Summer 2009 48

Summer 2009

Step 2a: Setting Few but Important Goals:

Current Goal Setting Practices Are Unsatisfying!

Do you like these IEPs?

I do not like these IEPs I do not like them Jeeze Louise We test, we check We plan, we meet But nothing ever seems complete.

Would you, could you Like the form?

I do not like the form I see Not page 1, not 2, not 3 Another change A brand new box I think we all Have lost our rocks!

-Original author unknown

49

Step 2a: Setting Few but Important Goals:

Current Goal Setting Practices Are Unsatisfying!

Often Ineffective Goal Smorgasbord!

• • • Student will perform spelling skills at a high 3rd grade level.

• Student will alphabetize words by the second letter with 80% accuracy.

• Student will read words from the Dolch Word List with 80% accuracy.

• Student will master basic multiplication facts with 80% accuracy.

Student will increase reading skills by progressing through Scribner with 90% accuracy as determined by teacher-made fluency and comprehension probes by October 2006.

To increase reading ability by 6 months to 1 year as measured by the Woodcock Johnson.

• Student will make one year's growth in reading by October 2006 as measured by the Acme Reading Test.

• Student will be a better reader.

• • Student will read aloud with 80% accuracy and 80% comprehension.

Student will make one year's gain in general reading from K-3.

• Students will read 1 story per week.

Summer 2009 50

Step 2a: Setting Few but Important Goals:

Current Goal Setting Practices Are Unsatisfying!

There is little to no empirical evidence that suggests writing these kinds of goals will lead to:

– Systematic formative evaluation (i.e., frequent progress monitoring) – Any evaluation at all – Improved educational outcomes In summary, we have no empirical evidence that these kinds of goals accomplish anything for students or teachers alike!

Summer 2009 51

Reading Spelling

Step 2b: Setting Few but Important Goals:

Reduce the Number of Goals to a Few Critical Indicators:

Sample Goal Templates for Use With AIMSweb® Math Computation Written Expression

In

(#)

weeks

(Student name)

will read

(#)

Words Correctly in 1 minute from randomly selected Grade

(#)

passages.

In

(#)

weeks

(Student name)

Letter Sequences and

(#)

will write

(#)

Correct Correct Words in 2 minutes from randomly selected Grade

(#)

spelling lists.

In (#) weeks (

Student name

) will write

(#)

Correct Digits in 2 minutes from randomly selected Grade

(#)

math problems.

In

(#)

weeks

(Student name)

will write

(#)

Total Words and

(#)

Correct Writing Sequences when presented with randomly selected Grade

(#)

starters.

story Summer 2009 52

Step 2b: Goal Setting Basics

Step 2: Goal Setting Fundamentals Step 2a. Set a few, but important goals

•Avoid the “goal smorgasbord”

(i.e., “more goals are better” approach)

•Avoid “haphazard goals”

Step 2b. Ensure goals are measurable and linked to validated formative evaluation practices

(AIMSweb® helps you accomplish this step more easily!) •Use AIMSweb® Progress Monitor •Use direct, continuous assessment using valid, reliable measures

Step 2c. Base goal setting on logical educational practices

•Parents, students, and staff should all understand the goal •Parents, students (age appropriate) should understand why we set goal the way in which it was set •Know how long we have to attain the goal •Know what the student is expected to do when the goal is met Summer 2009 53

Step 2b: Ensure the Goals are Measurable and Linked to Validated Formative Evaluation Practices

• Goals should be based on quality tests like CBM (AIMSweb® uses CBM).

• Based on validated practices such as how often, how many samples, etc.

• The goal represents the outcome of many complex skills that the student must learn. Example:

“Company A” may have as their annual goal to stockholders that they earn $3.25/share. Attainment of this outcome will represent the successful accomplishment of many complex activities (e.g., product development, advertising, sales, transportation, customer support, plant maintenance, etc.)

• Goals written within

AIMSweb® Progress Monitor

the same way as Earnings Per Share: are designed to operate

To give a standard to aim for that is a critical and general indicator of overall achievement.

Summer 2009 54

Step 2c: Goal Setting

Step 2: Goal Setting Fundamentals Step 2a. Set a few, but important goals

•Avoid the “goal smorgasbord”

(i.e., “more goals are better” approach)

•Avoid “haphazard goals”

Step 2b. Ensure goals are measurable and linked to validated formative evaluation practices

(AIMSweb® helps you accomplish this step more easily!) •Use AIMSweb® Progress Monitor •Use direct, continuous assessment using valid, reliable measures

Step 2c. Base goal setting on logical educational practices

•Parents, students, and staff should all understand the goal •Parents, students (age appropriate) should understand why we set goal the way in which it was set •Know how long we have to attain the goal •Know what the student is expected to do when the goal is met Summer 2009 55

Step 2c: Base goal setting on logical educational practices: Logical Educational Practices:

• Parents, students (when age appropriate), and staff should

all understand

the goal • Parents, students (when age appropriate) should understand

why how

the goal was set and • Know

how long

we have to attain the goal • Know what the student is

expected to do

when the goal is met Summer 2009 56

Quiz

3. Why would writing goals using CBM be a better practice in many cases than writing goals similar to those from the “goal smorgasbord?” a. CBM goals are easier for parents, teachers, and students to understand.

b. CBM goals are based on validated assessment practices c. CBM goals are measureable d. There is virtually no empirical evidence that goals from the “goal smorgasbord” actually accomplish anything for teachers or students alike e. All of the above Summer 2009 57

Quiz

3. Why would writing goals using CBM be a better practice in many cases than writing goals similar to those from the “goal smorgasbord?” a. CBM goals are easier for parents, teachers, and students to understand.

b. CBM goals are based on validated assessment practices c. CBM goals are measureable d. There is virtually no empirical evidence that goals from the “goal smorgasbord” actually accomplish anything for teachers or students alike e.

All of the above

Summer 2009 58

Quiz

2. What are some possible expectations for the student

after

the student has met a CBM goal (check all that apply):

Student should be better able to access the general curriculum without intensive support

□ □ □

Student will generalize skills taught during intensive instruction to everyday learning Student will continue learning at a growth rate similar to typical peers Student will perform at a level similar to typical peers Others?

Summer 2009 59

Quiz

2. What are some possible expectations for the student

after

the student has met a CBM goal (check all that apply): Student should be better able to access the general curriculum without intensive support

□ □ □

Student will generalize skills taught during intensive instruction to everyday learning Student will continue learning at a growth rate similar to typical peers Student will perform at a level similar to typical peers Others?

Summer 2009 60

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 61

The PM Setup Process

Review:

Setting up a frequent progress monitoring (PM) schedule setup involves three (4) major steps:

1.

2.

Survey Level Assessment (SLA):

Assess extent of academic deficit

Understanding Goal Setting “Basics:” 3.

4.

Determine the Actual Goal:

• Norm-referenced method of goal-setting • Criterion-referenced method of goal-setting

Assessing PM Schedule Needs:

Analyze student’s needs and available resources, then use to determine frequency and duration of assessment. Summer 2009 *All data and identifying information presented are fictitious.

62

4

Determine the Individual Goal: Two Common Methods

1.

2.

Norm-referenced Rate of improvement

(ROI)

Summer 2009 63

4a

Norm-Referenced (NR) Method

Progress Monitoring Schedule Setup and Goal Setting

Learn Via Case Study:

Michael Martin

Summer 2009 64

50% of students in 5 th grade at this school are performing between 90-128 wrc/min. These students represent the

MIDDLE 50%

of students in the comparison group (school). Summer 2009 65

“Core” Curriculum: “

Core” instruction is often delivered in a way that meets “middle” students more than the students in the “whiskers.”

Implication?

BM data reflects current status of performance —and also where the “core” must work to move middle students higher by next benchmark period. Summer 2009 66

At-risk students:

These are students, typically below the 25 th percentile within their comparison group… …These are students for whom “core instruction” has not entirely met their needs, (for any number of reasons). Summer 2009 67

Norm-Referenced Goal Setting:

Using norm group in a way that enables the educator to know the following:

1. PM Goal: The score at which at-risk student’s potential performance would be considered “average” compared to peers. (~25 th+ percentile)

Summer 2009 68

Norm-Referenced Goal Setting:

Using norm group in a way that enables the educator to know the following:

2. Moving the at-risk student up toward the “average range” compared to peers increases chance that student will access more core curriculum in similar manner as to peers.

Summer 2009 69

Summer 2009

Improving Core Programming…

If your school’s local norm, (especially those within the range of average performance (25 th -75 th percentile) is at an unacceptable performance level:

Your School

• Improve the “

core

” instruction! Do not rely solely upon “tiered” or more restrictive/intensive programs.

• Use caution when considering placement of all students “below the target” into “tier 2-3” intervention groups. You may be working inefficiently.

70

1.

2.

3.

4.

Norm-Referenced Goal Setting & Outcomes:

Move the at-risk student by setting a goal for at least the 25 th percentile of comparison group.

Progress Monitor Student will then be performing similar to peers and within average range.

If local “average range” is at an unacceptable performance level, improve “core” instruction

.

Summer 2009 71

Norm-Referenced Goal Setting:

Using norm group in a way that enables the educator to know the following: 2. Moving the middle 50% of students up so that spring benchmark targets are met by majority. 3. Student previously at risk reaches target.

Summer 2009 72

Establishing Goal-Level Material

• When progress monitoring, establishing the goal-level material is a

logical task

, based on a combination of:

– Educational values – Student educational needs – Intensity of instructional programs

• Predicated on premise that students with educational needs receive programs that produce learning at a

faster rate

than peers. Example: A fall, 3

rd grade student who is successful on 1 st level passages may be expected to be successful on 3 rd grade grade passages by the end of the school year.

Summer 2009 *All data and identifying information presented are fictitious.

73

Grade 3:

76/8

Practice Example: Michael Martin

Grade 4:

67/10

Grade 5:

48/12

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

74

Practice Example: Michael Martin Sample: 36-week expectation for performance = GOAL Goal for Michael is set at about the 25 th percentile, 114wrc/min, rounded to 115 wrc/min for simplicity. (Norm-referenced goal setting method)

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

75

All Goal Setting FAQ: What “number” and grade level do I choose for the goal?

Answer:

Set goal at the

grade level

and

score

that you expect the student to perform at the

end of the instructional period.

(E.g., 9 wks, 18 wks., 36 wks., 52 wks., etc.).

Summer 2009 76

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 77

4b

Rate of Improvement (ROI) Method

Progress Monitoring Schedule Setup & Goal Setting

Learn Via Case Study:

Michael Martin

Summer 2009 78

What is ROI?

Understanding AIMSweb®’s Aggregate Norms

Grades 1 – 5: Example: 50 th percentile for Grade 5

(147-115)/36 weeks = 0.9 average number of words gained per week.

Summer 2009 *All data and identifying information presented are fictitious.

79

What is ROI?

Understanding Michael Martin’s performance compared to Aggregate Norms

Grades 1 – 5:

Summer 2009

Compare Michael Martin

Fall 5 th grade Student:

48 wrc / 12 errors

*All data and identifying information presented are fictitious.

80

Rate of Improvement (ROI) Before one can write progress monitoring goals using ROI, there are three things to keep in mind:

1.

What research says is a “

realistic

” and “

ambitious

” growth rate 2.

What do norms indicate about “

good

” growth rates 3.

Aggregate & Local Norms:

National vs. your grade/schools growth rate during the first semester or last year.

Summer 2009 81

What research says about growth rates: Oral Reading Oral Reading

Grade Realistic

6-8 .3 WRC per week

Ambitious

.65 WRC per week 5 .5 WRC per week 4 .85 WRC per week 3 1.0 WRC per week 2 1.5 WRC per week 1 2 WRC per week .8 WRC per week 1.1 WRC per week 1.5 WRC per week 2.0 WRC per week 3.0 WRC per week

Summer 2009

From: Progress Monitoring Strategies for Writing Individualized Goals in General Curriculum and More Frequent Formative Evaluation

82

Summer 2009

What further research says about growth rates

83

ROI: AIMSweb National Aggregate Norm Table: Grade 5

• Aggregate

10 th

percentile = 0.6

• Aggregate

25 TH

percentile = 0.7 ROI • Aggregate

50 th

percentile = 0.9 ROI For our fifth grader, Michael Martin, what is the rate of growth of the

average 5th grader

above?

Summer 2009 84

Using Local Norms:

Determine what your class or school grade growth rate was last semester or last year.

Summer 2009

ROI: Local Norms: Grade 5

The average growth rate for a 5th grade student at this school is

1.6 words per week.

85

Survey Level Assessment (SLA):

Using National Aggregate Normative Data

Summer 2009

Next, conduct SLA for Michael Martin and decide goal-level material…

*All data and identifying information presented are fictitious.

86

Grade 3:

76/8

SLA: Michael Martin

Grade 4:

67/10

Grade 5:

48/12

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

87

SLA, ROI, and Goals

• After completing the SLA and deciding on the

goal level material

, look at the

growth rate

for a student at the

25th percentile in that goal level material.

Example:

SLA says Michael is a 5th grade student performing similar to an average student on 3 rd grade level materials.

– Decide whether goal level material should be 4th grade or 5th grade: • If

4 th

• If

5 th

grade level is goal, 25 th grade level is goal, 25 th percentile ROI for Nat’l Agg Norms is:

0.8

percentile ROI from Nat’l Agg Norms is:

0.7

– Either choice may be appropriate, depending on the intensity of instruction, previous growth rates seen with selected instructional method, etc.

Summer 2009 88

SLA, ROI, and Goals

1.

2.

Look at the ROI for a student at the 25th percentile in the goal level material.

Consider

“doubling”

that amount (ROI).

Example:

If the 25 th percentile ROI for 5 th grade is 0.7:

Minimally

, multiply

0.7 x 2 = 1.4 growth rate .

• Next, multiply 1.4 times the # of weeks you plan to progress monitor. This gives you your expected

gain score

.

1.4 x 36 weeks = 50.4 wrc

• Add that to the

SLA score from the goal level material

to determine the final goal.

50.4 + 48 = 98.4

• Consider rounding to even number, or closest “10”

98.4 rounded to 100.

IMPORTANT:

When planning a goal and providing intervention, the student

must have an ROI greater than average

if they are going to catch up!

Summer 2009 89

Establishing Goal-Level Material

• When progress monitoring, establishing the goal-level material using Norm-referenced or ROI-based data is a

logical task

, based on a combination of:

– Educational values – Student educational needs – Intensity of instructional programs – Research & quality data

• Predicated on premise that students with educational needs receive programs that produce learning at a

faster rate

than peers. Example: A fall, 3

rd grade student who is successful on 1 st level passages may be expected to be successful on 3 rd grade grade passages by the end of the school year.

Summer 2009 *All data and identifying information presented are fictitious.

90

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 91

Summer 2009

Progress Monitor Schedule Setup (Part 2):

Determining the frequency and duration of assessment based on needs and resources

5

92

Step 5: How much data should be collected?

Making Data-Based Decisions With Progress Monitor

Typically need at least 7-10 data points (Shinn & Good, 1989) before making programming decision

— and you may need to collect more if uncertain.

Christ & Silberglitt (2007) recommended 6-9 data points

 As the number of data points increases, the effects of measurement error on the trend line decreases.

Summer 2009 93

Step 5: How much data should be collected?

Four Criteria To Consider:

Criteria #1. Trend line meets (or is on-target to meet) AIM line for ultimate goal:

Success! Once goal is met, consider transition to less intensive program or new goal as needed. Summer 2009 94

Step 5: How much data should be collected?

Criteria #2. Trend line and AIM line will intersect in relatively near future?

Keep with current intervention until goal is reached. Summer 2009 *All data and identifying information presented are fictitious.

95

Step 5: How much data should be collected?

Criteria #3a. Trend line exceeds AIM line?

a. Consider

increasing goal or difficulty level

Summer 2009 Grade 5 student reading grade 4 passages. Goal was changed from 104 wrc/min to 125 wrc/min.

*All data and identifying information presented are fictitious.

NOTE:

When changing a goal to require a different grade level of material, start a new schedule. Do not use the same schedule as the data are not comparable (i.e., 50 wrc/min on a 5 th grade passage means something different than 50 wrc/min on a 3 rd grade passage.) 96

Step 5: How much data should be collected?

Criteria #3b. Trend line exceeds AIM line?

b.

Or, retain the current intervention and close the gap even faster

if this goal is the final performance level the student is to reach while being progress monitored. Summer 2009 Student may reach goal in mid March, rather than the end of May if progress continues at same rate of improvement.

Grade K student on Grade K PSF probes.

*All data and identifying information presented are fictitious.

97

Step 5: How much data should be collected?

Criteria #4. Trend line will not likely intersect AIM line —and/or moves in opposite direction of AIM line:

Consider adding additional intervention, changing variable, and/or intensifying program changes.

Note four data points are already below the AIM line.

Summer 2009 *All data and identifying information presented are fictitious.

98

Quiz

A brief review of leading research on progress monitoring: Group Question 1.

Generally speaking, how many data points are recommended according to Shinn (1989) before considering a decision? Summer 2009 99

Quiz

A brief review of leading research on progress monitoring: Group Question 1.

Generally speaking, how many data points are recommended according to Shinn (1989) before considering a decision?

Answer 1.

7 – 10

Summer 2009 100

Quiz

A brief review of leading research on progress monitoring: Group Question 2.

Generally speaking, how many data points are recommended according to Christ & Silberglitt (2007) before considering a decision?

Answer 2.

6 – 9

BUT….There are exceptions to the rules!

Summer 2009 101

Quiz

1. For Tier 1 or Tier 2 programs, the data involved are typically less "high stakes" than that used for:

a. Tier 3 b. Special education entitlement c. IEP goals Summer 2009 102

Quiz

1. For Tier 1 or Tier 2 programs, the data involved are typically less "high stakes" than that used for:

a. Tier 3 b. Special education entitlement c. IEP goals For Tier 3, IEP goals or special education entitlement decisions, we need to be

especially "confident"

about the rate of improvement (ROI), or progress, occurring.

Summer 2009 103

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 104

Summer 2009

Now…Building your confidence:

Developing good judgment in data analysis

When 7-10 data points may be

“too much”

or

“not enough”

6a

105

Building Confidence in Decision-Making 2.

Variability of the data:

a.

The “more variable” the data, the larger the error in the slope. The larger the error in the slope, the more data points are needed to gain confidence in the trend/actual progress made.

b.

The "tighter" the data, the fewer the number of data points potentially needed to be “confident” in the trend developing.

Summer 2009 106

3.

The direction of the trend:

a.

If all the data points are below the aimline and going strongly negative, you will not likely need 7-10 data points to confirm "uh-oh!" b.

In contrast, if all data points are above the line and in strongly positive direction, the opposite applies —you won’t likely need 10 data points to say, "wow" and increase the ambitiousness of your goal.

Summer 2009

Building Confidence in Decision-Making

107

Building Confidence in Decision-Making 4. ROI & aimlines are important:

Observe data against an “expected rate of progress” or “aimline.” The absence of such makes for increased error. (AIMSweb® automatically displays this data, but other systems may not.)

Without

aimline or trend lines Summer 2009

With

aimline or trend lines 108

6b

Further Building Your Confidence in Decision-Making

Data Collection: Balancing the ideal with the feasible

Summer 2009 109

How Frequently to Assess?

Balance

IDEAL

with

FEASIBLE

:

Too little data, too infrequently means students may stay in ineffective programs longer than necessary.

See example on next slide.

110

Building Confidence in Decision-Making

Note that a student may potentially be in an ineffective program longer than needed when data collection is not done frequently enough.

5 data points over

15

weeks.

vs.

5 data points over

5

weeks.

Summer 2009 *All data and identifying information presented are fictitious.

111

Frequency of Assessment Directly Related to Student Achievement

Summer 2009 Similar results found by Fuchs & Fuchs (1986) 112

Similar results found by Fuchs & Fuchs (1986)

Frequency of Assessment Directly Related to Student Achievement

For interpretive purposes:

An effect size of .71 means that a student at the

50th percentile without formative evaluation

would be expected to perform at the

76th percentile with formative evaluation.

An effect size is a measure of the strength of the relationship between two variables in a statistical population. It is the

average difference between two groups

.

Summer 2009 113

General Guidelines Based on Best Practices & Research Progress Monitor (PM) Testing Frequency ** Probable strength of PM data’s ability to reliably inform instruction: After 4 week period After 6 week period After 8 week period After 10+ week period R-CBM Recommendation

(Other measures need only one probe per session.)

2x/week

**

Good

**

Excellent

**

Excellent

**

Excellent 1 probe 1x/week

**

Fair

**

Fair

**

Good

**

Excellent 1 probe Every ~10 days Every 2 weeks

** **

Poor Poor

** **

Poor Poor

** **

Fair Poor

** **

Good Fair 1 probe 1 probe Every 3 weeks Poor

**

Poor

**

Poor

**

Poor Median of 3 probes Median of 3 probes Every 4+ weeks Poor Poor

**

Poor

**

Poor

** Consider all recommendations and guidelines presented within this AIMSweb® training module,

Summer 2009

as well as other local factors that may apply.

Copyright © 2009 Pearson Education, Inc. or its affiliate(s). All rights reserved. 114

Summary & Review

Nuances may impact decision-making. 1. Consider If:

ALL

of the data points are

below the aim line

, • And the trend is clearly heading

“ down”

Then:

Additional data may not be needed in order to make a decision.

2. Conversely, if:

ALL

of the data points are

above the aim line

, • And the trend is going

“through the roof,”

Then:

You are able to see the effect of the intervention sooner than listed (right) and this is good news! In key instances that occur when the trend is either hovering close to the aim line or is flat, the question becomes,

“How long do you want to keep a student in a potentially ineffective program at this rate before a change is made?”

The matrix (upper right) offers a guideline for the frequency and duration of data collection needed before a reliable decision may be made under many common conditions.

Summer 2009

Final Thoughts to Consider:

• The consequences of poor decisions are worse for kids not making progress.

• In contrast, if you err and judge the impact of a seemingly effective program

“too soon,”

it is likely that you are going to judge that program as one that

is working,

and thus maintain the program as is while continuing to collect data for future decision making.

115

Quiz

1. When you are progress monitoring and notice a lot of variability in the data, what do you want to do?

a.

Check rater reliability for data collection —ensure measure was given properly b.

c.

d. Consider whether a sufficient number of data points have been collected Increase frequency of Progress Monitoring, especially when data is being collected less than 1x/week. Check fidelity and consistency of instruction e. All of the above.

Summer 2009 116

Quiz

1. When you are progress monitoring and notice a lot of variability in the data, what do you want to do?

a.

Check rater reliability for data collection —ensure measure was given properly b.

c.

d. Consider whether a sufficient number of data points have been collected Increase frequency of Progress Monitoring, especially when data is being collected less than 1x/week. Check fidelity and consistency of instruction e.

All of the above.

Summer 2009 117

Quiz

2. If you have a series of data points are consistently above or below aimline, how many points would you need in order to decide with reasonable confidence about the efficacy of program?

a. About 3-4 if all clearly fell above —or clearly fell below—the aimline.

b. About 7-10 if all clearly fell above —or clearly fell below—the aimline.

c. About 2 —which is the minimum data required to see the trend line develop and thus understand the program’s efficacy.

d. None of the above. I will use my crystal ball.

Summer 2009 118

Quiz

2. If you have a series of data points are consistently above or below aimline, how many points would you need in order to decide with reasonable confidence about the efficacy of program?

a.

About 3-4 if all clearly fell above —or clearly fell below—the aimline

.

b. About 7-10 if all clearly fell above —or clearly fell below—the aimline.

c. About 2 —which is the minimum data required to see the trend line develop and thus understand the program’s efficacy.

d. None of the above. I will use my crystal ball.

Summer 2009 119

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 120

Summer 2009

Data Interpretation, Case Studies, & Practice Exercises:

7 7a

Michael Martin:

Norm-Referenced Progress Monitoring Schedule Setup

121

Practice Example: Michael Martin

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

“Michael Martin currently reads about 48 words correctly from Grade 5 Standard Reading Assessment Passages. He reads Grade 3 reading passages successfully; 67 correct words per minute, which is how well beginning 3 rd grade students read this material.

Summer 2009 *All data and identifying information presented are fictitious.

122

Practice Example: Michael Martin Sample: 36-week expectation for performance = GOAL Goal for Michael is set at about the 25 th percentile, 114wrc/min, rounded to 115 wrc/min for simplicity. (Norm-referenced goal setting method)

Summer 2009

KF KW KS 1F 1W 1S 2F 2W 2S 3F 3W 3S 4F 4W 4S 5F 5W 5S 6F 6W 6S 7F 7W 7S 8F 8W 8S

*All data and identifying information presented are fictitious.

123

Baseline Data KEY

Aimline: Trend line: Corrects: Errors: Summer 2009 *All data and identifying information presented are fictitious.

124

Summer 2009 *All data and identifying information presented are fictitious.

125

AIMSweb® Progress Monitor provides the

new ROI

after the entry of three (3) data points.

Summer 2009 *All data and identifying information presented are fictitious.

126

Summer 2009 *All data and identifying information presented are fictitious.

127

Summer 2009 *All data and identifying information presented are fictitious.

128

Summer 2009 *All data and identifying information presented are fictitious.

129

Summer 2009 *All data and identifying information presented are fictitious.

130

Teacher referenced research of Shinn (1989), Christ & Silberglitt (2007) and collected eight (8) data points thus far.

Is this enough data to evaluate efficacy of instructional program?

Summer 2009 *All data and identifying information presented are fictitious.

131

Sample questions to ask when reviewing data:

1.

2.

3.

4.

5.

Has instructional program been provided with fidelity? (Has this been observed directly?) Has student attendance been acceptable?

Is core instruction also being provided in reading? Or, is student missing core instruction?

Does instruction address student skill deficits?

What other factors could be impacting student’s performance?

Summer 2009 *All data and identifying information presented are fictitious.

132

An

“Intervention line”

is added on the

exact date

the new intervention has begun.

Summer 2009 *All data and identifying information presented are fictitious.

133

An

“Intervention line”

is added on the

exact date

the new intervention has begun.

Summer 2009 *All data and identifying information presented are fictitious.

134

Summer 2009 *All data and identifying information presented are fictitious.

135

Summer 2009 *All data and identifying information presented are fictitious.

136

Summer 2009 *All data and identifying information presented are fictitious.

137

Intervention 3

added and performance observed.

Summer 2009 *All data and identifying information presented are fictitious.

138

Intervention 3

added and performance observed.

Note:

Skill regression & recoupment pattern during winter break between December 22-January 5.

Summer 2009 *All data and identifying information presented are fictitious.

139

Summer 2009 *All data and identifying information presented are fictitious.

140

Note: Gradual decrease in error rates and increase in words read correct over time.

Summer 2009 *All data and identifying information presented are fictitious.

141

Quiz

1.

How could you use Progress Monitor data for regression & recoupment analysis?

2.

How was Michael Martin’s Goal set? ROI or Norm referenced method? Speculate what the logic was behind choosing the goal selected?

Summer 2009 142

GROUPWORK:

Rehearse, Practice & Discuss:

• Break into groups of 2-4 participants and rehearse the slides and content presented.

Think about how you will present this material to future trainees.

• Try to “re-tell” the information on the slides to peers.

• Take notes – they will be your reference for when you need to train others within your organization.

• Bring forth any questions from the team to your CAT.

• Your CAT will walk about the room, observing and assisting as needed.

• Your CAT will then survey each group at the end of this break to discuss questions and comments that arose during the exercise.

Summer 2009 143

Summer 2009

Progress Monitor Practice Exercises

1.

2.

3.

4.

Survey Level Assessment Present Levels of Educational Performance Writing quality instructional program / intervention descriptions Goal Setting Practice Exercises (Norm-based)

7a

144

Summer 2009 Exercise 1:

Tabitha Gralish

3

rd

grade student

7a

145

DIRECTIONS: Task 1:

Using the graph below, determine Tabitha’s performance level for each median

Survey Level Assessment

score obtained (based on Fall Norms).

Task 2:

Write down Tabitha’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009 *All data and identifying information presented are fictitious.

146

Task 1: Answer

Task #1: Using the graph below, determine Tabitha’s performance level for each median Survey Level Assessment score obtained (based on Fall Norms).

Summer 2009 147

Task 2: Answer

Task 2:

Write down Tabitha’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009

Well Below Average Average Above Average

148

DIRECTIONS :

1. Complete the activity below.

Summer 2009 *All data and identifying information presented are fictitious.

149

ANSWER :

Summer 2009 *All data and identifying information presented are fictitious.

150

Write a Present Levels of Educational Performance statement for Tabitha.

Currently Tabitha reads ___ words correct per minute correctly with ____ errors from Grade 3 AIMSweb® Standard Reading Assessment Passages. She reads grade _____ passages successfully, _____ words correct per minute with __ errors, which is how well ___ grade students read this material in the fall of the year.

Summer 2009 *All data and identifying information presented are fictitious.

151

ANSWER : Write a Present Levels of Educational Performance statement for Tabitha.

Currently Tabitha reads

22

words correct per minute with

7

errors from Grade 3 AIMSweb® Standard Reading Assessment Passages. She reads grade

2

passages successfully,

41

words correct per minute with

6

errors, which is how well

2nd

grade students read this material in the fall of the year.

Summer 2009 *All data and identifying information presented are fictitious.

152

Use norm-referenced goal-setting method:

Summer 2009 *All data and identifying information presented are fictitious.

153

ANSWER : Sample Goal for Tabitha:

In

36

weeks, Tabitha will read

89

words correctly, with

4

or fewer errors, from Grade

3

Standard Progress Monitor Reading Assessment Passages.

Summer 2009 *All data and identifying information presented are fictitious.

154

Summer 2009

Bonus Question

: Based on the above goal, along with Tabitha’s Present Levels of Educational Performance, calculate the Rate of Improvement (ROI) needed per week for Tabitha to reach her goal on time. *All data and identifying information presented are fictitious.

155

Calculating expected weekly ROI toward goal

1.

First, determine “gain score” by subtracting goal score from score obtained on goal-level SLA probes.

2. Calculate weekly expected ROI = Gain Score / # of weeks allotted to reach goal.

Summer 2009

Formula to calculate Gain Score:

Subtract

SLA Score

on goal level passages

(22)

from

Goal Score (89)

=

GAIN SCORE (67) Goal for Tabitha:

In

36

weeks, Tabitha will read

89

words correctly, with

4

errors, from Grade

3

or fewer Standard Progress Monitor Reading Assessment Passages.

*All data and identifying information presented are fictitious.

156

Bonus Question

: Based on the above goal, along with Tabitha’s Present Levels of Educational Performance, calculate the Rate of Improvement (ROI) needed per week for Tabitha to reach her goal on time. Summer 2009

ANSWER: 67 / 36 = 1.86

wrc/min per week (expected ROI)

*All data and identifying information presented are fictitious.

157

Goal Criteria and Method Consider your goal criteria and method:

• What student-related factors were involved in deciding the goal?

• How did the “discrepancy” between current performance and expected performance influence your decision about the goal?

• What other factors were involved in your goal-setting decision, if any?

Summer 2009 *All data and identifying information presented are fictitious.

158

8

Writing Effective Instructional Program / Intervention Descriptions

Summer 2009 159

Writing the Intervention Title & Description

Properly documenting instructional programs using Progress Monitor Software:

• Interventions

must anyone

be written so that they may be

thoroughly understood

by who has access to that student’s data or who is responsible for his/her progress.

• Interventions, combined with Progress Monitor data, inform which programs

work

— and which programs that did

not work

so that ineffective programs are not inadvertently re-administered.

Therefore

, it is recommended that the conditions under which the student received instruction are clearly reported, as well as any adjustments to those conditions that are made along the way.

Summer 2009 160

Writing the Intervention Title & Description

• • •

Writing the Recipe for Student Success

Describe your instructional programs as if they were a recipe for your favorite dishes. The greater the detail in the recipe —the more success you will have in replicating it The greater the detail in your “instructional recipes” the greater your success will typically be in: – Ensuring students receive consistent and appropriate instruction if they transfer to/from teachers, buildings, etc.

– Improved ability to defend your decision-making when changing components of student’s instructional programs – Enables teacher to “see” all of the variables that may impact the student’s learning —and thus improves the teacher’s ability to later “tweak” those variables to improve student outcomes – May reduce weakness of cases brought to litigation or other similar challenge when you have accurately and fully adhered to the “recipe” written.

Summer 2009 161

Writing the Intervention Title & Description

Compare the following recipes (Version 1 & 2). Which one is more easily replicated successfully?

Chocolate Chip Nut Cake: Version 1

• • • • • • • •

INGREDIENTS:

Some Flour A little Baking powder Dash of Baking soda Salt Butter or margarine Lots of sugar eggs Chocolate chips • • • •

PREPARATION INSTRUCTIONS:

Preheat oven Put cake in pan Bake for desired time Enjoy (?) Summer 2009 Reference: 5/6/09: http://www.recipes4cakes.com/chocolatecakes/chocochip.htm

162

Writing the Intervention Title & Description Chocolate Chip Nut Cake: Version 2

• • • • • • • • • • • • •

INGREDIENTS:

2 1/2 cup Flour 3 tsp Baking powder 1 tsp Baking soda 1/4 tsp Salt 1 cup Butter or margarine 1 cup Sugar 3 eggs 1 cup Sour cream 2 tsp Vanilla 1 cup Chopped nuts 1 cup Chocolate chips 1/2 cup Brown Sugar 2 tsp Cinnamon • • • • • • • • • • •

PREPARATION INSTRUCTIONS:

Oven Temp ~ 350 ° Baking Time ~ 1 Hour Pan Type ~ 10 inch tube pan Preheat oven, Grease pan.

Sift together the first four ingredients. Set aside.

Cream together the next three ingredients, beating in the eggs one at a time. Add the sour cream and vanilla to the creamed mixture.

Add flour mixture to creamed butter mixture and mix together.

Mix together the nuts, chocolate chips, brown sugar and cinnamon.

Pour 1/2 the batter in the pan, then sprinkle on 1/2 the chocolate chip mixture. Add the remainder of the batter, then sprinkle the remainder of the chocolate chip nut mix on top.

Bake for desired time, remove from oven and let cool for 5 minutes, loosen the sides of cake with a butter knife.

Invert cake onto serving plate and remove from pan Summer 2009 Reference: 5/6/09: http://www.recipes4cakes.com/chocolatecakes/chocochip.htm

163

Writing the Intervention Title & Description OPTIONAL

GROUP ACTIVITY:

(~5 minutes)

Step 1: Read the following sample case: “

Michael Martin’s Progress Monitor schedule was just transferred to your caseload today. He transferred to your school this week from another school in your district. Today you are reviewing his AIMSweb® Progress Monitor report and see that he has received two instructional programs (i.e., interventions) thus far. The first program did not work well. The 2nd program appears to be working successfully: (See report on next slide.)”

Summer 2009 *All data and identifying information presented are fictitious.

164

OPTIONAL

Summer 2009 *All data and identifying information presented are fictitious.

165

Writing the Intervention Title & Description OPTIONAL

Step 2: As a group, consider and discuss the following:

– Would you be successful in delivering the following instructional program / intervention?

Intervention 1: “Reading tutoring.”

Intervention 2:

“Increase tutoring.”

– Do you have enough information, by reading the Progress Monitor report’s program description alone, to deliver the program / intervention successfully?

– What other information do you need to know?

Summer 2009 166

Writing the Intervention Title & Description

1.

2.

Writing effective program descriptions:

Consider the following eight criteria and examples:

Who :

General education teacher, reading specialist, special education teacher, social worker, psychologist, parent, etc.

What :

Reading intervention program title & level, The “A+ Math Program, Level 2,” trade-books, self-selected reading, repeated reading, behavior plan (describe), core curriculum, specific phonics skill instruction, etc.

3.

Where :

General education classroom, push-in instruction, pull-out instruction, resource room, social worker, intervention specialist’s room, after-school program, etc. .

4.

Summer 2009

When :

After-school program, morning, during math instruction, afternoon, during core reading program, never during core reading program, etc.

(Continued on next slide)

167

Writing the Intervention Title & Description

5.

Writing effective program descriptions:

Consider eight criteria and examples

Why :

Level 2a of “A+ Math” addresses John’s recently identified deficit in 2 digit x 2 digit addition and subtraction with regrouping. “ABC Reading, Level 4 addresses John’s recently identified deficit in diphthongs and digraphs,” etc.

How often :

3x per week, every Monday, daily, twice per day, etc.

6.

7.

How long :

20 minutes, 10 minutes, one hour, during 1 st period, etc.

8.

Summer 2009

Other pertinent data:

An attendance plan has also been added to ensure John receives the instruction offered (describe plan), a behavior plan has been added to increase frequency in which John turns in assignments and homework by the due date.

168

Writing the Intervention Title & Description

SUMMARY:

Interventions must be written so that they may be understood —and replicated easily as needed by others who read them.

Examples Low quality High quality

Summer 2009

Program Label / Title

• Intervention • Intervention 1; • “ABC Reading” • Intervention 4 • Tier 2: Int. 1 • “Best Reading Program”

Program / Intervention Descriptions

• Tutoring • Reading help • Tier 2 Michael needs work on CVC and CVCE decoding skills, and skills to which those are prerequisite; therefore, Michael is receiving 30 minutes of pull-out reading instruction 3x per week, in the morning, using the “ABC Reading Program, Level 2a,” with Ms. Smith, the reading specialist. He will not be pulled out during core reading instruction in the general education classroom.

*All data and identifying information presented are fictitious.

169

Writing the Intervention Title & Description

Summer 2009 *All data and identifying information presented are fictitious.

170

Summer 2009

Exercise 2

:

Maya Cloud

Grade 5 student

7c

171

DIRECTIONS: Task 1:

Using the graph below, determine Maya’s performance level for each median

Survey Level Assessment

score obtained (based on

Fall

Norms).

Task 2:

Write down Maya’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009

For this case, assume:

1. That the Survey Level Assessment (SLA) was obtained in the Fall 2. The Goal Date will be in 36 weeks (end of school year) *All data and identifying information presented are fictitious.

172

ANSWERS:

Task 1:

Using the graph below, determine Maya’s performance level for each median

Survey Level

Assessment score obtained (based on Fall Norms).

Task 2:

Write down Maya’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009

Maya Cloud —Grade 5 student

Well Below Average Well-Below Average Below Average Average Well Above Average *All data and identifying information presented are fictitious.

173

Summer 2009 *All data and identifying information presented are fictitious.

174

ANSWER

Summer 2009 *All data and identifying information presented are fictitious.

175

Summer 2009 *All data and identifying information presented are fictitious.

176

ANSWERS:

5 24 5 2

Summer 2009

59 14 2

*All data and identifying information presented are fictitious.

177

Set Goal: Use Norm-referenced method

Summer 2009 *All data and identifying information presented are fictitious.

178

POSSIBLE ANSWER: Norm-referenced method

36 104* 5 4*

* Dependent upon ambitious, but feasible goal for individual student.

Summer 2009 *All data and identifying information presented are fictitious.

179

OPTIONAL

Bonus Question

: Based on Maya’s new

36-week goal (104 wrc/min on Grade 4 Passages),

along with Maya’s Present Levels of Educational Performance (below), calculate the weekly Rate of Improvement (ROI) needed per week for Maya to reach her goal on time.

Calculate Gain Score (Goal Score – SLA score for Grade 4): _______ Divide Gain Score by Number of weeks: ____________ = weekly ROI

Summer 2009 180

OPTIONAL

( Bonus Question

: Based on Maya’s new

36-week goal 104 wrc/min on Grade 4 Passages),

along with Maya’s Present Levels of Educational Performance (below), calculate the weekly Rate of Improvement (ROI) needed per week for Maya to reach her goal on time.

ANSWER

Calculate Gain Score (104-39): 65 Divide 65 by Number of weeks (36): 1.80

norm-referenced goal setting method.

Summer 2009

= weekly ROI expected for

181

Now that we have calculated a goal for Maya using the norm-referenced method, let’s practice goal setting using the weekly ROI method…

Summer 2009 182

Use ROI method to calculate goal

.

Summer 2009 *All data and identifying information presented are fictitious.

183

Use the following charts as needed to determine Maya’s goal using the ROI method.

Summer 2009 184

1.1 wrc/min to ~1.6 wrc/min (0.8 x 2) is the approximate weekly expected growth that we can use to formulate our goal: 1.6 x 36 weeks = 57.6

Thus: 58 (rounded) + 39 = 97 GOAL: 97 wrc/min

Summer 2009 185

POSSIBLE ANSWER: ROI method

36 96* 5 4*

* Dependent upon ambitious, but feasible goal for individual student.

Summer 2009 *All data and identifying information presented are fictitious.

186

POSSIBLE ANSWER:

36 104* 5 4*

Bonus Question

: Based on the above goal, along with Maya’s Present Levels of Educational Performance, calculate the Rate of Improvement (ROI) needed per week for Maya to reach her goal on time. * Dependent upon ambitious, but feasible goal for individual student.

Summer 2009 *All data and identifying information presented are fictitious.

187

Summer 2009

Bonus Question

: Based on the above goal, along with Maya’s Present Levels of Educational Performance, calculate the Rate of Improvement (ROI) needed per week for Maya to reach her goal on time. *All data and identifying information presented are fictitious.

188

POSSIBLE ANSWER:

36 104* 5 4*

Bonus Question

: Based on the above goal, along with Maya Present Levels of Educational Performance, calculate the Rate of Improvement (ROI) needed per week for Maya to reach her goal on time.

ANSWER: ~1.80 wrc/min per week

Summer 2009 *All data and identifying information presented are fictitious.

189

Goal Criteria and Method Consider your goal criteria and method for Maya:

• What student-related factors were involved in deciding the goal?

• How did the “discrepancy” between current performance and expected performance influence your decision about the goal?

• What other factors were involved in your goal-setting decision, if any?

Summer 2009 *All data and identifying information presented are fictitious.

190

Summer 2009

Exercise 3:

Zachary Johnston

Grade 6 student

7d

*All data and identifying information presented are fictitious.

191

DIRECTIONS: Task 1:

Using the graph below, determine Zachary’s performance level for each median

Survey Level Assessment

score obtained (based on

Fall

Norms).

Task 2:

Write down Zachary’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009

For this case, assume:

1. That the Survey Level Assessment (SLA) was obtained in the Fall 2. The Goal Date will be in 36 weeks (end of school year) *All data and identifying information presented are fictitious.

192

ANSWERS:

Task 1:

Using the graph below, determine Zachary’s performance level for each median Survey

Level Assessment

score obtained (based on Fall Norms).

Task 2:

Write down Zachary’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009

Zachary Johnston —a Sixth Grade Student

Well-Below Average Well Below Average Below Average Average *All data and identifying information presented are fictitious.

193

Summer 2009 *All data and identifying information presented are fictitious.

194

ANSWER

Summer 2009 *All data and identifying information presented are fictitious.

195

Summer 2009 *All data and identifying information presented are fictitious.

196

Summer 2009

ANSWERS:

49 6 3 8 10 88 3rd

*All data and identifying information presented are fictitious.

197

Summer 2009 *All data and identifying information presented are fictitious.

198

Set goal using Norm-referenced method

Summer 2009 *All data and identifying information presented are fictitious.

199

POSSIBLE ANSWER: Norm-referenced method

36 5* 114* 5

* Dependent upon ambitious, but feasible goal for individual student.

Summer 2009 *All data and identifying information presented are fictitious.

200

Set goal using ROI method

Summer 2009 *All data and identifying information presented are fictitious.

201

Use the following charts as needed to determine Zachary’s goal using the ROI method.

Summer 2009 202

0.8 wrc/min to ~1.4 wrc/min (0.7 x 2) is the approximate weekly expected growth that we can use to formulate our goal: 1.4 x 36 weeks = 50.4

Thus: 50 (rounded) + 54 = 104 GOAL: 104 wrc/min

Summer 2009 203

POSSIBLE ANSWER: ROI method

36 5* 104* 5

* Dependent upon ambitious, but feasible goal for individual student.

Summer 2009 *All data and identifying information presented are fictitious.

204

Goal Criteria and Method Consider your goal criteria and method for Zachary:

• What student-related factors were involved in deciding the goal?

• How did the “discrepancy” between current performance and expected performance influence your decision about the goal?

• What other factors were involved in your goal-setting decision, if any?

Summer 2009 *All data and identifying information presented are fictitious.

205

Summer 2009 Exercise 4:

Alyssia Erickson

Grade 4 student

7e

*All data and identifying information presented are fictitious.

206

DIRECTIONS: Task 1:

Using the graph below, determine Alyssia’s performance level for each median

Survey Level Assessment

score obtained (based on

Fall

Norms).

Task 2:

Write down Alyssia’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009 *All data and identifying information presented are fictitious.

207

ANSWERS:

Task 1:

Using the graph below, determine Alyssia’s performance level for each median

Survey Level

Assessment score obtained (based on WINTER Norms).

Task 2:

Write down Alyssia’s

performance level

(e.g., “Average,” “Below Average,” etc.) in the table above.

Summer 2009

Alyssia Erickson —a Fourth Grade Student

Well-Below Average Below Average Average *All data and identifying information presented are fictitious.

208

Summer 2009 *All data and identifying information presented are fictitious.

209

ANSWER

Summer 2009 *All data and identifying information presented are fictitious.

210

Summer 2009 *All data and identifying information presented are fictitious.

211

Summer 2009

ANSWERS:

72 4 8 3 7 83 3rd

*All data and identifying information presented are fictitious.

212

6.

Summer 2009 *All data and identifying information presented are fictitious.

213

6.

POSSIBLE ANSWER Norm-referenced goal:

18 4* 100* 5

* Dependent upon ambitious, but feasible goal for individual student.

Summer 2009 *All data and identifying information presented are fictitious.

214

Use the following charts as needed to determine Alyssia’s goal using the ROI method.

Summer 2009 215

1.1 wrc/min to ~1.6 wrc/min (0.8 x 2) is the approximate weekly expected growth that we can use to formulate our goal: 1.6 x 18 weeks = 28.8

Thus: 29 (rounded) + 72 = 101 GOAL: 101 wrc/min

Summer 2009 216

6.

POSSIBLE GOAL: ROI Method

18 4 101

Summer 2009

5

*All data and identifying information presented are fictitious.

217

Goal Criteria and Method Consider your goal criteria and method for Alyssia:

• What student-related factors were involved in deciding the goal?

• How did the “discrepancy” between current performance and expected performance influence your decision about the goal?

• What other factors were involved in your goal-setting decision, if any?

Summer 2009 *All data and identifying information presented are fictitious.

218

Support Options —A Review

Downloads Tab

Help Button

Support

• •

(866-313-6194) [email protected]

User Guides/Software Guides

Admin/Scoring Manuals

Summer 2009 219

SURVEY

www.aimsweb.com/training/survey

• • •

Locate your Certified AIMSweb® Trainer’s name Enter the password provided to you by your trainer Complete the survey

Summer 2009 220

Summer 2009

Appendix

Supplemental materials for training (optional)

221

Grades 1 – 5:

(Grades 6-8 continued on next slide) Summer 2009

AIMSweb National Aggregate Norm Table: Grades 1-8

222

AIMSweb National Aggregate Norm Table

Grades 6 - 8 (Grades 1-5 continued on previous slide) Summer 2009 *All data and identifying information presented is fictitious.

223

The End

AIMSweb® Progress Monitor Training: Summer 2009

Copyright © 2009 Pearson Education, Inc. or its affiliates. All rights reserved.

Summer 2009 224