Transcript Slide 1

Linking DIBELS Data to Differentiated
Instructional Support Plans
32nd Annual COSA Seaside
Conference
June 23, 2006
Hank Fien, Ph.D.
Center for Teaching and Learning
University of Oregon
[email protected]
1
Content Development
Content developed by:
Edward J. Kame’enui, Ph. D.
Deborah C. Simmons, Ph. D.
University of Oregon
Texas A & M University
Beth Harn, Ph. D.
Sarah McDonagh, Ph.D.
University of Oregon
University of Oregon
Hank Fien, Ph.D.
University of Oregon
Prepared by:
Patrick Kennedy-Paine
University of Oregon
Katie Tate
University of Oregon
2
School-wide Reading Model
Foundational Features: Translating
Research into Practice
3
Focusing Comments
• “Never eat more than you can lift.” (Miss
Piggy, circa 1979)
4
Reading Assessment for
Different Purposes
An effective, comprehensive
reading program includes reading
assessments for four purposes:
– Screening
– Diagnostic
– Progress Monitoring
– Outcomes or Program Evaluation
5
Relation of DIBELS to
Purposes of Assessment
• Utility of DIBELS
Purpose of Assessment
Utility
Screening
Yes
Progress Monitoring
Yes
Diagnostic
Limited
Outcome
Selected measures
6
DIBELS™ Assess the Big Ideas
Big Idea of Literacy
DIBELSŖ M easure
Phonemic Awareness
Initial Sound Fluency
Phoneme Segmentation Fluency
Alphabetic Principle
Nonsense Word Fluency
Accuracy and Fluency with
Connected Text
Oral Reading Fluency
Comprehension
At least through grade 3:
A combination of Oral
Reading Fluency & Retell
Fluency
Vocabulary Š Oral Language
Word Use Fluency
7
Using data in an Outcomes-Driven
model: Decision making steps
1. Identify Goals for Expected Performance
2. Identify and Validate Level of Support Needed to meet
Expected Performance Goals
3. Plan and Implement Level of Support
4. Evaluate and, if necessary, Modify Support Plan
5. Review Outcomes
8
ODM Step
Question(s)
Data
1. Identify
Need
Are there students who may need support?
How many? Which students?
Benchmark data: Histograms,
box plots, Class List Report
2. Validate
Need
Are we confident that the identified students
need support?
Benchmark data and additional
information: Repeat
assessment, use additional
data, knowledge of/information
about student
3. Plan
Support
What level of support for which students?
How to group students? What goals, specific
skills, curriculum/program, instructional
strategies?
Benchmark data and additional
information: Individual student
booklets, additional diagnostic
information, knowledge
of/information about student
4. Evaluate
Support
Is the support effective for individual
students?
Progress Monitoring data:
Individual student progress
graphs, class progress graphs
5. Evaluate
Outcomes
As a school/district: How effective is our
core (benchmark) support? How effective is
our supplemental (strategic) support? How
effective is our intervention (intensive)
support?
Benchmark data: Histograms,
Cross-Year Box Plots,
Summary of Effectiveness
Reports
9
Getting Reports
from DIBELS™
Data System
From DIBELS Data System, University of Oregon, 2000-2005
10
Identify Goals for Expected
Performance:
Primary Goal: All children reading at grade-level
by the end of third grade
11
Identify Goals for Expected Performance:
Primary Goal: All children reading at grade-level by
the end of third grade
Measure
How Much?
By When?
Initial Sounds
Fluency
25 or more
Middle of K
Phonemic
Segmentation
Fluency
35 or more
End of K
Nonsense
Word Fluency
25 or more
50 or more
End of K
Middle of 1st
Oral Reading
Fluency
1st: 40 or more
2nd: 90 or more
3rd: 110 or more
1st: End of year
2nd: End of year
3rd: End of year
12
DIBELS™ Benchmark Goals by
Grade
• Kindergarten
– Initial Sounds: 25 by winter
• Fourth Grade
– Oral Reading: 118 by
spring
– Phoneme Segmentation: 35
• Fifth Grade
by spring
– Oral Reading: 124 by
– Nonsense Words: 25 by
spring
spring
• First Grade
– Nonsense Words: 50 by
winter
• Sixth Grade
– Oral Reading: 125 by
spring
– Oral Reading: 40 by spring
• Second Grade
– Oral Reading: 90 by spring
• Third Grade
13
14
Identify and Validate Level of
Support Needed to meet Expected
Performance Goals
15
Identify and Validate Level of Support Needed to meet
Expected Performance Goals
•
•
•
•
•
•
•
•
Student Level: What level of instructional support
will students need to meet expected reading goals?
Data Source: Grade list/Class list report
Grade Level: What percent of our student are going
to need additional support to meet expected
reading goals?
Data Source: Distribution Report (by class)
School Level: What level of instructional support
will grade levels need to meet expected reading
goals? Are there certain grade levels that may need
more support than other grade levels?
Data Source: Distribution Report (by class)
District Level: What level of instructional support
will schools need to meet expected reading goals?
Are there certain schools that may need more
support than other schools?
Data Source: Distribution Report (by school)
16
Grade Level: What percent of our student are
going to need additional support to meet
expected reading goals?
DIBELS Reports that answer this
question
– Histogram Reports
– Distribution Reports by Class
17
18
What level of instructional support
will students need to meet
expected reading goals?
Class List Report
• The Class List and Grade List reports provide
information on individual students at a given assessment
period. The Class List report includes all the students
from one class.
• The Class List Report shows:
– The raw scores of each student's performance on
each measure.
– The status category (I.e., at risk, some risk, low risk or
deficit, emerging, established) for the student’s score
on each measure.
– Percentile ranks for the student’s score on each
measure to show the student's performance in
relation to all participating students in the district.
– Instructional recommendations based on a summary
of each student's performance on all of the measures.
19
Fall Grade-Level Team Meetings: (Identification) What
level of support will students need to meet winter
benchmark goals?
20
Plan and Implement Levels of
Instructional Support
21
Plan Support
• What will benchmark support and instruction
look like?
• What will strategic support and instruction look
like?
• What will intensive support and instruction look
like?
–
–
–
–
–
–
What SBRR programs will we use?
What SBRR strategies will we use?
Who will teach each group?
What will the group size be?
How often will we monitor progress?
How often will we discuss student progress in grade level team
meetings?
22
Three Tier Model of Primary,
Secondary, and Tertiary Prevention
Tertiary Prevention
Students
severe
sustained
learning
difficulty
(5%)
Students at
some risk or who
make adequate
progress with
additional
intervention
(15%)
Students at low risk or who
make adequate progress with
modest support (80%)
Secondary Prevention
Primary Prevention
Progress Monitoring: 2-4 x
Month
In-Program Assessments
Diagnostic Assessment
Screening & Outcome
Assessment
Progress Monitoring: Monthly
In-Program Assessments
Screening & Outcome
Assessment
Progress Monitoring: Term
In-Program Assessments
Screening & Outcome
Assessment
Note. Adapted from Walker, H. M., Horner, R. H., Sugai, G., Bullis, M., Sprague, J. R., Bricker, D, & Kaufman, M. J. (1996). Integrated
approaches to preventing antisocial behavior patterns among school –age children and youth. Journal of Emotional and Behavioral
Disorders, 4, 194-209.
McDonagh © 2004
23
Three Tier Model of Prevention
and Intervention
TIER
Tertiary
DIBELS
INSTRUCTIONAL
RECOMMENDATION
Intensive/At
Risk/Deficit
INSTRUCTIONAL
PLACEMENT
Part Core +
Replacement
Program
ASSESSMENT PLAN
•Progress Monitoring: 2-4 x
Month
•In-Program Assessments
•Diagnostic Assessment
•Screening & Outcome
Assessment
•
Secondary
Primary
Strategic/Some
Risk/Emerging
Benchmark/Low
Risk/Established
Core Reading
Program +
Supplement
Core Reading
Program
McDonagh © 2004
•
•
Progress Monitoring: 2-4 x
Month
In-Program Assessments
Screening & Outcome
Assessment
•Progress Monitoring: Term
•In-Program Assessments
•Screening & Outcome
Assessment
24
Additional Information on Programs:
http://oregonreadingfirst.uoregon.ed
u
25
Three Levels of Instruction and Support:
Summary of CSI Map
Time
Period
Fall to
Winter
Instructional
Recommendation
benchmark:
Participation in Core
W ho:
W hen:
Supplemental and
Intervention Programs/
Strategies
Supplemental and
Intervention
Program Delivery
W ho:
W hen:
__ w/in 90 minutes
__ outside of 90min
Frequency
of DIBELS
Progress
Monitoring
Determining
Instructional
Effectiveness
W ho:
How Often:
Activities:
Time:
Criteria:
strategic:
Group Size:
Group Size:
W ho:
W ho:
W ho:
W hen:
W hen:
__ w/in 90 minutes
__ outside of 90min
How Often:
Activities:
Time:
Criteria:
intensive:
Group Size:
Group Size:
W ho:
W ho:
W ho:
W hen:
W hen:
__ w/in 90 minutes
__ outside of 90min
How Often:
Activities:
Time:
Criteria:
Group Size:
Group Size:
26
Evaluate and, if necessary, Modify
Support Plan
27
Evaluate and, if necessary, Modify Support Plan
Student Level:
•
Did the student make adequate progress towards the
winter benchmark goal? Is the student responding
well to the intervention?
•
Data Source: Progress Monitoring Report, CSI map, Student
Intervention Profile
Time Period: Ongoing between fall and winter benchmarking
•
Grade Level:
•
Did all of the students in second grade make adequate
progress towards the winter benchmark goal? Did the
second grade instructional support plans adequately
support benchmark, strategic and intensive students?
If not, do we need to modify parts of the plan?
•
Data Source: Fall to Winter Summary of Effectiveness Report
(by grade), Fall to Winter CSI map, Winter to Spring CSI map
Time Period: Immediately following winter benchmarking
•
•
Grade Level: How well is our ELL (SPED) population
achieving compared to our Non-ELL (Non SPED)
population?
•
•
Data Source: Distribution Report (by subgroups), CSI map
Time Period: Immediately following winter benchmarking
28
DIBELS Summary of Effectiveness Reports
4 Ways to Achieve Adequate Progress
Time 1 (e.g., fall)
Time 2 (e.g., winter)
Intensive
At-Risk
1. Some Risk
2. Low Risk
Strategic
At-Risk
Some Risk
3. Low Risk
Benchmark
At-Risk
Some Risk
4. Low Risk
29
Winter Grade Level Team Meeting
What is the total percent of students that made
adequate progress towards the winter benchmark
goals?
Adams Elem
50% of the first grade students made adequate progress
towards winter DIBELS benchmark goal of 50cspm on NWF
30
measure.
Winter Grade Level Team Meeting
What is the total percent of students that made
adequate progress towards the winter benchmark
goals?
Adams Elem
Benchmark Students: 65% made adequate progress towards the winter NWF
benchmark goal
Strategic Students: 31% made adequate progress towards the winter NWF
benchmark goal
Intensive Students: 53% made adequate progress towards the winter NWF
benchmark goal
31
Winter Grade Level Team meeting:
Which Kindergarten students are making
adequate progress towards winter DIBELS
benchmark goals?
32
Modifying Instructional Support at the
Systems Level: Achieving a Healthy
System:
Were grade-level instructional maps
effective in supporting adequate
progress for students with
benchmark, strategic, and intensive
needs?
If not, What do we do about it?
33
Seven Elements To Evaluate:
I.
II.
III.
IV.
V.
VI.
VII.
Goals, Objectives, Priorities
Assessment
Instructional Programs and Materials
Instructional Time
Differentiated Instruction, Grouping, Scheduling
Administration, Organization, Communication
Professional Development
34
Oregon Reading First -Schoolwide Beginning Reading Model
Elements of a Healthy System Checklist
School:
Grade:
Level of Support:
I. GOALS, OBJECTIVES, PRIORITIES
Were content-coverage goals and pacing guides for programs established so sufficient lessons/units would be mastered and children make adequate progress?
II. ASSESSMENT
Are DIBELS progress monitoring assessments administered frequently for students below grade lev el?
Are in-program assessments administered regularly ?
Did grade level teams regularly analyze student reading data (DIBELS and in-program assessments), plan instruction based on data, and regroup students based
on the data?
III. INSTRUCTIONAL PROGRAMS AND MATERIALS
Ar e appropriate reading programs and materials being used to teach the full range of students (e.g., intervention programs in place for students
significantly below grade level)?*
Ar e all necessary materials available in each classroom? For each small group?*
Hav e the grade level teams worked together to systematically enhance the program as necessary (e.g., make instruction more systematic and explicit)?
Is the program implemented with fidelity? Are ef forts to improve f idelity working?
IV. INSTRUCTIONAL TIME
Is a sufficient amount of time allocated (i.e., 90-minute reading block with a minimum of 30 minutes of small group teacher-directed reading instruction
daily)?* Are teachers following the schedule?
Is additional instructional time scheduled for students who are struggling?*
Are important activities taught/stressed (e.g., red checks, targets, etc.)?
Are students spending an appropriate amount of time on independent activ ities? Are the independent activ ities directly linked to the reading instruction?
Are students meeting projections f or lesson progress?
Are students being accelerated whenev er possible to bring closer to grade-level performance?
V. DIFFERENTIATED INSTRUCTION/GROUPING/SCHEDULING
Ar e students grouped homogenously by performance level?*
Ar e students grouped based on program recommendations?*
Ar e group sizes for large and small group activities appropriate?*
Are cross-class and cross-grade grouping used when appropriate to maximize learning opportunities?
VI. ADMINISTRATION/ORGANIZATION/COMMUNICATION
Is a sufficient number of staff allocated?*
Have staff been assigned in a way such that reading instruction can be delivered to the full range of students each day?*
Are students participating in a reasonable number of programs so as to have an aligned, coherent program without conflicting information being presented?
Are Title and Special Education coordinated with and complementary to general education reading instruction?
VII. PROFESSIONAL DEVELOPMENT
Is ongoing, high quality training provided (i.e., staff received professional development on programs used in classrooms prior to implementation and at least twice
after initial training)?
Are program-specific consultants brought in to observ e in classrooms and provide ongoing support and training?
Are teachers receiv ing support from the RF coach in the classroom? outside the classroom?
Are regular inserv ic e sessions developed around implementation issues identif ied by the coach?
Do teachers have opportunities to observe model lessons from the coach? from peers? from other schools?
Are new teachers prov ided the necessary program training?
* = Structural element
35
Ongoing Grade Level Team Meetings: Did Robert
make adequate progress towards winter benchmark
goal? Is Robert responding well to the intervention?
36
Evaluate and Modify at the Student Level
Evaluate Progress and Adjust Instruction
“What To Do When Students Aren’t Learning Enough”
STUDENT VARIABLES
1. Does the learner have a potential undiagnosed hearing or vision problem?
2. Is the learner frequently absent during reading instruction?
Decision Point: Do student variables potentially explain the learner’s lack of progress?
YES
NO
If yes, specify a plan to address student factors.
Check hearing and vision
Develop systematic plan with parents to increase attendance
Other
OPPORTUNITIES TO LEARN
1. Student was present 95% or more of instructional days.
2. Instruction was delivered 5 days per week.
3. Small-group teacher-directed instruction was conducted a minimum of 30-45 minutes
daily.
4. Student had frequent opportunities to respond to tasks during teacher-directed
instruction.
Decision Point: Is Opportunity to Learn a potential factor explaining the learner’s lack of
progress?
YES
NO
If yes, specify a plan to increase Opportunity to Learn.
Plan to increase attendance
Add another instructional period daily
Ensure instruction is provided daily
Increase teacher-directed instruction (determine whether this is the appropriate group for
the learner)
Increase number of opportunities for learner to respond
37
Review Outcomes
Key Questions:
• What percent of students are reaching end of
year benchmark goals?
– Are we doing better over time?
• Did our levels of instructional support assist benchmark,
strategic and intensive students meet end of year
reading goals?
Data used to inform the decision:
• Histogram reports
-compare to grade level goals or previous year’s
histograms
• Grade Level: Summary of effectiveness Reports
38
Spring Grade Level Team Meeting
What is the total percent of students that made
adequate progress towards the spring benchmark
goal?
Adams Elem
90% (50%) of the first grade students made adequate
progress towards spring DIBELS benchmark goal of 40cwpm
on ORF measure.
39
Spring Grade Level Team Meeting
What is the percent of benchmark, strategic and
intensive students that made adequate progress
towards the spring benchmark goal?
Adams Elem
Benchmark Students: 98% (65%) made adequate progress towards the spring
ORF benchmark goal
Strategic Students: 71% (31%) made adequate progress towards the spring
ORF benchmark goal
Intensive Students: 80% (53%) made adequate progress towards the spring
40
ORF benchmark goal
41
42