Building Infrastructure: - University of South Florida

Download Report

Transcript Building Infrastructure: - University of South Florida

Problem Analysis
A collaborative project between the Florida Department of Education and the University of South Florida
FloridaRtI.usf.edu
Advance Organizer
1) SBLT Data - Beliefs, Practices, Skills
2) Review of Problem Identification
3) Big Ideas/Concepts of Problem Analysis
4) Hypothesis/Prediction Statement
5) Assessment & Hypothesis Validation
6) Examples of Hypothesis Generation and
Evaluating
SBLT Data
1.
Did your building’s beliefs change from the first to the second administration? If yes, in
what areas did the greatest change occur?
2.
What do you think these changes mean in the context of implementing a PS/RtI model in
your building?
3.
What “practices” occurring in your building do you think are most consistent with the
PS/RtI model and which ones do you think might be a threat to the implementation of the
model?
4.
How consistent are the overall beliefs of your building with your building’s perceptions of
the practices occurring? What does the level of consistency mean in terms of
implementing a PS/RtI model in your building?
5.
To what extent do you believe that your building possesses the skills to use schoolbased data to evaluate core (Tier 1) and supplemental (Tier 2) instruction? Based on
what your building has learned about using data to make decisions, how consistent are
the skills your building possesses with what you are doing in your building (i.e., to what
degree does your building evaluate the effectiveness of core and supplemental
instruction)?
Problem ID Review
Academic Systems
Behavioral Systems
Intensive, Individual Interventions
•Individual Students
•Assessment-based
•High Intensity
•Of longer duration
1-5%
5-10%
Targeted Group Interventions
•Some students (at-risk)
•High efficiency
•Rapid response
Universal Interventions
•All students
•Preventive, proactive
Horner & Sugai
80-90%
1-5%
Intensive, Individual Interventions
•Individual Students
•Assessment-based
•Intense, durable procedures
5-10%
Targeted Group Interventions
•Some students (at-risk)
•High efficiency
•Rapid response
80-90%
Universal Interventions
•All settings, all students
•Preventive, proactive
Problem ID Review
In order to identify a problem, you’ve got to
start with three pieces of data-
• Benchmark level of performance
• Target Student level of performance
• Peer level of performance
Problem ID Review
Individual Student Data
140
Peers
120
100
80
Benchmark
60
40
Student
20
0
0
1
2
3
4
5
6
7
8
9
10 11 12
Problem ID Review
Individual Student Data
140
120
100
80
Benchmark
60
Peers
40
Student
20
0
0
1
2
3
4
5
6
7
8
9
10 11 12
Problem ID Review
Individual Student Data
140
120
100
80
Benchmark
60
Peers
40
Student
20
0
0
1
2
3
4
5
6
7
8
9
10 11 12
Problem ID Review
Building Level Data
% Students
at Benchmark
100
90
80
70
60
50
40
30
20
10
0
Benchmark
Bldg. Level Performance
0
1
2
3
4
5
6
7
8
9
10 11 12
Problem ID Review
Building Level Data
% Students
referred to
office
100
90
80
70
60
50
40
30
20
10
0
Bldg. Level Performance
Benchmark
0
1
2
3
4
5
6
7
8
9
10 11 12
Data Required for Problem
Identification
•
•
•
•
•
Replacement Behavior
Current Level of Functioning
Benchmark/Desired Level
Peer Performance
GAP Analysis
Example- ORF
•
Target Student’s Current Level of Performance:
 40 WCPM
•
Benchmark
 92 WCPM
•
Peer Performance
 98 WCPM
•
GAP Analysis:


•
•
Benchmark/Target Student
Benchmark/Peer
92/40= 2+X difference SIGNIFICANT GAP
92/98= <1 X difference NO SIGNIFICANT GAP
Is instruction effective?
Yes, peer performance is at benchmark.
Problem ID
120
100
80
60
40
20
0
0
1
2
3
4
5
6
7
8
9
10 11 12
Example- Behavior
• Target Student’s Current Level of Performance:
 Complies 35% of time
• Benchmark (set by teacher)
 75%
• Peer Performance
 40%
• GAP Analysis:
 Benchmark/Target Student
 Benchmark/Peer
 Peer/Target Student
75/35= 2+X difference
75/40= 1.9 X difference
40/35= 1.1X difference
• Is behavior program effective?
• No, peers have significant gap from benchmark as well.
SIGNIFICANT GAP
SIGNIFICANT GAP
NO SIGNIFICANT GAP
Problem ID
100
90
80
70
60
50
40
30
20
10
0
0
1
2
3
4
5
6
7
8
9
10 11 12
Tier One Problem
Identification
DIBELS ORF Data - Grade s 1-5
Economically
Disadvant aged
A ll St udent s
6%
11%
4%
11%
13%
SW D
7%
30%
27%
21%
21%
20%
33%
H igh- R is k
83%
M oderate- R is k
85%
L ow- R is k
66%
72%
50%
2 0 0 5 -0 6
2 0 0 6 -0 7
2 0 0 5 -0 6
2 0 0 6 -0 7
School Years
2 0 0 5 -0 6
40%
2 0 0 6 -0 7
Tier One Problem
Identification
1.
Rank from highest to lowest the groups and years for which core instruction
is most effective. Be sure to include all 6 possibilities in your response.
2.
Which group(s) of students should receive highest priority for monitoring
while modifications to core instruction are being made? Justify your decision.
3.
Which group(s) of students is most likely to be referred for additional
intervention—regardless of any label they might have? Justify your decision.
4.
Based on the data from the previous two school years, for which of the three
groups of students depicted above, if any, will core instruction potentially be
effective at the end of this school year (i.e., 2007-08)? Justify your decision.
5.
Assume that modifications were made between the 05/06 and 06/07 school
years for all groups of students at all levels of risk. Which group(s) of
students at what level(s) of risk made the greatest improvement across the
two years? Justify your decision.
Tier One Problem
Identification
DIBELS ORF Da ta - Grade s 1- 5
Economically
Disadvant aged
A ll St udent s
20%
16%
13%
19%
SW D
20%
43%
20%
38%
15%
22%
22%
H igh- R is k
34%
M oderate- R is k
L ow- R is k
64%
67%
59%
65%
35%
28%
2 0 0 5 -0 6
2 0 0 6 -0 7
2 0 0 5 -0 6
2 0 0 6 -0 7
School Year
2 0 0 5 -0 6
2 0 0 6 -0 7
Tier One Problem
Identification
1.
Rank from highest to lowest the groups and years for which core instruction
is most effective. Be sure to include all 6 possibilities in your response.
2.
Which group(s) of students should receive highest priority for monitoring
while modifications to core instruction are being made? Justify your decision.
3.
Which group(s) of students is most likely to be referred for additional
intervention—regardless of any label they might have? Justify your decision.
4.
Based on the data from the previous two school years, for which of the three
groups of students depicted above, if any, will core instruction potentially be
effective at the end of this school year (i.e., 2007-08)? Justify your decision.
5.
Assume that modifications were made between the 05/06 and 06/07 school
years for all groups of students at all levels of risk. Which group(s) of
students at what level(s) of risk made the greatest improvement across the
two years? Justify your decision.
Tier One Problem
Identification Worksheet
Your project ID is:
• Last 4 digits of SS#
• Last 2 digits of year of birth
Steps in the Problem-Solving
Process
1.
2.
3.
4.
PROBLEM IDENTIFICATION
• Identify replacement behavior
• Data- current level of performance
• Data- benchmark level(s)
• Data- peer performance
• Data- GAP analysis
PROBLEM ANALYSIS
• Develop hypotheses
• Develop predictions/assessment
INTERVENTION DEVELOPMENT
• Develop interventions in those areas for which data are available
and hypotheses verified
• Proximal/Distal
• Implementation support
RESPONSE TO INTERVENTION (RtI)
• Frequently collected data
• Type of Response- good, questionable, poor
Steps in the Problem-Solving
Process: Problem Analysis
2. PROBLEM ANALYSIS
• Develop hypotheses
• Develop predictions/assessment
Problem Analysis in Context
Identify
the Problem
Analyze
the Problem
Design
Intervention
Implement
Intervention
Monitor
Progress
Evaluate
Intervention
Effectiveness
J
L
Timeline
The Role of Assessment
in Problem Analysis
Completing Problem Analysis activities will
enable the team to answer:
• Why is there a difference between what is
expected and what is observed? That is, why
is the replacement behavior not occurring?
What is the most likely reason?
• How do you target the intervention that
would have the highest probability of being
successful?
Purpose of Assessment in
Problem Analysis
• Assessment should link to instruction for the
purpose of designing an educational intervention
• The focus should be on collecting information
that will lead us to decisions about:
what to teach (curriculum)
and
how to teach (instruction)
Purpose of Assessment in
Problem Analysis
• Focus only on gathering information that is directly linked
•
•
to the defined problem and that will guide you to
answering the question “Why is this problem occurring?”.
Do not collect information for the sake of collecting
information.
Do not collect what you already have.
REMEMBER: Our assessment must focus on gathering
information that will DIRECTLY impact student gains
in their classroom environment.
Determining What Data to
Collect
Educationally
Relevant & Alterable
Known
Information
Gather this Existing
Information
(Classroom DIBELS data, ODRs)
Unknown
Information
Conduct Assessments to
Gather this Information
(Behavior observations, specific skill
assessments)
These are assessment questions
Less Educationally
Relevant & Unalterable
Disregarded or Low
Priority
(Height, eye color)
Don’t Go Here!
(Cognitive processing?)
Here’s what we’re gonna do
• Look at the information we have
• Gather some more we want, but don’t have
• Make a few educated guesses (Why is the
replacement behavior not occurring?)
• If needed, gather more information to fine
tune
• Decide on the most likely reason(s) why.
Here’s what we’re gonna do
• Look at the information we have
• Gather some more we want, but don’t have
• Make a few educated guesses (Why is the
replacement behavior not occurring?)
• If needed, gather more information to fine
tune
• Decide on the most likely reason(s) why.
Steps in Problem Analysis
• Fact Finding
• Generate ideas about possible
•
•
causes (hypotheses)
Sort out which possible causes seem
most viable and which don’t
(validation)
Link the things we’ve learned to
intervention
Generate Hypotheses
Evidenced-Based Knowledge
of Content
+
Specific Knowledge of
Current Problem
=
Good Hypotheses
Domains for Hypotheses
HYPOTHESIS
DOMAINS
Examples
Frequency of interaction, Reinforcement, Presentation Style
I
Instruction
Difficulty, Presentation, Length, Format, Relevance
C
Curriculum
E
Environment
Peers (Expectations, Reinforcement, Values, Support),
Classroom (Rules, Distractions, Seating, Schedule, Physical
Plant), Home/Family Support
Skills, Motivation, Health
L
Learner
Generate Hypotheses
Developing Assumed Causes
Developing evidence-based
statements about WHY a problem
is occurring.
Generate Hypotheses
Hypotheses…
• Are developed to determine reasons for why the
•
•
•
•
replacement behavior is not occurring
Should be based on research relevant to the
target skills
Focus on alterable variables
Should be specific, observable, and measurable
Should lead to intervention
Generate Hypotheses
Hypotheses…
• Must consider both SKILL and PERFORMANCE
deficits:
 Skill Deficit
• Student does not have the skills to perform the task
 Student lacks fluency skill for grade level
 Student lacks private speech for self control
 Performance Deficit
• Student does perform existing skill or performs at lower level
 Student reads slowly because of fear of ridicule by peers for
mistakes
 Peers reinforce bad choices more than teacher reinforces good
choices
Writing A Hypothesis Statement
(What are possible causes?)
Identify known
information about
the identified
problem.
Do you have
enough
information to
identify possible
causes?
YES
Discard Irrelevant
Information
NO
Gather unknown
information with
additional RIOT
procedures.
Make hypothesis and prediction.
The problem is occurring because _________.
If ____________ would occur, then the problem would
be reduced.
Hypothesis / Prediction
Statement
The Problem is occurring because
_________________________________.
If ___________________ would occur, then
the problem would be reduced.
Prediction Statements
• Developed to INFORM ASSESSMENT and
decision-making for hypotheses
• The purpose is to make explicit what we would
expect to see happen if:
 The hypothesis is valid and
 We intervened successfully to reduce or remove the
barrier to learning
• Written in if/then or when/then form
• Used to develop assessment questions to help
validate/not validate hypotheses
Hypotheses Validation
Why do Problem Solving Teams need to
Validate a Hypothesis?
If the hypothesis is inaccurate and the wrong
intervention is implemented valuable time
could be wasted on an intervention that was
not an appropriate instructional match for the
student.
Assessment
Problem Analysis is the process of
gathering information in the domains
of instruction, curriculum,
environment and the learner (ICEL)
through the use of reviews,
interviews, observations, and tests
(RIOT) in order to evaluate the
underlying causes of the problem.
That is, to validate hypotheses.
Assessment
How Do We Validate Hypotheses?
• Review
• Interview
• Observe
• Test
Assessment Procedures
that are used:
R:
I:
O:
T:
Review
Interview
Observe
Test
Assessment Domains
are not limited to the
student:
I:
C:
E:
L:
Instruction
Curriculum
Environment
Learner
Content Of Assessment
Domains
INSTRUCTION
• instructional decision-making regarding selection and use
•
•
•
•
•
•
of materials, placement of students in materials
frequency of interaction/reinforcement
clarity of instructions
communication of expectations and criteria for success
(behavioral and academic)
direct instruction with explanations and criteria for
success (behavioral and academic)
sequencing of lessons designs to promote success
variety of practice activities (behavioral and academic)
Content Of Assessment
Domains
•
•
•
•
•
•
•
CURRICULUM
long range direction for instruction
instructional materials
intent
arrangement of the content/instruction
pace of the steps leading to the outcomes
stated outcomes for the course of study
general learner criteria as identified in the school
improvement plan and state benchmarks
(behavioral and academic)
Content of Assessment
Domains
•
•
•
•
•
•
•
•
ENVIRONMENT
physical arrangement of the room
furniture/equipment
clear classroom expectations
management plans
peer interaction, expectations, reinforcement,
support
schedule
task pressure
home/family supports
Content Of Assessment
Domains
LEARNER
•
•
•
•
skills
motivation
health
prior knowledge
Domains for Assessment
DOMAINS
I
Instruction
C
Curriculum
E
Environment
L
Learner
R
Review
I
Interview
O
Observe
T
Test
DOMAINS
I
Instruction
RIOT
C
Curriculum
R
Review
T
Test
Teachers’ thoughts
about their use of
effective teaching
and evaluation
practices, e.g.,
checklists
Effective teaching
practices, teacher
expectations,
antecedent conditions,
consequences
Classroom environment
scales, checklists and
questionnaires; Student
opinions about
instruction and teacher
Permanent products,
e.g., books,
worksheets, materials,
curriculum guides,
scope & sequence
Teacher & relevant
personnel regarding
philosophy (e.g.,
generative vs.
supplantive), district
implementation and
expectations
Classroom work,
alignment of
assignments (curriculum
materials) with goals
and objectives
(curriculum). Alignment
of teacher talk with
curriculum
Level of assignment and
curriculum material
difficulty; Opportunity to
learn; A student’s
opinions about what is
taught
School rules and
policies.
Ask relevant
personnel, students
& parents about
behavior
management plans,
class rules, class
routines
Student, peers, and
instruction; Interactions
and causal relationships;
Distractions and
health/safety violations
Classroom environment
scales, checklists and
questionnaires; Student
opinions about
instruction, peers, and
teacher
District records, health
records, error analysis,
Records for:
educational history,
onset & duration of
problem, teacher
perceptions of the
problem, pattern of
behavior problems, etc.
Relevant personnel,
parents, peers &
students (what do
they think they are
supposed to do;
how do they
perceive the
problem?
Target behaviors –
dimensions and nature
of the problem
Student performance;
find the discrepancy
between setting
demands (instruction,
curriculum, environment)
and student performance
E
Environment
L
Learner
O
Observe
Permanent products,
e.g., written pieces,
tests, worksheets
projects
by
ICEL
I
Interview
Format for Hypothesis Validation
Hypothesis
Prediction
Mary is noncompliant because
she does not have the skills to
complete the work successfully.
Assessment Question(s):
If we reduce the academic
demand or improve her skills,
Mary will become more
compliant.
Is task difficulty appropriate for Mary’s skill level?
Where are the answers?:
Review Learner records for evidence of skills; Review
Curriculum to understand expectation.
Answers:
Review of records and review of curriculum indicates that
Mary has the skills to complete the requested tasks.
Validated?:
No
Format for Hypothesis Validation
Hypothesis
Prediction
a. Mary is not being positively reinforced
for compliant behavior.
b. Mary is being reinforced for
noncompliant behavior
If Mary is positively reinforced for
compliant behavior / not reinforced for
noncompliant behavior, her compliance
will increase.
Assessment Question(s):
Is Mary being positively reinforced for compliant behavior?
Is Mary being reinforced for noncompliant behavior?
Where are the answers?:
Observe the Environment in the situations where Mary
displays noncompliance and compliance.
Answers:
Observations indicate that Mary is not being consistently reinforced for
compliance in large group settings outside of the homeroom, but is being
consistently reinforced within the homeroom where she displays compliant
behavior. She is also avoiding assignments through noncompliance.
Validated?:
Yes
Format for Hypothesis Validation
Hypothesis
Prediction
28% of the students in third grade
have been referred to the office
because different school staff have
different rules.
If all school staff target the same
rules, fewer students will be
referred to the office.
Assessment Question(s):
Do different school staff enforce rules differently?
Where are the answers?:
Observe the environment to determine staff consistency.
Interview staff to determine common rules.
Answers:
Different staff utilize different sets of rules and enforce them
differently
Validated?:
Yes
Format for Hypothesis Validation
Hypothesis
Prediction
John is unable to answer
comprehension questions accurately
because his fluency rate to too low for
the time allocated for the material.
If John’s fluency rate increases,
then he will be able to answer
comprehension questions
accurately.
Assessment Question(s):
Is John’s fluency rate too low for accurate comprehension?
Where are the answers?:
Test Learner comprehension accuracy at different fluency levels.
Answers:
Comprehension accuracy was appropriate for fluency
leveled material.
Validated?:
No
Group Discussion
How is this process of
analyzing problems different
from how teams typically
address the problems of
struggling learners?
Let’s look at some
hypotheses
Tier I
The problem is occurring:
 …because many of the children come from single
parent families
?
• Acceptable (A)
• Acceptable with modification (AM)
• Unacceptable (U)
If (A) or (AM)
• Data Collection Method (choose one) Review
Interview
Observe
Test
Let’s look at some
hypotheses
Tier I
The problem is occurring:
 …because the first grade curriculum currently in place
does not contain the targeted skills.
?
• Acceptable (A)
• Acceptable with modification (AM)
• Unacceptable (U)
If (A) or (AM)
• Data Collection Method (choose one)
Review
Interview
Observe
Test
Let’s look at some
hypotheses
Tier I
The problem is occurring:
 …because the parents don’t value promptness and
get the students to school late.
?
• Acceptable (A)
• Acceptable with modification (AM)
• Unacceptable (U)
If (A) or (AM)
• Data Collection Method (choose one)
Review
Interview
Observe
Test
Let’s look at some
hypotheses
Tier III
The problem is occurring:
 …because his IQ is 82.
?
• Acceptable (A)
• Acceptable with modification (AM)
• Unacceptable (U)
If (A) or (AM)
• Data Collection Method (choose one)
Review
Interview
Observe
Test
Let’s look at some
hypotheses
Tier III
The problem is occurring:
 …because the ratio of positive to negative comments
in the classroom is low.
?
• Acceptable (A)
• Acceptable with modification (AM)
• Unacceptable (U)
If (A) or (AM)
• Data Collection Method (choose one)
Review
Interview
Observe
Test
Hypothesis Evaluation
Worksheet
Your project ID is:
• Last 4 digits of SS#
• Last 2 digits of year of birth
Reviewing the Process of Problem Analysis
Gather information about the problem
Generate hypotheses & prediction statements
Develop assessment questions and select assessment
procedures to confirm/reject hypotheses
Conduct assessments and link confirmed hypotheses to
interventions
Problem Analysis Practice
1. Write one hypothesis for each of the Domains
listed.
2. Identify the method that you would use to collect
data to confirm or reject the hypothesis.
3. Identify the specific type of data that you would
collect using your method (e.g., observe on-task
behavior, interview the teacher to find out
specific information, test fluency skills, review
work samples to find error patterns).
Problem Analysis Practice
When the school-based leadership team reviewed the results of the
middle of the year administration of a school wide math screening
(administered three time a year), they became concerned about the
performance in fourth grade. The data indicated that large numbers
of students were not demonstrating competence in two-digit by twodigit multiplication. The data showed that:
52% of fourth grade students are meeting the benchmark for two-digit
by two-digit multiplication.
The team has identified one replacement behavior: improve
performance on this benchmark such that 80% of fourth graders
meet the benchmark.
Problem Analysis Practice
Hypothesis: “Fourth grade students in this school are unable to
successfully complete two-digit by two-digit multiplication
because…….”
INSTRUCTION: ________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
CURRICULUM: ________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
ENVIRONMENT: ________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
LEARNER: _____________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
Problem Analysis Practice
Jill often seems to be off task. She gazes out the window,
talks to the other students at her table, and frequently
gets out of her seat to sharpen her pencil.
Her work is generally accurate in all areas except spelling
and written expression. On essays, she makes frequent
errors of punctuation, grammar, and capitalization.
The replacement behavior identified by her teacher is: Jill
will be engaged in activities relevant to her assignments
75% of the time.
Problem Analysis Practice
Hypothesis: “Jill is unable to remain academically engaged 75% of the time
because…….”
INSTRUCTION: ________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
CURRICULUM: ________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
ENVIRONMENT: ________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
LEARNER: _____________________________________________
Method of Assessment: R I O T
Specific Data to be Collected: ___________________________
Problem Analysis
Worksheet
One completed per table
No project ID necessary
Training Evaluation
No project ID needed
Important Questions
•
•
•
•
Is this just another way to do child study?
Have we focused primarily on Tier 3?
Is our first focus on improving Tier 1?
Does level and type of instruction vary across grade
levels based on student need and performance (e.g., 90
minutes vs 120 minutes of reading/language arts?
• Do we use data to make decisions all the time?
• Do we have regular data meetings to evaluate student
performance?
Important Questions
• Do teachers think that we need to do this stuff and “then
we can test the student?”
• Do parents believe that this is a “delay tactic?”
• Do we have expectations for Tier 2 (e.g., Title 1)
intervention effectiveness--do we evaluate it?
• Do we monitor students receiving special education
services more frequently than all other students?
• Do we really believe that almost all students can achieve
state-approved grade level benchmarks?
• Is our continuum of services fully integrated?
Final Thoughts
Problem Solving &
Response to Intervention
Resources
http://www.aea11.k12.ia.us/spedresources/ModuleFour.pdf - Chapter 5
Curriculum-Based Evaluation: Teaching and Decision Making, 3rd Edition
Kenneth W. Howell & Victor Nolet, 2000 - Chapters 2, 6