Instructional Systems - University of Washington

Download Report

Transcript Instructional Systems - University of Washington

ASSESSMENT
Special Education

The term ‘special education’ means specially
designed instruction, at no cost to parents, to meet
the unique needs of a child with a disability (Sec.
1400)
Services are provided in response to child’s needs,
not categorically
Special Education is problem-solving


“Special education exists because all general
education programs fail to educate effectively
some portion of students assigned to those
classrooms” (Deno, 1989).
Special education seeks to solve the problem of
students who fail to succeed in the mainstream.
Special Education: Underlying
Assumptions

Special education programs are a problem-solving
component of the school system whose function is to
identify and serve individuals whose performance is
significantly discrepant from their peers. (Stan Deno)
Disabilities as Performance
Discrepancies

One way to define disabilities is to specify the
difference between the performance required of
the individual in a given situation and the
performance actually achieved.
Disabilities as Performance
Discrepancies

Performance discrepancies are the disabilities that
must be overcome if an individual is to be
perceived as successful.
Defining Assessment

Within the context of the problem-solving model,
assessment becomes
 “A
tool for improving educational outcomes for
children” because it provides us with the information to
modify instruction and set appropriate goals.
Importance of Valid, Reliable
Assessment and Results
• Historical overrepresentation of minority
groups in SpEd
Race/Ethnicity
% Student
Population
% Ethnic Group
Qualifying for SpEd
(ages 6 – 21)
Most Common
Disability (after
SLD)
American Indian/Alaska
Native
1.2
14.1
SL
Asian/Pacific Islander
4.5
4.6
SL
Hispanic
19.2
8.4
SL
Black
17.3
12.6
SL, MR, ED
White
57.9
8.8
SL
Note: All statistics from the National Center for Education Statistics, 2004.
Importance of Valid, Reliable
Assessment and Results
• Biggest factor in qualification for services is
referral for assessment
– What factors might make some students qualify
at higher rates?
– Which students might not be referred that need
extra support?
• New approaches to qualification (i.e., RTI or
multi-tiered approach) attempt to reduce
overrepresentation by providing support to
students early, prior to assessment for SpEd
Need for Purposeful Assessment

In the context of the school, assessment should…
Be linked to a purpose
 Address specific questions about student knowledge and skills
 Provide data that support instructional design and modification
 Support increased student outcomes
 Utilize a balanced approach
(i.e., multiple methods)

Assessment review

Formative (today’s focus)




Occurs throughout instruction (e.g., screening, diagnostic test,
progress monitoring)
Provides information about student performance relative to
instructional goals
Allows teacher to determine whether instruction is effective and
make changes to improve outcomes
Summative



Measures the result of instruction
Occurs at the end of a unit or year (e.g. unit/chapter test, state
assessment)
Provides a picture regarding whether students met instructional goals
Formative Assessment Activities in
Math

Initial Math Assessment
 determining

Progress Monitoring
 determining

placement and appropriate instruction
growth toward goals
Mastery
 determining
mastery of skills as move through scope
and sequence

Instructional Error Analysis
 determining
remediating
error patterns during instruction and
Curriculum-Based Assessment (CBA)

CBA is type of formative assessment:
Integrally linked to the curriculum (as an alternative to
standardized testing)
 Based on a student’s ongoing performance
 Supports teachers in making data-based instructional
decisions


CBA & Curriculum-Based Measurement (CBM):
CBM is a more specific category of CBA
 CBM generally refers to tools with established technical
adequacy and standardized administration, and usually
uses norms

Types of CBA

Survey CBA

Focused CBA:
 Untimed
 Timed
Survey CBA



Test that measures a wide
span of concepts,
knowledge, and skills
Assessment focuses on
several mathematics
standards
Tests students for
placement in math
curriculum and instructional
group
Untimed focused CBA






Measures a narrower span
of skills than a survey CBA
Assesses narrow skill in
greater depth
No time limit
Test for placement
Evaluate mastery of lesson
objectives
Check for maintenance of
a skill
Timed focused CBA (a.k.a., probe)







Measures a narrower span of skills than a survey CBA
Set time limit
Test for placement
Evaluate mastery of lesson objectives
Check for maintenance of a skill
Used to examine fluency with a focused skill
Score is sensitive to small changes in performance
CBA Application



Which type of CBA would you use to check whether
a student met an IEP goal?
Why is it useful to know the different types of
assessment?
How does knowing the different types of CBA
impact your teaching?
Initial Math Assessment

Referenced to a typical or specific curriculum
 Survey
CBA tests
 Determine
approximate developmental level of skills
 Conduct initial error analysis
 Diagnostic
 Focus
 Fact
or Specific-level tests
on determining placement into scope and sequence
pretests
 Focus
on determining specific fact weaknesses and
placement into fact program
Survey-Level Tests


Use placement test from program, design your
own based on grade level, or use placement tests
from course website.
Administer test to group.
Survey CBA (K-1)

Beginning Math Assessment K-1

Counting (rote to 100; skip by 10s, 5s, 2s; from number other
than 1)

Numeral ID (1-99)

Numeral writing (1-99)

Number sense (quantity comparison, rational counting)

Operations: add/subtract - no renaming

Add/subtract story problems
Survey CBA (Intermediate)

Grades 2 and up
 Counting
by 1s and several skip counting series)
 Numeral ID (to millions)
 Numeral writing (to millions)
 Operations: add/subtract/multiply/divide
 Problem solving: add/subtract/multiply/divide
Survey CBA (Upper Level)

For students working with fractions, decimals, and
percents (roughly, grades 6 and up)
Reading and writing numbers
 Rounding decimals
 Identifying and manipulating fractions
 Identifying and finding percent values
 Decimal, fraction, and percent conversions
 Operations with decimals and fractions
 Single and multi-step story problems using fractions and
decimals

Not sure of what level?

Administer the last three addition/subtraction problems of the
lower level assessment
24
+32


63
+5
57
-35
Correct strategy and procedure? (fact errors OK) Administer the
higher level assessment.
Note. Have both assessments ready just in case.
Administration:
Things to Consider

Rapport
Take a few moments to introduce yourself
 Briefly explain why you’re working with them.


Materials


Be prepared (extra pencils, calculator, manipulatives,
number line, etc…)
Reward
Ask the teacher what type of reward might be OK.
 Suggested rewards - (schoolwide behavior cards, stickers,
high-fives, etc..)

Administration

Counting
 Read
directions
 Test each item

Number Identification / Writing
 Use
student worksheet
 Use stopping criteria (5 consecutive errors)
Administration - operations

Beginning - administer all


Intermediate and Upper Level - administer each operation


Stopping criteria
Stopping criteria for each operation
Modifications for operations


Counters (concrete or representational)
Prompt renaming if needed


Note prompts used during assessment
Calculator (after 1st working the problems)

Note. In your report, indicate when student used a calculator to work each
problem
Administration - problem solving

Beginning
 Read

each problem to the student
Intermediate and Upper Level
 Check
that the student can read each problem or read
problem to the student
 Calculators or fact tables OK as you’re testing problem
solving - not computation
Scoring

Scoring Counting
Record highest number correctly counted
 Number ID and Writing:



Indicate items correct (+)
Incorrect responses - record error (what student said for error analysis) or
NR for No Response
Use stopping criteria for number identification and writing
 Leave items not tested blank


Highlighting
For all parts: Highlight items missed on the record sheet
 Use highlighted items after test for easy visual analysis

Scoring operations and story
problems

On the student’s work
 Mark
each problems correct (+ or C)
 Circle problem (or part of problem) that is incorrect and
write in correct answer.
 Code types of error based on your error analysis

On the summary sheet
 Note
strategies used
 For Problem Solving section - complete score summary on
administration guide
Survey Level Tests



Summarize group performance using data sheet
Evaluate errors and identify skill areas where
students are having trouble
If necessary, administer another survey-level test
Diagnostic Tests


Using data from survey level test, determine
student’s current functioning across several skills
Use the following decision rules for deciding which
items to put on Diagnostic Assessment
Decision Rules
 Did
the student do her/his best work on the level test?
Were there distractions in the testing environment or
was the student unwilling to try hard for you (i.e., are
the errors on the test "can't" or "won't" errors)?
 If you believe the results of the level test represent the
student's best effort, then identify the error type (i.e.,
fact, component, or strategy).
Decision Rules
 If
student made component or strategy errors on a
problem type, plan on including that problem type
on your diagnostic test.
 For each problem type you decide to put on the
diagnostic test,
 go
to the scope and sequence chart in the DI Math text
and select at least two previous skills which students
should have mastered, and
 two later skills you believe the student has not mastered
(for goal setting).
Decision Rules
 Identify
any unique preskills that you believe the
student may not have mastered and include these on
the diagnostic assessment.
 Design three questions for each of the skills you have
decided to test. You may select questions directly from
the DI Math text.
 Write the questions on the math summary chart.
Diagnostic Assessment

Design your diagnostic assessment using the math
summary chart.
 If
students are young, most of your questions will be
oral
 If questions are oral, you will need to design a data
recording sheet

Administer; record data; conduct an error analysis
Fact Pretests


Students may start at various sets. Students who
know few facts would start at set A. Students who
know more facts would begin at later points.
In order to determine the set at which students might
begin, administer a written pretest that includes the
100 basic facts
 Available
online at http://depts.washington.edu/facts/
Fact Pretests
 Allow
students 2 minutes, instructing them to work as
many problems as they can. Use the following
guidelines to place students into the sequence:
 20
or more facts answered correctly can start at Set G.
 30 or more facts answered correctly can start at Set M.
 45 or more facts answered correctly can start at Set R.
 60 or more facts correctly in the 2 minute pretest
probably need not be placed in a fact program for that
type of fact
Suggestions for administration

Day One
 Administer

Survey Level tests
Day Two
 Analyze
Survey Level test and develop Diagnostic test
 Administer Fact Pretest

Day Three
 Administer
Diagnostic test
Guidelines for a Structured Assessment
Situation




Have materials organized and ready to use.
Ask the child to sit next to you; on your right, if right
handed, on your left, if left handed.
Put the student(s) at ease before testing.
Provide motivation for working hard (free time,
stickers, stars, etc.)
Guidelines for a Structured Assessment
Situation





Describe the purpose for testing (to determine what
the student knows, what they need to learn).
Give clear directions, then give the child the test.
Record student responses so that student doesn't
see.
Follow the testing procedures accurately.
Reinforce good effort, even when student is
performing poorly.
Guidelines for a Structured Assessment
Situation



Do not allow facial gestures or verbal comments
that will tell the student he/she gave a wrong
answer.
Do not tell answers or give hints; you are testing,
not teaching.
If the student is unable to read the story
problems you may read the words to her/him.
Guidelines for a Structured Assessment
Situation


You may give prompts after recording the student's
initial response to get more information about
conditions under which the student can perform the
task.
Record as much information as possible; record
data accurately.
Guidelines for a Structured Assessment
Situation


Stop when the student becomes obviously
frustrated.
Thank the student for working with you and give the
student a sticker, verbal praise, or whatever you set
up earlier.

Questions about the assignment?
CBA, Progress Monitoring,
& Benchmarking

Benchmarking:
 Using
a timed focused CBA three times a year (fall,
winter, and spring) to gauge student progress
 Can
examine student performance relative to benchmark
goals or peer performance
 Usually
consists of giving three probes and taking the
median score, depending on the
 Acts
as a more reliable measure of student performance
CBA, Progress Monitoring,
& Benchmarking cont’d.

Progress Monitoring:
Using a timed focused CBA on a pre-determined schedule
(e.g., weekly, bi-weekly) to monitor increases or decreases
in student performance
 Consists of giving one probe at each point in the schedule
 Score acts as an indicator of student performance
 Consistent administration allows teacher to monitor changes
in student performance
 Scores can be used to identify need for instructional change
or intervention

Progress Monitoring Example
Web-based data management systems
for mathematics

AIMSweb
Provides access to probes for grades K-8
 Lists norms for grades K-8 (most valid for K-6)
 Records score history
 Differentiates between benchmarking and progress
monitoring
 Graphs scores and integrates intervention lines
 Creates classroom and individual reports
 Option to monitor RTI cases


Others?

EasyCBM and DIBELS in development
Sample probe
Scoring

Multiple methods of scoring probes:
Traditional method: number of problems answered correctly
 Digits correct:




Can score digits correct in just the answer
Can score digits correct in the answer and the work shown by the
student (i.e., process and answer)
Scoring with digits correct provides a more sensitive picture
of student performance

Small changes from week-to-week can be seen as students increase
number of digits correct on each probe
Scored probe
Measuring Student Performance

From student performance data, we can look at
measurement in three ways:

Percentage



Rate



Most common method of measuring student performance
Usually represented relative to some criterion
Provides information on accuracy and fluency with a skill
Usually demonstrated on a line graph to display changes in rate over
time
Error Analysis

Examining types of errors to support instructional decisions