Transcript MARCES 2007

National Center on Educational Outcomes
The long and winding road
of alternate assessments
Where we started, where we are
now, and the road ahead!
Rachel F. Quenemoen,
Senior Research Fellow, NCEO
NCEO STATE SURVEY REPORTS
• 2005 State Special Education Outcomes:
Steps Forward in a Decade of Change
• 2003 State Special Education Outcomes:
Marching On
• 2001 State Special Education Outcomes: A
Report on State Activities at the Beginning of
a New Decade
• 1999 State Special Education Outcomes: A
Report on State Activities at the End of the
Century
Thompson & Thurlow (1999, 2001, 2003)
Thompson, Johnstone, Thurlow, & Altman (2005)
Survey topics across years
• Stakeholder expectations
• Content coverage (linkage to content
standards)
• Approaches (test format)
• Scoring criteria and procedures
• Performance/achievement descriptors
and achievement standard setting
• Reporting and accountability
Other NCEO reports referenced;
also Pre IDEA 97 Reports
• Other NCEO syntheses of State status,
slides 5, 6, 10, 11
• “Devil in the Details” NCEO studies,
slides 25, 26
• Archived NCEO State Reports
State Special Education Outcomes 19911997
Pioneers: Kentucky and Maryland
Maryland IMAP
Kentucky Alternate Portfolio assessment
system.
BOTH were in response to external
demands for accountability (legislature,
courts)
Ysseldyke, J., Thurlow, M., Erickson, R., Gabrys, R., Haigh, J.,
Trimble, S., & Gong, B. (1996). A comparison of state
assessment systems in Maryland and Kentucky with a focus on
the participation of students with disabilities (MarylandKentucky Report 1).
Ysseldyke, J. E., & Olsen, K. R. (1997).*
1. Alternate assessments focus on authentic skills and on
assessing experiences in community and other real life
environments.
2. Alternate assessments should measure integrated skills
across domains.
3. If at all possible, alternate assessment systems should
use continuous documentation methods.
4. Alternate assessment systems should include as critical
criteria the extent to which the system provides the
needed supports and adaptations, and trains the student
to use them.
* Putting alternate assessments into practice: What to measure
and possible sources of data (Synthesis Report No. 28).
IDEA 1997
• First Federal requirement of alternate assessments,
LEA and SEA
• IDEA Amendments of 1997 – Preamble
4) … the implementation of this Act has been impeded
by low expectations, and an insufficient focus on
applying replicable research on proven methods of
teaching and learning for children with disabilities.
(5) Over 20 years of research and experience has
demonstrated that the education of children with
disabilities can be made more effective by -(A) having high expectations for such children and
ensuring their access in the general curriculum to
the maximum extent possible; [Access AND progress]
POST IDEA 1997
Where did we start? Part 1
• Stakeholders – expectations, principles
• Content coverage – Generic “Standards”
throughout – content standards linkage
understanding and focus came later, and later
yet, achievement standards were differentiated
from content standards (with great difficulty!)
• Approaches –portfolios, checklists,
performance assessments, IEP driven, other…
(Some evidence in survey responses/verification of
confusion about what terms meant)
1999 - Stakeholder estimates of students
who cannot take regular assessment
<1 – 1%
Delaware*
Kansas
Kentucky
Maryland
Minnesota
Nebraska
Vermont
> 1 – 2%
California
Colorado
Hawaii
Idaho
Indiana
Florida*
Louisiana
Nevada
Oregon
Rhode Island
Virginia
> 2 – 4%
> 4%
Arkansas*
Connecticut
Massachusetts
Missouri
New Hampshire
New Mexico
Utah
Washington
Wisconsin
Mississippi
Ohio
South Dakota
Tennessee
Texas*
West Virginia
*State provided percentage of students with disabilities was transformed
to a percentage of all students using the special education rate.
Examples of principles
Thompson & Thurlow, 2000*
State #1
• Expectations for all students should be high, regardless of the
existence of any disability
• The goals for an educated student must be applicable to all
students, regardless of disability.
• Special education programs must be an extension and adaptation of
general education programs rather than an alternate or separate
system.
State #2
• Meet the law.
• Nonabusive to students, staff, parents.
• Inexpensive.
• Easy to do and takes little time.
State alternate assessments: Status as IDEA alternate assessment
requirements take effect (Synthesis Report No. 35).
Thompson & Thurlow (2000).
• Who involved: many states included general and
special education reps, a small number saw it as
a special education initiative.
• Nine states plan to base their alternate
assessment on separate standards or skill sets
that are not linked to general education standards.
• Most common approach: collection of a body of
evidence that assesses functional indicators of
progress toward state standards using a variety of
performance-based assessment strategies.
• Areas of greatest need for development are
scoring procedures and how data will be reported.
Content Addressed by Alternate Assessments:
Change Over Time
Year
Fnctl Fnctl
St
Exp/
skill, skill stnd
ext
St
No
Link Plus
link
St
Fnctl stnd*
St
stnd skills
stnd
Grade
level
stnd**
IEP Other Revis
team
ing
deter
cntnt
1999
16
---
1
19
---
---
24
---
2000
9
3
7
28
---
---
3
---
2001
4
15
9
19
---
---
3
---
2003
2
---
4
36
---
3
3
2
2005
---
---
1
21
10
1
7
10
*Category possibly included grade level standards prior to 2005
** Category introduced in 2005
Pioneer: Massachusetts
• Wiener, D. (2005). One state's story: Access
and alignment to the GRADE-LEVEL content
for students with significant cognitive
disabilities (Synthesis Report 57).
Changing Curricular Context for Students
with Significant Cognitive Disabilities
• Early 1970s
– Adapting infant/early
childhood curriculum
for students with the
most significant
disabilities of all ages
• 1980s
– Rejected
“developmental
model”
– Functional, life skills
curriculum emerged
• 1990s
– Also: social inclusion focus
– Also: self determination
focus
– Assistive technology
• 2000
– General curriculum access
(academic content)
– Plus earlier priorities
(functional, social, self
determination)
– Digitally accessible
materials
Alternate Assessment Approaches
2000-2005 (from 2005 Survey)
Year
Portfolio
or Body of
Evidence
Rating
Scale or
Checklist
IEP
Other
Analysis
In Development/
Revision
Regular States
1999
28 (56%)
4 (8%)
5 (10%)
6 (12%)
7 (14%)
2001
24 (48%)
9 (18%)
3 (6%)
12 (24%)
2 (4%)
2003
23 (46%)
15 (30%)
4 (8%)
5 (10%)
3 (6%)
2005*
25 (50%)**
7(14%)*** 2 (4%)
7 (14%)
8 (16%)
Unique States
2003
4 (44%)
0 (0%)
1 (11%)
1 (11%)
3 (33%)
2005
1 (11%)
1(11%)
1 (11%)
0 (0%)
1 (11%)
**Of these 25 states, 13 use a standardized set of performance/events/tasks/skills.
***Of these 7 states, three require the submission of student work.
Where did we start? Part 2
• Scoring criteria and procedures - 2001 and on
• Performance/achievement descriptors and
achievement standard setting – 2001 and on
• Reporting and Accountability – 2001 and on
(In addition to confusion about terms, there is some
evidence in survey responses/verification of a
tendency to give the “right” answer)
2001 - Student Performance
Measures
Skill/competence
Independence
Progress
Ability to Generalize
Other
0
10
20
30
Number of States
40
50
2001 - System Performance
Measures
Variety of settings
Staff support
Appropriateness
General education
participation
Parent Satisfaction
No system measures
0
10
20
Number of States
30
40
50
2005 - Outcomes Measured by
Rubrics on Alternate Assessments
Number of Regular States
0
5
10 15 20 25 30 35 40 45 50
Skill/Competence_
25 (40)
Level of Assistance
25 (32 independence)
23 (23)
Degree of Progress
Number/Variety of Settings
20 (21)
Alignment with Academic Content Standards
18
15 (18)
Ability to Generalize
Appropriateness
13 (20)
Staff Support
10 (20)
10
Social Relationships
Self Determination
Participation in General Education Settings
Support
(Numbers in parentheses from 2001)
9
7 (12)
7
2001, 2003 - Alternate
Assessment Scorers
Student’s teacher/
IEP member (44%)
Teachers in other
districts (26%)
Test contractor
(24%)
State education
agency (NA)
Teachers within
district (12%)
36
32
24
12
6
Developing/
revising (6%)
8
Other
(20%)
16
0
20
Numbers in parentheses % from 2001
Numbers on chart in black % from 2003
40
60
Percent of States
80
100
2003 - Alternate Assessment
Achievement Level Descriptors
Year
Same as
general
assessment
Different from
general
Assessment
Currently
developing/
revising
Regular States
2001
18 (36%)
19 (38%)
13 (26%)
2003
31 (62%)
16 (32%)
3 (27%)
3 (27%)
2 (18%)
3 (27%)
Unique States
2003
2003 - States with standard setting
process
Regular States
Don't
Know,
10%
Other,
16%
Yes, 52%
Informal
Process,
8%
No, 14%
PIONEERS: Arkansas,
Washington, Massachusetts
• Early standard-setting approaches
• Commitment to “real” assessment
methodology
• “Tell me - how will we set standards on this
test?” Arkansas Assessment Director
• “What the h… does proficiency mean for
these kids?” Washington Chief State
School Officer
Devil in the Details
• Quenemoen, R. F., Lehr, C. A., Thurlow, M. L., &
Massanari, C. B. (2001). Students with
disabilities in standards-based assessment and
accountability systems: Emerging issues,
strategies, and recommendations (Synthesis
Report 37). CCSSO alternate assessment
presession report
• Bechard, S. (2001). Models for reporting the
results of alternate assessments within state
accountability systems (Synthesis Report 39).
• Roeber, E. (2002). Setting standards on alternate
assessments (Synthesis Report 42).
• Quenemoen, R., & Thurlow, M., (2002). Including
alternate assessment results in accountability
decisions (Policy Directions No. 13).
Devil in the Details, continued
• Quenemoen, R., Rigney, S., & Thurlow, M. (2002).
Use of alternate assessment results in reporting
and accountability systems: Conditions for use
based on research and practice (Synthesis
Report 43).
• Quenemoen, R., Thompson, S. & Thurlow, M.
(2003). Measuring academic achievement of
students with significant cognitive disabilities:
Building understanding of alternate assessment
scoring criteria (Synthesis Report 50).
• Gong, B., & Marion, S. (2006). Dealing with
flexibility in assessments for students with
significant cognitive disabilities (Synthesis
Report 60).
Flexibility and Standardization
• Nominal categories are NOT often useful for
characterizing the technical aspects of the
assessment (see Gong & Marion, 2006).
• The evaluation of technical adequacy interacts with
the types of alternate assessments (i.e., choices/
degree of flexibility-standardization) being employed.
• This does NOT mean that standardization is good
and flexibility is bad—it all depends on purposes!
Alternate Assessment Approaches
2000-2005 (from 2005 Survey)
Year
Portfolio
or Body of
Evidence
Rating
Scale or
Checklist
IEP
Other
Analysis
In Development/
Revision
Regular States
1999
28 (56%)
4 (8%)
5 (10%)
6 (12%)
7 (14%)
2001
24 (48%)
9 (18%)
3 (6%)
12 (24%)
2 (4%)
2003
23 (46%)
15 (30%)
4 (8%)
5 (10%)
3 (6%)
2005
25 (50%)**
7(14%)*** 2 (4%)
7 (14%)
8 (16%)
Unique States
2003
4 (44%)
0 (0%)
1 (11%)
1 (11%)
3 (33%)
2005
1 (11%)
1(11%)
1 (11%)
0 (0%)
1 (11%)
**Of these 25 states, 13 use a standardized set of performance/events/ tasks/
skills.
***Of these 7 states, three require the submission of student work.
2005 - Development or revision
Area
Number of Regular States
Approach
8
Content
10
Standard-setting
13
Scoring Criteria
17
Survey topics: Where are we
now?
• Stakeholder expectations
• Content coverage (linkage to content
standards)
• Approaches (test format)
• Scoring criteria and procedures
• Performance/achievement descriptors
and achievement standard setting
• Reporting and accountability
Where are we now? Part 1
• Stakeholder expectations – stakeholder estimates of less
than 1% to more than 4% of all students in 1999 (see slide 8).
In 2007, with 2% regulation, we have seen data from
under 1% to as high as 9% of all students in alternates.
• Content coverage – National Alternate Assessment Center
work – University of Kentucky: Is it reading? Is it math? Is it
science?; University of North Carolina: Links for Academic
Learning; other methodologies for alignment.
Peer Review suggests great variability, near and far
linkages, but a steady trend is toward academic content.
• Approach –Degree and logic of flexibility and standardization
choices… Nominal categories are not particularly useful
descriptors. Unfortunately, “…the naked eye is drawn to
test format” not educational soundness (Baker, 2007)
Where are we now? Part 2
• Scoring criteria and procedures – What does student
performance look like? Student vs. system? How do we
measure “independence?” Who scores? Who checks?
Trust but verify? Flexibility vs. standardization issue.
Peer Review suggests great variability on this.
• Performance/achievement descriptors and standard
setting – Achievement on the content? Is the content
clearly referenced? How good is good enough?
What should these students know and be able to do?
How well? Needs careful monitoring over time,
consequential validity studies.
• Reporting and accountability – NCLB and IDEA
define that for now… stay tuned.
Reporting remains a challenge in some states.
More or less than meets the eye?
BECAUSE of the number of uncertainties still
in play, we need:
• Transparency
• Integrity
• Consequential validity studies
• Planned improvement over time
What is the road ahead?
Knowing What Students Know: The science and
design of educational assessment (NRC, 2001),
synthesized a tremendous body of learning and
measurement research and set an ambitious
direction for the development of more valid
assessments.
New Hampshire Enhanced Assessment Initiative
(NHEAI) and National Alternate Assessment
Center (NAAC) research/partner states validity
framework to apply to alternate assessment
Pioneers: Connecticut and Georgia
• Connecticut Technical Manual
http://www.education.umn.edu/NCEO/TopicAreas/
StateForum/CMTCAPTTechnicalManual2.pdf
• Georgia Technical Manual
• Through NHEAI/NAAC Expert Panel review:
New Hampshire, Massachusetts, Colorado,
Connecticut; Georgia, Iowa, Kentucky,
Maryland, Rhode Island, South Carolina
Visit: www.nceo.info
[email protected]