DeSSA Assessment Accommodations Search for Accommodations

Download Report

Transcript DeSSA Assessment Accommodations Search for Accommodations

Office of Assessment
February 18, 2015
1
Updates
 Documents, manuals, and resources
 DeSSA training by
 Role
 Grade
2
3
Three-Phase Approach
 Phase I – Pre-Hand Scoring (December)
 Focused on developing background knowledge and
preparing for the administration of the Smarter Interim
Assessments to be released in January
 Phase II – Hand-Scoring Training (January)
 Focused on the hand-scoring process to promote
integrity, validity, and reliability of scoring and
alignment with summative scoring
 Phase III – Post-Hand Scoring (January–Summer)
 Focuses on the entry, interpretation, and usage of
scores while planning forward to enhance the process
4
Hand-Scoring Training for
Interim Assessments
 Two, night-training sessions
 Table facilitators
 Educators from across the state
 Number
 Educator groups range from 71 to 93 participants
 Approximately 150 schools and 2 organizations
represented
 Feedback highlights include
 Collaboration
 Calibration
 Review of responses
 Review of annotations from Smarter and participants
5
Hand-Scoring Training Participation
Session 1
Session 2
Elementary
ELA
93
87
Secondary
ELA
71
66
Elementary
Mathematics
86
84
Secondary
Mathematics
73
73
6
Hand-Scoring Training for
Interim Assessments
 Interim background
 Focused on developing background knowledge and
preparing for the scoring of Interim Comprehensive
Assessments (ICAs) and Interim Assessment Blocks
(IABs)
 Hand-scoring process
 Item types, rubrics, scoring guide, scoring protocol,
scoring process, calibration, alignment with Smarter
 Debuted Teacher Hand-Scoring System (THSS)
 Follow-up
 Facilitator’s guide, presentations, handouts, scoring
guide, and THSS training for technical aspects
7
Hand-Scoring Process
 Student responses will be scored using the THSS
via the DeSSA Portal
 THSS training module was released on
February 12, 2015, to assist with this process and
entry of scores
8
9
THSS Overview
 Students complete open-ended items and submit
 Non-machine scored responses automatically sent
to test administrator in THSS
 Online resources and system interface provided for
Scorers
 Student responses can be viewed and scores
entered in the system
10
Interim Assessment Windows
Assessment
Grades
Dates
Interim Comprehensive
Assessment (ICA)
3–8 and 11
January 5–June 4, 2015
Interim Assessment
Blocks (IABs)
3–8 and 11 January 27–June 4, 2015
11
Hand-Scoring Security Policy
“Not Secure but Not Public”
 Interim security for students remains the same while
actively taking the assessment
 Non-machine scored responses are transmitted to
the TA through the THSS
 Teachers can view items, rubrics, exemplars, and
training guides in a closed environment
 Refer to Section 3.0 Ensuring Interim Test Security
and Security Procedures
12
Hand-Scoring Security Policy
“Not Secure but Not Public”
 Student responses can be viewed collaboratively
with other Scorers to advance instructional best
practices and promote student success
 Within a closed environment
 Display items live—such as on a Smart Board
 No recording or storing of student items or student
responses
 No physical retention of interim-associated items—
hardcopies must be stored in a secure location and
destroyed after use
13
Two Types of Interim Assessments
 Interim Comprehensive Assessments (ICAs)
 Same design as summative tests
 Assess the same claims and standards
 Yield overall scale scores, performance level
designation, and claim score information
 Interim Assessment Blocks (IABs)
 Assess smaller sets of targets
 Address specific content areas
 Shorter and more flexible
 Reported as Below Standard, At/Near Standard, and
Above Standard
14
Interim Assessment Response Types
Content Area
ELA/Literacy
Mathematics
Response Type
Short text
Constructed response
Essay
Short text
Constructed response
Short text (fill-in tables)
15
Teacher Hand-Scoring System (THSS)
16
THSS Response List Page
17
Scoring Response Screen
Available on this screen:






Description of Item
Rubric
Exemplars or Anchor Set
Training Guides
Navigation Buttons
Score Response Area (shows question and student answer)
18
Scoring Student Responses – Submit
Score
Click [Submit Score] at the bottom of the page when all Scores
and/or Condition Codes have been entered for a response
Warning:
If you have entered a score, you must click the [Submit Score] button to
save the score for the specific item response. If [Back] is clicked, this
will result in the score not to being saved.
Remember to click [Submit Score]!
19
20
21
ICA and IAB Training for Test
Administrators – Overview
 Security policies as well as the handling and
retention of secure test materials is modified for
ICAs and IABs from that of the Smarter summative
assessment
 Not secure but not public
 See security protocol specific to interim
assessments—Section 3.0, Ensuring Interim Test
Security and Security Procedures, of the ICA and
IABs TAM
22
Hand-Scoring Security Policy
“Not Secure but Not Public”
 Interim security for students remains the same while
actively taking the assessment
 Non-machine scored responses are transmitted to
the TA through the THSS
 Teachers can view items, rubrics, exemplars, and
training guides in a closed environment
 Refer to Section 3.0 Ensuring Interim Test Security
and Security Procedures
23
Hand-Scoring Security Policy
“Not Secure but Not Public”
 Student responses can be viewed collaboratively
with other Scorers to advance instructional best
practices and promote student success
 Within a closed environment
 Display items live—such as on a Smart Board
 No recording or storing of student items or student
responses
 No physical retention of interim-associated items—
hardcopies must be stored in a secure location and
destroyed after use
24
25
26
Delaware Technical Advisory Committee
(TAC) – Role
 The Delaware Technical Advisory Committee (TAC)
advises the Delaware Department of Education
(DDOE) to ensure that the state assessment system
will derive valid and reliable test scores for all
Delaware students
 Meets the federal requirements
 Improves teaching and learning
 The TAC is responsible to and reports directly to the
DDOE
27
Delaware TAC – Responsibilities
 The TAC serves as a consulting committee to the
DDOE
 Technical quality of the statewide assessment system
 TAC members
 Experts in educational measurement
 Deep understanding of the psychometric issues in the
design, development, and implementation of the state
assessments
 Members must demonstrate updated knowledge and
skills through their research interests, projects, and
publications
28
Delaware TAC – Responsibilities
 TAC meets twice a year
 Reviews the contractors’ work such as
 Design for scaling and equating
 Technical reports
 DDOE’s reports and analysis results
 Discusses technical issues in state assessments
and makes recommendations for improvement
29
Delaware TAC – Members
 Seven-member TAC includes
 Dr. Suzanne Lane (Chair), University of Pittsburgh
 Dr. Tim Davey, Education Test Services (ETS)
 Dr. Claudia Flower, University of North Carolina
 Dr. Ronald Hambleton, University of Massachusetts
 Dr. Brian Gong, Center for Assessment
 Dr. Martha Thurlow, University of Minnesota
 Dr. Richard Patz, ACT, Inc.
30
January 2015 TAC Meeting
 January TAC meeting focused on the technical
issues in the implementation of the Smarter
summative assessments
 Presentations were organized into three sessions
 Update design of Smarter summative assessments
and future considerations by Smarter Balanced
 Status and design for Delaware implementation by
contractors
 Preparation for implementation by DDOE
31
January 2015 TAC Meeting –
Smarter Balanced
 Dr. Marty McCall, Lead Psychometrician of Smarter
Balanced Assessment Consortium, updated design
for Smarter summative assessments and future
considerations presented such as
 Final blueprint
 Constraints for item selection and item exposure
control
 Scoring specification
 Validation study
 Recalibration
32
January 2015 TAC Meeting –
Contractors
 Contractors’ presentations included
 American Institutes for Research (AIR) – presented
the design of item selection algorithm for the adaptive
component and sampling plan for the performance
task component
 Data Recognition – presented the process and quality
control for hand-scoring, including paper/pencil
version scoring and considerations for quality control
and comparability of scores across settings
 AIR – demonstrated report design
33
January 2015 TAC Meeting – DDOE
 DDOE staff provided an update on the preparation
of the Smarter assessment implementation
 Test Security Manual for online testing and the
monitoring system
 Accessibility and accommodations with Smarter for
student with disabilities (SWD) were discussed
 The preliminary proposal for linking DCAS to Smarter
and reporting confidence interval using the conditional
standard error of measurement with a simulation
34
January 2015 TAC Meeting –
Other Topics
 Analysis results of student growth patterns and
trends for SWD
 Purpose and considerations in design of the
portfolio assessment for students with limited
communication skills were proposed
 Analysis for the comparability of content coverage
for individual tests in science was presented and
discussed followed by the simulation results for the
2015 science with modifications in algorithm for
improvement
35
36
Smarter Score Reporting – Online
 Things we know:
 Scores will not be available until after testing is
completed
 Score reports to schools – date to be determined in
summer
 Full score reporting presentation at December 2014
DTC meeting
 Paper score reports to families a little later this year
than for DCAS—late July 2015
37
Parent Reports Comparison Activity
Proposed DeSSA Version
for ELA/Literacy
DCAS 2014 Version
for ELA/Literacy
Smarter Initial Draft
38
Parent Reports Comparison Activity
Proposed DeSSA Version
for Mathematics
DCAS 2014 Version
for Mathematics
Smarter Initial Draft
39
Parents Reports Discussion
Things We Can
Change
Things We Cannot
Change
DeSSA Information
Cover Letter
Resource Links
Barrel Chart
40
Smarter ELA/Literacy and Mathematics Summative Assessments
Parent Reports Feedback – February 2015
Initial Information
Additional Information
(include sample language, details, …)
41
42
43
Interim Assessment Blocks Blueprints
 Updates
 Composition
 Hand-scoring information
For more information regarding the Blueprints for IABs,
visit the Smarter website:
http://www.smarterbalanced.org/interim-assessments/
44
45
Human Interpreter
 States requested offering human interpreter (i.e.,
visual support)
 Non-embedded accommodation
 Mathematics test
 ELA claim 3 (listening)
 Offered on a student by student basis (unique
accommodation)
46
Human Interpreter – Visual
Communication
 Must ensure high quality
 Adult with certification via professional organization
(e.g., Registry of Interpreters for the Deaf, RID) or
equivalent
 Adult must be familiar with content vocabulary
 Adult must be familiar with the visual support
 Process to ensure certification equivalence
 Documentation would be shared with Smarter
Balanced after testing is completed
47
Usability, Accessibility, and
Accommodations Guidelines
Resource
Description
A human signer (i.e., human visual support for English)
may be available to students on math tests and ELA
claim 3 stimuli and items with a documented need in an
IEP or 504 plan. The adult must be familiar with content
Human Signer vocabulary and the visual support the student uses
during everyday instruction. States are responsible for
creating, implementing, and documenting the process on
a student-by-student basis. States will share
documentation with Smarter Balanced.
48
Human Translator – English Learners
 Must ensure high quality
 Adult with certification via professional organization or
equivalent
 Adult must be familiar with content vocabulary
 Adult must be fluent with the student’s native
language
 Process to ensure certification equivalence
 Documentation would be shared with Smarter
Balanced after testing is completed
49
Usability, Accessibility, and Accommodations
Guidelines
Resource
Description
Human
Translator
Human translation may take place on math tests as long
as the adult translator is certified in English-to-home
language translation with experience in K-12 education.
Certification should be completed by an accredited
organization (e.g., American Translators Association). If
the language does not have a certification process,
states are responsible for creating, implementing, and
documenting the translation process on a student-bystudent basis. The adult must be familiar with content
vocabulary and the language support the student uses
during everyday instruction. States will share
documentation with Smarter Balanced.
50
51