Distributed Scoring of Regents Exams

Download Report

Transcript Distributed Scoring of Regents Exams

Distributed Scoring of Regents Exams:
NYC 2012 Pilots
Background
• In October 2011, the NYS Board of Regents voted that
beginning in school year 2012-13, teachers can no longer
score their own students’ state assessments.
• New York City already meets the new SED requirement
for the grades 3-8 ELA and math tests and NYSAA using
a Regional scoring model.
• Regents exams are currently administered and scored
locally (i.e., within the testing school).
2
New York City High Schools
• Over 923,000 Regents exams are administered annually* to
students in NYC, making it the largest assessment program in
the city.
• Approx. 250,000 students are enrolled in grades 9-12
• 460 high schools
• 386 – High schools (grades 9-12)
• 71 – Secondary schools (grades 6-12)
• 3 – K-12 schools (all grades)
• High schools exist both in stand alone and campus setting
• Stand Alone
• 194 high schools
• Campus
• 206 high schools are located in buildings with 3 or more
other high schools (max campus – 7 HSs in one building)
• Approx. 350 middle schools also administer Regents exams in
June; over 800 schools administer Regents exams in total
*Includes January, June, and August administrations.
3
Scoring Model Options Considered
Implementation
Model
Brief Description
Electronic Scoring
Scanned images of student responses are electronically
distributed to scorers. Scoring may be completed at
central scoring sites and/or at individual schools.
Regional Scoring
Staff members from multiple schools gather at a central
scoring site to collectively score exams.
Campus Scoring
Staff members located in one building with multiple high
schools remain in their building to score exams.
Exam (Staff) Trading
Exams (or staff) are traded amongst different schools for
scoring.
In-School Scoring
Multiple scoring committees are formed within a school.
Exams are distributed to scoring committees appropriately
to ensure teachers do not score their own student’s
papers.
4
Non-Electronic Distributed Scoring Pilots
January 2012
Model
(2 total)
Site
(7 total)
# Schools
(27 total*)
Exams*
(3 total)
Campus
4
17
Regional
3
10
Comprehensive
English; Integrated
Algebra; Living
Environment
*Excludes scoring of alternate language exams
• Only schools with grades 9-12 selected for January 2012
pilot
• Approx. 4,000 Comprehensive English, Integrated Algebra,
and Living Environment exams scored across all 7 sites
5
January 2012 Regents Exam Schedule
Jan. 24
Tues.
Jan. 25
Wed.
Jan. 25
Thurs.
Jan. 27
Fri.
Jan. 30
Mon.
Jan. 31
Tues.
Ratings
Day
Spring
Term
Begins
9:15 a.m.
Integrated
Algebra
Living
Environment
U.S. History &
Gov’t.
Global History &
Geography
RCT in Global
Studies
RCT in U.S.
History & Gov’t.
Geometry
RCT in Writing
RCT in Science
1:15 p.m.
Comprehensive
English
Algebra
2/Trigonometry
Physics
RCT in
Mathematics
Earth Science
Chemistry
RCT in Reading
6
Scoring Site Staff Structure
Site Supervisor
Scoring Content
Leaders
(4 per site)
Organizational
Team Leader
(1 per site)
Scorers
Organizational
Team Members
7
Distributed Scoring Pilots: January 2012
Implementation
Successes:
•
Collaboration across schools to plan for scoring and problem solve
•
Site Supervisor role and leadership
•
Deeper scorer training and norming
•
All scoring completed on or ahead of schedule
Challenges:
•
School selection and recruitment
•
Logistics and distribution of exams
•
Norming training across sites and models
•
Scorer identification and assignment
•
Proctoring vs. scoring needs
•
Specialized expertise (e.g., scoring alternate languages, upper-level
science and math scoring)
8
Scope and Plans for Scale-Up
Administration
January 2012*
June 2012*
August 2012
January 2013
June 2013
# Schools
27
164
TBD
460***
800+****
# Sites
# Exams
7
3
26
3 or 4**
TBD
TBD
TBD
10
TBD
10
*In January and June 2012, only English language versions of exams included
**Depends on whether the school is participating in an electronic or non-electronic model
***Includes all 9-12, 6-12, and K-12 schools
****Includes all schools administering Regents exams.
9
Goals of an Electronic Scoring Model
Projected benefits of electronic scoring include:
• Increased accuracy and consistency of scoring
student responses.
• A faster scoring rate (compared to a paper and pencil
model), which is expected to reduce the impact on
schools.
• Obtain data on Regents exam scoring rate by exam.
10
Regents Exam Scoring
Thousands
Regents Exam Scoring Method 2011-13
1,200
1,000
800
600
400
200
0
2011
Traditional School Based
2012*
2013*
Non-Electronic Distributed
2014*
Electronic Distributed
An increasing percentage of Regents exam scoring will use a distributed
(electronic or non-electronic) method ahead of a planned move to computerbased testing beginning in the 2014-15 school year.
* Refers to calendar year, not school year
11
Distributed Regents Scoring
Discussion Questions
•
How has Regents scoring typically been organized in your districts?
•
How are your districts planning to meet the new SED requirement?
For Regents exams? NYSESLAT? Science?
•
What types of assistance are you providing to districts?
•
How are you helping districts balance the simultaneous need for
proctors and scorers?
•
How do your districts handle the scoring of alternate language or
higher-level science exams(e.g., Physics)?
•
What scoring rates do your districts use for each exam for nonelectronic and electronic scoring methods?
•
Will you be monitoring scoring? What level of oversight are you
planning to use?
•
How has the new Regents calendar affected your approach to
developing a solution?
12
APPENDIX
13
Scoring Site Management Responsibilities
Role
Eligible Staff
Responsibilities
Site Supervisor
Appointed supervisor
or Education
Administrator (e.g.,
assistant principal)
Manage scoring site and
supervise all activities related
to the scoring facility; be onsite for duration of scoring
Scoring Content Leader (1 per
subject)
Content expert (e.g.,
subject area AP,
department chair);
supervisory license
preferred
Train scorers and oversee
scoring; be on-site for duration
of scoring for particular subject
Organizational
Team (size
equivalent to
number of schools
in site)
Organizational
Team Lead (1 per
site)
Organization
Team Members
Staff with ATS*
access; experience
with Regents
scanning preferred
Scan all answer documents for
scoring site; manage error
correction with oversight of site
supervisor; additional site
logistics support
Selection
Process
Selected by committee
of participating
schools’ principals at
information session
Providing school is
determined by
committee of
participating schools’
principals at
information session
Assist scoring site supervisors
and content leaders as needed
to check-in, distribute, and
check-out test materials
*ATS is the information system used to capture and process results from scanned
Regents exams.
14
School Selection: January 2012
Factors that contributed to school selection and matching for
January 2012 distributed Regents scoring included:
• Selection
• Serve only grades 9-12
• Administer the exam titles included in the pilot and in a
sufficient quantity
• Interest in piloting a distributed scoring model
• Matching
• Located in close proximity to other high schools that are
administering the same exams
• Expected to administer a quantity of exams in January 2012
(based on order data for that administration) that was
roughly equivalent to other nearby candidate schools
15