DAC Academy 2012 August 23, 2012 Colorado Department of Education Assessment Unit Welcome! • Introductions • Assessment News and Reviews • Introduction to Partnership for Assessment.

Download Report

Transcript DAC Academy 2012 August 23, 2012 Colorado Department of Education Assessment Unit Welcome! • Introductions • Assessment News and Reviews • Introduction to Partnership for Assessment.

DAC Academy 2012
August 23, 2012
Colorado Department of Education
Assessment Unit
Welcome!
• Introductions
• Assessment News and Reviews
• Introduction to Partnership for Assessment of
Readiness for College and Careers (PARCC)
- 15 Minute Break
• Introduction of New Science and Social Studies
Assessments
• Update on the Colorado Content Collaboratives
- Lunch
• Afternoon Break-Out Sessions
Meet the Assessment
Leadership and Support Team
Joyce Zurkowski
Executive Director
of Assessment
Christina WirthHawkins
Assistant Director
of Assessment
Margo Allen
Business Process
Manager
Meet the Assessment Team
Christine Deines
CO ACT
Glen Sirakavit
New Assessment
Jason Clymer
TCAP
Pam A. Sandoval
NAEP
Coordinator
Angela Norlander
Content
Collaboratives
Mira Monroe
Special Ed.
Heather
Villalobos Pavia
ELL
Meet the Assessment Data
Team
Jessica Allen
Data
Jasmine Carey
Psychometrician
Additional CDE Staff
Bill Bonk
Accountability
Linda Lamirande
Special Ed.
Jason Clymer
TRANSITIONAL COLORADO
ASSESSMENT PROGRAM
(TCAP)
TCAP and Summative Assessment
Timeline
2013
• TCAP and CoAlt Continue
• Field test new social studies and
computer based science items
2014
• TCAP and CoAlt Reading, Writing, and
Math will continue
• First year of new social studies and
science assessments will be operational
2015
• New Reading Writing, and Math
assessments (PARRC)
• Second year of new social studies and
science assessments will be operational
TCAP 2012
• Congratulations on a successful first year
of TCAP!
The State of Reading
CSAP/TCAP Reading Percent Proficient
and Advanced 2005-2012
• Grades 3, 4, 6 and 7
demonstrate upward
trends in reading
proficiency
100.0
90.0
Percent Proficient & Advanced
80.0
70.0
2005
2006
60.0
2007
50.0
2008
2009
40.0
2010
30.0
2011
20.0
2012
10.0
0.0
3
4
5
6
7
Grade
8
9
10
The State of Writing
• Grades 5, 7 and 8 have
higher proficiency levels
than 2005 levels.
CSAP/TCAP Writing Percent
Proficient and Advanced 2005-2012
100.0
90.0
Percent Proficient & Advanced
80.0
70.0
2005
60.0
2006
2007
50.0
2008
40.0
2009
2010
30.0
2011
20.0
2012
10.0
0.0
3
4
5
6
7
Grade
8
9
10
The State of Mathematics
CSAP/TCAP Mathematics Percent
Proficient and Advanced 20052012
• All grade levels have higher
proficiency levels than 2005
levels
100.0
Percent Proficient & Advanced
90.0
80.0
2005
70.0
2006
60.0
2007
50.0
2008
40.0
2009
30.0
2010
20.0
2011
2012
10.0
0.0
3
4
5
6
7
Grade
8
9
10
The State of Science
100.0
90.0
80.0
Percent Proficient & Advanced
• All grades show
improvement in proficiency
compared to the 2008
scores
CSAP/TCAP Science Percent Proficient and
Advanced 2008-2012
70.0
60.0
2008
2009
50.0
2010
40.0
2011
30.0
2012
20.0
10.0
0.0
5
8
Grade
10
TCAP 2012 Issues
• Parent “Opt Out”
– All students must test
– Letter addressing opt out will be updated and
re-released in 2013
• Multiple Misadministrations
– Wrong session given: Reading/Writing
– Using old items for test preparation
– Students discussing items
Clarifying Procedures
• New test security: forthcoming
• Procedures Manual update: coming in
September
• Frequently Asked Questions (FAQ)
• Reading after the test: updated script in
Test Proctor’s Manual
• Supplementary training: Oral Scripts,
Teacher Read Directions, and other topics
TCAP 2013
• Will still assess the same content as noted
in the TCAP Frameworks
• Will still be administered at the same time
and in the same manner
• Training schedule will remain the same
• Scoring and reporting will remain on the
same timeline
Mira Monroe
ACCOMMODATIONS
Accommodations
What’s New?
• Format:
– Includes instructions on all
accommodations
– Tables appear to have more restricted
accommodations – but…
• iPad:
Not allowed
• Verification of Removal
Form
• ACCESS for ELLs®
Accommodations
• September:
Training
» Statewide
• 10/31 – 11/23: Order Special TCAP
» Navigator
• November 15: Extra Special TCAP Due
» Mira
• December 15: Non-standard Accommodations
» Mira
Accommodations
State Monitoring Visits
• Coordinating visits with Title programs
• Mira and Heather will be doing visits
Mira Monroe and Linda Laraminde
COLORADO ALTERNATE
(CoAlt)
Colorado Alternate Assessment
CoAlt
• October 10 – 24: Order Material
• November:
Administration Training
• February 6 – March 22: Test Window
• March 26:
Schedule Pick-up
Alternate Standards and Assessment
Eligibility Criteria Worksheet
Eligibility is determined
by the IEP team
1st Determine Academic Standard
2nd Determine Assessment
Questions Contact:
Linda Lamirande
Exceptional Student Services Unit
303.866.6863
[email protected]
Christine Deines
COLORADO ACT (CO ACT)
DAC Academy 2011
Colorado ACT
College Entrance Exam
 Accepted by U.S. Colleges, Universities, Military Academies and
NCAA
Colorado ACT (COACT)





State test date: April 23, 2013
Make-up test date: May 7, 2013
2013 Test Date on TUESDAY
Accommodations Testing Dates: April 23 – May 7, 2013
All 11th grade students by law
11th Grade Alternate for students eligible to take CoAlt
 Managed by Exceptional Student Leadership Unit
 Testing Window: April 1 – April 26, 2013
 Contact Linda Lamirande, ESSU 303-866-6863
[email protected]
DAC Academy 2011
Colorado ACT
COACT
Test Supervisor (TS)
DACs may need
to develop a
communication
process with Test
Supervisors for
Accountability*.
COMMUNICATION
DAC
*Test Supervisors must develop a
communication plan with Back-up
Test Supervisors and Test
Accommodations Coordinators.
DAC Academy 2011
CO ACT UPDATES
ONLINE Schools
• New changes online schools: Two national
tests date options to online students ONLY:
– February 9, 2013 & April 13, 2013
• Student can choose either OPTION
• Student can pay $20 to take April 13, 2013 if
they registered and miss Feb. , 2013 test
option
DAC Academy 2011
ACT Graduating Class Report
http://www.cde.state.co.us/assessment/documents/coact/data/DifCOACTProfileReport_G
radClassReport.pdf
• ACT Profile Report
– Results of All Colorado Public School’s Spring Testing
Population
– Results for State Mandated Test
• ACT Graduating Class Report
– Most recent test Date for each student in most recent
graduating class in a Colorado HS (both private &
public)
DAC Academy 2011
ACT Graduating Class Report
State
Percent of
Graduates
Tested
Average Percent Meeting Percent Meeting
Composite
English
Reading
Score
Benchmark
Benchmark
Percent
Meeting
Math
Benchmark
Percent
Meeting
Science
Benchmark
Illinois
North
Dakota
100
20.9
65
47
44
30
100
20.7
64
49
45
30
Utah
97
20.7
64
54
40
29
Colorado
100
20.6
62
47
41
31
Louisiana
100
20.3
68
46
35
22
Wyoming
100
20.3
60
46
38
28
Michigan
100
20.1
59
45
36
26
Kentucky
100
19.8
59
44
31
22
Tennessee
100
19.7
59
43
29
21
Mississippi
100
18.7
53
34
21
14
National
52
21.1
67
52
46
31
Pam A. Sandoval
NATIONAL ASSESSMENT OF
EDUCATIONAL PROGRESS
(NAEP)
DAC Academy 2012
National Assessment of Educational Progress
NAEP
2013 National and state sample of grades 4 & 8 and national sample of
grade 12
(Close to 17,000 schools and 795,000 students nationwide)
NAEP 2013 Assessment Window
*Each selected student is tested in one subject, only.
DAC Academy 2012
National Assessment of Educational Progress
NAEP
Participating schools selected by National
NAEP Statisticians
– Most schools identified last May
– 99% of participating NAEP districts have received initial
notification from NAEP State Coordinator
– Will receive state & national results for grades 4 and 8 in
reading and math in fall of 2013. A few districts will also
take the TEL test (technology & engineering literacy) which
is a computer-based field test. In NAEP, we do not receive
disaggregated results for districts or individual schools- not
designed for this.
DAC Academy 2012
National Assessment of Educational Progress
NAEP Roles
National NAEP
Office
NAEP State
Coordinator
NSC
School &
Community
Contracted
Assessment
Team: Westat
DAC Academy 2012
National Assessment of Educational Progress
NAEP- Relationship between NAEP & School
Community
NAEP State Coordinator
District Contact
NAEP School
Coordinator
School & Community
•Assists NSC in
School
communications
•Works with NSC
and the
Supervisor to
Oversee the
process
•Confirms the
assessment date
•Provides schools
with info for parental
notification
•Responds to
questions
•Works with
district/school
personnel to ensure
a smooth process
•Reports the results
2011 Math Grade 4: Average Scale Score
Higher
Not significantly different
Lower
2011 Reading Grade 4
*Significantly different (p < .05) from 2011
2011 Math Grade 8: Average Scale Score
Higher
Not significantly different
Lower
2011 Reading Grade 8
*Significantly different (p < .05) from 2011
NAEP-The Nation’s Report Card® public web site:
http://nces.ed.gov/nationsreportcard/
NAEP-The Questions Tool
http://nces.ed.gov/nationsreportcard/itmrls/
Jessica Allen
DATA OPERATIONS
DAC Academy 2012
DATA OPERATIONS
• CDE’s role is to support districts in data
collection activities for TCAP/CoAlt, CO ACT
and ACCESS for ELLs.
• This presentation will provide a brief overview of
major data and logistic activities.
• A Handout with dates, resources specific to each
activity and assessment is posted on the
website.
DAC Academy 2012
Data Operations
Essentials for DACs
• Materials Ordering
• Automatic Data Exchange (ADE)Collections
– Collect accurate data for test book labels (Pre-Coded Labels).
– Verify student biographical data after testing (Student
Biographic Data (SBD)).
• Logistics
– Handling of testing materials before, during, and after
testing
• Final Assessment Results
DAC Academy 2012
Data Operations
Ordering Materials
• TCAP
– October Count is used for initial order, December/January
Pre-Coded labels used to adjust list.
• CoAlt
– Online Enrollments –CTB Navigator
• ACCESS for ELLs
– Online - MetriTech's website
• CO ACT
– Order Online.
– Email sent via ACT
DAC Academy 2011
Data Operations
Pre-Coded Labels (PCL)
• Label applied to test booklet that links student
information (e.g. name, gender) and eliminates
having to ‘bubble’ this information on test booklet.
• TCAP/CoAlt and CO ACT-Students data sources
– October Student Count and December/January
PCL Collection.
• ACCESS for ELLs
– October Student Count information.
DAC Academy 2011
Data Operations: Logistics
Receiving, processing, and shipping test materials
• Training posted on the assessment website in
November.
• Topics will include
– Recording test invalidations and accommodations at time
of testing.
– Creating the School Group List (SGL)
– Tracking the number of tests returned by content area,
grade, and school
– Procedures for Home Schooled students
– N Count on Navigator
DAC Academy 2012
Data Operations
Student Biographical Data (SBD) Review
• Opportunity to review and verify SBD data.
• Training will be online in late February 2013.
• Optional but necessary for any accountability
appeals that use assessment data.
DAC Academy 2012
Data Operations
N Counts – TCAP and CoAlt Only
• Review and verification number of test booklets
submitted to CTB.
• Districts work directly with CTB.
DAC Academy 2012
Data Operations
Final Assessment Results
• Districts receive information directly from testing
companies.
• All results are embargoed from public distribution until
a specific date
DAC Academy 2011
Data Operations: ADE
• CDE system
• Each collection requires registration
• Assessment Collections
- Pre-Coded Labels
- Student Biographical Data Collection (Two stages)
- Stage 1: Download, edit and upload an approved file.
- State 2: Review of approved file
DAC Academy 2011
Data Operations: ADE
Link to system.
Requires password.
Data only available during review window.
Link to support documents.
Separate section for each ADE collection.
https://cdeapps.cde.state.co.us/
DAC Academy 2011
Data Operations: ADE Documentation
https://cdeapps.cde.state.co.us/doc_toc.htm
DAC Academy 2012
Data Operations: Final Remarks
• DAC emails provide information about
upcoming data activities, availability of
support documents and other information
as needed.
• Each assessment is unique.
Heather Villalobos Pavia
NEW ENGLISH LANGUAGE
LEARNER ASSESSMENTS
W-APT™
ACCESS FOR ELLS®
Placement test: W-APT
Purpose of W-APT
• Identify students who may be candidates for ELL
programming
• Administer upon enrollment to determine the English
language proficiency level of students new to the school
or school system in order to provide ELL programming
• The W-APT is NOT used for program exit decisions
Characteristics of the W-APT
• Aligned to English Language Proficiency (CELP)
Standards,
• 5 grade level cluster forms: K, 1-2, 3-5, 6-8, 9-12
• Results in scores from proficiency levels 1-6
• Speaking individually administered. Listening, Reading
and Writing individually or small group administered.
– First semester kindergarten only assesses speaking and
listening.
Annual measure:
ACCESS for ELLs
Purpose of ACCESS for ELLs
• To monitor students' progress in acquiring
(academic) English
• One component in the body of evidence used
when making program exit decisions
Characteristics of ACCESS for
ELLs
ACCESS for ELLs test items are written from the model
performance indicators of the five English Language
Proficiency (CELP) standards:
• Social & Instructional Language
• Language of Language Arts
• Language of Mathematics
• Language of Science
• Language of Social Studies
Characteristics (continued)
• Test forms are available in three overlapping tiers for
each grade level cluster
– Tier A: Proficiency levels 1-3
– Tier B: Proficiency levels 2-4
– Tier C: Proficiency levels 3-5
• Test administrator scripts are different for each test form
• Administered in groups of up to 22 students
Notable Differences with
ACCESS for ELLs
• Kindergartners are tested 1 on 1
• Must be a district employee to administer
the test
• Listening is not on CD
• Do NOT order overage, MetriTech
calculates an automatic 5% overage
Christina With-Hawkins
PARTNERSHIP FOR
ASSESSMENT OF COLLEGE
AND CAREER READINESS
(PARCC)
Reading, Writing and Mathematics
• Recent legislation
– Requires Colorado to participate as a Governing
Board member in a consortium of states that focuses
on the readiness of students for college and careers.
– Requires the Board to rely upon the assessments
developed by the consortium expected to be ready for
spring 2015.
– Encourages the Board to conduct a fiscal and student
achievement benefit analysis of Colorado remaining
a Governing Board member starting on or before
January 1, 2014.
PARCC
• Colorado joined PARCC as a governing
member in August 2012.
• English Language Arts and Mathematics
in grades 3-11
• Computer-based/Paper-Pencil
• First operational assessment: spring 2015
PARCC States
PARCC Governing States
• Approve test specifications, priorities for content
assessed on each component, and
recommended scoring model
• Develop long-term sustainability plans for the
consortium and assessment system, including
through design of the tests and ability to refresh
over time
• Approve solicitations and select vendors for
PARCC procurements
• Determine highest priority model instructional
tools for PARCC to develop
PARCC Governing States
(Continued)
• Build and expand cadres of K-12 educators and
postsecondary faculty leading CCSS
implementation and PARCC assessment
development
• Ensure the assessment results provide the data
needed to support state accountability
mechanisms and educator evaluation model
– Participate in technical & policy working groups on accountability
to help identify solutions to pressing accountability transition
challenges and new approaches to accountability through ESEA
waivers
PARCC Assessments
• In English Language Arts/Literacy,
whether students:
– Can read and comprehend complex literary and
informational text
– Can write effectively when analyzing text
– Have attained overall proficiency in ELA/literacy
• In Mathematics, whether students:
– Have mastered fundamental mathematical concepts
– Can apply those knowledge and skills in novel
situations
PARCC Assessment Design
2 Optional Assessments/Flexible
Administration
Diagnostic
Assessment
•Early
indicator of
student
knowledge
and skills to
inform
instruction,
supports, and
PD
•Nonsummative
68
Mid-Year
Assessment
•Performancebased
•Emphasis on
hard-tomeasure
standards
•Potentially
summative
Performance
-Based
Assessment
(PBA)
•Extended
tasks
•Application
s of
concepts
and skills
•Required
Speaking And Listening Assessment
• Locally scored
• Non-summative, required
End-of-Year
Assessment
•Innovative,
computerbased items
•Required
PARCC Goal: Build a Pathway to College
and Career Readiness for All Students
K-2
formative
assessment
aligned to
the PARCC
system
K-2
Timely student
achievement data
showing students,
parents and educators
whether ALL students
are on-track to college
and career readiness
3-8
College
readiness
score to
identify who
is ready for
college-level
coursework
High
School
ONGOING STUDENT SUPPORTS/INTERVENTIONS
69
Targeted
interventions &
supports:
•12th-grade bridge
courses
• PD for educators
SUCCESS IN
FIRST-YEAR,
CREDITBEARING,
POSTSECON
DARY
COURSEWO
RK
Tools & Resources
Model
Content
Frameworks
Model
Draft
Policy
Instructiona
and
l Units
Descriptors
70
• Purpose: Support implementation of the CCSS - support
development of assessment blueprints; provide guidance to
state, district- and school-level curriculum leaders in the
development of aligned instructional materials
• Audience: State and local curriculum directors (primary
audience) ; teachers
• URL: http://www.parcconline.org/parcc-model-contentframeworks
• Purpose: Public review of two draft policies and PLDs
• Audience: Broad audience: teachers, schools, districts
states (for CCSS implementation and PARCC assessment
preparation)
• Timeline: Feedback by September 21, 2012
• URL: http://www.parcconline.org/crd-pld-survey
Launching Item Development
• Item Development Contracts
– Contracts with 2 consortia of vendors
– Item development officially launched in June 2012
– After 50% of work complete, PARCC will evaluate
quality, rigor and innovation and re-award contract to
vendor(s) who meet threshold for completing work
• Item Review Process
– Core Leadership Review Teams from across PARCC
states/Operational Working Groups
– Bias & Sensitivity Review Team
– Local Educator Review Teams
71
PARCC Sample Items
English Language Arts
PARCC Sample Items:
ELA
• Grade 3
• Grade 7
• Grade 10
PARCC Sample Items
Mathematics
PARCC Sample Items:
Mathematics
• Grade 3 : The Field
• Grade 6: Cake Weighing (Dana Center)
• High School – Golf Balls in Water
PARCC Timeline Through 2011-12
PARCC Tools & Resources
Model Content
Frameworks released
(Nov 2011)
Educator
Leader Cadres
launched
Fall
2011
Spring
2012
Winter
2012
Item
development
begins
Sample summative
assessment items
released
Summer
2012
Updated Model
Content Frameworks
Released
PARCC Assessment Implementation
76
Fall
2012
Timeline Through First PARCC
Administration in 2014-2015
PARCC Tools & Resources
Summative
PARCC
Assessments
(2014-15 SY)
Partnership Professional
Diagnostic
CollegeResource development K-2 Formative
modules
Tools
ready tools assessments
Center
released
Released
released
released
launched
Spring
2013
Pilot/field
testing
begins
Summer
2013
Fall
2013
Winter
2014
Expanded field
testing of
diagnostic
assessment
Spring
2014
Summ
er
2014
Expanded
field testing
Fall
2014
Optional
Diagnostic and
Midyear PARCC
Assessments
PARCC Assessment Implementation
77
Winter
2015
Spring
2015
Standard
Setting in
Summer
2015
Glen Sirakavit
NEW COLORADO SCIENCE AND
SOCIAL STUDIES ASSESSMENTS
Science and Social Studies
Assessments
• Based on the Colorado Academic Standards
• Grades:
– Science: grades 5, 8 and 11
– Social Studies: grades 4, 7 and 11
• Timeline
– Field test administration in spring 2013
– Operational administration in spring 2014
Science and Social Studies
Assessments
• Attain balance:
– Innovation with technical soundness and
feasibility
– Breadth with depth
• Take advantage of technology:
– Development: item type
• SR, CR, simulation/performance-based
– Administration: computer-based
– Scoring: automated and artificial intelligence
Science and Social Studies
Assessments - CoAlt
• Attain balance:
– Innovation with technical soundness and feasibility
– Appropriate for students with significant cognitive
disabilities
• Take advantage of technology:
– Development: item type
• SR and supported performance tasks
– Administration: test examiner
– Scoring: test examiner scoring with score input online
Science and Social Studies
Assessments
Examples of Technology-Enhanced Items
Opportunities for District
Involvement in Development
•
•
•
•
•
•
•
•
Item writing (for 2013 FT) – Fall 2012
Item review – Late Fall 2012
Cognitive labs – Early Spring 2013
Field testing – Late Spring 2013
Anchor paper selection – Early Summer 2013
Data review – Early Summer 2013
Item writing 1(for 2014 FT) – Spring 2013
Item writing 2 (for 2014 FT) – Early Summer 2013
Opportunities for District
Involvement in CoAlt Development
•
•
•
•
•
•
Item writing (for 2013 FT) – August 2012
Item review – November 2012
Field testing – April 2013
Data review – Early Summer 2013
Item writing 1(for 2014 FT) – Spring 2013
Item writing 2 (for 2014 FT) – Early Summer 2013
Paper to Online Assessments
•
•
•
•
•
Recurrent theme in next generation
assessment strategies
Leveraging advances in technology for greater
efficiency, flexibility, and potential cost savings
Benefits increasingly apparent
– Opportunities for more effectively
assessing student understanding and
performance
– Faster turnaround of scores
– Improved security model
– More efficient method of test delivery
– Student motivation
Moving online offers greater opportunity to
integrate/align instruction and assessment
But… How to make such a large, complex transition?
From “Considerations for Next-Generation Assessments: A Roadmap to 2014”, Pearson.
Three Levels of “Readiness” for Online Testing
• School
– Students
• Training, practice, familiarity
– Teachers, administrators & technology staff
• Close partnerships, training, policy administration
– Network & Infrastructure
• Setup, computer/lab logistics & load planning
• District
– Coordination, especially between assessment &
technology organizations
– Network-wide capacity planning
• State
– Policies, transition planning, & decision making
From “Considerations for Next-Generation Assessments: A Roadmap to 2014”, Pearson.
Example: Managing Assessment Data Load
When testing
begins, multiple
streams of
identical,
redundant data
can clog up
and overwhelm
the district or
school network
When properly
used, caching
or proxy
solutions can
reduce the
load of data
traffic that
online
assessments
place on the
network
capacity
Five Step Roadmap for Transitioning to
Online Assessments
1. Conduct a Needs Analysis
2. Develop a Realistic Transition
Strategy & Plan
3. Ensure Interoperability
4. Communicate Proactively
5. Plan for Ongoing Change
The full roadmap and additional resources are available online
at: www.PearsonAssessments.com/NextGenRoadmap
From “Considerations for Next-Generation Assessments: A Roadmap to 2014”, Pearson.
Highlights from the Roadmap – Steps 1 & 2
• Step 1 – Conduct a Needs Analysis
– Start with content & assessment design
requirements
– Conduct a “readiness” survey of district
and school technology
• Step 2 – Develop a Realistic
Transition Plan
– Focus on a multi-year, graduated strategy
and schedule
– Success is built on providing
districts/schools with online testing
experience prior to ramping up testing for
all students in all content areas
Highlights from the Roadmap – Steps 3 - 5
• Step 3 – Ensure Interoperability
– Standards are jointly agreed-upon limitations and constraints
– Very important to have engagement from both technology and
assessment community
• Step 4 – Communicate Proactively
– Find or create a forum for engaging personnel across districts, both
within and across states
– Build training for both assessment and tech staff
• Step 5 – Plan for Ongoing Change
– Unlike paper assessments, technology running online assessments
will continue to change
– Plan for recurring readiness checks, & build state-district
communication into planning
The Technology Readiness Tool
• Pearson contracted to develop
• Both national assessment consortia will
provide the tool to the states to deploy in six
data collection windows between 2012 and
2014
• Will collect local data to determine technology
readiness for online assessments, and provide
gap analysis
• Will use data to support local/state/national
planning for the transition
Measuring Local Readiness
Readiness for online assessments
has multiple different dimensions:
1. Computers & other devices
 Minimum system requirements
2. Ratio of devices to test-takers
 Including testing window and session scheduling
3. Network and infrastructure
 Bandwidth, network utilization, size of content
4. Personnel (staffing & training)
Measuring Local Readiness
• Mid-July through August
– Superintendents appoint District Technology Coordinators (DTC)
• August through December
– Pearson/CDE notify districts of survey window, provide web-based
training and provide access to the Survey & Readiness Tool
– Pearson/CDE identify/confirm field test participants
• Field test districts who have technology challenges will receive
extra support to complete an action plan
• November through December
– Pearson/CDE conduct trainings and open training centers
• January through March
– Districts install proctor caching, configure PearsonAccess & TestNav
– Districts complete certification checklist
• May
– Feedback from districts on the test administration
Online Testing District Readiness Process
Identify DTCs
Execute Survey
& Readiness
Tool
Determine Field
Test Districts
Open Online
Training Center
• Distribute letter to District
• Identify District Technology Coordinators
• Provide training and access to Survey & Readiness Tool
• Execute the Survey & Readiness Tool and capture results
• Recruit Field Test Districts
• Provide training sessions for participating District Technology
Coordinators
• Verify District’s online test environment
• Teachers & Students have access to “sandbox” online test
system
Special Notes for Technology Staff
• Technology leadership & support
is critical
– Providing readiness data for
statewide transition and gap analysis
– Guidance & support for local, state
and national planning
• Assessment and technology issues are intertwined –
solutions require cross-disciplinary understanding
– Statewide equity and comparability issues
– Security of test content
– Importance of cross-training local personnel, and evaluating
support for entire assessment solution
– Forward planning for the future
Lessons Learned from Other States about
Transitioning to Online Assessments
• Phased approach
– Start small and build online capacity
– Initially there may be problems, difficulties, challenges before
reaching stability
• Communicate and promote the advantages/benefits
–
–
–
–
Students are engaged
Interactivity of technology-enhanced items
Testing interface is user-friendly and accessible
Reduced administrative burden
• No need to inventory test materials and risk losing them
– Scores are returned sooner
– Built-in Online Accommodations
• Oral Scripts would not require additional proctors/testing environments
Lessons Learned from Other States about
Transitioning to Online Assessments
• Challenges
– Might need more proctors/administrators in each test
environment
• 2 test proctors walking around and observing students directly and 1 test
administrator watching the computer monitor to ensure students are on task
– Distinction between instructional technology and assessment
technology
• With the abundance of high tech consumer products, our constituencies may
expect the transition to be much quicker
– Setting testing windows
• Tension between longer testing windows to better manage load and the
timeline that most schools/districts prefer
– Collaboration between testing personnel and technology
personnel
• At the school, district, and state level, know who is responsible for what and
who to contact when there is a problem
Lessons Learned from Other States about
Transitioning to Online Assessments
• Recommendations
– Certification
• Require schools to self-certify that they have met the guidelines
• Require schools to validate what was reported with self-certification by using
software or an outside company to provide independent certification
– Test the system
– Train all new users
– Develop an emergency plan
• Direct access to key staff of the vendor
– Conduct surveys and special studies to get feedback from district
administrators, teachers, students
• Perceptions of test administrators vs. what the thoughts of students actually were
Colorado Content Collaboratives
Overview and Update
August 2012
Angela Norlander
colorado content collaboratives
cde
The Right Question
What
does
mastery
look like?
•For the
student
•For the
teacher
colorado content collaboratives
cde
How Colorado Will Determine
Student Learning
 Quality Criteria for One Measure
 Multiple Measure Design
Principles for Combinations of
Measures
 Growth Measure Development
colorado content collaboratives
cde
Content Collaboratives--Cohorts
Cohort One
 February–May 2012
•
•
•
•
Cohort Two
 June-December 2012
Dance
Drama & Theatre Arts
Music
Reading, Writing &
Communicating
• Social Studies
• Visual Arts
•
•
•
•
•
•
Comprehensive Health
Mathematics
Physical Education
Science
World Languages
Career and Technical
Education
colorado content collaboratives
cde
2012 Purpose
 The objective is to identify an initial bank of high quality
student academic measures which can be used to
determine, in part, the effectiveness of an educator
 Sample measures in each grade for each subject will
establish the beginning of on-going “build out” of the bank
 Over time, the Content Collaboratives will focus on
developing instructional resources, creating performance
tasks and continue to populate the bank with multiple
measures that represent both student learning and
educator effectiveness
colorado content collaboratives
cde
What goes in the bank?
 Identification of assessments districts can
use
 Multiple modes of actual assessments
 Future tasks and items which may become
eligible
 Protocol for eligibility
colorado content collaboratives
cde
Cohort I & II: Flow Chart of Work
National
Researchers
Colorado
Content
Collaboratives
Technical
Steering
Committee
I: Jan-Mar 2012
II: Jun-Aug 2012
I: Feb-May 2012
II: July-Nov 2012
I &II:
Feb-Dec 2012
I & II: Aug 2012Aug 2013
Researchers gather
existing fair, valid
and reliable
measures for
consideration.
Collaboratives use
protocol to review
researchers’
measures for
feasibility, utility
and gaps.
Technical Steering
Committee creates
frameworks and
design principles
for collaboratives
to use in reviewing
and creating
measures.
Piloting and
peer review of
measures.
Prepare to fill
gaps.
Provide
recommendations
to Technical
Steering
Committee.
Committee
reviews
recommendations
of collaboratives.
Pilot
then
peer
review
Aug 2012-Aug
2013: Cohort I
piloting & peer
review
January 2013Aug 2013:
Cohort II
piloting & peer
review
Bank
Future
Work
I & II
Nov 2012Aug 2013
Measures
placed in
online
Education
Effectiveness
Resource Bank
for voluntary
use.
Who is helping us?
•Researchers
•Technical Steering Committee
•Center for Assessment (NCIEA)
•Pilot Districts
•Peer Reviewers
•Other states and districts
colorado content collaboratives
cde
High Quality Assessment Content
Validity Review Tool
• A high quality assessment should be...Aligned
• A high quality assessment should be…Scored
using Clear Guidelines and Criteria
• A high quality assessment should be...FAIR
and UNBIASED
• A high quality assessment should…increase
OPPORTUNITIES TO LEARN
colorado content collaboratives
cde
colorado content collaboratives
cde
High Quality Assessment Content
Validity Review Tool
Training Modules
Definitions
Examples
Release in November 2012
colorado content collaboratives
cde
Reading, Writing & Communicating
August 15, 2012
Cohort I: Next Steps
Continue to review assessments
Connecting & collaborating with national
partners to fill gaps
Advanced assessment literacy training
State Model Curriculum Development
Performance Task Development
colorado content collaboratives cde 2012
Cohort II
Work began in Pueblo on July 23rd-24th
Will meet with Researchers on September
19th in Loveland
Review work will continue on September
20th in Loveland, October 24th & 25th in
Aurora, and November 14th in Golden
colorado content collaboratives cde 2012
Technical Steering Committee
colorado content collaboratives
cde
How Colorado Will Determine
Student Learning
 Quality Criteria for One Measure
 Multiple Measure Design
Principles for
Combinations of Measures
 Growth Measure Development
colorado content collaboratives
cde
Technical Steering Committee
 Met August 2nd
Committee members discussed
combination and growth
strategies
 Representatives from 5 districts
participated as respondents
Agenda and Notes posted on
Content Collaboratives
website
colorado content collaboratives
cde
Technical Steering Committee
Next Steps
 React to drafts of:
•Practical guidelines for districts regarding
the combination of multiple measures
•Glossary of terms for use in the guidelines
•Examples of how to plot growth
•Approaches currently being used in
Colorado districts
Next in-person meeting:
•Wednesday, December 12, 2012, in
Denver
colorado content collaboratives
cde
2012-2015 Work of Content Collaboratives
2012
2013
2014
2015
•
•
•
•
Researchers offer assessments for consideration to the Content Collaboratives
Cohorts I & II of Content Collaboratives review/recommend assessments for piloting
Cohort I assessments begin piloting in Fall 2012 to determine its utility within educator effectiveness evaluations
Guiding principles and criteria posted on the website for designing and vetting assessments to be used in
educator effectiveness evaluations
• Begin populating Resource Bank with Cohort I assessments in November 2012
• Continue piloting of Cohort I assessments & begin peer review of assessments in terms of how the assessments
function for the purposes of educator effectiveness evaluation
• Begin piloting of Cohort II assessments in January 2013
• Begin populating Resource Bank with Cohort II assessments in Winter 2013
• Content Collaboratives, using identified measures, begin working on curriculum and instructional designs aligned
to the Colorado Academic Standards
• Continue to refine and build the Resource Bank
• Build out sophisticated instructional lessons that respond to gaps in student learning
• Continue to refine and build the Resource Bank
• Continue to build statewide capacity
• Continue build-out of the bank in regards to instructional practices
Colorado Content Collaboratives
Contact:
Angela Norlander
Office of Assessment, Research & Evaluation
[email protected]
303-866-6931
Website:
http://www.cde.state.co.us/contentcollaboratives/
colorado content collaboratives
cde