Transcript Slide 1

MAC Common
Assessment Training
Modules
Session F5
Michigan School Testing Conference
February 21, 2013
Ann Arbor MI
Michigan Assessment
Consortium Vision
…is to improve student learning and achievement
through a system of coherent curriculum, balanced
assessment and effective instruction. We do this by
collaboratively:
 Promoting assessment knowledge and practice.
 Providing professional development.
 Providing and sharing assessment tools and products.
Michigan Assessment
Consortium Beliefs
We believe…
 Collaboratives and consortia advance the
work
 Balanced assessment is a valued enterprise
 Students are the most important users of
classroom assessment data
 Teachers and administrators must engage
students in the assessment process
We Believe….
 All educators can learn to implement a balanced
assessment system
 Teachers and principals and central office must
be assessment literate
 Development and use of a coherent system (CIA)
ensures quality for each student
 An effective assessment system includes a
balance of school, district, and state measures
and uses a variety of methods
Comprehensive Balanced
Assessment System
Aligned to Content Standards
MEAP /MME/MI-Access/End of Course
Summative – Assessment of Learning
Are students proficient?
Interim Assessments/Unit/Chapter
Short-Cycle Summative Assessments
Are students on track for proficiency?
Classroom Assessment Practices
Formative – Assessment for Learning
Did the student learn what I just taught them?
How can I help students learn even more?
5
The trick….
…create a balanced assessment system
MAISA Instructional Services
Committee
2012-13 Assessment Goal
"In cooperation with the MI Assessment Consortium and
other statewide related projects, contribute to a high
quality, comprehensive assessment system in the state of
MI."
7
Vision of Excellence in Assessment Balanced Assessment System
 There is a balance of formative and
summative assessments.
 The assessments are of high quality.
 Students are actively engaged in the
assessment process.
Elbow Partner
You know you have a high
quality common assessment
when…..
Accurate assessments +
Appropriate uses resulting in
productive reactions
STUDENT SUCCESS
Common Assessment Module Content
1. Introduction and Overview of the MAC
CADS Series
2. What Are Common Assessments?
3. Determining the Outcome of
Assessment
4. Determining the Targets of Assessment
5. Matching the Assessment Methods to
the Learning
Targets
6. Assessing Students with Special Needs
7. Writing the Test Blueprint
8. Writing the Selected-Response Items
9. Writing Constructed Response Items
10. Writing Performance Assessment Items
11. Using Portfolios to Assess Students
12. Developing and Using Scoring Guides
and Rubrics
13. Editing the Draft Assessment Items
14. Detecting and Eliminating Bias and
Distortion
15. Assembling the Assessment Instrument
16. Field Testing
17. Looking at Field Test Data
18. Reliability
19. Test Validity
20. Assembling the Final Common
Assessment
21. Assessment Administration, Scoring
and Reporting
22. Standard Setting
23. Presenting the Results
24. Using Data to Improve Instruction
11
“Other” Elbow Partner Time
When reviewing and thinking about what you
currently know about the assessment
development process…
Which module would you want to know more
about?
Which module topic might you already have a
level of confidence?
12
 Website
www.michiganassessmentconsortium.org
 Follow MAC
13
St. Joseph County
Common
Assessment Project
Beginning at the beginning
 Why did we begin this journey?
 LEA’s desire to:
 conduct Professional Learning Communities
(PLC’s)
 implement a multi-tiered system of support (RtI)
 provide effective feedback to students around
learning goals
 fully implement a standards-based model of
instruction/assessment
15
Beginning at the beginning
 Our challenges:
 “Faulty” data at PLC’s
 Ineffective data to target the interventions for
RtI
 Ineffective feedback practices around
data/grading and reporting
 Assessments not designed for a standards
based system
16
The “vision” began
simply..
 Just pick the items from the CD with our text
and use that!
 Just find the best one and use it countywide!
 Someone has to have one to buy!
 There’s lots of software that has items we
can use! Let’s just get that.
 Teachers can just go online and find one to
use!
 Let’s just wait for Smarter Balanced!
17
Critical First Step
 Do you have a clear and appropriate
purpose(s) for your assessment?
 How will the assessment be used?
 Who will use the results?
 Which partners will help you?
 What learning targets will you measure?
 Local?
 State?
 National?
 Other?
18
What Makes an Assessment
“Common?”
 It is more than an assessment given by
one teacher
 This is an insufficient definition
 It is a method for creating a
community of shared practice in a
school, district, across districts, even
across the state
19
The Power of the Common
Assessment
 The use of common assessment results
by two or more teachers in a PLC…
 Provides data to inform interventions
 Allows teachers to see how changes in
instructional practice can lead to higher
achievement
 Look deeply at their own and others’
practice to ultimately improve student
achievement
20
Common Assessments
 Common Assessments….
 are built on the same learning targets/goals,
whether they were developed at the school,
district, state, or national levels
 These targets needed to be those contained
within the mathematics common core
standards for Algebra I and Algebra II
This was the first challenge – could they
agree???
21
The decisions
 Create benchmark assessments 25 – 30
per course
 Use the traditional Algebra I and II course
“outline” defined in the CC
 Build the assessments by “standards
cluster” – explicitly key each item
 “Unpack” each standard within a cluster
to determine appropriate cognitive
demand and ensure alignment
22
Alignment
CONTENT
INSTRUCTION
ASSESSMENT
23
Unpacked document
 Identification of key vocabulary within
each standard
 Determination of learning goals in the
format of:
 I know statements
 I can statements
 These statements determined the types
of items to include on each assessment
 Direct link to standards
 Prerequisite skills yet to be filled in
24
“Vetting” of content
 Teams of 2-3 unpacked
 Another team reviewed
 An outside math consultant
reviewed/edited
 Final unpacking document was used to
develop assessment items
 Assessments assess only one cluster
and no more than 2 – 3 standards
25
26
Alignment
CONTENT
INSTRUCTION
ASSESSMENT
27
Cognitive Demand
 It is critical to be sure that assessments
measure the correct cognitive demand of
the learning goals represented in the
standards.
 It is critical that whoever is ‘building
assessments’ understands this concept
and applies it in their work
 The majority of classroom teachers have
not received sufficient training in
assessment design
28
Target/Method Match
29
30
31
32
The outcomes of quality
assesments….
 Clear, concise data being used in
collaborative groups
 Powerful data to provide feedback to
teachers and students and parents
 The ability to accurately inform
curriculum and instructional changes
 Closer alignment between grades/scores
and actual proficiency levels of the
33
students
The pilot test…2012/2013
Agreements for Pilot Teachers
Assessments are to remain “intact”
Assessments assess only one cluster and no
more than 2 – 3 standards and will be given
whenever that content has been taught
Data is not to be used to determine grades or
for teacher evaluation. These are draft!
Item level data are being collected and analyzed
to determine edits needed at follow up sessions
34
Validity Checklist…..
How did we do?
What do we “tweak” as we move
forward?
35
Project Timeline
2011-2012
ALG I
Assessment Writing
ALG II
Assessment Writing
Geometry
8th Grade
6-7
Grade?
2012-2013
2013-2014
2014-2015
Assessment
Implementation
Assessment Field
Testing
Assessment
Implementation
Assessment Writing
Assessment Field
Testing
Assessment
Implementation
Assessment Writing
Assessment Field
Testing
Assessment
Implementation
Assessment Writing
Assessment Field
Testing
Lessons learned
 This work takes a team!
 Content area specialists
 Assessment specialists
 Data specialists





This work takes time!
This work takes commitment!
This work takes patience!
This work takes trust!
This work will help increase achievement
if done well.
37
“One of the most powerful, high-leverage
strategies for improving student learning
available to schools is the creation of frequent,
common, high-quality formative assessments by
teachers who are working collaboratively to help
a group of students develop agreed-upon
knowledge and skills.”
Fullan (2005), Hargreaves & Fink (2006), Reeves (2004), Schmoker
(2003), Stiggins (2005)
38
What questions might you
have?
Contact Information
 Dodie Raycraft – St. Joseph County ISD
[email protected]
269.467.5452
 Keith Barnes – St. Joseph County ISD
[email protected]
269.467.5461
 Kimberly Young – MDE/BAA
[email protected]
517.373.0988
 Kathy Dewsbury-White – Michigan Assessment
Consortium
[email protected]
517.927.7640
40