No Slide Title

Download Report

Transcript No Slide Title

The Thinking Behind
·PACT·
Performance
Assessment for
California Teachers
Raymond Pecheone
Stanford University
April 16, 2008
1
The PACT Assessment
System
•
•
•
•
A performance assessment for teacher candidates
created in response to SB 2042, with new subject matter
standards, new program standards, and new
assessment standards
Alternate assessments permitted must meet California
Quality Standards for reliability/validity
(i.e., AERA/APA test standards).
Aligned with the California Teaching Performance
Expectations (standards) and California Content
Standards
High stakes assessment designed to initially license
beginning teachers
2
PACT Institutions
‣
‣
‣
‣
‣
‣
‣
‣
UC Berkeley
UC Davis
UC Irvine
UCLA
UC Riverside
UC San Diego
UC Santa Barbara
UC Santa Cruz
‣
‣
‣
‣
‣
‣
‣
‣
‣
‣
‣
‣
Cal Poly — SLO
CSU Channel Islands
CSU Chico
CSU Dominguez Hills
CSU Monterey Bay
CSU Northridge
Humboldt State
Sacramento State
San Diego State
San Francisco State
San Jose State
Sonoma State
‣
‣
‣
‣
‣
‣
‣
‣
‣
‣
‣
Stanford
Holy Names University
Mills College
Notre Dame de Namur
University
Pepperdine University
St. Mary’s College of
California
University of the Pacific
University of San Diego
Antioch University
USC
San Diego Intern
3
The PACT Assessment
System
Assessments Embedded in Local Programs
— examples —
The Capstone
Teaching Event
Teaching Event
Demonstrates :
Child
Case
Studies
Analyses of
Student
Learning
Curriculum
/Teaching
Analyses
‣Planning
‣Instruction
‣Assessing
‣Reflecting
‣Academic Language
Observation/Supervisory Evaluation & Feedback
4
Teaching Event
Records of Practice*
Instructional and Social Context
3 to 5 Days
Planning
•Lesson Plans
•Handouts,
overheads,
student work
•Lesson
Commentary
Instruction
•Video clip(s)
•Teaching
Commentary
Assessment
Reflection
Whole Class
Assessment
•Analysis of
learning of 2
students
Reflections
•Reflective
Commentary
•Analysis of
•Daily
Evidence of Academic Language
* 24 Teaching Events in 13 credential areas
5
Teaching Event
Subject Areas
•
Multiple Subjects
‣
‣
Literacy
Mathematics
•
Single Subject
‣
‣
‣
‣
‣
‣
‣
‣
‣
Agriculture
English language arts
History social science
Mathematics
Science
Art
Music
Physical Education
World languages
6
Guiding Questions and
Analytic Rubrics
•
PLANNING
‣ Establishing a Balanced
•
ASSESSMENT
‣ Analyzing Student Work From
Instructional Focus
‣
‣
•
an Assessment
‣
Making Content Accessible
Designing Assessments
INSTRUCTION
‣ Engaging Students in Learning
‣ Monitoring Student Learning
During Instruction
•
•
Using Assessment to Inform
Teaching
REFLECTION
‣ Monitoring Student Progress
‣ Reflecting on Teaching
ACADEMIC LANGUAGE
‣ Understanding Language
Demands
‣
Supporting Academic
Language Development
7
PACT Rubrics
(one example)
ELEMENTARY LITERACY TEACHING EVENT ( 2004-05 PILOT)
GUIDING QUESTION:How does the candidate use analysis
of student learning to propose next steps in instruction?
Level 1
•
Level 3
Level 4
• Next steps focus on improving
•
Next steps focus on
improving student
performance through
support that addresses
student
misunderstandings or
needs.
Next steps are based
on broad patterns of
performance on the
assessment.
•
Next steps focus on
improving student
performance through
targeted support to
individuals and groups
to address specific
misunderstandings or
needs.
Next steps are based
on analysis of whole
class patterns of
performance, some
patterns for individuals
and/or subgroups and
general knowledge of
indvidiual students
and/or subgroups.
All components of
Level 3 plus:
•
student performance through
support that addresses student
•
•
misunderstandings
or needs.
• Next steps are based on broad
patterns of performance on the
assessment.
Next steps are not
described in sufficient
detail to understand
them.
— OR —
•
Level 2
Next steps are vaguely
related to or not
aligned with the
analysis of student
misunderstandings and
needs.
— OR —
•
Level 2
Next steps are based
on inaccurate
Next steps demonstrate a
strong understanding of
both the identfied content
and language standards
and of individual students
and/or subgroups.
8
2-Day Subject Specific
Scorer Training
•
DAY 1
‣
‣
‣
‣
Overview of PACT Teaching Event and scoring process
Discussion on bias
Note taking and Documentation
Understanding Level “2”
•
•
DAY 2
‣
‣
‣
Understanding Level “1”
Understanding Level “3”
Independently score a Calibration
Teaching Event & Debrief
9
PACT Scores
Inter-rater Reliability
Level of Agreement
Percent
Exact Match
46%
± 1 point
34%
± 2 points or greater
10%
Sample Size · 2,580
Spearman-Brown Reliability Estimate · 0.88
10
PACT Validity Studies
•
Content validity
‣
‣
•
•
Decision Consistency ·
Holistic vs. Analytic ratings
Bias and fairness
review
Factor Analysis
(2002-03 Pilot Year):
• Reflection
& Assessment
TPE alignment study
Evaluation of score validity
Construct validity
‣
Development teams,
Program directors,
Program faculty, &
Leadership team
Concurrent validity
‣
‣
•
• Instruction
• Planning
•
Predictive Validity
(Carnegie/CT Study)
11
What We Learn from
the PACT Analyses
• How our candidates do:
‣
‣
‣
‣
‣
On different aspects of teaching
In different subject areas
In comparison to other institutions
Over time
With different kinds of supports
12
Data Charts · 2003-04
Campus/Task Scores
3.25
3.00
2.75
2.50
2.25
2.00
1.75
To
ta
lM
IS
Pl
an
ni
ng
In
st
ru
ct
io
As
n
se
ss
m
en
t
Re
fle
Ac
ct
io
ad
n
La
ng
ua
ge
Mean Item Score
Task Mean Item Scores by Campus
A
B
C
D
E
F
G
H
I
J
K
Category Title
13
Data Charts · 2003-04
Content Area/Task Scores
Task Mean Item Scores by Content
Area
2.75
2.50
EL
EM
ELA
MTH
HSS
SCI
2.25
2.00
1.75
To
ta
P
lM
la
IS
nn
i
In
ng
st
M
ru
IS
ct
A
io
ss
n
es
sm MIS
en
R
tM
ef
A
ca
le
IS
ct
de
i
on
m
ic
M
La
IS
ng
ua
ge
Mean Item Score
3.00
Category Title
14
PACT Scores - Assessment
of Student Learning (20032005)
Student Learning Score
Frequency 2004
250
Frequency
250
200
150
100
50
0
200
150
100
50
0
1
2
3
4
Student Learning Score
Frequency 2005
Score
1
2
3
4
Score
400
Frequencies
Frequency
Student Learning Score
Frequency 2003
300
200
100
0
1
2
3
Rubric Scores
4
15
Faculty Learning &
Program Improvement
•
•
•
Increased articulation across courses,
structures and roles
Changes in content of some courses
Structural changes in Teacher
Education Program
16
PACT Teaching Event ·
DNA
•
•
•
•
•
Documents teaching of learning segment
(3-5 lessons or hours of instruction)
Subject specific
Standardized tasks & core questions across
programs
Scored with common rubrics, passing standard
During student teaching
17
For More Information...
•
See Teaching Event
Handbooks and Rubrics at
www.pacttpa.org.
18
Actions
Program
Meetings
PACT
Advisor
Professional
Development
Analysis of
Candidate
Work
Scoring
theTE
Collaborative
planning
acrossUniver
sity &K-12
schools
19
Total Mean Item Scores and Task Mean Item Scores by Campus
(2003-04 Pilot Year)
3.30
A
3.10
B
C
2.70
D
2.50
E
F
2.30
G
H
2.10
I
1.90
J
1.70
K
ge
ua
tio
La
ng
le
c
Ac
a
d
R
ef
sm
es
As
s
n
en
t
n
In
st
ru
c
tio
ni
ng
Pl
an
lM
IS
1.50
To
ta
Mean Item Score
2.90
20
The Research Base for
Teacher Licensing Tests
•
Weak relationship between traditional licensing
tests and teacher effectiveness (NRC, 2001)
‣
‣
‣
‣
‣
•
Strauss & Sawyer (1986)
Ferguson (1991, 1998)
Ferguson & Ladd (1996)
Clotfelter, Ladd, & Vigdor (forthcoming)
Goldhaber (2005, 2006)
Effect sizes quite small in recent value added
research (.01 .06)
21
Educative Assessment
•
•
•
•
•
Teachers Matter
Subject Matter Matters
Preparation (support) Matters
Authenticity Matters
Integration of Practice Matters
22
Total Mean Item Scores and Task Mean Item Scores by Campus
(2003-04 Pilot Year)
3.30
A
3.10
B
C
2.70
D
2.50
E
F
2.30
G
H
2.10
I
1.90
J
1.70
K
ge
ua
tio
La
ng
le
c
Ac
a
d
R
ef
sm
es
As
s
n
en
t
n
In
st
ru
c
tio
ni
ng
Pl
an
lM
IS
1.50
To
ta
Mean Item Score
2.90
23
Total Mean Item Score and Task Mean Item Score by
Content Area (2003-04 Pilot Year)
3.1
2.9
EL
EM
2.5
ELA
2.3
MTH
2.1
HSS
SCI
1.9
1.7
e
La
ng
ua
g
M
IS
ic
io
n
Ac
ad
em
R
ef
le
ct
M
en
tM
IS
As
se
ss
m
ct
io
n
In
st
ru
Pl
an
ni
ng
M
IS
IS
1.5
To
ta
lM
IS
Mean Item Score
2.7
24
California Teaching
Performance Expectations
‣
‣
‣
‣
‣
‣
TPE 1 · Specific Pedagogical Skills for Subject
Matter Instruction
TPE 2 · Monitoring Student
Learning During Instruction
TPE 3 · Interpretation and
Use of Assessments
TPE 4 · Making Content
Accessible
TPE 5 · Student Engagement
TPE 6 · Developmentally
Appropriate Teaching
Practices
‣TPE 7 · Teaching English
Learners
‣TPE 8 · Learning about
Students
‣TPE 9 · Instructional Planning
‣TPE 10 · Instructional Time
‣TPE 11 · Social Environment
‣TPE 12 · Professional, Legal,
and Ethical Obligations
‣TPE 13 · Professional Growth
25
What is Subject Specific
about the Teaching Event?
•
•
•
•
•
Focus of learning segment & aligned to Ca. content
standards
Teaching/learning tasks on video clip(s)
Additional prompts in some content areas
(e.g., misconceptions in science, dispositions in
mathematics, description of text in
English/language arts)
Common and subject specific rubrics
Benchmarks
within subject areas
26
Scoring
•
•
•
•
•
Trained and calibrated subject specific
assessors
Campus based with central audits &
regional scoring
Rubric based scoring in real time
(web based platforms)
Organized around dimensions of
teaching (PIARA) and guiding questions
Sequentially Scored By PIARA Tasks
27
PACT Scores
Assessment of Student Learning (2003 2005)
28