Transcript Slide 1

How Well Do Our
Assessment Programs
REPORT TO SCHOOLS?
Jocelyn Cook
Manager, Educational Measurement
Department of Education and Training
Western Australia
Programs reported to schools
• State and Territory Literacy and
Numeracy population testing
programs
• Random sample testing programs
– State (Monitoring Standards in Education)
– National (Primary Science Assessment
Program, Civics and Citizenship National
Assessment Programs
– PISA, TIMSS
The aim of assessment programs
To contribute to the improvement of
student learning …
How well do our population testing
programs and sample programs
contribute to the improvement of
student learning?
Improving student learning
Three steps
1. Agree and explicate what students
should know and be able to do
2. Measure extent to which this is
being achieved
3. Efforts of educational enterprise
directed to ensuring students reach
those goals
Quality of monitoring programs
Margaret Forster, Research Director,
Assessment and Reporting
(2002 Seventh Roundtable on Assessment in
Canberra)
Framework for judging quality of
system-wide monitoring programs.
Margaret Forster’s Checklist
· Planning the program (clarity of purpose,
resourcing and sustainability)
· Collecting the data (validity and reliability)
· Using the data (informing policy and reform)
Planning the program: Resourcing
All jurisdictions sustain
• international
• national
• State/Territory programs
by providing technical, logistical
financial resources
Planning the Program: Clarity of Purpose
PISA:
PISA was designed to help governments …
enhance the effectiveness of their
educational systems
VCAA:
… The results provide information used to
plan new programs and a useful source of
feedback and guidance to students, parents
and teachers.
Tasmania’s Dept of Education:
… student learning outcomes will be improved
by assessment, monitoring and reporting
practices that:
• inform decision-making about teaching and
learning;
• provide useful and timely feedback to
students, parents and teachers; and
• enable accountability requirements to be
met at student, school, Department and
government levels.
Collecting the data (validity & reliability)
State and Territory programs:
Intense scrutiny test quality &
technical processes since
inception of National Benchmark
Reporting
Collecting the data (validity & reliability)
International and national
assessment programs
national committees oversee and
endorse:
• assessment content
• psychometric and technical
processes and procedures
If accountability mechanism does not
positively affect quality of:
• classroom teaching
• school practice
• public policy
the accountability mechanisms
themselves are failing.
Using the Data
Good reporting
• guides teachers to intelligent
interpretation of data that is
• useful to their work: teaching
students effectively
Using the Data
Harnessing efforts of educational
enterprise to improve students
learning
• technical – agree goals; measure;
respond
• symbolic – motivating force (Joan
Herman 2005)
Symbolic Function
Motivation
• stimulates purposeful reform
• provides incentives and/or
sanctions.
Technical Function
• measure performance and provide
data to support improvement
Using the data
Sample programs
PISA, TIMSS, PSAP, CCNAP, MSE
Limited technical and symbolic function
•
•
•
•
No valid individual student level information
Limited school level information
Time-lag between testing and reporting
Few resources dedicated to maximising use
of data at classroom and school level
Working with data
• Teachers trained for decades to
mistrust test data
• Limited training in educational
measurement
• Some extreme reactions
Data Club (1999)
• Target group - school leaders
• Purpose – support them to judge
school’s performance based on
their school’s Western Australian
Literacy and Numeracy
Assessment (WALNA) data
Assessment for Improvement (2002)
Target group – classroom teachers
Purpose – build confidence in
interpreting their class data derived
from WALNA
• build teachers’ ability to blend their
classroom monitoring with WALNA
results
• plan for future teaching and learning
Teachers’ views
•
•
•
•
Prior use
The power of talk and work
Diagnostic use
Thinking about improvement
Help schools provide better literacy & numeracy teaching
100%
90%
Disagree
95%
92%
Agree
73%
80%
70%
60%
58%
50%
42%
40%
30%
20%
10%
0%
27%
8%
1999
5%
Parent
2002
1999
Teacher
2002
Evaluation Findings
Question: The report gave
me additional information
not available in the
regular school report
Question: The test results
provided me with valuable
diagnostic information
about my students
Disagree
80%
73%
70%
Agree
63%
62%
62%
60%
50%
40%
38%
37%
38%
27%
30%
20%
10%
0%
1999
2002
Parent
1999
Teacher
2002
Evaluation Findings
Question: The data
on individual
students is useful
for diagnostic
purposes
Question: The test
results provided me
with valuable
diagnostic
information about my
students
disagree
100%
agree
89%
90%
80%
70%
62%
60%
50%
38%
40%
30%
20%
11%
10%
0%
Principal
Teacher
Evaluation Findings
principals with no Data Club
less likely to:
• provide the data to teachers or school
community
• judge their staff as confident users of data
• use it when reviewing curriculum plans
• find the data useful for diagnostic purposes
• use it to track student performance.
Sample Programs
What has been learnt from the
population testing experience is
instructive for improving
reporting to schools on sample
programs.
Not about ‘their kids’ in
immediate sense
…..but
information intrinsically
interesting.
• All jurisdictions committed to deep
reform that positively alters
learning outcomes for students
BUT
accountability mechanisms will fail
if teachers are expected to work it
out all by themselves