Exploring Assessment for Learning

Download Report

Transcript Exploring Assessment for Learning

Consider the Evidence
Evidence-driven decision making
for secondary schools
A resource to assist schools
to review their use of data and other evidence
6
Getting to Information
Page 1
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven decision making
This module is part of a resource about how we use data and other
evidence to improve teaching, learning and student achievement
Today we are looking at the second stage of this process –
collecting and analysing evidence, and getting to information we
can use
Page 2
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence
Any facts, circumstances or perceptions that can be used as an
input for an analysis or decision
• how classes are compiled, how classes are allocated to
teachers, test results, teachers’ observations, attendance
data, portfolios of work, student opinions …
Data are one form of evidence
Page 3
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Data
Known facts or measurements, probably expressed in some
systematic or symbolic way (eg as numbers)
• assessment results, gender, attendance, ethnicity …
Data are one form of evidence
Page 4
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
What evidence does a school have?
•
•
•
•
•
Page 5
Demographics
Student achievement
Perceptions
School processes
Other practice
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Data indicate a
possible issue that
could impact on
student
achievement
Speculate
A teacher has a
hunch about a
problem or a
possible action
Explore
Check data and
evidence to
explore the issue
Reflect
on what has been
learned, how
practice will change
Evaluate the
impact on the
intervention
Question
Clarify the issue and
ask a question
Assemble Decide
what data and
evidence might be
useful
Act
Carry out the
intervention
Intervene
Plan an action aimed at
improving student
achievement
Page 6
Interpret Insights
that answer your
question
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Analyse data and
evidence
The evidence-driven decision making cycle
.Reflect
How will we
teach writing in
the future?
Trigger
Significant
numbers not
achieving well
in writing
A teacher has a
hunch - poor
writers might
spend little time
on homework
Explore data
Survey of
students
shows that
this is only
partially true
Evaluate Has
writing improved?
Intervene
Create multiple
opportunities for writing;
include topics that can
use sport as context;
connect speaking and
writing. PD for staff.
Page 7
Interpret
information
Poor writers
likely to play
sport, speak
well, read less,
do little HW
Analyse
NQF/NCEA
results by
standard
Analyse non
NQF/NCEA data
and evidence
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Question
What are the
characteristics
of students who
are poor at
writing?
Assemble
more data &
other
evidence:
asTTle reading,
homework,
extracurric,
Attendance, etc
Evidence-driven decision making
Getting to information
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 8
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
> Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 9
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Assembling the evidence
• We want to know if our senior students are doing better in one
area of NCEA Biology than another.
So … we need NCEA results for our cohort.
• It could be that all Biology students do better in this area than
others.
So … we also need data about national differences across the two
areas.
Page 10
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Are our data any good?
A school found that a set of asTTle scores indicated that almost
all students were achieving at lower levels than earlier in the
year.
Then they discovered that the first test had been conducted in
the morning, but the later test was in the afternoon and soon
after the students had sat a two-hour exam.
Page 11
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Think critically about data
• Was the assessment that created this data assessing exactly
what we are looking for?
• Was the assessment set at an appropriate level for this group of
students?
• Was the assessment properly administered?
• Are we comparing data for matched groups?
Page 12
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Cautionary tale 1
You want to look at changes in a cohort’s asTTle writing levels
over 12 months.
Was the assessment conducted at the same time both years?
Was it administered under the same conditions?
Has there been high turnover in the cohort?
If so, will it be valid to compare results?
Page 13
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Cautionary tale 2
You have data that show two classes have comparable
mathematics ability. But end-of-year assessments show one
class achieved far better than the other.
What could have caused this?
Was the original data flawed? How did teaching methods differ?
Was the timetable a factor? Did you survey student views? Are
the classes comparable in terms of attendance, etc?
Page 14
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
> Analyse
Interpret
Intervene
Evaluate
Reflect
Page 15
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Analysing data and other evidence
• Schools need some staff members who are responsible for
leading data analysis
• Schools have access to electronic tools to process data into
graphs and tables
• All teachers do data analysis
• Data is not an end in itself - it’s one of the many stages along
the way to evidence-driven decision making
Page 16
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Types of analysis
We can compare achievement data by subject or across
subjects for
• an individual student
• groups of students
• whole cohorts
The type of analysis we use depends on the question we want
to answer
Page 17
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Inter-subject analysis
• Have my students not achieved a particular history standard
because they have poor formal writing skills, rather than poor
history knowledge?
Page 18
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Intra-subject analysis
• What are the areas of strength and weakness in my own
teaching of this class?
Page 19
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Longitudinal analysis
• Are we producing better results over time in year 11 biology?
Page 20
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Basic analysis
2.1
Pamela N
1
A
Lee
2
E
Manu
3
Keisha N
4
E
Bron
5
M
Deane
6
N
Slane
7
A
Sam
8
M
Sione
9
A
10 Oran
E
11 Shirin
E
12 Hanna
E
13 Val
N
14 Liam
15 Morgan M
N
16 Hone
A
17 Mahi
Page 21
2.2
A
A
E
A
M
M
A
A
M
A
E
E
E
A
M
A
A
2.3
N
A
E
N
M
E
N
N
N
A
E
M
E
M
M
N
N
2.4*
N
N
E
N
N
M
N
A
N
A
E
M
E
M
M
N
A
2.5*
N
N
N
N
N
N
N
A
N
A
A
A
N
N
N
N
A
2.6* ABS DET
6
20
N
0
12
A
4
18
E
8
7
N
0
3
A
1
2
A
8
22
N
8
12
A
2
2
N
0
7
A
0
6
E
1
0
M
0
0
E
2
10
M
0
15
M
4
17
N
0
10
A
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Basic analysis
• Divide the class into three groups on the basis of overall
achievement
• Identify students who are doing so well at level 2 that they
could be working at a higher level
• Find trends for males and females, those who are absent often,
or have many detentions
• Compare this group’s external assessment success rate with the
national cohort.
Page 22
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Reading levels – terms 1 and 4
.
Page 23
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Making sense of the results
Think about significance and confidence
How significant are any apparent trends?
How much confidence can we have in the information?
Page 24
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Making sense of the results
This table shows that reading levels
overall were higher in term 4 than in
term 1.
Scores improved for most students.
20% of students moved into Level 5.
But the median score is still 4A.
Is this information? Can we act on it?
Page 25
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
> Interpret
Intervene
Evaluate
Reflect
Page 26
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do we have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Making sense of information
Data becomes information when it is categorised, analysed,
summarised and placed in context.
Information therefore is data endowed with relevance and purpose.
Information is developed into knowledge when it is used to make
comparisons, assess consequences, establish connections and
engage in dialogue.
Knowledge … can be seen as information that comes laden with
experience, judgment, intuition and values.
Empson (1999) cited in Mason (2003)
Page 27
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Information
Knowledge gained from analysing data and making
meaning from evidence
Information is knowledge (or understanding) that can inform
your decisions.
How certain you will be about this knowledge depends on a
number of factors: where your data came from, how reliable it
was, how rigorous your analysis was.
So the information you get from analysing data could be a
conclusion, a trend, a possibility.
Page 28
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Information
Summative information is useful for reporting against targets
and as general feedback to teachers.
Formative information is information we can act on – it informs
decision-making that can improve learning.
Page 29
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Reading levels – terms 1 and 4
Page 30
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Questions to elicit information
• Did the more able students make significant progress, but not
the lower quartile?
• How have the scores of individual students changed?
• How many remain on the same level?
• How much have our teaching approaches contributed to this
result?
• How much of this shift in scores is due to students’ predictable
progress? Is there any data that will enable us to compare our
students with a national cohort?
• How does this shift compare with previous Year 9 cohorts?
Page 31
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Words, words, words …
Information can … establish, indicate, confirm, reinforce, back
up, stress, highlight, state, imply, suggest, hint at, cast doubt
on, refute …
• Does this confirm that …?
• What does this suggest?
• What are the implications of …?
• How confident are we about this conclusion?
Page 32
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
• Is this the sort of result we envisaged? If not, why?
• How does this information compare with the results of other
research or the experiences of other schools?
• Are there other variables that could account for this result?
• Should we set this information alongside other data or evidence
to give us richer information?
• What new questions arise from this information?
Page 33
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
• Does this relate to student achievement - or does it actually tell
us something about our teaching practices?
• Does this information suggest that the school’s strategic goals
and targets are realistic and achievable? If not, how should they
change, or should we change?
• Does the information suggest we need to modify programmes
or design different programmes?
• Does the information suggest changes need to be made to
school systems?
Page 34
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
What effect is the new 6-day x 50-min period structure having
on student engagement levels?
Page 35
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
What effect is the new 6-day x 50-min period structure having
on student engagement levels?
Do student views align with staff views?
Do positive effects outweigh negative effects?
Is there justification for reviewing the policy?
Does the information imply changes need to be made to
teaching practices or techniques?
Does the information offer any hint about what sort of changes
might work?
Page 36
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 37
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.