Exploring Assessment for Learning

Download Report

Transcript Exploring Assessment for Learning

Consider the Evidence
Evidence-driven decision making
for secondary schools
A resource to assist schools
to review their use of data and other evidence
1
Page 1
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven decision making
Today we aim to
• think about how we use data and other evidence to improve
teaching, learning and student achievement
• improve our understanding, confidence and capability in using data
to improve practice
• discuss how we make decisions
• think about our needs and start to plan our own evidence-based
projects
Page 2
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven eating
You need to buy lunch. Before you decide what to buy you
consider a number of factors:
• how much money do you have?
• what do you feel like eating?
• what will you be having for dinner?
• how far do you need to go to buy food?
• how much time do you have?
• where are you going to eat it?
Page 3
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven teaching
I had a hunch that Ana wasn’t doing as well as she could in her
research assignments, a major part of the history course. What
made me think this?
Ana’s general work (especially her writing) was fine. She made
perceptive comments in class, contributed well in groups and had
good results overall last year, especially in English.
How did I decide what to do about it?
I looked more closely at her other work. I watched her working in
the library one day to see if it was her reading, her use of resources,
her note taking, her planning, or what. At morning tea I asked one
of Ana’s other teachers about Ana’s approach to similar tasks. I
asked Ana if she knew why her research results weren’t as good as
her other results, and what her plans were for the next assignment.
I thought about all of this and planned a course of action. I gave her
help with using indexes, searching, note taking and planning and
linking the various stages of her research.
Page 4
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Consider the Evidence
A resource to assist schools
to review their use of data and other evidence
What is meant by ‘data and other evidence’?
Page 5
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence
Any facts, circumstances or perceptions that can be used as an
input for an analysis or decision
• how classes are compiled, how classes are allocated to
teachers, test results, teachers’ observations, attendance data,
portfolios of work, student opinions …
Data are one form of evidence
Page 6
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Data
Known facts or measurements, probably expressed in some
systematic or symbolic way (e.g. as numbers)
• assessment results, gender, attendance, ethnicity …
Data are one form of evidence
Page 7
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Which factors are data?
Evidence to consider before buying lunch
• how much money you have
• what you feel like eating
• what you’ll be having for dinner
• how far you need to go to buy food
• how much time you have
• where you’re going to eat
• what your diet allows
Page 8
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven decision making
We have more evidence about what students know and can do
than ever before – their achievements, behaviours, environmental
factors that influence learning
We should
• draw on all our knowledge about the learning environment to
improve student achievement
• explore what lies behind patterns of achievement
• decide what changes will make a difference
Page 9
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
What evidence does a school have?
•
•
•
•
•
Page 10
Demographics
Student achievement
Perceptions
School processes
Other practice
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Demographics
What data do we have now to provide a profile of our school?
What other data could we create?
• School
• Students
• Staff
• Parents/caregivers and community
Page 11
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Demographics
Data that provides a profile of our school
• School – decile, roll size, urban/rural, single sex or coeducational, teaching spaces …
• Students – ethnicity, gender, age, year level, attendance,
lateness, suspension and other disciplinary data, previous
school, part-time employment …
• Staff – gender, age, years of experience, qualifications,
teaching areas, involvement in national curriculum and
assessment, turnover rate …
• Parents/caregivers and community – socio-economic factors,
breadth of school catchment, occupations …
Page 12
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Student achievement
What evidence do we have now about student achievement?
What other evidence could we collect?
• National assessment results
• Standardised assessment results administered internally
• Other in-school assessments
• Student work
Page 13
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Student achievement
Evidence about student achievement
• National assessment results - NCEA, NZ Scholarship - details like
credits above and below year levels, breadth of subjects
entered…
• Standardised assessment results administered internally - PAT,
asTTle …
• Other in-school assessments - most non-standardised but some,
especially within departments, will be consistent across classes includes data from previous schools, primary/intermediate
• Student work - work completion rates, internal assessment
completion patterns, exercise books, notes, drafts of material these can provide useful supplementary evidence
Page 14
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Perceptions
What evidence do we have now about what students, staff and others
think about the school?
Are there other potential sources?
• Self appraisal
• Formal and informal observations made by teachers
• Structured interactions
• Externally generated reports
• Student voice
• Other informal sources
Page 15
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Perceptions
Evidence about what students, staff, parents and the community
think about the school
• Self appraisal - student perceptions of their own abilities,
potential, achievements, attitudes …
• Formal and informal observations made by teachers - peer
interactions, behaviour, attitudes, engagement, student-teacher
relationships, learning styles, classroom dynamics …
• Structured interactions - records from student interviews, parent
interviews, staff conferences on students …
• Externally generated reports - from ERO and NZQA (these
contain data but also perceptions) …
• Student voice - student surveys, student council submissions …
• Other informal sources – views about the school environment,
staff and student morale, board perceptions, conversations
among teachers …
Page 16
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
School processes
What evidence do we have about how our school is organised and
operates?
• Timetable
• Classes
• Resources
• Finance
• Staffing
Page 17
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
School processes
Evidence about how our school is organised and operates
• School processes - evidence and data about how your school is
organised and operates, including:
• Timetable –structure, period length, placement of breaks,
subjects offered, student choices, tertiary and workforce factors,
etc
• Classes - how they are compiled, their characteristics, effect of
timetable choices, etc
• Resources - access to libraries, text books, ICT, special
equipment, etc
• Finance - how the school budget is allocated, how funds are
used within departments, expenditure on professional
development
• Staffing - policies and procedures for employing staff, allocating
responsibility, special roles, workload, subjects and classes
Page 18
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Other practice
How can we find out about what has worked (or not) in other
schools?
Page 19
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Other practice
How we can find out about what has worked in other schools?
• Documented research – university and other publications,
Ministry of Education’s Best Evidence Syntheses, NZCER,
NZARE, overseas equivalents …
• Experiences of other schools – informal contacts, local clusters,
advisory services, TKI LeadSpace …
Page 20
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
What can we do with evidence?
Shane’s story
A history HOD wants to see whether history students are
performing to their potential.
She prints the latest internally assessed NCEA records for history
students across all of their subjects. As a group, history students
seem to be doing as well in history as they are in other subjects.
Then she notices that Shane is doing very well in English and only
reasonably well in history. She wonders why, especially as both are
language-rich subjects with many similarities.
The HOD speaks with the history teacher, who says Shane is
attentive, catches on quickly and usually does all work required. He
mentions that Shane is regularly late for class, especially on
Monday and Thursday. So he often misses important information or
takes time to settle in. He has heard there are ‘problems at home’
so has overlooked it, especially as the student is doing reasonably
well in history.
Page 21
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Shane’s story (cont...)
The HOD looks at the timetable and discovers that history is Period
1 on Monday and Thursday. She speaks to Shane’s form teacher
who says that she suspects Shane is actually late to school virtually
every day. They look at centralised records. Shane has excellent
attendance but frequent lateness to period 1 classes.
The HOD speaks to the dean who explains that Shane has to take
his younger sister to school each morning. He had raised the issue
with Shane but he said this was helping the household get over a
difficult period and claimed he could handle it.
The staff involved agree that Shane’s regular lateness is having a
demonstrable impact on his achievement, probably beyond history
but not so obviously.
The dean undertakes to speak to the student, history teacher, and
possibly the parents to find a remedy for the situation.
Page 22
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Thinking about Shane’s story
What were the key factors in the scenario about Shane?
What types of data and other evidence were used?
What questions did the HOD ask?
What happened in this case that wouldn’t necessarily happen in
some schools?
Page 23
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
23
Shane’s story - keys to success
The history HOD looked at achievement data in English and
history.
She looked for something significant across the two data sets, not
just low achievement.
Then she asked a simple question: Why is there such a disparity
between in these two subjects for that student?
She sought information and comments (perceptions evidence
and data) from all relevant staff.
The school had centralised attendance and punctuality records
(demographic data) that form teacher could access easily.
The action was based on all available evidence and designed to
achieve a clear aim.
Page 24
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven strategic planning
If we use evidence-driven decision making to improve student
achievement and enhance teaching practice …
… it follows that strategic planning across the school should also be
evidence-driven.
Page 25
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evidence-driven strategic planning
INDICATORS
FROM DATA
.
asTTle scores
show a high
proportion of
year 9
achieving
below
curriculum level
NCEA results
show high nonachievement in
transactional
writing
STRATEGIC GOAL
ANNUAL PLAN
YEAR TARGET
To raise the levels of writing
across the school
Develop and implement
a plan to raise levels of
Writing at year 9
Raise writing asTTle
results year 9 boys
from 3B to 3A
Strategic action
Development plan to be
based on an analysis of
all available data and to
include a range of
shared strategies
etc.
Develop a writing
development plan which
addresses writing across
subjects and levels ,
including targets,
professional development
and other resourcing needs
etc.
etc.
EVALUATION
DATA
Appraisal
asTTle writing
results improve by
…
Perception data
from Yr 9 staff
indicates …
Evaluation of
effectiveness of
range of shared
strategies, barriers
and enablers …
etc
Poor results in
other language
NCEA
standards
PD
Self
review
etc.
School
charter
Page 26
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 27
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 28
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Data indicate a
possible issue that
could impact on
student
achievement
Speculate
A teacher has a
hunch about a
problem or a
possible action
Explore
Check data and
evidence to
explore the issue
Reflect
on what has been
learned, how
practice will change
Evaluate the
impact on the
intervention
Question
Clarify the issue
and ask a question
Assemble Decide
what data and
evidence might be
useful
Act
Carry out the
intervention
Intervene
Plan an action aimed at
improving student
achievement
Page 29
Interpret Insights
that answer your
question
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Analyse data and
evidence
The evidence-driven decision making cycle
SPECULATE
TRIGGER
EXPLORE
REFLECT
QUESTION
EVALUATE
ASSEMBLE
ACT
INTERVENE
Page 30
INTERPRET
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
ANALYSE
The evidence-driven decision making cycle
.Reflect
How will we
teach writing in
the future?
Trigger
Significant
numbers not
achieving well
in writing
A teacher has a
hunch - poor
writers might
spend little time
on homework
Explore data
Survey of
students
shows that
this is only
partially true
Evaluate Has
writing improved?
Intervene
Create multiple
opportunities for writing;
include topics that can
use sport as context;
connect speaking and
writing. PD for staff.
Page 31
Interpret
information
Poor writers
likely to play
sport, speak
well, read less,
do little HW
Analyse
NQF/NCEA
results by
standard
Analyse non
NQF/NCEA data
and evidence
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Question
What are the
characteristics
of students who
are poor at
writing?
Assemble
more data &
other
evidence:
asTTle reading,
homework,
extracurric,
attendance,
etc.
Evaluate and reflect
• Summative evaluation – assess how successful the intervention
was; decide how our practice will change; report to board
• Formative evaluation – at every stage in the cycle we reflect
and evaluate
Are we are on the right track?
Do we need to fine-tune?
Do we actually need to complete this?
Page 32
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Types of analysis
We can compare achievement data by subject or across subjects
for
• an individual student
• groups of students
• whole cohorts
The type of analysis we use depends on the question we want to
answer
Page 33
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Inter-subject analysis
• Have my students not achieved a particular history standard
because they have poor formal writing skills, rather than poor
history knowledge?
Page 34
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Intra-subject analysis
• What are the areas of strength and weakness in my own
teaching of this class?
Page 35
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Longitudinal analysis
• Are we producing better results over time in year 11 biology?
Page 36
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
> Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 37
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Asking questions
Evidence-driven decision making starts with asking good questions
You can tell whether a man is clever by his answers. You can
tell whether he is wise by his questions.
Nobel Prize winner, Naguib Mahfouz
Page 38
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Trigger questions
•
•
•
•
•
•
•
•
•
•
How good/poor is …?
What aspects of … are good/poor?
Is … actually changing?
How is … changing?
Is … better than last year?
How can … be improved?
Why is … good/poor?
What targets are reasonable for …?
What factors influence the situation for …?
What would happen if we …?
Formative or summative?
Page 39
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Summative questions
A target in the school’s annual plan is for all year 10 boys to
improve their writing level by at least one level using asTTle (e.g.
from 4B to 4A).
Have all year 10 boys improved by at least one asTTle level in
writing?
Page 40
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Questions about policy
We have been running 60-minute periods for 5 years now.
What effect has the change had?
Page 41
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Formative questions from data
The data suggest our students are achieving well in A, but less well
in B.
What can we do about that?
Page 42
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Formative questions from data
A significant proportion of our school leavers enrol in vocational
programmes at polytechnic or on-job.
How well do our school programmes prepare those students?
Page 43
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Questions from hunches
• I suspect this poor performance is being caused by …
Is this true?
• We reckon results will improve if we put more effort into ...
Is this likely?
• I think we’d get better results from this module if we added …
Is there any evidence to support this idea?
Page 44
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Hunches from raw data
2.1
1
Pamela N
2
Lee
A
3
Manu
E
4
Keisha N
5
Bron
E
6
Deane
M
7
Slane
N
8
Sam
A
9
Sione
M
10 Oran
A
11 Shirin
E
12 Hanna
E
13 Val
E
14 Liam
N
15 Morgan M
16 Hone
N
17 Mahi
A
Page 45
2.2
A
A
E
A
M
M
A
A
M
A
E
E
E
A
M
A
A
2.3
N
A
E
N
M
E
N
N
N
A
E
M
E
M
M
N
N
2.4*
N
N
E
N
N
M
N
A
N
A
E
M
E
M
M
N
A
2.5*
N
N
N
N
N
N
N
A
N
A
A
A
N
N
N
N
A
2.6* ABS DET
N
20
6
A
12
0
E
18
4
N
7
8
A
3
0
A
2
1
N
22
8
A
12
8
N
2
2
A
7
0
E
6
0
M
0
1
E
0
0
M
10
2
M
15
0
N
17
4
A
10
0
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Hunches from raw data
• Is the class as a whole doing better in internally assessed
standards than in externally assessed standards? If so, why?
• Are the better students (with many Excellence results) not doing
as well in external assessments as in internal? If so, why?
• Is there any relationship between absences and achievement
levels? It seems not, but it’s worth analysing the data to be
sure.
Page 46
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
> Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 47
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Question – Explore – Question
It looks like our students are doing well in A but not in B. What can
we do about it?
EXPLORE … what else should we be asking?
Is this actually the case?
Is there anything in the data to suggest what we could do about it?
Page 48
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Question – Explore – Question
We have been running 60-minute periods for a year now. Did the
change achieve the desired effects?
EXPLORE … what else should we be asking?
How has the change impacted on student achievement?
Has the change has had other effects?
Is there more truancy?
Is more time being spent in class on assignments, rather than as
homework?
Page 49
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
> Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 50
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
A very good question
• Specific and with a clear purpose
• Able to be investigated through looking at data and other
evidence
• Likely to lead to information on which we can act
Page 51
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Questions with purpose
What do we know about reported bullying incidents for year 10
students?
MAY BE BETTER AS
Who has been bullying whom? Where?
What are students telling us?
What does pastoral care data tell us? Were some interventions
more effective with some groups of students than others?
Page 52
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Write more purposeful questions
1. What are the attendance rates for year 11 students?
2. What has been the effect of the new 6-day x 50-min period
structure?
3. How well are boys performing in formal writing in year 9?
4. What has been the effect of shifting the lunch break to after
period 4?
Page 53
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
More purposeful questions
1. How do year 11 attendance rates compare with other year
levels? Do any identifiable groups of year 11 students attend
less regularly than average?
2. Is the new 6-day x 50-min period structure having any positive
effect on student engagement levels? Is it influencing
attendance patterns? What do students say?
3. Should we be concerned about boys’ writing? If so, what action
should we be taking to improve the writing of boys in terms of
the literacy requirements for NCEA Level 1?
4. The new timing of the lunch break was intended to improve
student engagement levels after lunch. Did it achieve this? If
so, did improvements in student engagement improve student
achievement? Do the benefits outweigh any disadvantages?
Page 54
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
> Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 55
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Assembling the evidence
• We want to know if our senior students are doing better in one
area of NCEA biology than another.
So … we need NCEA results for our cohort.
• It could be that all biology students do better in this area than
others.
So … we also need data about national differences across the two
areas.
Page 56
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Are our data any good?
A school found that a set of asTTle scores indicated that almost all
students were achieving at lower levels than earlier in the year.
Then they discovered that the first test had been conducted in the
morning, but the later test was in the afternoon and soon after the
students had sat a two-hour exam.
Page 57
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Think critically about data
• Was the assessment that created this data assessing exactly
what we are looking for?
• Was the assessment set at an appropriate level for this group of
students?
• Was the assessment properly administered?
• Are we comparing data for matched groups?
Page 58
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Cautionary tale 1
You want to look at changes in a cohort’s asTTle writing levels over
12 months.
Was the assessment conducted at the same time both years?
Was it administered under the same conditions?
Has there been high turnover in the cohort?
If so, will it be valid to compare results?
Page 59
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Cautionary tale 2
You have data that show two classes have comparable
mathematics ability. But end-of-year assessments show one class
achieved far better than the other.
What could have caused this?
Was the original data flawed? How did teaching methods differ?
Was the timetable a factor? Did you survey student views? Are the
classes comparable in terms of attendance, etc?
Page 60
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
> Analyse
Interpret
Intervene
Evaluate
Reflect
Page 61
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Analysing data and other evidence
• Schools need some staff members who are responsible for
leading data analysis
• Schools have access to electronic tools to process data into
graphs and tables
• All teachers do data analysis
• Data is not an end in itself – it’s one of the many stages along
the way to evidence-driven decision making
Page 62
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Basic analysis
2.1
Pamela N
1
A
Lee
2
E
Manu
3
Keisha N
4
E
Bron
5
M
Deane
6
N
Slane
7
A
Sam
8
M
Sione
9
A
10 Oran
E
11 Shirin
E
12 Hanna
E
13 Val
N
14 Liam
15 Morgan M
N
16 Hone
A
17 Mahi
Page 63
2.2
A
A
E
A
M
M
A
A
M
A
E
E
E
A
M
A
A
2.3
N
A
E
N
M
E
N
N
N
A
E
M
E
M
M
N
N
2.4*
N
N
E
N
N
M
N
A
N
A
E
M
E
M
M
N
A
2.5*
N
N
N
N
N
N
N
A
N
A
A
A
N
N
N
N
A
2.6* ABS DET
6
20
N
0
12
A
4
18
E
8
7
N
0
3
A
1
2
A
8
22
N
8
12
A
2
2
N
0
7
A
0
6
E
1
0
M
0
0
E
2
10
M
0
15
M
4
17
N
0
10
A
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Basic analysis
• Divide the class into three groups on the basis of overall
achievement
• Identify students who are doing so well at level 2 that they
could be working at a higher level
• Find trends for males and females, those who are absent often,
or have many detentions
• Compare this group’s external assessment success rate with the
national cohort.
Page 64
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Reading levels – terms 1 and 4
.
Page 65
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Making sense of the results
Think about significance and confidence
How significant are any apparent trends?
How much confidence can we have in the information?
Page 66
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Making sense of the results
This table shows that
reading levels overall
were higher in term 4
than in term 1.
Scores improved for most students.
20% of students moved into level 5.
But the median score is still 4A.
Is this information? Can we act on it?
Page 67
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Information
Knowledge gained from analysing data and making
meaning from evidence.
Information is knowledge (or understanding) that can inform your
decisions.
How certain you will be about this knowledge depends on a
number of factors: where your data came from, how reliable it
was, how rigorous your analysis was.
So the information you get from analysing data could be a
conclusion, a trend, a possibility.
Page 68
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Information
Summative information is useful for reporting against targets and
as general feedback to teachers.
Formative information is information we can act on – it informs
decision-making that can improve learning.
Page 69
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Questions to elicit information
• Did the more able students make significant progress, but not
the lower quartile?
• How have the scores of individual students changed?
• How many remain on the same level?
• How much have our teaching approaches contributed to this
result?
• How much of this shift in scores is due to students’ predictable
progress? Is there any data that will enable us to compare our
students with a national cohort?
• How does this shift compare with previous Year 9 cohorts?
Page 70
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Reading levels – terms 1 and 4
Page 71
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Words, words, words …
Information can … establish, indicate, confirm, reinforce, back up,
stress, highlight, state, imply, suggest, hint at, cast doubt on,
refute …
• Does this confirm that …?
• What does this suggest?
• What are the implications of …?
• How confident are we about this conclusion?
Page 72
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
> Interpret
Intervene
Evaluate
Reflect
Page 73
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do we have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Making sense of information
Data becomes information when it is categorised, analysed,
summarised and placed in context.
Information therefore is data endowed with relevance and purpose.
Information is developed into knowledge when it is used to make
comparisons, assess consequences, establish connections and
engage in dialogue.
Knowledge … can be seen as information that comes laden with
experience, judgment, intuition and values.
Empson (1999) cited in Mason (2003)
Page 74
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
• Is this the sort of result we envisaged? If not, why?
• How does this information compare with the results of other
research or the experiences of other schools?
• Are there other variables that could account for this result?
• Should we set this information alongside other data or evidence
to give us richer information?
• What new questions arise from this information?
Page 75
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
• Does this relate to student achievement - or does it actually tell
us something about our teaching practices?
• Does this information suggest that the school’s strategic goals
and targets are realistic and achievable? If not, how should they
change, or should we change?
• Does the information suggest we need to modify programmes
or design different programmes?
• Does the information suggest changes need to be made to
school systems?
Page 76
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
What effect is the new 6-day x 50-min period structure having on
student engagement levels?
Page 77
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interrogate the information
What effect is the new 6-day x 50-min period structure having on
student engagement levels?
Do student views align with staff views?
Do positive effects outweigh negative effects?
Is there justification for reviewing the policy?
Does the information imply changes need to be made to teaching
practices or techniques?
Does the information offer any hint about what sort of changes
might work?
Page 78
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
> Intervene
Evaluate
Reflect
Page 79
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Professionals making decisions
How do we decide what action to take as result of the information
we get from the analysis?
We use our professional judgment.
Page 80
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Professional decision making
We have evidence-based information that we see as reliable and
valid
What do we do about it?
If the information indicates a need for action, we use our collective
experience to make a professional decision
Page 81
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Professionals making decisions
Have my students not achieved a particular history standard
because they have poor formal writing skills, rather than poor
history knowledge?
The answer was ‘Yes’ ... so I need to think about how to improve
their writing skills. How will I do that?
Page 82
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Professionals making decisions
Do any particular groups of year 11 students attend less regularly
than average for the whole cohort?
The analysis identified two groups – so I need to think about how
to deal with irregular attendance for each group.
How will I do that?
Page 83
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Professionals making decisions
You asked what factors are related to poor student
performance in formal writing.
The analysis suggested that poor homework habits have a
significant impact on student writing.
You make some professional judgements and decide
• Students who do little homework don’t write enough
• You could take action to improve homework habits - but you’ve
tried that before and the success rate is low
• You have more control over other factors – like how much time
you give students to write in class
So you conclude – the real need is to get students to write
more often
Page 84
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Deciding on an action
Information will often suggest a number of options for action. How
do we decide which action to choose?
We need to consider
• what control we have over the action
• the likely impact of the action
• the resources needed
Page 85
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Planning for action
•
•
•
•
•
•
Page 86
Is this a major change to policy or processes?
What other changes are being proposed
How soon can you make this change?
How will you achieve wide buy-in?
What time and resources will you need?
Who will co-ordinate and monitor implementation?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Planning for action
• Is this an incremental change? Or are you just tweaking how
you do things?
• How will you fit the change into your regular work?
• When can you start the intervention?
• Will you need extra resources?
• How will this change affect other things you do?
• How will you monitor implementation?
Page 87
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Timing is all
• How long should we run the intervention before we evaluate it?
• When is the best time of the year to start (and finish) in terms
of measuring changes in student achievement?
• How much preparation time will we need to get maximum
benefit?
Page 88
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Planning for evaluation
We are carrying out this action to see what impact it has on
student achievement
We need to decide exactly how we’ll know how successful the
intervention has been
To do this we will need good baseline data
Page 89
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Planning for evaluation
• What evidence do we need to collect before we start?
• Do we need to collect evidence along the way, or just at the
end?
• How can we be sure that any assessment at the end of the
process will be comparable with assessment at the outset?
• How will we monitor any unintended effects?
Don’t forget evidence such as timetables, student opinions, teacher
observations …
Page 90
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
> Evaluate
Reflect
Page 91
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evaluate the impact of our action
Did the intervention improve the situation that triggered the
process?
If the aim was to improve student achievement, did that happen?
Page 92
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evaluate the impact of our action
Was any change in student achievement significant?
What else happened that we didn’t expect?
How do our results compare with other similar studies we can find?
Does the result give us the confidence to make the change
permanent?
Page 93
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evaluate the impact of our action
A school created a new year 13 art programme. In the past
students had been offered standard design and painting
programmes, internally and externally assessed against the full
range of achievement standards. Some students had to produce
two folios for assessment and were unsure of where to take their
art after leaving school.
The new programme blended drawing, design and painting
concepts and focused on electronic media. Assessment was against
internally assessed standards only.
Page 94
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evaluate the impact of our action
• Did students complete more assessments?
• Were students gain more national assessment credits?
• How did student perceptions of workload and satisfaction
compare with teacher perceptions from the previous year?
• Did students leave school with clearer intentions about where to
go next with their art than the previous cohort?
• How did teachers and parents feel about the change?
Page 95
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evaluate the intervention
How well did we design and carry out the intervention? Would we
do anything differently if we did it again?
Were our results affected by anything that happened during the
intervention period - within or beyond our control?
Did we ask the right question in the first place? How useful was
our question?
How adequate were our evaluation data?
Page 96
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Think about the process
• Did we ask the right question in the first place? How useful was
our question?
• Did we select the right data? Could we have used other
evidence?
• Did the intervention work well? Could we have done anything
differently?
• Did we interpret the data-based information correctly?
• How adequate were our evaluation data?
• Did the outcome justify the effort we put into it?
Page 97
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
The evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
> Reflect
Page 98
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Future practice
• What aspects of the intervention will we embed in future
practice?
• What aspects of the intervention will have the greatest impact?
• What aspects of the intervention can we maintain over time?
• What changes can we build into the way we do things in our
school?
• Would there be any side-effects?
Page 99
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Future directions
• What professional learning is needed? Who would most benefit
from it?
• Do we have the expertise we need in-house or do we need
external help?
• What other resources do we need?
• What disadvantages could there be?
• When will we evaluate this change again?
Page 100
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Consider the Evidence
Terminology
Page 101
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Terminology used in the
evidence-driven decision making cycle
Trigger
Explore
Question
Assemble
Analyse
Interpret
Intervene
Evaluate
Reflect
Page 102
Clues found in data, hunches
Is there really an issue?
What do you want to know?
Get all useful evidence together
Process data and other evidence
What information do you have?
Design and carry out action
What was the impact?
What will we change?
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Trigger
Data, ideas, hunches, etc that set a process in action.
The trigger is whatever it is that makes you think there could be an
opportunity to improve student achievement. You can routinely
scan available data looking for inconsistencies, etc. It can be useful
to speculate about possible causes or effects - and then explore
data and other evidence to see if there are any grounds for the
speculation.
Page 103
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Explore
Initial data, ideas or hunches usually need some preliminary
exploration to pinpoint the issue and suggest good questions to
ask.
Page 104
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Question
This is the key point: what question/s do you want answered.
Questions can raise an issue and/or propose a possible solution.
Page 105
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Assemble
Get together all the data and evidence you might need – some will
already exist and some will have to be generated for the occasion.
Page 106
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Analyse
Process sets of data and relate them to other evidence.
You are looking for trends and results that will answer your
questions (but watch out for unexpected results that might suggest
a new question).
Page 107
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Interpret
Think about the results of the analysis and clarify the knowledge
and insights you think you have gained.
Interrogate the information. It’s important to look at the
information critically. Was the data valid and reliable enough to
lead you to firm conclusions? Do the results really mean what they
seems to mean? How sure are you about the outcome? What
aspects of the information lead to possible action?
Page 108
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Intervene
Design and implement a plan of action designed to change the
situation you started with.
Be sure that your actions are manageable and look at the
resourcing needed. Consider how you’ll know what has been
achieved.
Page 109
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Evaluate
Using measures you decided in advance, assess how successful the
intervention has been.
Has the situation that triggered the process been improved? What
else happened that you maybe didn’t expect?
Page 110
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Reflect
Think about what has been learned and discovered – and what
practices you will change as a consequence.
What did we do that worked? Did this process suggest anything
that we need to investigate further? What aspects of the
intervention can be maintained? What support will we need?
Page 111
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Other terms used
in
Consider the Evidence
Page 112
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Analysis
A detailed examination of data and evidence intended to
answer a question or reveal something.
This simplistic definition is intended to point out that data analysis
is not just about crunching numbers - it’s about looking at data and
other evidence in a purposeful way, applying logic, creativity and
critical thinking to see if you can find answers to your questions or
reveal a need. For example, you can carry out a statistical analysis
of national assessment results in the various strands of English
across all classes at the same level. You could compare those
results with attendance patterns. But you might also think about
those results in relation to more subjective evidence - such as how
each teacher rates his/her strengths in teaching the various
strands.
Page 113
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Aggregation
A number of measures made into one.
This is a common and important concept in dealing with data. A
single score for a test that contains more than one question is an
aggregation - two or more results have been added to get a single
result. Aggregation is useful when you have too few data to create
a robust measure or you want to gain an overview of a situation.
But aggregation can blur distinctions that could be informative. So
you will often want to disaggregate some data – to take data apart
to see what you can discover from the component parts. For
example, a student may do moderately well across a whole
subject, but you need to disaggregate the year’s result to see
where her weaknesses lie.
Page 114
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Data
Known facts or measurements, probably expressed in
some systematic or symbolic way (eg as numbers).
Data are codified evidence. (The word is used as a plural noun in
this kit.) The concepts of validity and reliability apply to data. It
helps to know where particular data came from; how data were
collected and maybe processed before you received them. Some
data (eg attendance figures) will come from a known source that
you have control of and feel you understand and can rely on. Other
data (eg standardised test results) come from a source you might
not really understand; they may be subject to manipulation and
predetermined criteria or processes (like standards or scaling).
Some data (eg personality profiles) may be presented as if they are
sourced in an objective way but their reliability might be variable.
Page 115
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Demographics
Data relating to characteristics of groups within the
school’s population. Data that provides a profile of people
at your school.
You will have the usual data relating to your students (gender,
ethnicity, etc) and your staff (gender, ethnicity, years of
experience, etc). Some schools collect other data, such as the
residential distribution of students and parental occupations.
Page 116
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Disaggregation
See aggregation
When you disaggregate data, you take aggregated data apart to
see what you can discover from the component parts. For example,
a student may do moderately well across a whole subject, but you
need to disaggregate the year’s result to see where her
weaknesses lie.
Page 117
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Evaluation
Any process of reviewing or making a judgement about a
process or situation.
In this resource, evaluation is used in two different but related
ways. After you have analysed data and taken action to change a
situation, you will carry out an evaluation to see how successful
you have been - this is summative evaluation. But you are also
encouraged to evaluate at every step of the way - when you select
data, when you decide on questions, when you consider the results
of data analysis, when you decide what actions to take on the basis
of the data - this is called formative evaluation.
Page 118
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Evidence
Any facts, circumstances or perceptions that can be used
as an input for an analysis or decision.
For example, the way classes are compiled, how a timetable is
structured, how classes are allocated to teachers, student portfolios
of work, student opinions. These are not data, because they are
not coded as numbers, but they can be factors in shaping teaching
and learning and should be taken into account whenever you
analyse data and when you decide on action that could improve
student achievement.
Page 119
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Information
Knowledge gained from analysing data and making
meaning from evidence.
Information is knowledge (or understanding) that can inform your
decisions. How certain you will be about this knowledge depends
on a number of factors: where your data came from, how reliable it
was, how rigorous your analysis was. So the information you get
from analysing data could be a conclusion, a trend, a possibility.
Page 120
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Inter-subject analysis
A detailed examination of data and evidence gathered from
more than one learning area.
Inter subject analysis can answer questions or reveal trends about
students or teaching practices that are common to more than one
learning area. For example, analysing the results of students taking
mathematics and physics subjects can indicate the extent to which
achievements in physics are aided or impeded by the students’
mathematical skills.
Page 121
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Intervention
Any action that you take to change a situation, generally
following an analysis of data and evidence.
This term is useful as it emphasises that to change students’
achievement, you will have to change something about the
situation that lies behind achievement or non-achievement. You
will take action to interrupt the status quo.
Page 122
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Intra-subject analysis
A detailed examination of data and other evidence
gathered from within a specific learning area.
Intra subject analysis can answer questions or reveal trends about
student achievement or teaching within a subject or learning area.
For example, an analysis of assessment results for all students
studying a particular subject in a school can reveal areas of
strength and weakness in student achievement and/or in teaching
practices, etc. Comparison of a school’s results in a subject with
results in that subject in other schools is also intra subject analysis.
Page 123
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.
Terminology
Longitudinal analysis
A detailed examination of data and evidence to reveal
trends over time.
Longitudinal analysis in education is generally used to reveal
patterns in student achievement, behaviour, etc over a number of
years. Results can reveal the relative impact of different learning
environments, for example. In this resource, it is suggested that
longitudinal analysis can be applied to teaching practice and school
processes. For example, the impact of modified teaching practices
in a subject over a number of years can be evaluated by analysing
the achievements of successive cohorts of students.
Page 124
www.minedu.govt.nz
© New Zealand Ministry of Education 2009 - copying restricted to use by New Zealand education sector.