Transcript Slide 1

Performance tables - what are they
good for (absolutely nothing?)
Professor Steve Strand
Institute of Education
University of Warwick
Lambeth Raising Achievement: Making Use
of Data and Good Practice Annual
Conference, International House, 3
November 2011
Objectives of the session
• Review the development
•
•
of school performance
data in England: What are their objectives, do
they meet them?
How are performance tables (PTs) changing
with the coalition government (White Paper, Nov
2010)? What is planned and what effects might
the plans have?
What are the conditions that can maximise the
effective use of data for school improvement?
25 years of school performance data
• What data was routinely available in the mid 1980’s
to evaluate school performance?
• All secondary schools required to publish their
examination results from 1982, but no format
specified, wide inconsistencies
• Widespread testing but by LA or individual
school choice, rarely public
• LEA had option for inspection service but more
often advisory services, no reports published,
HMI typically restricted to thematic reports
The 1980’s
• 1982: All secondary schools required to publish their
•
•
examination results, although the Regulations didn’t
specify the precise form.
1987: Nationwide extension of TVEI for 14-18 year
olds. Central curricula innovation was introduction of
work experience. Split into local projects, carefully
monitored to establish good practice – many
performance indicators.
1988: Education Reform Act, Introduction of LMS,
National testing 7-14, Governors detailed annual reports to
parents, CIPFA & DES aide memoire, list of ‘100 PI’s in
schools’
Sample of TVEI / CIPFA / DES PI’s
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% 5th year with 4+ GCE ‘O’ levels / CSE Grade 1 passes
% attendance
Total days exclusion
% Y11 who transfer to FTE post 16
PTR / class size by year group
% Staff qualified to degree level in their subject
% staff attendance
% school periods not taught by designated teachers
% staff involved in significant CPD
Number of formal parental complaints
% pupils whose parents attend parent consultation sessions
Capitation per pupil on books / Computers & IT / resources
Are incidents of internal vandalism increasing?
Are the schools objectives for community links being achieved?
Making sense of PIs
“Mere inspection of a list of indicators
will not typically reveal all the intricacies
of their inter-relationships … without a
detailed knowledge of the trade off
between inputs and outputs … policy
making may simply be confused by
these additional data”.
Mayston & Jesson (1988).
•
•
•
•
•
•
•
•
•
•
•
•
Timelines (1990’s)
May-1991 First Statutory KS1 tests published LA level (not intended)
Nov-1992 First secondary performance tables published in England
Government loose interest in PIs
May-1993 First statutory KS3 tests
Sep-1993 Start of secondary OFSTEDs (reports on web, summary to
parents). Primary school OFSTEDs from 2004.
May-1995 First Statutory KS2 tests
Nov-1996 TIMSS International study (age 13) – England 16/25 in Maths
(
(widely cited) 6/25 in Science (not widely cited!)
Mar- 1997 First primary performance tables using 1996 results
(England)
Sep-1998 NLS introduced
Sep-1998 First Statutory Baseline Assessment in Reception (age 4+)
Jan- 1999 First OFSTED LEA Inspections
Sep-1999 NNS introduced
Nov-1999 First ‘Autumn Package’ published
Crucial thread of VA/CVA
"Without a value-added dimension, the obvious
basis for judgement is that 'higher' scores
represent better practice and 'lower' scores
worse. This could lead to unwarranted
complacency on the part of some schools whose
pupil population comprise more able pupils and,
conversely, to despair on the part of others, who,
however hard they try can never expect to raise
the absolute level of their pupils’ scores to those
obtained in schools with more able pupils."
(SEAC 1993 Dearing interim report, Annex. 5, par. 3).
Willms
(1992)
analysis for
20 Scottish
secondary
schools
(p60)
Above/Below Floor Target 2010
Demographics
Group 1 Group 2
Oddsratio
% FSM (primary)
30%
16%
2.0
% FSM (secondary)
27%
14%
2.3
% primary in top third deprived (IDACI) 2009
77%
<33%
1.7
% secondary in top third deprived (IDACI) 2009
70%
<33%
1.8
% SEN KS2
28%
21%
1.5
% SEN KS4
33%
21%
1.9
Looked after KS2 & KS4
0.7%
0.4%
1.7
6%
3%
2.1
Mobility - % with >4% joining during Y6
42%
27%
2,0
BME secondary
21%
22%
1.0
?
?
?
Persistent absence secondary (not FSM)
CVA
Source: DFE (2011). Underperforming schools and
deprivation: (RR141). London: DFE.
Timelines (2000’s)
•
Sep-2001
•
•
Jan- 2002
Nov-2002
•
•
•
•
•
•
•
•
•
•
Sep-2003
Jan -2004
May-2005
Sep-2005
Nov-2006
Jun- 2009
May-2010
Nov-2010
Nov-2011
KS3 Strategy Introduced - (Pilot of science & TLF)
First national targets for KS3 in 2004 (En, Ma, Sc, ICT)
First full PLASC
First KS3 published tables
First VA reported in secondary KS4 tables
Interactive AP becomes the PAT
David Milliband North England speech announces NRwS
Election of Labour for Third Term
NRwS (SEF, School Profile, SIA, ‘single conversation’)
FFT exceptions reports
First CVA reported in Secondary A&A tables
School Report Card proposed
Coalition Government formed
Schools White paper
CVA to be removed from tables
Winning the argument, but…..
• Value-added included in secondary performance
tables from 2002, but:
• ‘Median line’ methodology: ceiling effect means
•
•
•
level 3->5 cannot show VA (similar for 5->7 KS3),
introduced systematic bias against high baseline
schools (Tymms, 2004)
Prior attainment only: no other pupil / school
context factors
Narrow focus, still primarily on a single threshold
measure (5*A-C)
Absence of confidence intervals around school
value-added estimates
However CVA arrived in 2006
• Differentiated PA, regression methodology:
• KS1 average points score and divergence
• KS2 fine grades and divergence
• Levels 345= 48% chance 5+A*-C (1.4% of cohort)
Levels 543= 75% chance 5+A*-C (0.3% of cohort)
• Pupil factors: FSM, Deprivation (IDACI), SEN, Gender,
•
•
Age within year, mobility, in-care, ethnic group, EAL
School composition
• School mean and SD of prior attainment
With 95% confidence intervals
The aims of Performance tables
1. Accountability as publically funded
institutions to government & public
2. To support parental choice of schools
(market driven) aligned with open
enrolment
3. To raise standards / support school
improvement
1. Accountability
• Strong parental support
• Parents should be able to compare one school’s
performance against another (86%)
• Tests and exam results are one important
measure of a school’s performance (87%)
• The performance of each school in tests and
exams should be published and publically
available (87%)
Nationally representative sample of 1,624 adults (including 550 parents
of child 0-18) in England, November 2008. DCSF (2009). School
accountability and school report card omnibus survey. DCSF-RR107.
London: DCSF.
And among the media
• All the major newspapers publish school (and
university) league tables (Times, Guardian,
Telegraph etc).
• ”Education is a perfect media topic. It has heat
and light. Heat because everybody cares about
it ,and light because they all think that they
understand it.” (Journalist, quoted in Earl 2001,
p6).
• With such an alliance of parents and media, any
Government is going to listen.
2. Parental Choice
Factors that influenced choice of secondary
A visit to the school / open day
Information from the school
Other parents & carers
Teachers
School League/Performance tables
Advice from previous school attended
OFSTED / Inspectors report on the school
Information from the LEA
Reports in local media
Government written information
Government online informartion
%
82
74
58
56
55
51
49
36
36
26
18
Telephone survey of 3,005 parents in summer 2004. Wiseman et al (2005). London
Challenge: Second survey of parents & carers 2004 (RR624). London: DfES.
Social gradient in choice
•
Academic outcomes was most common reason (43%)
offered by parents for wanting a place in their favoured
school. But likelihood of citing ‘academic outcomes’ was
significantly higher for:
•
•
•
•
•
Owner Occupiers (2:1 relative to parents renting)
Mothers in Social Class I & II (1.7:1 against manual)
BME mothers (1.7:1 against White)
Parents residing in London boroughs (2.5 times more likely not
to apply to their nearest schools than parent in Shire LEAs)
Increases social segregation by indirectly informing parents
which schools have high concentrations of high SES
students or democratises the information that high SES
families are already aware of though their social networks?
2,170 parents of Y6 children in 2000. Flately, J., et al.(2001). Parents'
experiences of the process of choosing a secondary school (RR 278).
3. School Improvement
• Meta-studies from the US using independent
•
•
measures of attainment (NAEP, NELS) suggest a
modest positive impact on average (ES=0.24), but
studies provide mixed findings and tend to polarise
between the extremes (Lee, 2008)
Difficulty of disentangling school performance tables
from other policies adopted at the same time (teacher
certification; rewards/sanctions for schools e.g.
teachers performance pay, school vouchers, school
takeover threats etc.)
But recent study from UK comparing Wales-England
has had a substantial impact
The Wales ‘experiment’
• Broad context is a substantial decline in Wales in
Programme for International Student Assessment
(PISA) scores for maths, reading and science
Burgess, Wilson & Worth (2010)
• Welsh Assembly announced it would cease publishing
secondary school performance tables in July 2001
(after exams taken). Natural experiment – otherwise
similar to England in inspection regimes, exams, etc.
• Any change in school results in Wales vs. England from
2002 onwards?
• Controls for:
• prior attainment at KS3, entitled to FSM, Cohort size
• Pupil funding, population density, Church/Maintained
• Local competition (N schools with 5Km)
Results
Burgess conclusions
• Decline for Wales seen for APS as well as 5+A*C &
congruent with PISA results – not gaming.
• Top 25% Welsh schools (by high PA or low poverty)
not affected. Effect concentrated particularly for
lowest performing schools falling furthest behind.
• Public scrutiny through PT’s puts low performing
schools in England under great pressure to improve,
similar schools in Wales maybe ‘coasted’
• Removal of tables did not reduced school segregation
and sorting (by FSM or by KS2 score) in Wales
Unintended consequences
• Excellent review by Smith (1995) on eight
•
unintended consequences of PIs
In school performance tables includes:
• Gaming: C/D boundary, GNVQ equivalents,
•
•
•
switching exam boards, SEN school action, etc.
Depressing baseline scores
Teaching to the test
Selective student admissions / removing “difficult”
students
• Always happens in the school up the road! Little
information on whether these work in long term.
Should they stay or should they go?
• One of the least evidence-based areas of school policy,
but what evidence there is suggests PTs may be a
driver/energiser of improvement
• Both political left and right support, but for different
reasons (Markets, FOI/Openess/Empowerment)
• Better than the alternatives? (remember the 1980’s)
• Have achieved a level of consensus among school
leaders based on ‘moral’ argument around CVA and
fairness: “Every school, regardless of the SES circumstances in
which it operates, must have a fair opportunity of achieving a good
score” DFE (2009) A school report card, par.47.
• Tamper with this at peril
•
•
•
•
How are the PT’s changing?
Addressing some of the gaming issues: In 2004, when
recognised as equivalent to GCSE, 1,882 gained level 2
passes, risen to 462,182 students in 2010 (see Wolf
review). Also SEN SAP or above only.
Ebacc- will it drive out vocational? 22% eligible in 2011,
33% in 2012 and 47% in 2013 (DFE survey of 692
schools)
VA for low/mid/high prior attainment and for each Ebacc
subject (though are PT’s the place for this? – see later)
Widen the range of indicators (average PA, %FSM, %EAL
etc). Broadly good, but remember fate of PI schemes.
Some (e.g. university destinations) largely outside of
schools control?
New website (GoCompare)
http://www.education.gov.uk/researchandstatistics/statssearch
Removal of CVA
• Took from 1992 – 2006 to get CVA, now discarded,
•
why?
“The CVA measure is difficult for the public to
understand... It is morally wrong to have an
attainment measure which entrenches low
aspirations for children because of their
background. For example, we do not think it right to
expect pupils eligible for FSM to make less
progress from the same starting point as pupils who
are not eligible for FSM”. (DFE, 2010, Schools
White Paper, p68).
Fundamental misunderstanding
• Confuses student level expectations from school
•
•
level accountability. Pupils on FSM not only have
lower attainment but DO make less progress at
school.
However this does not mean lower expectations for
students, target setting and progress measures are
not adjusted (and never were) for pupil background.
The point is factors that are outside a schools
control (gender, deprivation, EAL, ethnicity, SEN,
mobility) need to be included in a school
accountability indicator.
Suggested replacements
• Two levels of progress – but highly subject to
•
threshold effects
Alternative approach to contextualisation through
“families of 10 to 15 schools with similar intakes for
all regions of the country” (DFE, 2010, p76), but:
“CVA, because it is based on individual pupil characteristics
and their attainment, is not prone to the biases that can be
created by comparisons based on school-level similarities.
We therefore believe that some form of CVA is the best
means of contextualising pupil progress.” (DFE 2009,
School Report Cards, par.77).
Options for schools?
• Does data speak for itself, like dials on the
dashboard of a car? Does feedback always lead to
improvement? Is providing data in PTs the best way
of securing school improvement?
• There are a wide range of School Performance
Data Services: PIPs/MidYIS/ALIS, FFT, NFERPASS, RAISE online, LA services like Lambeth
• Data is inevitably more detailed than that needed
for performance tables (see following examples) but
they also offer the training & support to use the data
effectively
-1.4
-1.6
44
42
130
14
227
German
Graphic Comm.
History
Home Economics
Maths
11
14
Religious Studies
Science
Technical
Spanish
0
85
Physics
11
77
PE
Social & Voc.
66
Music
0
99
Geography
0
163
French
Modern Studies
226
English
Drama
42
-1.2
Craft & Design
-1.0
34
-0.8
Computing
-0.6
111
-0.4
Chemistry
-0.2
69
0.0
Business Manag.
0.2
86
0.4
Biology
0.6
82
0.8
Art & Design
1.0
15
1.2
Administration
1.4
10
1.6
Account & Finance
0
subject residual with 90% confidence interval
Differential effectiveness by subject : VA against CAT
Value added for different groups of pupils
•
•
Are pupils who make significantly more, or
significantly less, progress:
• boys/girls, SEN, EAL?
• Different ability levels?
• joined school recently?
Alternatively, are they pupils:
• Who missed particular classes for long periods?
• With a particular teacher?
• In a particular set?
• Who had extra support / intervention?
• Who followed a particular scheme of work?
• Whose teachers used different teaching practices?
Conclusions
• Accountability and PTs are here to stay
• Wales is reintroducing: Last week all secondary
schools were told which of 5 “bands” they had been
placed into as part of a new accountability regime
(based on raw scores, VA, attendance). Parents and
the media will get the information in December 2011. A
primary school model is being developed and will follow
next year. (TES, 23/09/11).
• We will see more and more data published and
publically available (on the web, GoCompare style)
but it will be harder and harder to make sense of it
Conclusions (Continued)
• The credibility of the data is key to users (Saunders,
•
•
•
2000), removal of CVA breaks the trust, should
publically oppose this change
Resist the shift of the entire burden from the State to
schools, keep a focus on the policy issues (like EMA
and equity in University entry)
PTs can provide the incentive (if seen as fair) for both
low attaining and (if CVA included) high attaining
schools (avoiding complacency)
but for SI need much more detailed data, training &
support: It should be a bright future for School
Performance Data services!
End of Session – Thank you
Professor Steve Strand
Institute of Education
University of Warwick
[email protected]
Tel. (024) 7652 2197