No Slide Title

Download Report

Transcript No Slide Title

Implementing Building the Curriculum 5
Improving Assessment, Improving Learning
Ken Greer, Executive Director (Education), Fife Council
CEM Conference, Glasgow
24th March 2011
How the world’s best-performing
school systems come out on top
 “All of the top-performing and rapidly improving
systems have curriculum standards which set
clear and high expectations for what students
should achieve.
 “High performance requires every child to
succeed.
 “The only way to improve outcomes is to improve
instruction.
 “All of the top-performing systems also recognise
that they can not improve what they do not
measure.”
The McKinsey Report, September 2007
Assessment:
It’s about asking questions
What’s assessment for?
What system are we working with?
How are we doing it?
How do we make sure we are all talking about
the same standard?
5. Which unintended consequences do we want to
avoid?
6. How do we put all this together and make it work
in Fife (the unashamedly Chauvinistic).
1.
2.
3.
4.
1. What’s assessment for?
–
to support learning;
–
to give assurance to parents and others
about learners’ progress; and
–
to provide a summary of what learners have
achieved, including through qualifications
and awards, and to inform future
improvements.
-Building the Curriculum 5
1. What’s assessment for? (2)
The Assessment Reform Group (ARG)
The use of assessment :

to help build pupils’ understanding, within day-to-day
lessons

to provide information on pupils’ achievements to those
on the outside of the pupil-teacher relationship: to parents
(on the basis of in-class judgements by teachers and test
and examination results) and to further and higher
education institutions and employers (through test and
examination results)

data to hold individuals and institutions to account,
including through the publication of results which
encourage outsiders to make judgments on the quality of
those being held to account.
2. What system are we working with?
 Building the Curriculum 3 (June 2008)
 Curriculum for Excellence: Es and Os
 CfE BtC 5 A framework for assessment:
recognising achievement, profiling and reporting
(December 2010)
 CfE BtC 5 A framework for assessment:
understanding applying and sharing standards in
assessment for CfE: quality assurance and
moderation (October 2010)
 The NAR
http://www.ltscotland.org.uk/nationalassessmentresource/
 51,000+ teachers (4000 in Fife)
Arrangements for
•Assessment
•Qualifications
•Self-evaluation and
accountability
•Professional development
to support the purposes of
learning
3. How are we doing it?











CfE aspirations delivered (SLCIRCEC)
CPD; support; challenge; reporting
Knowing the limitations of various approaches to assessment
Measuring what we value
Working together to moderate/define standards led by expert
practitioners
Monitoring progress, monitoring value-added
Motivating: defining the bar
Analysing; benchmarking; supporting
Giving account and holding to account
Finding a manageable way: economy, efficiency, effectiveness
Milestones, not millstones
4. How do we make sure we are all
talking about the same standard?
 Trust/professionalism
 The primacy of individual teachers’ judgements is at the
heart of the assessment system in Scotland, supported
by moderation at local authority level and across
authorities
 A National Assessment Resource (NAR) to support
teachers as they come to judgements about learners’
progress
 Outcomes and experiences, but not performance
criteria
 Prior performance/other information?
 The car with no speedometer, no odometer and no
petrol gauge
A view from another country
 “Progress also relies on the need to retain clear
accountability through testing. This means at the end of
primary school just as much as at the end of secondary.”
Gordon Brown, quoted in TES, 30/10/2009
 “In the less successful secondary schools, the limited use of
assessment data on pupils on transfer to Year 7 led to
insufficiently challenging targets for some pupils.
 “In raising the attainment of learners in literacy who are
most at risk of not gaining the skills they need for successful
lives, the factors identified from visits on this survey
included sharp assessment of progress in order to
determine the most appropriate programme or support.”
Removing Barriers to Literacy, Ofsted 2011
5. Which unintended consequences
do we want to avoid?
What we want to avoid:
 assessment which does not support learning,
directly or indirectly
 de-motivation of any learner
 self-fulfilling prophecies
 ‘high stakes’ testing
 league tables
 false comparisons
What we want to promote:
 Improvements in learning, teaching and
performance (SLCIRCEC)
6. What’s are we doing in Fife?
Fife’s performance culture
Strategy for
improvement
Better
understanding
Appropriate
information
Collegiate
approach
The Fife way
Importance of the……….
• right strategy
• right culture
• right information
• right interpretation
• right results i.e. positive impact on performance
Themes
 Concentration on Impact: relentless focus on
outcomes
 Culture: Need to develop a strong performance
culture
 Context: Need to understand underlying
performance issues in an appropriate context
 Clarity: Need to use appropriate information to
identify performance issues
Context
What does national data tell us?
14
Leavers entering training (%)
Leavers entering HE (%)
70
60
50
40
30
20
10
12
10
8
6
4
2
0
0
1
2
3
4
5
6
7
SIMD decile
More deprived
8
9
10
1
2
3
4
5
6
7
8
9
SIMD decile
More affluent
10
Challenging a deterministic view
 “…the PISA scores of the top-performing countries
show a low correlation between outcomes and the
home background of the individual student.”
The McKinsey Report, September 2007
 However, in Fife (Scotland?), social context has a
strong relationship with attainment and other
educational outcomes, including destinations
 Educational outcomes vary with social context
across the social spectrum
 We need to understand this in order to address it.
Raising attainment … for all
Fife: attainment by end of S4
Scotland: leavers entering HE (%)
200
150
70
Leavers entering HE (%)
60
50
40
30
20
10
100
td
ep
ri
D ved
ec
i
D le 2
ec
i
D le 3
ec
i
D le 4
ec
i
D le 5
ec
i
D le 6
ec
i
D le 7
ec
Le D ile
as ec 8
t d ile
ep 9
riv
ed
os
ed ile 2 ile 3 ile 4 ile 5 ile 6 ile 7 ile 8 ile 9 ved
v
i
i
pr ec ec ec ec ec ec ec ec epr
e
D D D D D D D D d
td
t
s
o
as
e
M
L
0
M
Average SQA tariff points
250
Benchmarking
Example: comparator authorities
 Authority 1 is a comparator authority for
Authority 2 – rated as “very close” by HMIE
 In 2010, 9.4% of Authority 1 secondary pupils
were FME, compared to 9.7% of Authority 2’s
secondary pupils
 2.2% of children in Authority 1 live in the
SIMD 15% most deprived areas in Scotland,
compared with 3.3% in Authority 2
Example: comparator authorities
 In 2010, 38% of Authority 1 pupils
achieved 5+ at level 5 by the end of S4, as
compared with 65% in Authority 2
 What accounts for this difference?
Raising attainment … for all
 Performance management needs to account
for the impact of social context
and other relevant factors
 But … current school performance measures
focus attention on the most deprived
(e.g. FME, SIMD 15%)
 Need the best data to fit the issue
not the best fit to the available data
Some of the best data …
 CEM assessments from the University of
Durham (PIPS, INCAS, MidYIS, SOSCA)
 Standardised to a national level of
performance and comparable across
stages
 Provides a coherent view of performance
at local authority, establishment,
curriculum area & class levels
CEM assessment data
Performance information
School, curriculum area and class level
PIPS
P1
PIPS
P3
PIPS
P5
PIPS
P7
Tracking
Individual level
SOSCA
S2
SQA
S4, S5, S6
Looking across a cohort: Fife
Average SQA S4 tariff points
250
2
R = 0.99
200
150
100
Most deprived deciles
Least deprived deciles
The same cohort at P7: Fife
60
2
Average PIPS P7 score
R = 0.98
55
50
45
40
Most deprived deciles
Least deprived deciles
Continuous improvement …
“Value added” measures
250
SQA S4 average
R2 = 0.98
At Fife level
there is a strong
correlation between
performance in PIPS
(at stage P7) and SQA
(by S4) when viewed
by social context.
More affluent
200
150
This is related to
outcomes …
More deprived
100
40
50
PIPS P7 average
60
Continuous improvement …
“Value added” measures
The red lines separate
levels of attainment most
common amongst those
who enter …
250
More affluent
SQA S4 average
R2 = 0.98
HE
200
FE, Employment
150
Unemployment
More deprived
100
40
50
PIPS P7 average
60
Conclusion: outcomes and social
context
 There is substantial evidence that
educational outcomes vary across the
social spectrum
 Current approaches to measuring school
performance do not adequately reflect this
relationship
Conclusion: local data sources
 Local sources of information (e.g. CEM
data) can give valuable additional insight
 This can help to understand year-on-year
variations in performance
 This can provide “added value” measures:
– Across the social spectrum
– Within a given cohort
– By subject area
Conclusion: national data
sources
 There is a lack of relevant data across the
social spectrum
 E.g. there is no national data available on
the SIMD profile of each education
authority (EA) or school
 Relevant national data are vital for real
understanding and improvement of:
– EA and school performance
– outcomes for young people
Conclusions
 Strong performance management can
make a difference
 It requires the development of a
performance culture
 It requires engagement by managers and
leaders at all levels
 It needs to be based on the intelligent use
of appropriate evidence