Research-Led Approaches to Increasing Pupil Learning (ppt)

Download Report

Transcript Research-Led Approaches to Increasing Pupil Learning (ppt)

Research-Led Approaches to
Increasing Pupil Learning
Robert Coe
Capita Conference: Implementing the Pupil Premium
Newcastle, 8 July 2013
Outline
 How can we use school resources to get the
biggest increases in learning?
 What can research tell us about the likely
impact of different strategies?
∂ these strategies?
 How do we implement
 What else do we need do to make it likely
that attainment will rise?
Improving Education: A triumph of hope over experience
http://www.cem.org/attachments/publications/ImprovingEducation2013.pdf
2
Evidence about the
effectiveness of different
strategies
3
Toolkit of Strategies to Improve Learning
∂
The Sutton Trust-EEF Teaching and Learning Toolkit
http://www.educationendowmentfoundation.org.uk/toolkit/
www.educationendowmentfoundation.org.uk/toolkit
Impact vs cost
Effect Size (months gain)
Promising
8
May be
worth it
Feedback
Meta-cognitive
Peer tutoring
Homework
(Secondary)
Collaborative
Early Years
1-1 tuition
∂
Behaviour
Small gp
Phonics
Parental
tuition
involvement
ICT
Social
Individualised Summer
schools
learning
Mentoring
Homework
(Primary)
Performance Aspirations
0
pay
£0 Ability grouping
Cost per pupil
Smaller
classes
After
school
Teaching
assistants
£1000
Not
worth it
Key messages
 Some things that are popular or widely
thought to be effective are probably not
worth doing
– Ability grouping (setting); After-school clubs;
∂
Teaching assistants; Smaller classes;
Performance pay; Raising aspirations
 Some things look ‘promising’
– Effective feedback; Meta-cognitive and self
regulation strategies; Peer tutoring/peer‐assisted
learning strategies; Homework
Clear, simple advice:
 Choose from the top left
 Go back to school and do it
∂
For every complex problem
there is an answer that is
clear, simple, and wrong
H.L. Mencken
7
Why not?
 We have been doing some of these things for a
long time, but have generally not seen
improvement
 Research evidence is problematic
– Sometimes the existing evidence is thin
∂ reflect real life
– Research studies may not
– Context and ‘support factors’ may matter
 Implementation is problematic
– We may think we are doing it, but are we doing it right?
– We do not know how to get large groups of teachers
and schools to implement these interventions in ways
that are faithful, effective and sustainable
8
So what should we do?
9
Four steps to improvement




Think hard about learning
Invest in good professional development
Evaluate teaching ∂quality
Evaluate impact of changes
1. Think hard about
learning
www.educationendowmentfoundation.org.uk/toolkit
Impact vs cost
Effect Size (months gain)
Promising
8
May be
worth it
Feedback
Meta-cognitive
Peer tutoring
Homework
(Secondary)
Collaborative
Early Years
1-1 tuition
∂
Behaviour
Small gp
Phonics
Parental
tuition
involvement
ICT
Social
Individualised Summer
schools
learning
Mentoring
Homework
(Primary)
Performance Aspirations
0
pay
£0 Ability grouping
Cost per pupil
Smaller
classes
After
school
Teaching
assistants
£1000
Not
worth it
1. Which strategies/interventions are
very surprising (you really don’t
believe it)?
2. Which strategies/interventions can
∂ they do (or don’t)
you explain why
improve attainment?
3. Which strategies/interventions o
you want to know more about?
13
Poor Proxies for Learning
 Students are busy: lots of work is done
(especially written work)
 Students are engaged, interested, motivated
 Students are getting attention: feedback,
explanations
∂
 Classroom is ordered, calm, under control
 Curriculum has been ‘covered’ (ie presented to
students in some form)
 (At least some) students have supplied correct
answers (whether or not they really understood
them or could reproduce them independently)
14
A simple theory of learning
Learning happens
when people have
to think hard
∂
Hard questions about your school
 How many minutes does an average pupil on
an average day spend really thinking hard?
 Do you really want pupils to be ‘stuck’ in your
∂
lessons?
 If they knew the right answer but didn’t know
why, how many pupils would care?
16
2. Invest in effective CPD
How do we get students to learn hard things?
Eg
 Place value
 Persuasive
writing
 Music
composition
 Balancing
chemical
equations
• Explain what they should do
• Demonstrate it
• Get them to do it (with
gradually
reducing support)
∂
• Provide feedback
• Get them to practise until it is
secure
• Assess their skill/
understanding
How do we get teachers to learn hard things?
Eg
 Using formative
assessment
 Assertive
discipline
 How to teach
algebra
• Explain what they should do
∂
What CPD helps learners?
 Intense: at least 15 contact hours, preferably 50
 Sustained: over at least two terms
 Content focused: on teachers’ knowledge of
subject content & how students learn it
∂
 Active: opportunities to try it out & discuss
 Supported: external feedback and networks to
improve and sustain
 Evidence based: promotes strategies
supported by robust evaluation evidence
3. Evaluate teaching
quality
Every teacher needs to
improve, not because they
∂
are not good enough,
but
because they can be even
better.
Dylan Wiliam
Identifying the best teachers
Sources of evidence:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Colleagues (peers, SLT) observing lessons
Trained outsiders observing lessons
Pupils’ test score gains
Progress in NC levels (from teacher assessment)
∂
Pupils’ ratings of teacher/lesson
quality
Teacher qualifications
Tests of teachers’ content knowledge
Parents’ ratings
Ofsted ratings
Colleagues’ (including SLT) perceptions
Teachers’ self-evaluation
Next generation of CEM systems …


Assessments that are
– Comprehensive, across the full range of curriculum areas,
levels, ages, topics and educationally relevant abilities
– Diagnostic, with evidence-based follow-up
– Interpretable, calibrated against norms and criteria
– High psychometric quality ∂
Feedback that is
– Bespoke to individual teacher, for their students and classes
– Multi-component, incorporating learning gains, pupil ratings,
peer feedback, self-evaluation, …
– Diagnostic, with evidence-based follow-up

Constant experimenting
24
4. Evaluate impact of
changes
Mistaking School Improvement (1)
(Coe, 2009)
1. Wait for a bad year or choose underperforming
schools to start with. Most things self-correct or revert
to expectations (you can claim the credit for this).
2. Take on any initiative, and ask everyone who put
effort into it whether they feel it worked. No-one wants
∂
to feel their effort was wasted.
3. Define ‘improvement’ in terms of perceptions and
ratings of teachers. DO NOT conduct any proper
assessments – they may disappoint.
4. Only study schools or teachers that recognise a
problem and are prepared to take on an initiative.
They’ll probably improve whatever you do.
Mistaking School Improvement (2)
(Coe, 2009)
5. Conduct some kind of evaluation, but don’t let the
design be too good – poor quality evaluations are
much more likely to show positive results.
6. If any improvement occurs in any aspect of
∂
performance, focus attention
on that rather than on
any areas or schools that have not improved or got
worse (don’t mention them!).
7. Put some effort into marketing and presentation of
the school. Once you start to recruit better students,
things will improve.
Key elements of good evaluation
 Clear, well
defined, replicable
intervention
∂
 Good assessment
of appropriate
outcomes
 Well-matched
comparison group
Summary …
1. Think hard about learning
2. Invest in good CPD
3. Evaluate teaching
quality
∂
4. Evaluate impact of changes
[email protected]
www.cem.org
@ProfCoe