The challenge of leadership - Durham University Community

Download Report

Transcript The challenge of leadership - Durham University Community

@ProfCoe
www.twitter.com/ProfCoe
Evidence-based use of the
pupil premium
Robert Coe
Durham Leadership Conference, 26 June 2014
Outline
 What can research tell us about the likely
impacts and costs of different strategies?
 How do we implement these strategies to …
–
–
–
–
Focus on what matters
∂
Change classroom practice
Target areas of need
Produce demonstrable benefits
Improving Education: A triumph of hope over experience
http://www.cem.org/attachments/publications/ImprovingEducation2013.pdf
2
Evidence about the
effectiveness of different
strategies
3
Toolkit of Strategies to Improve Learning
∂
The Sutton Trust-EEF Teaching and Learning Toolkit
http://www.educationendowmentfoundation.org.uk/toolkit/
Effect Size (months gain)
Impact vs cost
www.educationendowmentfoundation.org.uk/toolkit
Most promising for
raising attainment
8
May be
worth it
Feedback
Meta-cognitive
Peer tutoring
Homework
(Secondary)
Collaborative
Early Years
1-1 tuition
∂
Behaviour
Small gp
Phonics
Parental
tuition
involvement
ICT
Social
Individualised Summer
schools
learning
Mentoring Teaching
Homework
assistants
(Primary)
Performance
Aspirations
0
pay
Setting
£0
Cost per pupil
Smaller
classes
After
school
£1000
Small
effects /
high cost
Key messages
 Some things that are popular or widely
thought to be effective are probably not
worth doing
– Ability grouping (setting); After-school clubs;
∂
Teaching assistants; Smaller classes;
Performance pay; Raising aspirations
 Some things look ‘promising’
– Effective feedback; Meta-cognitive and self
regulation strategies; Peer tutoring/peer‐assisted
learning strategies; Homework
Clear, simple advice:
 Choose from the top left
 Go back to school and do it
∂
For every complex problem
there is an answer that is
clear, simple, and wrong
H.L. Mencken
7
Why not?
 We have been doing some of these things for a
long time, but have generally not seen
improvement
 Research evidence is problematic
– Sometimes the existing evidence is thin
∂ reflect real life
– Research studies may not
– Context and ‘support factors’ may matter
 Implementation is problematic
– We may think we are doing it, but are we doing it right?
– We do not know how to get large groups of teachers
and schools to implement these interventions in ways
that are faithful, effective and sustainable
8
So what should we do?
9
Four steps to improvement




Think hard about learning
Invest in good professional development
Evaluate teaching ∂quality
Evaluate impact of changes
1. Think hard about
learning
Effect Size (months gain)
Impact vs cost
www.educationendowmentfoundation.org.uk/toolkit
Most promising for
raising attainment
8
May be
worth it
Feedback
Meta-cognitive
Peer tutoring
Homework
(Secondary)
Collaborative
Early Years
1-1 tuition
∂
Behaviour
Small gp
Phonics
Parental
tuition
involvement
ICT
Social
Individualised Summer
schools
learning
Mentoring Teaching
Homework
assistants
(Primary)
Performance
Aspirations
0
pay
Setting
£0
Cost per pupil
Smaller
classes
After
school
£1000
Small
effects /
high cost
1. Which strategies/interventions are
very surprising (you really don’t
believe it)?
2. Which strategies/interventions can
∂ they do (or don’t)
you explain why
improve attainment?
3. Which strategies/interventions o
you want to know more about?
13
Poor Proxies for Learning






Students are busy: lots of work is done (especially
written work)
Students are engaged, interested, motivated
Students are getting attention: feedback,
explanations
Classroom is ordered, calm, under control
∂
Curriculum has been ‘covered’
(ie presented to
students in some form)
(At least some) students have supplied correct
answers, even if they
–
–
–
–
Have not really understood them
Could not reproduce them independently
Will have forgotten it by next week (tomorrow?)
Already knew how to do this anyway
14
A better proxy for learning?
Learning happens
when people have
to think hard
∂
Hard questions about your school
 How many minutes does an average
pupil on an average day spend really
thinking hard?
 Do you really want∂ pupils to be ‘stuck’
in your lessons?
 If they knew the right answer but didn’t
know why, how many pupils would
care?
16
True or false?
1. Reducing class size is one of the most
effective ways to increase learning [evidence]
2. Differentiation and ‘personalised learning’
resources maximise learning [evidence]
3. Praise encourages learners
and helps them
∂
persist with hard tasks [evidence]
4. Technology supports learning by engaging
and motivating learners [evidence]
5. The best way to raise attainment is to
enhance motivation and interest [evidence]
17
2. Invest in effective CPD
How do we get students to learn hard things?
Eg
 Place value
 Persuasive
writing
 Music
composition
 Balancing
chemical
equations
• Explain what they should do
• Demonstrate it
• Get them to do it (with
gradually
reducing support)
∂
• Provide feedback
• Get them to practise until it is
secure
• Assess their skill/
understanding
How do we get teachers to learn hard things?
Eg
 Using formative
assessment
 Assertive
discipline
 How to teach
algebra
• Explain what they should do
∂
What CPD helps learners?
 Intense: at least 15 contact hours, preferably 50
 Sustained: over at least two terms
 Content focused: on teachers’ knowledge of
subject content & how students learn it
∂
 Active: opportunities to try it out & discuss
 Supported: external feedback and networks to
improve and sustain
 Evidence based: promotes strategies
supported by robust evaluation evidence
3. Evaluate teaching
quality
Improving Teaching
 Teacher quality is what matters
 We need to focus on teacher learning
 Teachers learn just∂ like other people
– Be clear what you want them to learn
– Get good information about where
they are at
– Give good feedback
23
Why monitor teaching quality?
 Good evidence of (potential) benefit from
– Performance feedback (Coe, 2002)
– Target setting (Locke & Latham, 2006)
– Accountability (Coe & Sahlgren, 2014)




Individual teachers matter most
Teachers typically stop∂ improving after 3-5 years
Everyone can improve
Judging real quality/effectiveness is very hard
– Multidimensional
– Not easily visible
– Confounded
24
Monitoring the quality of teaching

Progress in assessments
– Quality of assessment matters (cem.org/blog)
– Regular, high quality assessment across curriculum (InCAS,
INSIGHT)

Classroom observation
– Much harder than you think! (cem.org/blog)
– Multiple observations/ers, trained and QA’d

Student ratings
∂
– Extremely valuable, if done properly
(http://www.cem.org/latest/student-evaluation-of-teaching-canit-raise-attainment-in-secondary-schools)

Other
–
–
–
–
–
Parent ratings feedback
Student work scrutiny
Colleague perceptions (360)
Self assessment
Pedagogical content knowledge
25
Teacher Assessment
 How do you know that it has captured
understanding of key concepts?
– vs ‘check-list’ (eg ‘;’=L5, 3 tenses=L7)
 How do you know standards are comparable?
∂ subjects
– Across teachers, schools,
– Is progress good?
 How have you resolved tensions from teacher
judgments being used to judge teachers?
– Summative assessment includes teacher feedback
26
Evidence-Based Lesson Observation
 Behaviour and organisation
– Maximise time on task, engagement, rules &
consequences
 Classroom climate
– Respect, quality of interactions,
failure OK, high
∂
expectations, growth mindset
 Learning
– What made students think hard?
– Quality of: exposition, demonstration, scaffolding,
feedback, practice, assessment
– What provided evidence of students’ understanding?
– How was this responded to? (Feedback)
27
4. Evaluate impact of
changes
School ‘improvement’ often isn’t
 School would have improved anyway
– Volunteers/enthusiasts improve: misattributed to intervention
– Chance variation (esp. if start low)
 Poor outcome measures
– Perceptions of those who worked hard at it
– No robust assessment of pupil
∂ learning
 Poor evaluation designs
– Weak evaluations more likely to show positive results
– Improved intake mistaken for impact of intervention
 Selective reporting
– Dredging for anything positive (within a study)
– Only success is publicised
(Coe, 2009, 2013)
Key elements of good evaluation
 Clear, well
defined, replicable
intervention
∂
 Good assessment
of appropriate
outcomes
 Well-matched
comparison group
RISE: Research-leads Improving Students’ Education
∂




With Alex Quigley, John Tomsett, Stuart Kime
Based around York
RCT: 20 school leaders trained in research, 20 controls
Contact: [email protected]
31
Summary …
www.cem.org
1. Think hard about
learning
2. Invest in good CPD
3. Evaluate teaching
quality
@ProfCoe
[email protected]
4. Evaluate impact of
changes