Enhancing the Education Environment at Queen’s First Annual Conference of the Centre for Educational Development Queen’s University Belfast, 18-19 Sept 2006 Enhancing Learning.

Download Report

Transcript Enhancing the Education Environment at Queen’s First Annual Conference of the Centre for Educational Development Queen’s University Belfast, 18-19 Sept 2006 Enhancing Learning.

Enhancing the Education Environment at Queen’s First Annual Conference of the Centre for Educational Development Queen’s University Belfast, 18-19 Sept 2006

Enhancing Learning and Teaching: What Role Can Research Evidence Play?

Dai Hounsell

University of Edinburgh

www.ed.ac.uk/etl

BACKGROUND AND INTRODUCTION

Evidence and Practice

 the lure of evidence-based and evidence-informed practices and policies  evidence on teaching-learning and assessment practices – how far can we generalise, and about what?

– the challenge of contingency 

Feedback and Its Discontents

pervasive evidence of variable feedback

(e.g. National Student Survey, 2006; QAA Learning from Subject Review, 2003; Krause et al. 2005; Hounsell, 2003; Hounsell et al. 2005; Carless, 2006)

RESEARCH AND FEEDBACK TO STUDENTS

 

compelling evidence of the role of feedback and formative assessment in facilitating high-quality learning

( see e.g. Black et al. 2003; Nicol and MacFarlane Dick, 2006)

evolving conceptions of feedback

(Sadler, 1998)

 what makes for effective feedback – knowledge of results  – support and encouragement  – grasp of what high-quality achievement entails   closing the loop (waxing and waning) action taken to close the gap, between desired goal and actual performance

KEYNOTE FOCUS AND AIMS

 research findings on guidance and feedback to students  draws on biosciences data from the

ETL

Project  aims to review:  findings from 1 st round of data-gathering, and subsequent action by course team  findings from 2 nd round of data-gathering, on impact of measures taken  outcomes of subsequent analysis  implications for evidence-informed efforts to enhance learning and teaching

Enhancing Teaching-Learning Environments in Undergraduate Courses (ETL Project)

RESEARCH DESIGN

Aims

to investigate ways of enhancing the quality of undergraduate learning and teaching, in a range of subject areas & settings 

Samples and settings

first- & final-year modules in three departments 

Data-gathering

student questionnaires and interviews with students & staff 

‘Enhancement’ focus

collection, analysis & joint review of baseline data   evidence-based collaborative initiatives

BIOSCIENCE STUDENTS’ PERCEPTIONS OF GUIDANCE AND FEEDBACK

 The students’ overall perceptions of their courses were broadly positive across all of six bioscience course units surveyed  Their experiences of the provision of guidance and feedback on assessed work, however, were much more variable  In some units, students reported favourably; in others, there were significant student concerns

BIOSCIENCE STUDENTS’ PERCEPTIONS

OF GUIDANCE AND FEEDBACK (Questionnaires)

[insert figure 2]

STUDENTS’ CONCERNS ABOUT

GUIDANCE AND FEEDBACK (Interviews)

 Where guidance and feedback was a significant student concern, it could take various forms:  uncertainty about what staff expected from students in set [i.e. formally required] work  dissatisfaction with the variable quantity and helpfulness of feedback comments from staff  frustration with delays in receiving feedback  (in a small number of instances) uncertainty about the ground rules for “buttonholing” tutors

STUDENTS’ CONCERNS ABOUT

GUIDANCE AND FEEDBACK (Interviews)

S5: I got 8 out of 20, and I've got nothing written on my [feedback] sheet at all.

S3: Mine's the same. I got 10, and it's got no comments on it whatsoever.

S5: And they tell you to do it in double-spacing, so they can write things in, but they never do . S3: I mean, if we're getting half marks, it must have a lot wrong with it . . [S5: Exactly.] But it's not telling us anything.

-------------------------------------------------------------------- S1: Sometimes they say ‘Be more concise’ but then another time I thought ‘Well, I’ll try being more concise this time’ and actually I got less for doing that! So then the next time I thought ‘I’ll go back to my other way’ and it worked better! So it’s been confusing.

STUDENTS’ CONCERNS ABOUT

GUIDANCE AND FEEDBACK (Interviews)

S1: We write the thing, hand it in [S: Yeah] and we get it back with a few comments on … Mainly spelling mistakes. [Laughter][…] S3: It's postgrads [who mark the work], and it's quite, sometimes inconsistent. […] S2: — It's very inconsistent. [S: Yeah]. And also, I don't think that they are marked for us. They are marked for them. [..] I don't think they are writing in the margins so we will know not to do it again. They're writing it in the margins so they will remember that we've done it wrong when they add up the marks, I think. It isn't done as feedback.

TWO CASE STUDIES

Case 1

-

A Large First-year Course Unit

Case 2

-

A Small Final-year Honours Module

Case One A LARGE FIRST-YEAR COURSE UNIT

 Over 600 students and 25+ staff in varied roles  50% of overall grade from coursework, incl. a debate, a group poster, an advisory letter to a GP (the ‘pertussis enigma’ exercise)  findings from initial questionnaires and interviews:  low questionnaire scores on clarity about assessment and feedback  general concern about limitations of pre-assignment guidance and post-assignment feedback  particular concern with the ‘pertussis enigma’ exercise

Case One A LARGE FIRST-YEAR COURSE UNIT

 The collaborative initiative agreed with the course team to address the concerns identified:  strengthened guidance to lab demonstrators about assignments and assessments (incl. the ‘pertussis enigma’ exercise)  adoption of a structured marking and feedback proforma for the ‘pertussis enigma’ exercise

Case One: FINDINGS ON IMPACT

Pre-Collaborative Initiative

S1: We didn't actually get much feedback on the actual marking of [the pertussis exercise]. Mine had no written comments on it at all and had 10 out of 20 or something, which I wasn't too happy with. I: So you didn't understand why you'd got that mark? S1: Yeah, well no comments were on it at all […] 

Collaborative Initiative

S: Yeah. [...] I thought [the feedback on the pertussis assignment] was good because it had written comments and how you'd done in each bit. So it wasn't just a mark out of nowhere, you knew where you'd let yourself down, whether it was the presentation,or whether it was the content, or what.

Case One: FINDINGS ON IMPACT

[ With apparently highly similar student cohorts

]  More positive perceptions of advance guidance and feedback about the ‘pertussis enigma’ exercise in every interview following the introduction of the initiative  No evidence in the questionnaire data of impact across the module  Suggests difficulty of change across multiple assignments with many staff involved

Case Two A SMALL FINAL-YEAR HONOURS MODULE

 A total of 14-15 students and two staff  Took the form of student-led seminars, assessed by oral presentations and essays  Findings from initial questionnaires and interviews:   questionnaire scores low on two feedback items interviews indicated, for presentation and essays:  uncertainty about assessment criteria  relative paucity of feedback

Case Two A SMALL FINAL-YEAR HONOURS MODULE

 The collaborative initiative agreed with the course team to address the concerns identified:  more guidance about assessment criteria in introductory class briefing  handout on assessment criteria for presentations  anonymous written peer feedback on presentations  private feedback meeting between staff and student-presenters

Case Two: FINDINGS ON IMPACT

[ With apparently highly similar student cohorts

]  improvement in questionnaire scores on all the ‘teaching-learning environment’ scales  largest change on scales relevant to the collaborative initiative  similarly very positive comments in the student interviews

Case Two: FINDINGS ON IMPACT

(% ‘agree’ or ‘agree somewhat’)

60 40 100

92.9

92.3

80

64.3

100.0

100.0

78.6

100.0

92.3

20 0

clear expectations how to tackle it 0.0

fdbk for learning staff support 7.1

fdbk to clarify

B3L 2002/03 B3L 2003/04

Case Two: FINDINGS ON IMPACT

Pre-Collaborative Initiative

No, they’re really weird [essay] titles and I’ve just been like, Whoah, where do you start? Like, they’re really bizarre.

Collaborative Initiative

S4 They have given us good guidance [about the essays] - […] S2 Yeah, they did didn’t they? […] S3 Yeah, one of them particularly, it’s not really anything we can find references for […] So, it’s something we’ve really got to kind of think about, and draw on our knowledge of what we already know […]

Case Two: FINDINGS ON IMPACT

 I:

Collaborative Initiative

So do you think having feedback from other students [on your presentation] is worthwhile?

S1: I think it is, ‘cause then you realise what you did wrong and how you can improve it. It is actually really useful.

S2: Especially from people that, you know, if we do something blatantly stupid they’ll tell us. It’s quite good to get opinions from people who’ve been listening to you but not marking.

REVIEW OF CASE FINDINGS

(Bearing in mind the need for caution about the scale and limitations of the research)

these research findings would seem to indicate that: 1.

students’ concerns about the effectiveness of guidance and feedback took various forms 2.

areas of particular concern could be pinpointed, and steps taken to try to address these 3.

4.

there was follow-up evidence of impact in interviews (in both cases) and in questionnaires (in case 2) findings from these and other cases suggest that enhancing the quality of feedback and guidance may be harder to achieve in larger team-taught courses

A CODA: MODELLING GUIDANCE AND FEEDBACK

 “Unfinished business”  analysis and writing-up of research evidence as ongoing and recursive  Remodelling guidance and feedback as an integrated loop

The guidance and feedback loop

1.

STUDENT SÕ PRIOR EXPERIENCES OF ASSESSMENTS

IN THE SUBJECT/IN THE UNIT

feed-forward into next assignment/ assessment 5.

SUPPLEMEN TAR Y SUPPORT 2.

PRELIMINARY GUID AN CE

ABOUT EXPECTATIONS & REQUIREMENTS

6.

FEED-FORWARD

i.e. DE PLOYMENT OF ENHANCED UNDE RSTANDING AND/OR SK ILLS IN SUBSE QUENT ASS ESSMENTS

embark on assignment review feedback 4.

FEEDBACK ON PERFORMANCE/ ACHIEVEMEN T submit assignment 3.

ONGOIN G CLARIFICAT ION

OF EXPECTATIONS

1.

STUDENTSÕ PRIOR EXPERIENCES OF ASSESSMENTS

IN THE SUBJECT/IN THE UNIT

2.

PRELIMINARY GUID AN CE

ABOUT EXPECTATIONS & REQUIREMENTS c oursew ork e.g. w ritten/oral guidelines about assignment requirements, access to past examples of completed assignments exams & tests e.g. w ritten/oral guidelines about exam/test requirements, access to model answ ers/past ex am questions

3.

ONGOING CLARIFICAT ION

OF EXPECTATIONS c oursew ork e.g. specific queries addressed in tutorials/ practicals/ by email exams & tests e.g. opportunities to gain practice in tackling tasks of the kind on w hich assessments w ill be based

4.

FEEDBACK ON PERFORMANCE/ ACHIEVEMEN T

c oursew ork e.g. individualised w ritten comments/ breakdow n of marks linked to the assessment criteria or specific components of the set task exams & tests e.g. w hole class oral feedback on ow n and other small groups’ answ ers to the problems set and addressed in class

5.

SU PPLEMENTAR Y SU PPORT

cours ew ork e.g. follow -up referral to remedial resource materials, and/ or indiv idualised guidance on areas of difficulty ex ams & tes ts e.g. anticipatory feedback, i. e. access to past exam questions w ith lecturer’s commentary (on for instance ‘traps for the unw ary ’)

6.

FEED-FORWARD

i.e. DEPLOYMENT OF ENHANCED UNDERSTANDING AND/OR SKILLS IN SUBSEQUENT ASSESSMENTS

The guidance and feedback loop

1.

STUDENT SÕ PRIOR EXPERIENCES OF ASSESSMENTS

IN THE SUBJECT/IN THE UNIT

feed-forward into next assignment/ assessment 5.

SUPPLEMEN TAR Y SUPPORT 2.

PRELIMINARY GUID AN CE

ABOUT EXPECTATIONS & REQUIREMENTS

6.

FEED-FORWARD

i.e. DE PLOYMENT OF ENHANCED UNDE RSTANDING AND/OR SK ILLS IN SUBSE QUENT ASS ESSMENTS

embark on assignment review feedback 4.

FEEDBACK ON PERFORMANCE/ ACHIEVEMEN T submit assignment 3.

ONGOIN G CLARIFICAT ION

OF EXPECTATIONS

REMODELLING GUIDANCE AND FEEDBACK

 [Re]modelling guidance and feedback as an integrated whole     the guidance-and-feedback loop takes in both coursework and exams illuminates potential troublespots shows how steps can be inter-related  findings as data and evidence

in tandem with

findings as tools for diagnosis & enhancement

KEY REFERENCES

Black, P., Harrison, C., Marshall, L. and Wiliam, D. (2003). Assessment for Learning. Putting It into Practice. Maidenhead: Open University Press.

Carless, D. (2006). 'Differing perceptions in the feedback process', Studies in Higher Education, 31.2, pp. 219-233.

Hounsell, D. (2003). 'Student feedback, learning and development'. In: Slowey, M.

and Watson, D. ed. Higher Education and the Lifecourse. Maidenhead: SRHE & Open University Press/McGraw-Hill. pp. 67-78.

Hounsell, D. [in press]. 'Towards more sustainable feedback to students.' In: Boud, D. and Falchikov, N., eds. Rethinking Assessment for Future Learning. London: Routledge Hounsell, D et al. (2005) Enhancing Teaching-Learning Environments in

Undergraduate Courses: End-of-Award Report to ESRC on project

L139251099. Universities of Edinburgh, Durham and Coventry: ETL Project. http://www.ed.ac.uk/etl/publications

KEY REFERENCES

Hounsell, D., McCune, V., Hounsell, J. and Litjens, J. ‘The quality of guidance and feedback to students’. [Submitted for journal publication, Sept 2006] Krause, K., Hartley, R., James, R. and McInnis, C. (2005). The First Year Experience in Australian Universities: Findings from a Decade of National Studies. Final Report to DEST. Melbourne: University of Melbourne, Centre for the Study of Higher Education.

http://www.cshe.unimelb.edu.au/

McCune, V. and Hounsell, D. (2005). ‘The development of students' ways of thinking and practising in three final-year biology courses’. Higher Education, 49(2), 255-289. Nicol, D. and Macfarlane-Dick, D. (2006). ‘Formative assessment and self-regulated learning: a model and seven principles of good feedback practice’. Studies in Higher Education, 31(2), 199-218 QAA (2003). Learning from Subject Review, 1993-2001: Sharing Good Practice.

Gloucester: Quality Assurance Agency for Higher Education.

http://www.qaaa.ac.uk

Sadler, D. R. (1998) Formative assessment: revisiting the territory, Assessment in Education 5(1): 77-84.