Transcript Slide 1

Best Evidence Medical Education
&
Evaluating the Evidence
Workshop Aim
The aim of this workshop is to explore how critical appraisal of research
studies is done for non-experimental research, especially in the field of
educational evaluation. It will help you to:





Gain an overview of approaches to critical appraisal and an
appreciation of its role in evidence informed practice and policy making.
Identify the challenges educators face judging evaluation designs from
a variety of research paradigms, using both quantitative and qualitative
data collection methods.
Increase your knowledge of the purposes and process of systematic
review research in professional education
Increase your awareness of the work of the Best Evidence Medical
Education Collaboration
Consider whether to submit a proposal to do a BEME systematic review
or a rapid review.
2
09 30 – 09 45
Welcome and introductions
09 45 – 10 15
Evidence informed education: national, international and
professional perspectives
Marilyn
Hammick
10 15 – 11 00
Evidence informed practice in education: argument and
evidence
Small group
activity 1
11 00 – 11 15
11 15 – 11 30
11 30 – 12 00
Refreshments
Plenary feedback from activity 1
Appraising and using education research papers in systematic
review work
All
Marilyn
Hammick
12 00 - 12 45
The reality of critical appraisal –part A
Small group
activity 2
12 45 – 13 00
Plenary feedback from activity 2
13 00 – 13 45
Lunch
13 45 - 14 30
The reality of critical appraisal –part B
14 30 - 14 45
Plenary feedback from activity 3
14 45 – 15 00
Refreshments
15 00 - 15 30
Identifying the need for and using evidence for practice and
policy decisions
Small group
activity 4
15 30 - 15 45
Plenary feedback from activity 4
All
15 45 – 16 00
Take home messages and close
All
Small group
activity 3
3
Evaluating the Evidence

Evidence informed practice and policy

Critical appraisal of the evidence
4
Evidence informed education: national, international
and professional perspectives

International

National


Campbell Collaboration
UK EPPI-Centre
evidence for policy and practice information
Professional
Best Evidence Medical Education
5
C2 Coordinating Groups





Crime and Justice
Education
Social Welfare
Methods
Communication and Internationalisation
6
Time for evidence based medical education
Tomorrow's doctors need informed educators not amateur tutors
Stewart Petersen, Professor of medical education.
Faculty of Medicine and Biological Sciences, University of
Leicester , 1999
John Bligh and M Brownell Anderson
Editorial: Medical teachers and evidence
Medical Education (2000) 34, 162-163
Philip Davies
Approaches to evidence-based teaching
Medical Teacher (2000) 22, 1, pp 14-21
Fredric M Wolf Lessons to be learned from Evidence-based Medicine:
practice and promise of Evidence-based Medicine and Evidence-based Education
Medical Teacher (2000) 22, 3 pp 251-259
C P M van der Vleuten et al The need for evidence in education
Medical Teacher (2000) 22, 3, pp 246-250
7
Best Evidence Medical Education
(2001)

Appropriate systematic reviews of
medical education

Dissemination of information

Culture of best evidence medical
education
8
Taking a BEME approach to educational decisions

Comprehensively critically appraising the literature that already exists



Categorizing the power of the evidence available



realism
epistemological openness
Identify the gaps and flaws in the existing literature




systematic
transparent
published
grey
hand searching
Suggest and carry out appropriately planned studies


optimize the evidence
education intervention more evidence based
9
BEME (2008)







7 published reviews, 2 in-press
Rapid reviews, 3 in-press
BEME Spotlights
Medical Teacher, BEME Guide, Website
Partnership with University of Warwick
UK Autumn workshop/Spring Conference
Widening the community of practice
10
Published reviews i
•
Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL,
Scalese RJ. Features and uses of high-fidelity medical
simulations that lead to effective learning –a BEME systematic
review. Med Teach 2005; 27(1): 10-28.
•
Hamdy H, Prasad M , Anderson M B, Scherpbier A, Williams
R, Zwierstra R, Cuddihy H. BEME systematic review:
Predictive values of measurements obtained in medical
schools and future performance in medical practice. Med Teach
2006; 28, 2, pp 103-116.
•
Dornan T, Littlewood, S Margolis A, Scherpbier A, Spencer J,
Ypinazar V How can experience in clinical and community
settings contribute to early medical education? A BEME
systematic review. Med Teach 2006; 28, 1, pp 3-18.
11
Published reviews ii

Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB
Systematic review of the literature on assessment, feedback and
physicians’ clinical performance Med Teach 2006; 28, 2, pp 117128.

Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, Gelula M and
Prideaux D. A systematic review of faculty development initiatives
designed to improve teaching effectiveness in medical
education:BEME Guide No 8. Med Teach 2006; 28, 6 pp. 497-526.

Hammick M, Freeth D, Koppel I, Reeves S & Barr H (2008) A Best
Evidence Systematic Review of Interprofessional Education BEME
Guide no. 9 Medical Teacher 29 (8): pp. 735-51.

Colthart I, Bagnall G, Evans A, Allbut H, Haig A, Illing J and
McKinstry B (2008). The effectiveness of self-assessment on the
identification of learner needs, learner activity, and impact on clinical
practice: BEME Guide no 10. Medical Teacher 30:2, pp 124-145.
12
Reviews in progress …

To what extent is the OSCE a valid, reliable, and feasible method of
assessing the different learning outcomes in undergraduate medical
education?

A systematic review of the literature on the effects of portfolios on
student learning in undergraduate medical education (peer review)

A systematic review on the use of portfolios in postgraduate
assessment (peer review)

A systematic review of the evidence base around clinical and
professional final year assessment in veterinary education
13
•
Assessing the effectiveness and impact of a patient safety
curriculum, led by David Mayer, University of Chicago College of
Medicine
•
Effectiveness of Journal Clubs, led by Karen Kearley, Oxford
University, UK.
•
Skills Loss after Resuscitation Courses, led by Ben Shaw, Liverpool
Women’s Hospital, UK.
•
Work-based Assessment in Health Professional Education, led by Jill
Thistlewaite, University of Queensland, Australia
•
Educational games for students of health care professions: Elie Akl,
Dept of Medicine, State University of New York, Buffalo, USA
•
A review of the evidence linking conditions, processes and outcomes
of clinical workplace learning: Tim Dornan et al, Manchester Medical
School, UK
14
Activity 1: Evidence informed practice in
professional education: argument and evidence
(45 mins)




Task: Critically analyse the following two papers, identifying the
strengths and weaknesses of the arguments being made.
Paper 1: Hammersley M (2005) Is the evidence –based policy
movement doing more good than harm? Reflections on Iain
Chalmers case for research based policy making and practice.
Evidence & Policy 1: 1, 85-100.
Paper 2: Davies P. (2000) The relevance of systematic reviews
to educational policy and practice Oxford Review of Education
26: 3&4, 365-378.
Feedback 3 key points from your discussion
15
The systematic review examined
(Hammick M. A BEME Review: a little illumination. Med Teach. 27(1): 1-4, 2005).
17
Systematic review of the effectiveness of
interprofessional education (JET)





10,495 abstracts
884 papers
353 studies
107 ‘robust’
evaluations
21 best evidence
studies




Medline
CINAHL
BEI
ASSIA
1966-2003
1982-2001
1964-2001
1990-2003
Mainly N.A. (60%)
UK = 33%
18
Other BEME Review examples





1992 – 2001
BEI, ERIC, Medline,
CINAHL & EMBASE
6,981 abstracts
699 papers
73 studies in Review






Up to 2001
Medline, Embase,
EPOC, ERIC, BEI
20,000 ‘hits’ (titles scanned)
560 papers
+44 on update: 2001-4
33 studies in Review
19
Abstract Filter, applying inclusion criteria
20
Abstract Filter, applying inclusion criteria
MAPPING THE FIELD
Directions for travel
Key requirements
Challenges and barriers
Equipment for the journey
Who should travel this way
21
Abstract Filter, applying inclusion criteria
Macros
issues
Setting and
context of the
intervention
Descriptive review
Paints a picture
MAPPING THE FIELD
Develop
s theory
Tells a story
- local,
- national
-International
Learners’
views on
the
intervention
22
Abstract Filter, applying inclusion criteria
Systematic review
Evaluation filter
Broad
Useful
Limited
- inclusive
- general theory supported
by some evidence
What, how,
when, who,
where?
23
Abstract Filter, applying inclusion criteria
Evaluation filter
What, how,
when, who,
where?
Quality filter
Characteristics of
effectiveness?
24
Abstract Filter, applying inclusion criteria
Evaluation filter
Focussed
Robust
Powerful
Transferable
What, how,
when, who,
where?
Quality filter
Systematic review
- exclusive
- specific theory supported
by strong evidence
Characteristics of
effectiveness?
25
Four guiding principles – that an enquiry should be:
 contributory: advances wider knowledge and/or understanding;

defensible in design: provides a research strategy which can
address the questions posed;

rigorous in conduct: through the systematic and transparent
collection, analysis and interpretation of data;

credible in claim: offers well-founded and plausible arguments
about the significance of the data generated
Ref: UK HM Government Strategy Unit
26
Quality judgement

Contribution

Design

Conduct

Claims
27
Contribution

Assessment of current knowledge

Identified need for knowledge

Takes organisational context into account

Transferability assessed
28
Defensible design






Theoretical richness
Evaluation question (s)
Clarity of aims and purpose
Criteria for outcomes and impact
Resources
Chronology
29
Conducted rigorously


Ethics and governance
Clarity and logic
sampling
 data collection
 analysis
 synthesis
 judgements

30
Makes credible claims
Collection
Interpretation
Judgement
31
Activity 2: Critically appraise the strengths
and weaknesses of primary research (45 mins)




Task: to evaluate two reports of educational research and
discuss their value for evidence informed decision making in
professional education using BMJ Guidelines for evaluating
papers on educational interventions
Paper 3: Crutcher et al. (2004) Multi-professional education in
diabetes Medical Teacher, 26: 5, 435–443
Paper 4: Boehler et al. (2006) An investigation of medical
student reactions to feedback: a randomised controlled trial
Medical Education 40: 746–749
Feedback: value of the 2 studies & utility of the tool
32
CASP Tools



Study design
systematic reviews, randomised controlled
trials, qualitative research studies, cohort
studies, case control studies, diagnostic
test studies, economic evaluation studies
http://www.phru.nhs.uk/Pages/PHD/CASP.
htm
33
Activity 3: Critically appraise the strengths and
weaknesses of primary research (45 mins)




Task: to evaluate two reports of educational research and
discuss their value for evidence informed decision making in
professional education using the UK Government’s Framework
for appraising the quality of qualitative evaluations (pp 1117) and/or the CASP tool
Paper 5: Alderson et al. (2002) Examining ethics in practice:
health service professionals evaluations of in-hospital ethics
seminars Nursing Ethics 9: 5, 508-521
Paper 6: Bing-You et al. (1997) Feedback falling on deaf ears:
residents’ receptivity to feedback tempered by sender credibility
Medical Teacher 19: 1, 40-44
Feedback: Issues involved in making judgements about
reported research -3 key points
34
Activity 4: The need for and use of
evidence (30 mins)

Task: to discuss and identify the need for and
use of evidence by education practitioners
and policy makers in local and national
contexts.

Plenary session:


1 practice and 1 policy area that could be
informed by evidence and why
Challenges in using evidence to shape practice
and policy
35
To conclude…







BEME seminar in your workplace
Support & guidance for review groups
AMEE 2009, Malaga - BEME sessions
Warwick, May 2009 - Portfolio Conference
Contact [email protected]
Collect certificates and information
Medev evaluation sheet
36
Finally …

Have a safe journey home
Thank you
37