Essential skills in pedagogic research

Download Report

Transcript Essential skills in pedagogic research

Health Sciences and Practice Subject Centre
Essential Skills in Pedagogic Research Workshop
The Practice Teacher: Innovations and Best Practice
16-17/11/2010
Professor Marilyn Hammick
Consultant on Pedagogic Research
for Health Sciences and Practice Subject Centre
www.health.heacademy.ac.uk
Features
• Balanced and broad approach to
educational enquiry, research and
evaluation
• Two days about skills, not theory-led
• Interactive learning, small groups, shared
tasks
• Suitable for new researchers and those
with limited experience
2
Learning Outcomes (or what you will/should know
more about by Wednesday 4 pm)
• formulate a focused research question and, or hypothesis and
establish appropriate aims and objectives related to the question/
hypothesis
• select research methodologies and data collection methods that
are appropriate to different types of research questions
• discuss essential element of ethics and governance in relation to
educational research
• discuss resourcing, training and competence in relation to
educational research
• plan the timely delivery of research studies
• formulate research outcomes and products that make a
contribution to evidence informed practice and policy in health care
professional education
• take a critical view of reported research and discuss its contribution
to knowledge
3
Educational research
• Craft
• Intellectual
engagement
• Open mindedness
• Reflexivity
• Writing to find the
sense
• Detective work
• Clarifying
• Making meaning
• Deeper
understanding
• Effectiveness
• Improvement
4
The research question session 1
• Different types of questions e.g.
clarification, proving, improving
effectiveness, outcome-based, process
orientated, cost effectiveness, theory
development, explaining theory, realist
perspective, etc.
• Defining and refining the research
question
5
Research design
session 2
• Choices available, role of paradigms,
methodologies and methods
• Role of the researcher and the researched
• Anticipated evidence
• Data characteristics
• Analytical approach
• Discourses of enquiry, the lens that is
chosen
• Decision making, logical steps towards a
6
robust design
Operationalisation of the design
-session 3
•
•
•
•
•
•
Ethics and governance
Delivering the design
Critical review of background literature
Pilot studies
Analytical methods
Training and expertise, supervision,
involving stakeholders, use of a steering
group
• Resources, money and time
7
Research product and utility
-session 4
• Evidence informed educational practice
and policy
• Research product and the consumer
• Identifying the audience
• Dissemination, reportage
• Timeliness
8
Two day plan
Day and Time
Activity
Tuesday am
Introduction & Overview
Session 1: the research question
Tuesday pm
Session 2: designing the study
Wednesday am
Session 3: operationalising the study
Wednesday pm
Session 4: research product and utility
9
Ways of working
• Participants in small groups of 5-6, working in
these groups throughout the two days.
• Each session aims to cover one of the four
aspects
• Within each session there will be
– A brief opening plenary to discuss the small group work for that
session
– small group work with support
– a closing plenary for groups to report the outcomes of their work
in that session.
10
Previous comments ....
• by the end of the course I was confident that that
going through the issues of the other people had
made me consider all my own issues too
• appropriate design for different types of research
did come through in the group work
• (liked) the practical element of designing and
exploring a project
• helped me to begin to question research
methods differently. Working with people from
different disciplines was also useful.
11
Not so good ...
• the final hour because so many people had not
bothered to stay on.
• would have liked more about specific
approaches to research and analysis
• more "input" on how to go about puting together
a research proposal.
• more about theoretical frameworks
• discuss own pedagogic research questions and
ways of addressing these
12
Three purposes of enquiry
Development
Capacity
Accountability
Knowledge
13
Four eeee’s
Evidence
Ethics
Enquiry
Effectiveness
14
Three choices
Rigorous &
robust
15
Research compass
• Appropriate direction
• Question led
approach
• Paradigm
• Methodology
• Methods
16
Enquiry ‘logic loop’
Evidence
Believable
Transferable
Enquiry as
a purposeful
activity
Utility
17
Four guiding principles – that an enquiry should be:
• contributory: advances wider knowledge and/or
understanding;
• defensible in design: provides a research strategy which
can address the questions posed;
• rigorous in conduct: through the systematic and
transparent collection, analysis and interpretation of data;
• credible in claim: offers well-founded and plausible
arguments about the significance of the data generated
Ref: UK HM Government Strategy Unit
18
Finding the questions
• What don’t I know about the effectiveness of my
teaching practice?
• What improvements would I like to make in my
teaching practice?
• What aspects of the curriculum are not evidence
based?
• How could student learning be enhanced?
Purpose:
identify a question that needs an answer,
related to a pedagogic activity (intervention)
that requires some evidence to support its
effectiveness.
19
Planning the research
• Contribution
• Design
• Conduct
• Claims
20
Contribution
• Assessment of current knowledge
• Identified need for knowledge
• Takes organisational context into account
• Transferability assessed
21
Defensible design
•
•
•
•
•
•
•
Logical choice of approach
Logical choice of data collection
Theoretical richness
Clarity of aims and purpose
Criteria for outcomes and impact
Resources
Chronology
22
Research/evaluation question
(in place) Intervention (new)
Aims –what is planned
and
objectives –endpoints/products/outcomes
of the evaluation/research/enquiry
23
1. What are our Reasons and Purposes for
evaluation?
2. What will be the Uses of our evaluation?
3. What will be the Foci for our evaluations?
4. What will be our Data and Evidence for our
evaluations?
5. Who will be the Audience for our evaluations?
6. What will be the Timing for our evaluations?
7. Who should be the Agency conducting the
evaluations?
24
Evaluation design
after
before
The box
=
the intervention to be evaluated
25
Four possibilities (Ovretveit)
•
•
•
•
Descriptive
Audit
what is v what should
Before and After
Comparative/experimental A v B
26
Research groups
• Choices available: (paradigms), methodologies
and methods
• Choice of lens
• Role of the researcher and the researched
• Anticipated evidence -purpose
• Data characteristics
• Analytical approach
• Decision making, logical steps towards a robust
design
• Dissemination –draft plans
27
Weds 17th November
• First day follow up
• Issues related to putting your research
design into practice
28
Using theory ....
International Journal of Integrated Care, 16 November 2010 - ISSN 1568-4156
Research and Theory
The organisation of interagency training to safeguard children in England: a
case study using realistic evaluation
Patsios and Carpenter
http://www.ijic.org/index.php/ijic/index ,
29
Abstract
Background: Joint training for interagency working is carried out by Local Safeguarding
Children Boards in England to promote effective local working to safeguard and promote
the welfare of children.
Purpose: This paper reports on the findings of the outputs and outcomes of interagency
training to safeguard children in eight Local Safeguarding Children Boards.
Methods: A review of Local Safeguarding Children Board documentation, observations
of Local Safeguarding Children Board training sub-group meetings and a series of
interviews with training key stakeholders in each Local Safeguarding Children Board
were used to assess how partner agencies in the Local Safeguarding Children Boards
carried out their statutory responsibilities to organise interagency training. ‘Realistic
Evaluation’ was used to evaluate the mechanisms by which a
central government mandate produced particular interagency training outputs (number of courses, training days)
and joint working outcomes (effective partnerships), within
particular Local Safeguarding Children Board contexts. 30
Results: The ‘mandated partnership’ imposed on Local Safeguarding Children Boards
by central government left little choice but for partner agencies to work together to
deliver joint training, which in turn affected the dynamics of working partnerships across
the various sites. The effectiveness of the training sub group determined the success of
the organisation and delivery of training for joint working. Despite having a central
mandate, Local Safeguarding Children Boards had heterogeneous funding and training
arrangements. These resulted in significant variations in the outputs in
terms of the number of courses per ‘children in need’ in the locality and in the cost per
course.
Conclusions: Interagency training which takes account of the context of the Local
Safeguarding Children Board is more likely to produce better trained staff, effective
partnership working, and lead to better integrated safeguarding children services.
31
Kirkpatrick outcomes model
• Satisfaction
• Learning
– Knowledge -cognition
– Skills -psychomotor
– Attitudes -affective
• Behavioural change
– Self reported
– Reported by A. N. Other
• Impact on
– Service user/community
– Service delivery
32
level evaluatio evaluation description and
n type
characteristics
(what is
measured
)
examples of evaluation tools and relevance and practicability
methods
1
Reaction
Reaction evaluation is how the 'Happy sheets', feedback forms.
delegates felt about the
Verbal reaction, post-training
training or learning
surveys or questionnaires.
experience.
Quick and very easy to obtain.
Not expensive to gather or to analyse.
2
Learning
Learning evaluation is the
Typically assessments or tests
measurement of the increase in before and after the training.
knowledge - before and after.
Interview or observation can also
be used.
Relatively simple to set up; clear-cut for
quantifiable skills.
Less easy for complex learning.
3
Behaviour Behaviour evaluation is the
extent of applied learning back
on the job - implementation.
4
Results
Observation and interview over
time are required to assess
change, relevance of change, and
sustainability of change.
Measurement of behaviour change typically
requires cooperation and skill of linemanagers.
Results evaluation is the effect Measures are already in place via Individually not difficult; unlike whole
on the business or
normal management systems and organisation.
environment by the trainee.
reporting - the challenge is to relate Process must attribute clear accountabilities.
to the trainee.
33
Phases of pedagogic research
•
•
•
•
Interventions
Students as guinea pigs
Best thing since sliced bread phenomenon
Need to ‘test’ interventions within an ethical
framework
• Consider how interventions in other disciplines
are investigated
• Pre-classroom, series of phases (not harmful,
local, multi-centre)
34
Mixed methods research
Toolkit 11: Practical considerations of leading and working on a
mixed methods project
Vanessa May and Hazel Burke, Morgan Centre, University of Manchester, July 2010
Summary
• The aim of this toolkit is to highlight key issues that might arise out
of leading or working on a mixed methods research project.
• The focus is on the practical aspects of such work,
– the importance of teamwork;
– the need to allow for extra time
– issues around data analysis and integration;
– publishing from mixed methods projects.
http://www.methodspace.com/
35
Mixed methods research is a research design with philosophical
assumptions as well as methods of inquiry.
As a methodology, it involves philosophical assumptions that guide the
direction of the collection and analysis of data and the mixture of qualitative
and quantitative approaches in many phases in the research process.
As a method, it focuses on collecting, analyzing, and mixing both quantitative
and qualitative data in a single study or series of studies.
Its central premise is that the use of quantitative and qualitative approaches in
combination provides a better understanding of research problems than either
approach alone.
36
•
A methodology refers to the philosophical framework and the fundamental
assumptions of research (van Manen, 1990). Because the philosophical
framework one uses influences the procedures of research, we define
methodology as the framework that relates to the entire process of
research.
•
Research design refers to the plan of action that links the philosophical
assumptions to specific methods (Creswell, 2003; Crotty, 1998).
Experimental research, survey research and mixed methods are all
research designs.
•
Methods, on the other hand, are more specific. They are techniques of data
collection and analysis, such as a quantitative standardized instrument or a
qualitative theme analysis of text data (Creswell, 2003; van Manen, 1990).
37
Design theory into practice
(from Gaunt)
Aims and objectives
(wobbly interventions with fuzzy boundaries)
Tasks
&
Techniques
clear
undecided
predetermined
Cell 2
Working with
defined
questions and
choice about
design
Eliciting
Cell 1
Know what to do
and why it
needs to be
done
Execution
unclear
Cell 4
Finding direction,
vision, in the
long term
Envisioning
Cell 3
Using appropriate
design to
provide insight
and
understanding
Exploring 38
Conducted rigorously
• Ethics and governance
• Clarity and logic
–sampling
–data collection
–analysis
–synthesis
–judgements
39
Evaluator as learner and trader
•
•
•
•
•
Resistance to enquiry
Understanding the context/values
Balance between doing & investigating
Being mindful of the politics
Awareness of ‘Hawthorne’ effect and
impact of evaluator’s presence
40
Research Governance in
Educational Evaluation
41
Overview of governance in theory
(selective)
• Principles of research involving people
• Duties of the investigator
• Enquiry outcomes
• Practicalities of the process
42
Principles of research involving
people
• Scientific basis of the evaluation
• Add to knowledge and understanding of
the topic
• Equal respect for everyone involved
• Respecting the autonomy of participants
43
Duties of the investigator
(deontology)
• Telling the truth
• Informed consent
• Confidentially and anonymity
• Risk v benefit assessment
44
Enquiry outcomes
(consequentialism)
• Results/consequences -short and long
term
• Hazards
• Non participants
• Achievable aims
45
Practicalities of the process
• Taking account of external codes of
practice and the law
• Capability & Competence
• Resources
• Open to external scrutiny
46
Selection of practical matters
Source: Freeth D, Hammick M, Reeves S, Koppel I & Barr H (2005)
Effective Interprofessional Education: Development, Delivery and
Evaluation. Blackwell Publishing
• What does informed consent mean in this
context?
• How are participants going to be kept
informed of the ongoing evaluation processes
and emergent findings?
• How will initial consent to participate in the
evaluation be recorded and how will ongoing
consent be checked?
47
• Might any aspects of the evaluation plan feel
coercive?
• What level of confidentiality and anonymity
can be assured and how will this be
achieved?
• Does the evaluation team have sufficient
expertise to conduct this evaluation
competently and sensitively?
• Is this evaluation designed to reveal new
things that have practical significance?
48
Makes credible claims
Collection
Interpretation
Judgement
49
Practice to Product
• Evaluator as mirror
• Challenge of difficult messages
• Different audiences
50
Choosing an audience
1. For your chosen topic list all the possible
audiences you might write about it for.
2. Choose one and say why.
3. For your chosen audience decide on a
possible publication outlet and
alternatives in case of rejects!
51
Audience
• Different audiences demand different
languages/styles/word counts
• Policy makers, practitioners, funders
• Local, national or international
• Reading time
52
Overview
• Writing is a creative process
–Telling a story
–Communicating what you know
–Time/material/inspiration/confidence
53
Overview
• Publishing is a technical process
–Reading (and reviewing) published
work
–Patience and organisation
–Focus/skills/audience/realism
54
Publishing & Writing
55
Publwrishying
56
The two r’s ..
• Writers and reading
• Readers and writing
57
The Writing Team
• Ownership of the work
• The idiosyncratic nature of individual
writing practices
• Leadership responsibilities
• Sharing the work and sharing out the
work
58
How much to write?
more and more & then less and less
• Report:
1/3/25 pages
– Key messages/executive summary/report
• Paper or chapter:
3K words
– Abstract/introduction/main text/conclusion
– 150/300/2,400/150
59
Tools to help(?)
• Publishers guidance
• Software potential
– Google docs
– Track changes
• Choice of format for the final product
– consider your competitors
– electronic and web based resources, other
print media, posters.
60
You already know this but …
• Sentences: are best when they say one thing.
• Paragraphs: are a collection of sentences about
that one thing.
• Sections and subsections: are devices to make it
easier to introduce different aspects of the one
thing.
• Chapters: allow a focus on multiple aspects on
that one thing, they need an introduction, a
substantial filling and a summary, its helpful to
link each chapter with previous one (at the
beginning) and to the next (at the end) and to
use eye catching headings that indicates their
content and keep the reader’s interest.
61
Important
• Your work needs a title
– is the content really reflected in the title,
is it short enough to be read at a
glance?
– Is a running title required by the
publisher?
• What are the best sub-titles to use to
guide the reader through your work?
62
Remember
• it’s easier to put the full reference in as
you cite the work of others than do to this
for every citation at the end
• direct quotes need page numbers (and
draft versions with page numbers for all
citations are useful if these need checking
at the any stage!!)
• electronic sources should have an access
date
• et al. is written like this!
63
Describing your enquiry
• 6 word story
• Proposal abstract (500 words)
64
One life, 6 words what yours?
www.smithmag.net
• Not quite what I was planning
• Running away, best decision I
made
• I believe in life before death
• Next time, better parents, better
hair
• A brilliant pen but invisible ink
65
Describing your enquiry
• 6 word story
• Proposal abstract (500 words)
66
Contact Us
Health Sciences and Practice Subject Centre
[email protected]
www.health.heacademy.ac.uk