UWE presentation

Download Report

Transcript UWE presentation

Public involvement in
research: assessing impact
through a realist evaluation
invoNET 21 February 2012
David Evans, Vito Laterza & Rosie Davies
on behalf of the UWE/Coventry team
Acknowledgements
This project was funded by the National
Institute for Health Research (NIHR)
Health Services & Delivery Research
programme (project number 10/2001/41).
The views expressed are those of the
authors and not necessarily those of the
NHS, the NIHR or the Department of
Health.
UWE/Coventry team
•
•
•
•
•
•
•
•
•
•
•
•
•
Professor David Evans (principal investigator), UWE
Professor Jane Coad, Coventry
Dr Jane Dalrymple, UWE
Ms Rosie Davies, research partner
Ms Chris Donald, research partner
Professor Sarah Hewlett, UWE
Mr Vito Laterza, UWE
Dr Amanda Longley, UWE
Professor Pam Moule, UWE
Dr Katherine Pollard, UWE
Dr Jane Powell, UWE
Ms Ruth Sayers, research partner
Ms Cathy Rice, research partner
Focus of this session
• Initial thoughts on reflexivity
• Background to our project, design and
conceptual framework
• Reflections on early learning
– Theory development and testing
– Objectivity and contamination
– Working with research partners
Reflexivity and reflective practice
• “Reflexivity” – not the same as reflective practice:
• Thinking about ourselves (i.e. reflective practice) and
learning from that (i.e. do things better with reflective
practice)
• But also linking ourselves to our research participants
(not just what you can do better, but thinking about
others through our own experience)
• Thinking about thinking (i.e. epistemology): reflecting
upon basic structures and processes that are behind the
taken-for-granted everyday reality we live in and study in
our research contexts
Background
• Team based at University of the West of
England (UWE) and University of Coventry
• Grew out of existing UWE Service User and
Carer Involvement in Research initiative
• Nine academic researcher and four
research partner (user) co-applicants/coresearchers
Design
•
•
•
•
Realist evaluation framework
18 month project
Eight case studies
Mainly qualitative methods
– Semi-structured interviews (c. 5 participants
per case study x 3 interviews over one year)
– Observation
– Documentary analysis
– Consensus workshops
• Economic analysis
Realist evaluation
• Policy driven by an underlying theory of
how an initiative is supposed to work
• Role of the evaluator to compare theory
and practice
• “What works for whom in what
circumstances and in what respects?”
• Look for regularities of context,
mechanism and outcome (CMO)
(Pawson 2006; Pawson & Tilley 1997; 2008)
Levels of public involvement in
research theory
• Policy level – what do DH, NIHR and other senior R&D
stakeholders think are effective mechanisms leading to
desired policy outcomes?
• Programme/project level – what do stakeholders (e.g. PI’s,
research teams, research partners) think involvement
contributes to their desired outcomes?
• Academic level – what are the dominant academic theories
in the literature about public involvement mechanisms in
research and whether/how they work?
• Our research team – what do we think are the contextspecific and generalisable mechanisms leading to desired
policy outcomes?
Our CMO theory – to date
Context
Mechanism
Outcome
•Leadership on
involvement by the PI or
other senior member of
the research team
•Attitudes of trust and
respect towards the public
involved
•Culture of valuing and
support for involvement
•Infrastructure that
supports involvement, e.g.
policy on payment and
expenses
•Involvement throughout a
research project
•Long-term involvement
•Training and support
•Linking involvement to
decision making
•Budget for involvement
•Defined roles
Impact on research design
and delivery:
•Project design
•Research tools
•Recruitment
•Data collection
•Analysis
•Writing up
•Dissemination
Our task – articulate and test public
involvement in research theory
• Articulate policy-level programme theory from
policy documents, actions, etc – it’s about research
quality not empowerment
• Synthesise what we know about context,
mechanisms and outcomes in practice from the
literature – recognising complexity and uncertainty
• Simplify and express theory in a testable form for
testing in case studies
• Collect case study data and analyse
• Revise theory and repeat process
Reflections on theory
development and testing
• Difficulty of categorising factors as context
or mechanism
• Multiple contextual factors and
mechanisms making causal attribution
difficult
• Time period for data collection shorter than
project timescales
• Impact may be diffuse not specific
• Outcomes may be quite limited
‘Objectivity’ and ‘contamination’
At this stage, three related reflexive effects emerged:
•
Leading members of the team are also experts who provide advice and
training on public involvement in research to researchers and PIs
(i.e. “now that you study us, can we still ask for advice?”)
•
The questions we are asking are triggering processes of reflection that are
likely to have an impact on the ongoing processes of shaping public
involvement structures and mechanisms (i.e. “Mmm that’s a good question,
I haven’t thought about that” or “I will certainly consider these issues
further”)
•
Varying responses on potential issues of overlap (or “contamination”) from
the participants themselves (i.e. some do not seem to be particularly
concerned, while others are more concerned about keeping our study
process from influencing the ongoing process of public involvement under
study)
Reflections on ‘objectivity’ and
‘contamination’
Implications for questions of ‘objectivity’ in qualitative health research:
•
Reflexive effects need to be taken into account and productively explored
as data, rather than discarded as “unwanted bias”: this is why we decided to
keep reflection going, rather than have a unilateral policy on this (i.e. we will
continue to give advice and we are also aware that participants might
change their behaviour due to the questions and reflections emerged from
data collection; these effects will be followed up, where possible)
•
We will respect research participants’ wishes on the matter: if they want us
to put specific measures in place to reduce any possible influence, we aim
to accommodate that
•
We will protect research participants’ confidentiality and anonymity in all
cases, and this will always take precedence on questions of reflexivity and
objectivity in any case.
Involving research partners
• Contributed from design stage onwards
• Each research partner works with one academic
researcher and Vito on two of the eight case studies
• Research partners meet together as a group in addition
to attending full team meetings
• Involvement in all aspects of the project, including theory
building, conducting interviews and analysis
• Reflection on the team’s process of involvement, issues
and outcomes included as data
• Support from Vito and named researcher on team
Reflections on involving
research partners
• Role development in the institution at UWE
– Extending formal arrangements because of our need
for research passports
• Difficult to keep track of impact!
• Differences: levels of experience in research,
life experiences, kinds of contributions, between
case study teams ...
• Complex roles and relationships
Contact details
• David Evans, Professor in Health Services
Research (Public Involvement)
[email protected]
• Vito Laterza, Research Fellow
[email protected]
• Rosemary Davies, Research Partner
[email protected]