Issues and Challenges around Appraising Qualitative

Download Report

Transcript Issues and Challenges around Appraising Qualitative

Issues and Challenges around Appraising
Qualitative Research
ESQUIRE Sheffield 4 September 2014 11:50-12:30
Ruth Garside
Senior Lecturer in Evidence Synthesis
Talk structure
Quality appraisal:
• Should we do it?
• How can we do it?
• What are the challenges?
• Next steps?
Should we?
• Do we need to distinguish between high quality research
and poor?
• Standards for systematic reviews generally.
• Precedent?
Review of published reviews of qualitative
research
• Of 42 studies:
– 21 did not describe appraisal of studies
– 6 explicitely mentioned not conducting formal appraisal
of studies
– 5 papers did a critical appraisal, but did not use a formal
checklist
– 7 described modifying existing instruments
– 1 used an existing instrument without modification
Dixon-Woods M, et al. Synthesizing qualitative research: a review of published reports. Qual Res 2007; 7:375
How? Break out groups
• What makes qualitative research “good quality”?
Not a new issues for qualitative researchers!
Author
Suggested Validity Criteria
Altheide & Johnson
1994
Plausibility, relevance, credibility, importance of topic.
Eisenhart & Howe
1992
Completeness, appropriateness, comprehensiveness, credibility,
significance.
Leininger 1994
Credibility, confirmability, meaning in context, recurrent patterning,
saturation, transferability.
Lincoln 1995
Positionality, community as arbiter, voice, critical subjectivity, reciprocity,
sacredness, sharing perquisites of privilege.
Lincoln & Guba
1985; ‘89
Truth value, applicability, consistency, neutrality.
Marshall 1990
Goodness, canons of evidence.
Maxwell 1992; ‘96
Descriptive validity, interpretive validity, theoretical validity, evaluative
validity, generalizability.
Sandelowski 1986,
‘93
Credibility, fittingness, auditability, confirmability, creativity, artfulness.
Smith 1990
Moral and ethical component.
Thorne 1997
Methodological integrity, representative credibility, analytic logic,
interpretive authority.
Whittemore et al Validity in Qualitative Research. Qual Health Res. 2001; 11(4): 522-537
Example checklists
CASP Qualitative research checklist
Question
Hint: Consider
1
Was there a clear statement of
the aims of the research?
2
Is a qualitative methodology
appropriate?
• What was the goal of the research?
• Why it was thought important?
• Its relevance.
If the research seeks to interpret or illuminate the
actions and/or subjective experiences of research
participants.
• Is qualitative research the right methodology for
addressing the research goal?
Is it worth continuing?
Critical Appraisal Skills Programme
http://media.wix.com/ugd/dded87_951541699e9edc71ce66c9bac4734c69.pdf
Y/N
/CT
Question
Hint: consider
3
Was the research design appropriate to
address the aims of the research?

4
Was the recruitment strategy appropriate
to aims of the research?
5
Was the data collected in a way that
addressed the research issue?
6
Has the relationship between researcher
and participants been adequately
considered?
• If the researcher has explained how the participants were
selected.
• If they explained why the participants they selected were the
most appropriate to provide access to the type of knowledge
sought by the study.
• If there are any discussions around recruitment (e.g. why some
people chose not to take part).
• If the setting for data collection was justified.
• If it is clear how data were collected (e.g. focus group, semistructured interview etc.)
• If the researcher has justified the methods chosen.
• If the researcher has made the methods explicit (e.g. for
interview method, is there an indication of how interviews were
conducted, or did they use a topic guide)?
• If methods were modified during the study. If so, has the
researcher explained how and why?
• If the form of data is clear (e.g. tape recordings, video material,
notes etc).
• If the researcher has discussed saturation of data.
• If the researcher critically examined their own role, potential
bias and influence during.
(a) Formulation of the research questions.
(b) Data collection, including sample recruitment and choice of
location.
• How the researcher responded to events during the study and
whether they considered the implications of any changes in the
research design.
If the researcher has justified the research design (e.g. have
they discussed how they decided which method to use)?
Y/N
/CT
Question
7 Have ethical issues been taken
into consideration?
8 Was the data analysis
sufficiently rigorous?
9 Is there a clear statement of
findings?
1 How valuable is the research?
0
Hint: consider
• If there are sufficient details of how the research was explained to participants
for the reader to assess whether ethical standards were maintained.
• If the researcher has discussed issues raised by the study (e.g.
issues around informed consent or confidentiality or how they have handled the
effects of the study on the participants during and after the study).
• If approval has been sought from the ethics committee.
• If there is an in-depth description of the analysis process.
• If thematic analysis is used. If so, is it clear how the categories/themes were
derived from the data?
• Whether the researcher explains how the data presented were selected from
the original sample to demonstrate the analysis process.
• If sufficient data are presented to support the findings.
• To what extent contradictory data are taken into account.
• Whether the researcher critically examined their own role, potential bias and
influence during analysis and selection of data for presentation.
• If the findings are explicit.
• If there is adequate discussion of the evidence both for and against the
researchers arguments.
• If the researcher has discussed the credibility of their findings (e.g. triangulation,
respondent validation, more than one analyst).
• If the findings are discussed in relation to the original research question.
• If the researcher discusses the contribution the study makes to existing
knowledge or understanding e.g. do they consider the findings in relation to
current
practice or policy? or relevant research-based literature?
• If they identify new areas where research is necessary.
• If the researchers have discussed whether or how the findings can be
transferred to other populations or
considered other ways the research may be used.
Y/
N/
CT
In small groups discuss:
• Are there any challenges to using these criteria?
• Do they assess “quality”
– Why? / Why not?
Example checklists
Wallace A, et al . Meeting
the challenge: developing
systematic reviewing in
social policy. Policy and
Politics 2004; 32(4):455470.
Challenges
1). Research community agreement
Standards for qualitative research have variously
emphasized literary and scientific criteria, methodological
rigor and conformity, the real-world significance of the
questions asked, the practical value of the findings, and
the extent of involvement with, and personal benefit to,
research participants.
Sandelowski, M., and J. Barroso. 2007.
Handbook for synthesizing Qualitative Research. New York: Springer
Challenges
2). Lack of fit between systematic review and qualitative
researcher priorities
Challenges
3). What are we actually appraising?
–
–
–
–
Lack of distinction between reporting standards and conduct.
Applying one standard to a discipline with different standards.
Different purposes – theory generation vs pragmatic questions
Many checklists give multiple sample “guidance” for each
question but dichotomous scores
Challenges
4). Interpretation required
Comparing 3 checklists:
Agreement in categorizing papers was slight….Structured
approaches did not appear to yield higher agreement than
unprompted judgement.
Dixon-woods et al. 2007. J Health Serv Res. 12(1): 42-47
Challenges
5). What do we do with “poor quality” studies?
Variously:
• Exclude
• “Weight”
• Test through contribution to the synthesis
A proposal:
• Technical aspects
• Trustworthiness
• Theoretical considerations
• Practical considerations
Garside. Should we appraise the quality of qualitative research reports for systematic reviews and if so,
how?. Innovation: the European Journal of Social Science Research. 2014; 27(1): 67-79
1. Technical aspects:
Y/P/N
Comments
1. Is the research question(s) clear?
2. Is the research question(s) suited to qual. enquiry?
Are the following clearly described?
3.
Context
4.
Sampling
5.
Data collection
6.
Analysis
Adapted from:
Dixon-Woods et al. The problem of appraising qualitative research. Qual Saf Health Care 2004; 13:233-225
& Popay J, Using Qualitative Research to Inform Policy and Practice. ONS, Cardiff: April 2008.
2. Trustworthiness
For example:
• Are the design and execution appropriate to the research
question?
• What evidence of reflexivity is there?
• Do the voices of the participants come through?
• Are alternative interpretations, theories etc explored?
• How well supported by the data are any conclusions?
• Are ethical considerations given appropriate thought?
• etc.
3. Theoretical considerations
For example:
• Does the report connect to a wider body of knowledge or
existing theoretical framework; and, if so
– Is this appropriate (e.g. not uncritical verification);
• Does the paper develop explanatory concepts for the
findings
• etc.
4. Practical considerations
Not “is this research valid?” but rather “what is this
research valid for?”
For example
• Does this study usefully contribute to the policy
question?
• Does this study provide evidence relevant to the policy
setting?
• Does this study usefully contribute to the review?
Adapted from: Aguinaldo JP. Rethinking Validity in Qualitative Research from a Social Constructionist Perspective:
From "Is this valid research?" to "What is this research valid for?". The Qualitative Report 2004; 9(1):127-136.
Thank you!
[email protected]
www.ecehh.org