Transcript Slide 1

BIO45003. APPLIED EPIDEMIOLOGY
UNDERSTANDING SCIENTIFIC LITERATURE AND
INTRODUCTION TO SYSTEMATIC LITERATURE
REVIEWS
This lecture will help you to become familiar with the
dissemination of scientific research, will provide the
tools for you to gain expertise on how to read and
interpret scientific papers.
A second part of the lecture will be devoted to
systematic literature reviews and meta-analysis.
UNDERSTANDING SCIENTIFIC LITERATURE
• Scientific literature is the most formal way to
report/disseminate the scientific activity and its findings
• Other forms of scientific reporting: communication in
conferences, technical meetings, groups of experts,
scientific committees, etc.
MOST COMMON FORMS OF SCIENTIFIC LITERATURE
1. The Research Paper
2. The Review Article (and/or meta-analysis)
3. The Case Report
4. The Opinion Paper
1. THE RESEARCH PAPER
1.2. Definition and general principles
- Articles reporting original research studies
- Target audience: professional and scientific community (no wide audience)
- Subjected to peer-review
- Length subjected to journal limits [currently max 3,500 words]
- Conventional format: IMRAD (Introduction, Materials and Methods,
Results and Discussion)
- Recommended by the Int. Committee of Medical Journals Editors
- Especially advisable for experimental and observational studies
- What does the IMRAD structure answer? (Hill AB, 1965)
Why did you start? (introduction)
What did you do? (materials and methods)
What answers did you get? (results and discussion)
What does it mean anyway? (discussion)
1.2. Structure and contents
a) INTRODUCTION
-
Provides the background information and the aim (and
hypothesis)
-
Four rules to decide the contents:
-
Tell the reader why the research was started
-
Do not explain what can be found in textbooks of the field
-
Do not elaborate in terms in the title
-
Make clear what question the study was designed to answer
-
Subliminal message to the reader: “This topic is important” and “I
know what I’m talking about”
-
Text fully based in references. No a single sentence without
reference!
-
RQ, Aims [and Hypothesis] at the very end of the section.
-
No graphical aid
b) MATERIALS AND METHODS
-
Answers the question “what did you do?”
-
High level of detail
-
Logical sequence:
-
Def. of the design [in relationship with the RQs]
-
Def. of the state, condition or intervention to be studied [case
definition]
-
Def. of the subjects
-
Def. of methods to select subjects
-
Def. of the intervention/s (if any)
-
Def. of all observations to be made (detail how they were made)
-
Definition of the plan of data analysis
-
Subliminal message to the reader: “It is well structured,
comprehensive, specific, detailed and follows standard methods”
-
References when needed [about method/s used].
c) RESULTS or FINDINGS
-
Answers the RQs
-
Reports on the new evidence generated exclusively in this study
-
Increases efficiency by using tables, graphs and figures
-
Text should report summary results and critically examine important
results and data
-
Sequence: Depends on the design. Generally
-
1st “section”: descriptive analysis (E.g.: characteristics of the
sample)
-
2nd “section”: results of analysis aiming to answer the RQs
-
Subliminal message to the reader: “strong results. This study
generates new evidence”
-
The narrative contents often “help” the reader to interpret the
significance of the findings (E.g.: “53%” is not the same than
“over half of the…”; “6” is not the same than “only 6”)
No references
-
d) DISCUSSION
-
Opening sentence: narrative answer to the research question(s)
-
Put results against previous evidence (supporting or contradicting
current findings)
-
Provides thorough self-criticism of the study (general to the design
used and specific to the limits of this study)
-
Closing sentences:
-
Implications: should your findings alter existing treatments?,
existing preventing interventions? modify health policies?, etc.
-
Suggestions: what should happen next? Next research
questions? Next studies to be designed and implemented?
-
Subliminal message to the reader: “The location of my “strong”
results in the previous knowledge is important”, “I am an expert in
the topic”, “in spite of the limits this is a study worthy to take into
account among the evidence in the field”
-
References when needed
OTHER ELEMENTS
TITLE
-
Types:
-
Indicative: tells the reader what the papers is about
-
Informative: tells the reader what the paper is about in more detail
ABSTRACT
• Indicative: tells the reader what the manuscript is about (reviews…)
• Informative: Summarizes what the paper actually says (for research
papers, case reports, systematic literature reviews…)
• Structure: IMRAD structure or similar (with or without headings)
• OTHERS:
• Authors’ names and filiations, key words, acknowledgements,
sources of support, declaration of conflict of interest if any, sources
of support and references
2. THE CASE REPORT
• Papers providing and describing individual cases
Main types:
• The unique (or nearly unique) case that appears to
represent a previously undescribed syndrome or
disease
• The case with an unexpected association of two or
more diseases or disorders that may represent a
previously unsuspected casual relationship
• The case representing a new and clinically important
variation from an expected pattern. The “outlier” case.
• The case with an unexpected evolution that suggests a
possible therapeutic or an important adverse drug
effect.
2. THE CASE REPORT [cont.]
Structure:
• An statement of why the case is worth reading about
• An account of the case, with all the relevant data
• Discussion of evidence, that the case is unique or
unexpected
• Possible alternative explanations for case features
- Conclusion, with implications
A subtype: The case-series analysis: A paper based on
retrospective study of cases records.
3. THE OPINION PAPER
- Papers with no scientific demands (but, logical,
argumentative and theoretical…)
- Not aimed to contribute to evidence
- Main types:
- The editorial
- The position paper
- The book review
- The letter to the editor
4. THE REVIEW PAPER (and META-ANALYSIS)
Type of paper that informs on the state-of-the art of an specific topic
Types:
• Descriptive (narrative reviews) and Systematic reviews
Very important hint!:
A systematic literature review does not summarize the findings from
other studies but EXTRACTS DATA FROM STUDIES to create a
NEW STUDY
Structure:
• Adaptation of the IMRAD format specific of systematic reviews:
1. Objective
2. Data source
3. Study selection
4. Data extraction
5. Data synthesis
6. Conclusions
A. Systematic literature reviews. Structure and contents
1. Introduction
•
Background (it doesn’t include findings from papers
that will be included in the review itself)
•
Aims or objectives: To find out/review/assess the
available evidence regarding… x
Review questions, aims and objectives [and hypothesis]
-
Aims (general statement that goes to the end of the introduction
section) and objectives (detailed description that goes in the
methods section)
• The aim is always to explore/analyse/investigate what is the
available evidence regarding x, y, z.
• Research/review question(s) must be clearly stated. It is always
something like “what is the available evidence regarding x, y and
z?” (“what is known”*)
*but don’t accept this wording. ITeaching
use it forpurposes
teaching purposes
only!
and students seem to forget it.
E.g.: TOPIC: HIV AMONG PREGNANT WOMEN IN SOUTH AFRICA.
• RQ1: What is the evidence regarding prevalence of HIV among
pregnant women in South Africa in the last decade? (screening
studies: ad hoc prevalence studies, data from monitoring systems,
“cross-sectional studies”…)
• RQ2: What is the knowledge regarding the local prevalence of HIV in
pregnant women in the x and z South African districts among health
workers? (surveys)
• RQ3: What is the perception of the burden of the disease among
women attending women’ facilities for ART provision? (qualitative
studies!)
2. Methods. Definitions
• The units of analysis of syst lit reviews: THE STUDIES
INCLUDED.
• The outcome variable(s) are those that answer the RQ [e.g.:
prevalence, incidence, effectiveness…]
• The outcome variable[s] are measured by a unique coefficient
• Some examples of such coefficients:
•
OR,
•
RR,
•
% [prevalence is an example]
•
$,
•
cases/persons*time,
•
survival time,
•
units distributed,
•
vaccines delivered,
•
% of potential cases reached [for coverage]...]
2. Methods [cont.]
• Databases [data sources]
• Search strategies
• Limits [study selection]: Temporal, Geographical, Subsets of the specific topic and so on
• Exclusion/inclusion criteria (for papers or units of
analysis) no linked to content (e.g.: designs, recruitment
settings…)
• Data extraction/ data synthesis
• Assessment criteria
• Software used [for storage and management of
bibliographic references, for graph production, for
data extraction…]
Sources of information and search strategies
(a) Sources of information (databases, websites… used)
(b) Search strategies used including the key words and
combinations of words used (at least two research strategies)
each search strategy applied. Include a good explanation on
development of search strategies until final no. of units of
analysis is reached. They should include not only formal search
strategies but further searchers too (e.g.: in websites, manual
searchers from the reference list in papers screened…)
E.g. of search strategy in PubMed:
#1 Search: HPV. Limits: none. RESULTS = 14,543
#2 Search: HPV . Limits: Clinical Trial, English. RESULTS = 462
#3 Search: HPV AND attitudes. Limits: Clinical Trial, English.
RESULTS = 10
WHAT TO EXTRACT FROM EACH UNIT OF ANALYSIS?
WHERE EACH EXTRACTED BIT GOES TO?
UNIT OF ANALYSIS
SYST LIT REVIEW
- INTRODUCTION
- INTRODUCTION
- METHODS
- METHODS
- RESULTS
- RESULTS
- DISCUSSION
- DISCUSSION
= It might
= Always
3. Results
a. Availability of information
• What was found applying search strategy/ies?
b. Findings
• Findings include: from data extraction, from quality
assessment [or critical appraisal], from heterogeneity
and sensibility analysis [Everything done!]
• Opening with a general description of the findings
[taken all units of analysis together] that answers the
main RQs.
• Specific description of findings [According to the
limits defined, paying attention to outliners, in terms
of the strength of evidence provided by the units of
analysis… ]
4. Discussion (conclusion)
• What all these data mean?
• What do we know?
• What do not we know?
• What do we do next?
CONDUCTING SYSTEMATIC REVIEWS. MAIN STEPS
1
2
4
Formulate a research question
Choose search strategy and inclusion/exclusion
criteria (protocol)
Conduct a computerized search
5
Conduct a supplementary search
6
Assess methodological quality of papers found
7
Synthesize findings
8
Formulate recommendations
Source: Adapted from Sim J and Wrigth C. Research in Health Care. Concept, Designs and Methods. 2000
B. META-ANALYSIS
• Papers that use statistical methods to combine
pooled datasets (results) from other studies) and
analyse them to reach a single observation for the
aggregated data (Bowling, A., 2002).
• Can be with or without systematic lit. review
• Especially important because:
a) They overcome effects of sample (n+n+n…?: No!)
b) They overcome effects of specific
treatment/intervention settings (e.g.: community,
hospitals, diagnostic units, drug treatment
centres…)
METHA-ANALYSIS, A TWO-STAGE PROCESS
First stage:
Extraction of data from each individual study and the calculation (if
not given in the document) of the result for that study (the ‘point
estimate’ or ‘summary statistic’) (e.g.: RR, OR…), with an estimate
of the chance variation we would expect with studies like that (the
CI).
Second stage:
Deciding whether it is appropriate to calculate a pooled average
result across studies and, if so, calculating and presenting such a
result [heterogeneity analysis] .
BUT! The decision needs to be done taking into account that a
meta-analysis is not about adding together the results from quite
different studies and calculate a summary statistic. Meta-analysis
look at the results within each study, and then calculates a
weighted average.
Heterogeneity and Sensitivity analysis
(a) Heterogeneity analysis (HA) is a pre-meta-analysis
step to decide whether findings from different studies
can be pooled to produce the meta-analysis. Graph:
FOREST PLOT
(b) Sensitivity analysis (SA) is a general label to describe
efforts made to decide whether the criteria used to
select the units of analysis were correct or, in fact,
introduced a bias. Graph: FUNNEL PLOT
Clinical heterogeneity
Do all the studies found really address the same question (so that
an average of their results would be sensible)? There might be
differences in the participants, interventions or outcomes that
makes the researcher to conclude that the studies are too different
and, therefore, no appropriate to pool their results.
E.g.:
Study A tests the effectiveness of a treatment “X” to help pregnant
women to give up smoking
Study B tests the effectiveness of a treatment “X” to help teenagers
to give up smoking
In spite of the same treatment “x” been tested in the two studies,
the populations in the study are too different. Therefore, pooled
results from the two studies are not appropriate.
Statistical heterogeneity
Are the results from different studies consistent?
To what extent?
Attention should be paid to studies where the results
don’t seem to fit (they are too different) It has to do with
heterogeneity .
Usually inappropriate to calculate an average effect (that
is, perform a meta-analysis) if there is a large amount of
heterogeneity.
IDENTIFYING STATISTICAL HETEROGENEITY
Two main ways:
1. By looking at a forest plot to see how well the confidence
intervals overlap. If the confidence intervals of two studies
don’t overlap at all, there is likely to be more variation between
the study results than one would expect by chance (unless
there are lots of studies), and heterogeneity should suspected.
A visual inspection of the confidence intervals will help get an
idea of the amount of statistical heterogeneity, and guide the
student think about whether it is reasonable to combine the
results of these studies. When trials are ‘too different’ (e.g.: CI
don’t overlap or overlap little), the reviewer should conclude
that the strength of the evidence is in doubt (at least!). The
other conclusion is, of course, that a meta-analysis is not
advisable.
IDENTIFYING STATISTICAL HETEROGENEITY [cont]
2. By performing an statistical test:
a. A useful way to identify heterogeneity is perform a χ2 (“xsquare”) test. We compare the chi-square statistic with its
degrees of freedom. If the statistic is larger than its degrees of
freedom there is evidence of heterogeneity. This test presents
sensitivity problems [heterogeneity goes under-detected]
b. An alternative, the I2 statistic, was developed. It scores
heterogeneity between 0% and 100%. A rule of thumb: 25%=
low, 50% =moderate, and 75% = high heterogeneity. Later it
was identified that this statistic also failed in identifying part of
the heterogeneity.
Heterogeneity analysis. Identifying statistical heterogeneity by looking
at the forest plot
Effectiveness of treatment “X”
to give up smoking
Relative risk (95% CI)
Moorley (1997)
Bone (2000)
Goodfellow (1998)
Aceijas (2003)
Schmidt (2005)
Watts (2000)
Corcoran (1995)
Dilmohamed (1990)
Candy (1999)
Morrissey (2004)
Moorley (1997)
Mudyarabikwa (1995)
Abitoye (1900)
Test for
heterogeneity X2
1
Treatment worse
2
5
Treatment better
10
FOREST PLOT. AS PRESENTED IN META-ANALYSIS
WEEKLY READING
• Petticrew M and Roberts H (2006). Systematic reviews in the social sciences: a
practical guide. Malden, MA ; Oxford : Blackwell Pub., 2006.]
From the Systematic reviews journal. www.systematicreviewsjournal.com/
• Machingaidze et al. (2013). Understanding interventions for improving
routine immunization coverage in children in low- and middle-income countries:
a systematic review protocol. Systematic Reviews, 2013, 2:106. At:
www.systematicreviewsjournal.com/content/2/1/106
• Hoytema van Konijnenburg et al. (2013). Insufficient evidence for the use of a
physical examination to detect maltreatment in children without prior suspicion: a
systematic review. Systematic Reviews, 2:109. At:
www.systematicreviewsjournal.com/content/2/1/109
See next slide!
ALSO
• Sand-Jensen, K. (2007). How to write consistently boring scientific literature.
Oikos 116: 723727.
• Crombie, IK and TO Davies, H.(2009) What is meta-analysis?. Hayward
Medical Communications, Hayward Group Ltd. “What is?” series. [Worth
checking out: http://www.whatisseries.co.uk/whatis/
• Lapadula, G. et al. (2007) Dideoxynucleoside HIV reverse transcriptase
inhibitors and drug-related hepatotoxicity: a case report. Journal of Medical
Case Reports , 1:19
• The Cochrane Collaboration (2002). Open learning material for reviewers
Version 1.1. November 2002.
• Julian PT Higgins and Sally Green, Eds. Cochrane Handbook for
Systematic Reviews of Interventions Version 5.0.2 [updated September
2009]. At: http://www.cochrane-handbook.org/
• CRD (2008). Guidance for undertaking reviews in health care.
• Hymes KB, et. al. Kaposi's sarcoma in homosexual men-a report of eight
cases. Lancet. 1981 Sep;2(8247):598-600
• Brown P (2006) How to formulate research recommendations. BMJ Vol. 333 14
oct.,2006.p 804-6.
• The Lancet (2007) Medicines for children: safety as an afterthought. The
Lancet. Vol 370 October 6. p 1190.