Health services based trials: introductory course

Download Report

Transcript Health services based trials: introductory course

Health services based trials (HSB):
introductory course
Elina Hemminki
THL, Universities of Helsinki and Tampere
(National Institute for Health and Welfare)
Helsinki May 11,12,18 2009
Department of Public Health, Helsinki
Structure of the course
Monday 11, Elina Hemminki
• Introduction of participants and interests
• Introductory lecture
• Assignment of group/ individual work
• Start of group/ individual work
Tuesday 12, Diana Elbourne, Nea Malila,
Minna Kaila
Monday 18, student presentations
• presentations + discussion
• what next
Volume and credits
Guided work 12 hours
Independent work 10 hours
• reading and commenting an article
• project plan (shared work)
• course diary (a brief description what you
learnt/ wondered of the lectures or the
topic)
Credits (to be clarified): if all of the above:
..credits.
If 60-90%: ..credits
Outline of the introductory lectur
• Trials: Experimental vs non-experimental
• Various kinds of trials
• HSB: why, features
• Examples
Material:
• 3 papers for exercise 1
• Others which may be of interest
• List of further reading (later)
• Copies of presentations
Trials: Experimental vs nonexperimental
1. Introduction
2. Doing a trial
3. Difficulties
4. Additional issues
5. Impact of trials
6. Exercises
What does a trial mean?
• Intervention to group(s), created by the
researcher (><natural trials)
• Usually, two or more (>< uncontrolled
trial)
• Good trials: comparable groups (same
calendar time)
• Most common allocation method:
randomization (why: also unknown predictors)
• Different kinds of trials
History
• First modern clinical trial 1940s (TB and
streptomycin, MRC in UK)
• blood letting, scurvy and vitamin C
• later non-clinical health trials
• agriculture etc earlier
• www.jameslindlibrary.ccccc
Why trials?
• Unbiased information
• Especially important in case of therapy
and other intentional activities (selection
bias)
• “All new interventions in health care (and
those old not previously studies) should be
introduced via trials”
• Simple in theory, difficult in practice
Advantages
• With an example
• postmenopausal hormone therapy
• used for decades based on nonexperimental data
• millions of women exposed for a
preventive therapy; health (and monetary)
costs large
HT
• (Female) hormone drug therapy during
and after menopause, HRT
• Estrogens (+ progestins + other), extracts, DES,
other synthetic
• 80 yr + for symptoms
• 40 yr + against aging
• 30 yr + for preventing diseases (all
major chronic diseases)
• North America (USA) --> Western &
Northern Europe --> whole world
DDD/1000 inhab.
60
50
Finland
40
30
20
10
0
1980
1984
1988
1992
Year
1996
2000
The Nurses’ Health Study
121,700 females age 30-55 in 1976-1994, risk of
cardiovascular deaths
.
OR (95% Cl)
Crude
Adjusted
Never Current HRT
1.0
0.35 (0.25-0.49)
1.0
0.47 (0.32-0.69)
NEJM 1997;336:1769-75
Past HRT use
0.84(0.67-1.05)
0.99 (0.75-1.30)
WHI, Women’s health initiative
• 16 000 ”healthy” women; RCT, 8 yr
prevention of cardiovascular diseases
• stopped after 5 yr: ineffective, harms
• 7 extra heart infarcts per year per 10 000
women
• 8 extra breast cancers
• 8 extra brain insults
• 18 extra deep vain thrombosis
WHI, Women’s health initiative
• less fractures
• less colon cancer
• (no major effect on metabolic diseases,
dementia, well-being)
Other trials
• Nachtigall et al.
1979: +
• Hemminki et al.
1997: ?
• HERS 1998: • Interim report from
WHI 2000: 0
• (WHI 2002: -)
• WH1-2 2004: -
Small secondary
prevention trials
EVTET 2000: WEST 2001: 0
PAPWORTH 2002
ESPRIT 2002
etc.
(Michels ym 2003)
Why a difference in observational
and experimental studies?
• several papers trying to solve what was
the cause.
• an explantion (of general relevance: bias9
Bias in observational studies of
Oestrogen and Heart Disease
Source
Effect on OR
Prescribing bias
More educated
Higher social class
Osteoporosis
No disease*
Healthier
Compliance
decrease
decrease
decrease
decrease
decrease
decrease
decrease
* diabetes, hypertension, cardiovascular
After Barrett-Connor, BMJ 1998; 317:457-461
CDP- RCT; secondary prevention of
CHD, clofibrate,Males 30-60 (n=2789)
5 yr Mortality, Placebo Group
Adjusted
for 40 baseline
characteristics
67% “GOOD” ADHERERS*
15 %
16 %
33% “POOR” AHDERERS
28 %
26 %
Adjusted OR death =0.64 attributable to
compliance
* > 80 % prescribed dose
NEJM 1980; 303: 1038-1041
Doing a trial
Doing a trial: Simple in theory,
difficult in practice
• Dependent on others (service providers,
customers, politicians, press etc) /laboratory,
archives, interviews
• On line research, things are changing
while research is going on, a moving train
• Ideological opposition, “guinea pigs “
• Costs, bureaucracy
• Researcher skills/ administrative skills
General principles
•
•
•
•
•
•
Forward in time, like a cohort study
Define the intervention
Create the groups
Expose them to intervention(s)/ control
Define the outcomes, measure them,
Compare the groups
Forward in time
• like a cohort study
• starting from exposure (intervention)
• parallel in time; cross-over-design
______________
• uncontrolled trials: “comparison” group
may be the situation before or after; I do
not recommend
Define the intervention
•
•
•
•
•
•
•
Important to be explicit
how feasible? Original and compromised
how applicable after the trial?
together with those who implement
think all elements, including the soft ones
describe the intervention in detail
monitor whether the intervention is as
planned; report changes; correct if
possible
Create the groups
• Target population (different for mechanistic
and practical trials)
• Allocation: randomization if possible (takes
care also unknown confounders)
• Unit: an individual, a department, a village
• Power calculations: how many ---> impact
on all aspects of the study
• Power calculations different for individual
and cluster trials
• What is possible?
Expose to intervention(s)/ control
• Craftsman skills: weak link often
• Method and skills needed: depends on the
intervention and context
• Contamination (control group)
Define outcomes, measure them
• Mechanistic vs. practical: to understand
what happens/ is the intervention useful
• Often: too much intermediary, laboratory,
process etc. outcomes
• Too little health, unanticipated outcomes
• Costs of intervention (important for
decision)
• Costs of data collection
• An issues: How much determined by data
source
Compare the groups
• Simple if groups similar (randomized
and large numbers): “cross-tabulations”
• Analysis by intention to treat
(compliance, drop-outs)
• Main outcomes such that data from
everyone possible to get
• Sub-analysis by compliance etc
• Overall outcome from all dimensions
(for decision making)
Typical challenges or problems
• providers/ others responsible
• balance between feasibility and
generalizibility
• changes in environment, time (too
early/late)
• compliance, dilutation
• drop-outs, loss to follow-up
• costs and dependence on financiers
Persuasion of service providers/
others responsible
• Skills not often trained, nor typical for
researchers
• Conflicting interests; knowing the politics
and march order
• Routines help, others’ experiences
• Unconventional approaches or challenging
current practices: ?
• Shopping around (Researcher X moved to
town Y because it had good chief
physician)
Balance between feasibility and
generalizibility
• Generalizibility: can the results of the trial
be transferred to practice/ other contexts?
• Often a compromise what would be best
and what can be done
• Stepwise
• Using other type of evidence to fill the gap
between own trial and the practice
Changes in environment, time
• “It is too early to study an intervention
until it is too late”
• “The results of trials tell about history”
• Environment may change (political,
administrative, health, services, public
perception, funding…, CHIMACA)
• Competing groups may publish their
results (WHI and Wisdom)
• A problem especially with long-term
exposures/ follow-ups
Compliance, dilutation
• Trial participants do not do as planned
The groups start resembling each other.
Control group wants; intervention does not
want.
• Especially important for trials when
- intervention based on behavior changes
- long term intervention
- hot topic
Drop-outs, loss to follow-up
• How serious threat: depends on the
context and the intervention; long
follow-up
• Anticipation: is there a way to get at
least some information from all
• Compromises in terms of other aspects
(e.g. comprehensive information)
• Routine record keeping / special data
collection
• Drop from intervention/ from data
collection
Costs and dependence on financiers
• Cost-efficiency of the trial
• Costs of the study an aspect in all
methodological decisions
• Bias in topics by the interest of potential
financiers
• Service providers should be responsible:
trials a part of routine services (see
Chalmers papers)
Practical issues
• Permissions (ethics, data protection etc)
• Insurance, liability
• Intervention = drug: international
detailed rules
• Data collection: simple/ many aspects,
administrative data (register based trials)
• Financing/ time (which first: field or money?)
Difficulties
• Depends on the intervention and the
design
• Mechanistic vs. practical vs HSB
• Imagination in the design may help
• Teaching of clinicians/ others
responsible for activities studied
• Public relations, press
• Will be easier in the future?
• EU and drug trials
What have I learnt?
• Have several research questions (vs Peto)
• Visit field frequently; make space for
finding problems (as it is vs. it should be)
• Be ambitious, but do not expect success
• Have imagination
• Look for interest groups’ motives
• Need for PR and alliances
• Find practical tips from previous research
Impact of trials
•
•
•
•
Valued in medical literature
Cochrane collaboration
Other meta-analyses
Trial registration, trial reporting rules  all
trials, regardless of results should be
published
• “Negative” vs. “positive” results
Impact on practice
• ?
Like any research:
• “if winds are favorable “
• if key-people are involved
• if (financial) interests/ incentives
Optimistic: with time the impact will come
_______
Example of HT trial
HT sales in USA after publishing
WHI-I-trial 2002
• 40 % reduction in sales of all
estrogen-progestin pills July - October
(prescriptions)
• 56 % in sales of Prempro®
• competing drugs: a small increase
Source: The Associated Press, November 2002 (web)
HT myynti Suomessa 1980-2004
60
50
40
30
20
10
0
19
80
19
82
19
84
19
86
19
88
19
90
19
92
19
94
19
96
19
98
20
00
20
02
20
04
DDD/1000 inhabitants
HT use in Estonia and in Finland
Comments of WHI-I trial in Finnish
medical journals
Not so enthusiastic
• Important
• One of many
• Not for preventing CVD
7 / 13
4
5
Possible reasons for Finnish
physicians’ defensive position
• fear of loosing personal credibility
(experts)
• fear of loosing professional credibility
(among patients and lay-persons)
• too much at stake in clinical work
• too much at stake research wise
• education and push from competing
firms
Pragmatic vs explanatory trials,
mechanistic vs practical
• Originally by Schwartz and Lellouch 1960s
• Criteria changed but basic idea the same:
two types of trials at the end of the
continuum
• Impact on all phases
• Simplified: practical brings answers for
practice; main problem in generalization
Outline for HSB-trials
•
•
•
•
What HSB is and why it is important?
Do HSB trials need promotative actions?
Is Finland the promised land of HSB?
Examples
What does "HSB trial" mean?
• The purpose is to find, through
experimental research, the best form of
action in health services*
• It aims for practice
* in a broad sense, including activities
outside health services aiming at
promoting and preserving health, treating
diseases, rehabilitation and (medical)
nursing
Terms in Finnish
• kokeelliset tutkimukset terveydenhuollossa
(KTT)
• terveydenhuoltoon upotetut kokeet
• terveydenhuollon interventiot
• kokeelliset tutkimukset terveydenhuollon
käytännössä
• kokeelliset tutkimukset osana terveydenhuoltoa
• jne.
Terms in English
•
•
•
•
•
•
•
health services based trials (HSB)
complex interventions (Elbourne)
community trials
interventions in health care
social experiments
cluster (randomized) controlled trials
randomized field tests + quasi-experiments
(methods other than randomization to allocate groups)
• randomized social experiments
• field experiments
• trials for policy research
Typical features 1
• Experimental research = health/ health
care intervention is given/ received in such
a way that different groups can be reliably
compared. The researcher decides
grouping.
• HSB-trial definition is overlapping with
other experimental trials and evaluation
research.
• Classifications are continuums.
Typical features 2
• Explanatory/ mechanistic trials: individual
randomization, informed consent, researchers
responsible for intervention, new exposures
• HSB-trials: cluster randomization/ other
allocation, no informed consent, "system"
responsible for intervention, exposures already
in practice
• Natural experiment: allocation independently
from research, no permission, "system"
responsible for intervention, exposures old or
new
• Practice change: no group allocation (usually
before-after), no permission, "system"
responsible for intervention, exposures old or
new
Typical features 3
Explanatory trial
HSB-trial
Permission: from target group
Intervention: researcher
Payer: research financer
Target group: feasibility,
informativness
Results: specific (placebo a
problem)
Applicability: Effect in ideal
circumstances
more general
health care system
health care
(future) target
all effects (trial
effect problem)
effect in practice
Typical features 4
• Allocation: often other than (individual)
randomization
• Unit: often collective (institution, group,
community) rather than a person or her
part (ear, eye, leg, arm)
• applicability important (generalizibility sure
only for the population studied)
Why HSB-trials are important?
An addition to traditional trials
The key-tool for evidence based medicine is
RCT
In their current form, RCTs (individuals as
units) best suit for studying (single)
technologies in selected populations
Currently, information is lacking of:
1. impact of technologies in practice
2. systems, packages, treatment strategies
Why HSB should be facilitated?
This far few HSB-trials
Likely important for evidence based health
care (judged from the impact of RCTs)
Some hindering factors which could be
changed
Facilitating factors
• Need for evidence based health care
• Coverage with evidence/ only in research development (big health care payers,
USA, UK)
• Increasing health care costs
• Finnish public health care system
• Research valued by service providers
• Well-educated population
• Good health registers
Hindering factors 1
• Researchers’ visions, know-how, attitudes
• costs
• research norms and laws
Visions and know-how. Only few trials this far -->
not an established research tradition, "Have
not thought of“, “Is real research”: teaching
(this course).
Costs may be lowered by
1. Integrating research into normal services
2. collecting outcome data from routine records,
e.g. registers
Hindering factors 2
Research regulation needs change to allow
HSB-trials and proper ethical evaluation,
both within Finland and internationally.
A key issues: informed consent (from who,
should it be asked)
E.g.: giving health education, offering
screening, organizing services, changing
environment
Finnish HSB trials
•
•
•
•
•
•
•
•
•
•
•
•
•
Cancer screening trials (Hakama et al) Malila
EBMeDS (supported decision making in health centers) Kaila
LATE (quality in health centers)
Yhteispeli (well-being of school-children)
Isolation in psychiatric care ?
Psychically ill parents' children (Solantaus)
Young men at risk of marginalization (Stakes)
Hospital discharge practices (Marja-Leena Perälä)
Vaccination trials (KTL)
Anti-smoking intervention among youth (KTL)
Ergonomy in kitchens (TTL)
Employment paths (TTL)
Promotion of sexual health(KTL)
Register based trial on birth
education
Elina Hemminki, Kaija Heikkilä,
Tiina Sevon, Päivikki Koponen
Hemminki et al 2008, BMC Health Services
Research 8:126 (tabled)
Childbirth classes 1
• 20 maternity centers in Helsinki matched
for pairs and randomly allocated to
intervention and control.
• Intervention: further education to nurses
and leaflets to mothers
• Outcome: mode of delivery
• Data collected from registers
• Childbirth classes: an existing program
• Responsibility of municipalities to organize
and finance; content not defined
Childbirth classes 2
• Called research originally; by the advice of
service providers (administration) divided into
program (intervention) and research (data
collection).
• Design: to guarantee data with small money
(routine registers)
• Maternity centers invited by service
administrators and researchers
• Invitation to the head nurse of each district with
letters and phone calls; personal contacts by a
midwife researcher
• sent to be handled in ethics committees; did not
succeed (rules did not apply)
Childbirth classes 3
For
• researchers received administrative backup from the city health administrators
• no press coverage
Against
• lack of research administration rules
• reluctant service providers (public health
nurses) >< competition between midwife
and nurse professions
• No request from women
Childbirth classes 4
• No protest on only half of the centers
being exposed
• Reluctance of the field (not felt important)
• Professional battle
Which rules apply?
Medical research or developing health care?
Is approval from ethics committee needed?
Meetings and discussions with City health
planners, people responsible for developing
maternity health centres
The body coordinating research activities within
Helsinki city health services (TUTKA)
Helsinki University coordinating ethics committee
Gynaecology and obstetrics committee
TUTKA: “Development project, no ethics
committee statement is required”
KIVA school trial
• Financed by the Ministry of Education, Turku
University, Cristina Salmivalli
• Developing a program to prevent bullying at
school grades 1-3, 4-6, 7-9, evaluating it, 2006
• "HSB": Program first introduced to 117 schools;
how allocated to intervention and control? What
are outcome measures?
• All Finnish schools can get the intervention
autumn 2009
• KiVa Koulu web-pages: the trial is difficult to
identify (hidden), not in English
Problems in law and rules
Sir Bradford Hill 1963: "there is no one way
of doing clinical trials ethically, and giving
detailed advice as if there were, will harm
both research and ethics". BMJ 1963,
5337: 1043-9
Key actors 1: ethics committees
• one of the gate-keepers: what can and
cannot be made
• starting point: participant protection
• time and cost implications on research; no
responsibility to research financers
• human rights --> scientific content
• ethics --> surveillance of law and rules
• committees over-do (protection) and
under-do (motives for research)
• no "formal decision" --> no easy appeal
Relation between research and
service
• different criteria for research and routine
care (accepted technology)
• easier to use without research than to start
with research (time, money, know-how,
rewards) --> care experiments
• "I need permission to give a drug to half of
my patients, but not to give it to all of
them" (Richard Smithells)
Exercise 1
Read the 3 indicated papers.
Select one, write 1 page (spacing 1) what
(new) you learnt, what you agree/
disagree/ wonder.
Send it to me, latest on Sunday 17 evening
by e-mail
Exercise 2
• Two people together (if not even, one 3 people)
• Select one topic of the list or choose your own;
let me know on Tuesday your topic + who form a
pair.
• Prepare a presentation for next Monday, either a
power-point or a paper to be distributed, 10 min
• Divide the work among yourself, discuss the
plan, present it.
• Focus: population, unit, allocation,
implementation, outcome measures, data
collection, costs, payer, actors, permissions
• Be prepared to comment other people’s
presentations.
Exercise 2 topics
1. Can health education to school-children by
ordninary health services reduce use of sweets
(candy, soft drinks, cookies..)?
2. Does it matter if .. (päivystys) in health centers
is done by regular physicians or outsider firms
(reppufirmat)?
3. Is it better to run maternity centers combined
to child wellfare centers or to family planning.
4. Can use of psyvchotropic drugs among old in
institutions be prevented by adding social
activities