Colleagues as a Defense Against Bad Science The Walter C

Download Report

Transcript Colleagues as a Defense Against Bad Science The Walter C

Colleagues as a
Defense Against
Bad Science
The Walter C. Randall Lecture
on Biomedical Ethics
April 12, 2011
Gerald P. Koocher, Ph.D., ABPP
Simmons College
www.ethicsresearch.com
Proximal Cause & Scientific
Dishonesty

Researchers are most likely to
intentionally engage in dishonest acts if:

their commitment to discovering the truth
(to patient care, or to other core values)
fails or becomes compromised through
rationalization;

if the potential for reward (or perceived
urgent need) exists; and

if they regard the chances of detection as
low.
2


For example, an investigator justify falsifying data as
acceptable because…

They believe the actual results would turn out as expected anyway;

taking a shortcut seems necessary to meet an important deadline; and

the chance of uncovering forged data seems nil.
What to do?

Create a situational constraint stands as the primary barrier to
intentionally committing a dishonest act.

Colleagues in a position to observe or learn about the misbehavior
constitute the principal source of such constraint. These same colleagues
also provide the most readily available resource for preventing and
correcting unintentional errors.
3
What do you mean by
“Bad Science” ?

The big three are FF&P
 Fabrication,
Falsification, and
Plagiarism
 Fabrication is usually in the form of
“dry lab” data that are simply
invented.
 Falsification can take several forms.
“Smoothing,” or “cooking” actual data to
more closely approach the desired or
expected outcome.
 Dropping collected data points (“trimming”)
to delete unwanted information.

4
Can we agree on plagiarism?
Use of another person’s words, ideas,
or other intellectual and creative work
without attribution, and representing it
as one’s own.
 But wait…

How about “self-plagiarism”?
 How about cultural justifications?

I
wanted to get it right…
 I was just being efficient
 My professor is lazy
5
The Bozo Factor

Sometimes ineptitude or
incompetence can result in
inappropriate design, poor or
biased sampling procedures,
misused or wrongly applied
statistical tests, inadequate
record-keeping, and just plain
carelessness. Even though there
may be no intent to deceive,
inaccurate information can also
seriously damage the research
record.
6
Won’t the scientific record selfcorrect over time?

One might assume (or hope) that such
inaccuracies, purposeful or not, will be
discovered. But don’t cannot count on it.

The odds of correct errors through replication
have declined as funding sources do not
support replication research

Most scholarly journals do not normally publish
replication studies.

As a result, researchers have little incentive to
repeat projects, especially expensive and
complex ones.
7
Difficulties in Detection
Most highly publicized data scandals
have occurred in biomedical research
laboratories.
 No one knows for sure whether the
incidence is higher in biomedical
science than in the social and
behavioral sciences, or whether it is
simply easier to detect fraud in
biomedicine.

8

Most social and behavioral research does not
involve chemical analyses, tissue cultures,
changes in physical symptoms, invasive
procedures, or similar “hard” documentation.

Social science data often take the form of
scores from questionnaires, psychological
assessments, performance measures or
qualitative data based on interviews or
behavioral observations.

By the time data analysis takes place the
participants have long since gone, taking their
identities with them. Such data become relatively
easy to generate, fudge, or trim.
9
COLLEAGUES AS A DEFENSE
AGAINST BAD SCIENCE
NIH Grant No. R01 NS049573 [NINDS/ORI]
Gerald P. Koocher, Principal Investigator
Patricia Keith-Spiegel and Joan Sieber, Co-Investigators
See: Koocher, G. P. & Keith-Spiegel, P. (2010). Opinion: Peers nip
misconduct in the bud. Nature, 466, 438-440.
10
NIH focuses on FF&P, but
there’s more…
We surveyed more than 5,000 names
in the NIH CRISP data base.
 2,599 respondents reported 3,393
accounts of suspected wrongdoing and
other errors related to the conduct of
research.
 Only 406 of those responding stated
that they had no incidents to share.

11
Type
Number of
Incidents
Percentage
Fabrication/falsification
608
17.3%
Questionable publication practices
(e.g., disputed authorship credits)
601
17%
Plagiarism
462
13.1
Difficult or stressful work
environment (e.g., mistreatment of
subordinates, sexual harassment or
other forms of exploitation)
432
12.3%
Incompetence (e.g., poor research
design or inappropriate analysis,
insufficient skills relative to the
study technique)
420
11.9
12
Type
Number of
Incidents
Percentage
Carelessness (e.g., cutting corners,
sloppy record keeping)
334
9.5%
Intentional bias (e.g., rigging a sample
or method to favor a certain outcome)
176
5%
Failure to follow the rules of science
(e.g., violating human research
participant requirements, sidestepping
or ignoring IRB directives)
169
4.8%
Inadequate supervision of research
assistants
136
3.9%
*Respondents could report more than
one type of incident.
Total =
3525
100%
13
What Risks Materialized and
Who Got Hurt?

In 1,169 (42%) of the incidents, participants
experienced no negative consequences as a result of
their intervention.

Another 296 participants (11%) reported an elevation
in status.

Almost half of our interveners reported suffering to
some degree; most recounting only emotional distress
as opposed to career or social standing damage.

Some reported serious consequences, such as feeling
shunned, forced to leave a job, or losing previously
close friends or allies. A few feared law suits, although
none ever materialized.
14
Glass 2/3rds full




Despite personal risks, 2/3 of participants claimed to
have attempted to prevent or correct a wrong in
progress, or to minimize damage that had already
occurred.
Very few participants initially reported their concerns to
another entity, opting to attempt informal corrective
steps or achieve damage control on their own or in
partnership with other colleagues.
The most common reasons offered for acting included a
commitment to research integrity, to avoid damaging
the reputation of oneself, one’s lab, or institution, or to
prevent an associate from making a mistake.
Almost all respondents took direct action if the
questionable act was perpetrated by their own post docs
or assistants.
15
Who Takes Action, and
Does It Work?

A binary logistic regression analysis profiled characteristics of researchers
who intervene:
Most likely to take action were those who







held a higher professional or employment status than the suspected
wrongdoer
had less regular interaction or involvement with the suspected wrongdoer
based their suspicions on strong evidence (i.e., direct observation or
direct disclosure the transgressor rather than second-hand accounts or
hearsay)
perceived the transgression as unintentional, and
held a belief that individuals have a primary responsibility to become
actively involved in maintaining scientific integrity.
The vast majority of those who felt victimized or who believed that they
might suffer blame also proved likely to intervene individually or by
reporting the matter, suggesting that acts involving direct threat to
oneself will likely lead to taking some type of action.
The highest rates of intervention occurred for projects described as taking
place in the context of high stress that compromised research quality.
16
Those Who Did Not Act

About a third of participants did not take
action regarding any incident they shared
with us.



The largest group revealed that they felt too remotely
involved or knew that others were already taking
action.
Another third claimed they simply did not know what
to do.
Reluctance to deal with a suspected offender
perceived of as difficult person or who was their
superior were other common reasons for inaction, as
was an unwillingness to act when evidence seemed
insufficient.
17

Social relationships, job security, and status become
more salient in close working conditions. So perhaps
understandable, but also disappointing – we found
that those who worked closely with suspected
wrongdoers were less likely to take any action.


Thus, the best opportunity to observe wrongs and stop
or correct them appears to also be less likely to be
utilized.
We asked if those who took no action on their
suspicions experienced lingering reservations. Forty
percent of those who did not get involved, even
though they had direct evidence of wrongdoing, still
felt misgivings, sometimes even after many years had
since passed.
18
ROGUES GALLERY
19
Rogues’ Gallery

Dr. Eric Poehlman (University of Vermont) became the
first academic scientist in the United States to serve
prison time for misconduct (not involving fatalities) and
received a lifetime ban on federal research funding.
Poehlman published articles containing bogus data and
submitted falsified grant applications that brought in
almost 3 million dollars in federal grant money since the
early 1990s.
20
Eric Poehlman

Plead guilty to acting alone in falsifying and fabricating research data
and filing false grant applications, for which the National Institutes of
Health paid $542,000.

Agreed to pay $180,000 to settle a civil complaint related to
numerous false grant applications he filed while at UVM.

Paid $16,000 in attorney's fees to counsel for Walter F. DeNino, a
research assistant whose complaint of scientific misconduct spurred
an investigation by UVM.

Agreed to a lifetime bar from seeking or receiving funding from any
federal agency and to submit numerous letters of retraction and
correction to scientific journals related to his misconduct.

Agreed to a permanent exclusion from participation in all Federal
health care programs.
21
Paul Kornak


Paul Kornak pled guilty to
criminally negligent
homicide for falsely
representing results of
blood chemical analyses
in a chemotherapy study.
One participant who
should have been
excluded from the study
died as a result.
22
Paul H. Kornak
In August 2000, he applied for employment to the
VA, submitting a false Declaration for Federal
Employment form, denying any conviction or
probation history, despite having been convicted
and sentenced to 3 years probation for mail fraud
in 1992.
By October of 2000, Kornak was responsible for
organizing, coordinating, implementing, and
directing all research elements in the Stratton VA
Medical Center oncology research program.
23
Paul H. Kornak
From May 14, 1999, to July 10, 2002, Kornak defrauded the
sponsors of the clinical studies by repeatedly submitting false
documentation regarding study participants and enrolled people
who did not qualify under the study protocols.
He caused the death of a participant when he falsely represented
the results of the patient’s blood chemistry analysis to suggest
the participant met the criteria for participation in one study,
when the actual results showed impaired kidney and liver
function. The patient was administered the chemotherapeutic
drugs docetaxel, cisplatin, and 5-FU in connection with the studt
protocol on or about May 31, 2001, and died as a result on or
June 11, 2001.
24
Case Summary - Paul H. Kornak
[Federal Register: February 24, 2006 (Volume 71, Number 37)]

Paul H. Kornak, Stratton VA Medical Center,
Albany, New York pled guilty to:

making and using a materially false statemen

mail fraud and

criminally negligent homicide

(See United States of America v. Paul H. Kornak,
Criminal Action No. 03-CR-436 (FJS), U.S. District
Court (N.D.N.Y.) (January 18, 2005).

In addition to 71-month prison term, he was
directed to pay restitution to two pharmaceutical
companies and the VA in the amount of
approximately $639,000.
25
Too Good to be true:
Stephen Bruning

By the age of 30 he had produced an influential body of work
on treatment of the mentally retarded. But there was
something odd about the work of Stephen Breuning, then an
assistant professor of child psychiatry at the University of
Pittsburgh.

His data seemed almost too orderly, too pat, and collected
with remarkable speed. The doubts came to a head in 1983
when his supervisor, Robert Sprague, then director of the
Institute for Child Behavior and Development at the
University of Illinois, reported his suspicions to the NIMH.
26
Too Good to be true:
Stephen Bruning

Between 1979 and 1984, said Sprague,
Breuning, "produced one-third of the
literature in the psychopharmacology of the
mentally retarded.“
http://www.time.com/time/magazine/article/0,9171,964485,00.html#
ixzz1Im1gTTtM

Dr. Stephen Breuning was convicted of
"academic fraud-related charges" in the
United States District Court of the District of
Maryland on November 10, 1988.
27
Harvard Finds Scientist Guilty of
Misconduct
By Nicholas Wade, Published: August 20, 2010, Boston Globe

Harvard University said Friday that it had found a
prominent researcher, Marc Hauser, “solely
responsible” for eight instances of scientific
misconduct. Marc Hauser worked in the field of
cognition and morality. Hours later, Dr. Hauser, a
rising star for his explorations into cognition and
morality, made his first public statement since news of
the inquiry emerged last week, telling The New York
Times, “I acknowledge that I made some significant
mistakes” and saying he was “deeply sorry for the
problems this case had caused to my students, my
colleagues and my university.”
28
Hauser’s Graduate Student
What can we do?
30
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell



Interviewed 135 of our participants and 125 provided first
hand witnessing and respondeding to research wrongdoing.
Participants reported a variety of responses, including
formal notification of institutional officials, peer shaming and
one-on-one discussions with peers to address wrongdoing
that ranged from improper attribution of authorship to
falsification, fabrication and plagiarism.
Unexpectedly, administrative incompetence in handling
allegations seemed the prevalent theme, exceeding all of
forms of wrongdoing that were coded. Institutions may have
neither effective nor efficient processes for managing even
the most egregious cases of research wrongdoing.
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell

“Our research team
recounted to our dean the
bullying and dishonesty we
had experienced. It
seemed to us that the dean
didn’t know how to confront
the powerful perpetrator
and didn’t confront the
wrongdoing. He instructed
all of us to relinquish any
right to our data.”

“My university admitted
many very wealthy foreign
students, most of whom
plagiarized rampantly.
When I spoke out about it,
colleagues explained that if
they cracked down on
plagiarism, the students
would simply leave and go
to another expensive
private university where
they could submit
plagiarized work.”
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell

“The department had recently
hired a nationally known
superstar with tenure, who
brought many grants with him.
He was close to retirement and
suffering from severe emotional
problems, which manifested
themselves as paranoia, selfaggrandizement, and extreme
cruelty and ruinous abuse of
post-docs and students in his
lab, who were driven out of the
laboratory or left science
entirely.”

“I had just become a post doc for
a PI who gave me data on 50
subjects to work with. However,
the research coordinator, who
was resigning, told me that fMRI
scans had only been done on 6
of the 50 subjects and that the
results did not support the PI’s
hypotheses. I felt like I had just
been handed a smoking gun,
and wanted out immediately.
But how?”
Culture shifting

Actively engaging colleagues
with gentle alternatives to
whistleblowing





Offering help
Expressing concern
The “Bullwinlke” approach
Encouraging reporting of nearmiss situations
Apologizing when appropriate
34
Create a culture that encourages
reporting of human error
Near-miss recognition and reporting
systems (near-miss, zero cost)
 Creating a safe climate for sharing
concerns in a professional manner and
context (engaged colleagues)

35