Colleagues as a Defense Against Bad Science The Walter C

Download Report

Transcript Colleagues as a Defense Against Bad Science The Walter C

Why Scientists Cheat
(and what am I supposed
to do about it?)
University of Washington and
Fred Hutchinson Cancer Center
August 25, 2011
Gerald P. Koocher, Ph.D., ABPP
Simmons College
www.ethicsresearch.com
Proximal Cause
& Scientific Dishonesty

Researchers likelihood of intentionally
engaging in dishonest acts increases
if:

their commitment to discovering the truth (to
patient care, or to other core values) fails or
becomes compromised through
rationalization;

if the potential for reward (or perceived
urgent need) exists; and

if they regard the chances of detection as
low.
2
Investigators may justify falsifying data
as acceptable because…
 They
believe the actual results would
turn out as they expected in any case.
 Taking
a shortcut seems necessary to
meet an important deadline.
 The
chance of uncovering forged data
seems nil.
3
What to do?
Colleagues as a defense against bad science

Create a situational constraint as the
primary barrier to intentional dishonest acts.

Colleagues in a position to observe or learn
about the misbehavior constitute the principal
source of such constraint. These same
colleagues also provide the most readily
available resource for preventing and
correcting unintentional errors.
What constitutes
“Bad Science?”

The big three are FF&P

Fabrication, Falsification, and Plagiarism
 Fabrication is usually in the form of
“dry lab” data that are simply
invented.
 Falsification can take several forms.
“Smoothing,” or “cooking” actual data to
more closely approach the desired or
expected outcome.
 Dropping collected data points
(“trimming”) to delete unwanted
information.
5

Can we agree on plagiarism?


Use of another person’s words, ideas, or
other intellectual and creative work without
attribution, and representing it as one’s
own.
But wait…


How about “self-plagiarism”?
How about cultural justifications?



“I wanted to get it right…don’t want to lose face.”
“I was just being efficient.”
“My professor is lazy.”
6
Sohrabi, B. Gholipour, A. & Mohammadesmaeili, N. (2011).
Effects of Personality and Information Technology on Plagiarism:
An Iranian Perspective. Ethics & Behavior, 21, 5, xxx-xxx.

Students must do absurd assignment assigned by their professors.

Discovering new things makes me worried so I prefer to copy others’ work.

If professors know that I don’t understand, I feel ashamed.

Cheating culture is accepted in universities.

In our college, establishing a good relationship with professors is better
than strong academic work.

Nowadays, graduating is more important than achieving scientific
knowledge. Internet plagiarism helps me to graduate more easily.

I retaliate for professors’ cheating; professors plagiarize Power Point files
found on Google.

Internet plagiarism is the students’ right, not immoral.

In college the quantity outweighs quality; by internet plagiarism I increase
my works’ quantity.
Stupidity & Bad habits
Incompetence: Examples include: poor
research design, methodology, or
statistical procedure; inappropriate
selection or use of a study technique due
to insufficient skills or training.
 Careless work habits: Examples include:
sloppy record-keeping; haphazard data
collection; cutting corners; inadequate
monitoring of the project's progress.

The Bozo Factor


Sometimes ineptitude or incompetence
can result in:

inappropriate experimental design,

poor or biased sampling procedures,

misused or wrongly applied statistical tests,

inadequate record-keeping, or

just plain carelessness.
Even without any intent to deceive,
inaccurate information can seriously
damage the research record.
9
Other issues
not tracked by the Feds

Intentional bias: Examples include: rigging a
sample to maximize support for hypotheses;
withholding methodology details; deceptive
or misleading reporting of data or its
interpretation.
Other issues
not tracked by the Feds

Questionable publication
practices/authorship: Examples
include: publishing a paper or parts of
the same study in different publication
outlets without informing the readers;
undeserved "gift" authorships; coerced
authorship; omitting someone who
deserved an authorship or other form of
credit.
Other issues
not tracked by the Feds

Inadequate supervision of research
assistants. Examples include: giving
assistants more responsibility than
they are able or willing to handle,
insufficient supervision of assistants'
work.
Other issues
not tracked by the Feds

Failure to follow the regulations of
science. Examples include:
sidestepping or ignoring the IRB or its
directives; circumventing or ignoring
human participant requirements with
regarding informed consent,
confidentiality, or risk assessment;
inadequate care of research animals;
violating federal research policy.
Other issues
not tracked by the Feds

Contributing to difficult or stressful
work environments that could
adversely influence research process.

Examples: mistreatment or disrespectful
treatment of subordinates; sexual
harassment or other form of exploitation;
playing favorites and other factors that
create poor morale or acting out by
subordinates; conflicts with the
administration or administrative policies.
Other issues not necessarily
tracked by the Feds

A dishonest act indirectly related to
researcher role. Examples include:
unreported conflicts, such as a
financial interest in the outcome of an
experiment; misuse or
misappropriation of grant funds;
inflating, distorting, or including bogus
accomplishments on a resume.
Won’t the scientific record selfcorrect over time?

One might assume (or hope) that such
inaccuracies, purposeful or not, will be
discovered. But don’t cannot count on it.

The odds of correct errors through replication
have declined as funding sources do not
support replication research.

Most scholarly journals do not normally publish
replication studies.

As a result, researchers have little incentive to
repeat projects, especially expensive and
complex ones.
16
Difficulties in Detection
Most highly publicized data
scandals have occurred in
biomedical research laboratories.
 No one knows for sure whether the
incidence is higher in biomedical
than in the social and behavioral
sciences, or whether it is simply
easier to detect fraud in biomedical
research.

17
Difficulties in Detection

Most social and behavioral research does not
involve chemical analyses, tissue cultures,
changes in physical symptoms, invasive
procedures, or similar “hard” documentation.

Social science data often take the form of
scores from questionnaires, psychological
assessments, performance measures or
qualitative data based on interviews or
behavioral observations.

By the time data analysis takes place the
participants have long since gone, taking their
identities with them. Such data become relatively
easy to generate, fudge, or trim.
18
COLLEAGUES AS A DEFENSE
AGAINST BAD SCIENCE
NIH Grant No. R01 NS049573 [NINDS/ORI]
Gerald P. Koocher, Principal Investigator
Patricia Keith-Spiegel and Joan Sieber, Co-Investigators
See: Koocher, G. P. & Keith-Spiegel, P. (2010). Opinion: Peers nip
misconduct in the bud. Nature, 466, 438-440.
19
NIH focuses on FF&P, but
there’s more…
We surveyed more than 5,000 names
in the NIH CRISP data base.
 2,599 respondents reported 3,393
accounts of suspected wrongdoing and
other errors related to the conduct of
research.
 Only 406 of those responding stated
that they had no incidents to share.

20
Type
Number of
Incidents
Percentage
Fabrication/falsification
608
17.3%
Questionable publication practices
(e.g., disputed authorship credits)
601
17%
Plagiarism
462
13.1
Difficult or stressful work
environment (e.g., mistreatment of
subordinates, sexual harassment or
other forms of exploitation)
432
12.3%
Incompetence (e.g., poor research
design or inappropriate analysis,
insufficient skills relative to the
study technique)
420
11.9
21
Type
Number of
Incidents
Percentage
Carelessness (e.g., cutting corners,
sloppy record keeping)
334
9.5%
Intentional bias (e.g., rigging a sample
or method to favor a certain outcome)
176
5%
Failure to follow the rules of science
(e.g., violating human research
participant requirements, sidestepping
or ignoring IRB directives)
169
4.8%
Inadequate supervision of research
assistants
136
3.9%
*Respondents could report more than
one type of incident.
Total =
3525
100%
22
What Risks Materialized and
Who Got Hurt?

In 1,169 (42%) of the incidents, participants
experienced no negative consequences as a result of
their intervention.

Another 296 participants (11%) reported an elevation
in status.

Almost half of our interveners reported suffering to
some degree; most recounting only emotional distress
as opposed to career or social standing damage.

Some reported serious consequences, such as feeling
shunned, forced to leave a job, or losing previously
close friends or allies. A few feared law suits, although
none ever materialized.
23
Glass 2/3rds full




Despite personal risks, 2/3 of participants claimed to
have attempted to prevent or correct a wrong in
progress, or to minimize damage that had already
occurred.
Very few participants initially reported their concerns to
another entity, opting to attempt informal corrective
steps or achieve damage control on their own or in
partnership with other colleagues.
The most common reasons offered for acting included a
commitment to research integrity, to avoid damaging
the reputation of oneself, one’s lab, or institution, or to
prevent an associate from making a mistake.
Almost all respondents took direct action if the
questionable act was perpetrated by their own post docs
or assistants.
24
Who Takes Action, and
Does It Work?

A binary logistic regression analysis profiled
characteristics of researchers who intervene:
Most likely to take action were those who

held a higher professional or employment status
than the suspected wrongdoer

had less regular interaction or involvement with the
suspected wrongdoer

based their suspicions on strong evidence (i.e.,
direct observation or direct disclosure the
transgressor rather than second-hand accounts or
hearsay)

perceived the transgression as unintentional, and 25
Most likely to take action were
those who

held a belief that individuals have a primary
responsibility to become actively involved in
maintaining scientific integrity.

The vast majority of those who felt victimized or
who believed that they might suffer blame also
proved likely to intervene individually or by
reporting the matter, suggesting that acts
involving direct threat to oneself will likely lead to
taking some type of action.

The highest rates of intervention occurred for
projects described as taking place in the context
of high stress that compromised research quality.
Those Who Did Not Act

About a third of participants did not take
action regarding any incident they shared
with us.



The largest group revealed that they felt too remotely
involved or knew that others were already taking
action.
Another third claimed they simply did not know what
to do.
Reluctance to deal with a suspected offender
perceived of as difficult person or who was their
superior were other common reasons for inaction, as
was an unwillingness to act when evidence seemed
insufficient.
27
Those Who Did Not Act

Social relationships, job security, and
status become more salient in close
working conditions. So perhaps
understandable, but also disappointing –
we found that those who worked closely
with suspected wrongdoers were less
likely to take any action.

Thus, the best opportunity to observe wrongs
and stop or correct them appears to also be
less likely to be utilized.
28
Those Who Did Not Act

We asked if those who took no action
on their suspicions experienced
lingering reservations.

Forty percent of those who did not get
involved, even though they had direct
evidence of wrongdoing, still felt
misgivings, sometimes even after many
years had since passed.
ROGUES GALLERY
30
Paul Kornak
Clinical trials in oncology and homicide

Paul Kornak pled guilty to
criminally negligent
homicide for falsely
representing results of
blood chemical analyses in
a chemotherapy study.

One participant who
should have been
excluded from the study
died as a result.
31
Paul Kornak
In August 2000, he applied for employment to
the VA, submitting a false Declaration for
Federal Employment form, denying any
criminal history, despite having a prior
conviction and serving 3 years probation for
mail fraud in 1992.
By October of 2000, Kornak held responsibility
for organizing, coordinating, implementing, and
directing all research elements in the Stratton
VA Medical Center oncology research program.
32
Paul Kornak
•From May, 1999, to July, 2002, Kornak defrauded the
sponsors of clinical studies by repeatedly submitting false
documentation to enroll participants who did not qualify under
study protocols.
•He caused the death of a participant by falsely representing
results of the patient’s blood chemistry analysis to suggest the
participant met criteria for study participation, when actual
results showed impaired kidney and liver function.
•The patient was administered docetaxel, cisplatin, and 5-FU
in connection with the study protocol on or about May 31,
2001, and died as a result on or June 11, 2001.
33
Case Summary - Paul H. Kornak
[Federal Register: February 24, 2006 (Volume 71, Number 37)]

Paul H. Kornak, Stratton VA Medical Center,
Albany, New York pled guilty to:
 making and using a materially false statement
 mail fraud and
 criminally negligent homicide
 (See United States of America v. Paul H.
Kornak, Criminal Action No. 03-CR-436 (FJS),
U.S. District Court (N.D.N.Y.) (January 18,
2005).
 In addition to 71-month prison term, he was
directed to pay restitution to two
pharmaceutical companies and the VA in the
amount of approximately $639,000.
34
Andrew Wakefield:
Heavyweight Champion of Scientific Fraud

British physician/researcher
who (with 12 co-authors)
published a paper in Lancet
(February, 1998) about 12
autistic children, describing a
new syndrome: “autistic
enterocolitis,” and raising the
possibility of a link between a
novel form of bowel disease,
autism, and the MMR vaccine.
Andrew Wakefield



The paper was a complete fraud, describing
invented patients and bogus data.
The scare frightened parents and pushed down
vaccination rates, causing outbreaks of
measles in the UK.
The “theory” has persisted among many
parents of autistic children, reducing support
for other urgently needed genuine research.
Andrew Wakefield

A follow-up article published by the BMJ in January, 2011
revealed that Wakefield—in partnership with the father of
one of the boys in the study—had planned to launch a
venture linked to MMR vaccination scare that would profit
from new medical tests and "litigation driven testing."

Wakefield predicted he "could make more than $43 million
a year from diagnostic kits" for autistic enterocolitis.

Wakefield is no longer licensed as a physician and
reportedly now resides in Texas.

He still has a following including luminaries such as former
Playboy playmate Jenny McCarthy.
Too Good to be true:
Stephen Breuning

By the age of 30 he had produced an influential body of work
on treatment of the mentally retarded. But there was
something odd about the work of Stephen Bruning, then an
assistant professor of child psychiatry at the University of
Pittsburgh.

His data seemed almost too orderly, too pat, and collected
with remarkable speed. The doubts came to a head in 1983
when his supervisor, Robert Sprague, then director of the
Institute for Child Behavior and Development at the
University of Illinois, reported his suspicions to the NIMH.
38
Too Good to be true:
Stephen Breuning

Between 1979 and 1984, said Sprague,,
"produced one-third of the literature in the
psychopharmacology of the mentally
retarded.“
http://www.time.com/time/magazine/article/0,9171,964485,00.html#
ixzz1Im1gTTtM

Dr. Stephen Breuning was convicted of
"academic fraud-related charges" in the
United States District Court of the District of
Maryland on November 10, 1988.
39
Harvard Finds Scientist Guilty of
Misconduct
By Nicholas Wade, Published: August 20, 2010, Boston Globe

Harvard University found a prominent researcher,
Marc Hauser, “solely responsible” for eight instances
of scientific misconduct. Marc Hauser worked in the
field of cognition and morality. Hours later, Dr.
Hauser, a rising star for his explorations into cognition
and morality, made his first public statement since
news of the inquiry emerged last week, telling The
New York Times, “I acknowledge that I made some
significant mistakes” and saying he was “deeply sorry
for the problems this case had caused to my students,
my colleagues and my university.”
40
Hauser’s Graduate Student
Marc Hauser resigned his position as a
faculty member, effective August 1, 2011

A large majority of the Harvard psychology faculty had voted not to
allow him to teach in the department this year, and Faculty of Arts
and Sciences dean supported the decision.

“While on leave over the past year, I have begun doing some
extremely interesting and rewarding work focusing on the
educational needs of at-risk teenagers. I have also been offered
some exciting opportunities in the private sector,” Hauser wrote in
a resignation letter to the dean, dated July 7. “While I may return to
teaching and research in the years to come, I look forward to
focusing my energies in the coming year on these new and
interesting challenges.”

http://www.boston.com/Boston/whitecoatnotes/2011/07/embattled-harvardpsychology-professor-resigns/Yb6hnLhdPuBkPf4f0rTXpO/index.html
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell


Interviewed 135 of our participants and 125
provided first hand witnessing and responding to
research wrongdoing.
Participants reported a variety of responses,
including formal notification of institutional
officials, peer shaming and one-on-one
discussions with peers to address wrongdoing
that ranged from improper attribution of
authorship to falsification, fabrication and
plagiarism.
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell

Unexpectedly, administrative
incompetence in handling allegations
seemed the prevalent theme, exceeding
all of forms of wrongdoing that were
coded. Institutions may have neither
effective nor efficient processes for
managing even the most egregious cases
of research wrongdoing.
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell

“Our research team
recounted to our dean the
bullying and dishonesty we
had experienced. It
seemed to us that the dean
didn’t know how to confront
the powerful perpetrator
and didn’t confront the
wrongdoing. He instructed
all of us to relinquish any
right to our data.”

“My university admitted
many very wealthy foreign
students, most of whom
plagiarized rampantly.
When I spoke out about it,
colleagues explained that if
they cracked down on
plagiarism, the students
would simply leave and go
to another expensive
private university where
they could submit
plagiarized work.”
Witnesses of Research Wrong-Doing
Joan E. Sieber and Ann Meeker O’Connell

“The department had recently
hired a nationally known
superstar with tenure, who
brought many grants with him.
He was close to retirement and
suffering from severe emotional
problems, which manifested
themselves as paranoia, selfaggrandizement, and extreme
cruelty and ruinous abuse of
post-docs and students in his
lab, who were driven out of the
laboratory or left science
entirely.”

“I had just become a post doc for
a PI who gave me data on 50
subjects to work with. However,
the research coordinator, who
was resigning, told me that fMRI
scans had only been done on 6
of the 50 subjects and that the
results did not support the PI’s
hypotheses. I felt like I had just
been handed a smoking gun,
and wanted out immediately.
But how?”
Culture shifting

Actively engaging colleagues
with gentle alternatives to
whistleblowing





Offering help
Expressing concern
The “Bullwinkle” approach
Encouraging reporting of nearmiss situations
Apologizing when appropriate
47
Create a culture that encourages
reporting of human error
Near-miss recognition and reporting
systems (near-miss, zero cost)
 Creating a safe climate for sharing
concerns in a professional manner and
context (engaged colleagues)

48
www.ethicsresearch.com
COPIES OF THIS PRESENTATION
AND OTHER RESOURCES INCLUDING
THE RRW MANUAL ARE
AVAILABLE FOR DOWNLOADING
FREE OF CHARGE.