EMBO_only European Molecular Biology Organization European

Download Report

Transcript EMBO_only European Molecular Biology Organization European

Options for Evaluating Research:
Inputs, Outputs, Outcomes
Michele Garfinkel
Manager, Science Policy Programme
ICSTI ITOC Workshop
19 January 2015, Berlin
Today’s talk
• About EMBO
• A policy view of research
assessment
• Stakeholder roles
About EMBO
• European Molecular Biology
Organization (Maria Leptin, Director)
• Founded 1964, Heidelberg, DE
• Funded by the European Molecular
Biology Conference
– 27 Member States
– 3 cooperation agreements
• Advancing policies for a world-class
European research environment
Science Policy Programme
• Governance
• Three main areas: biotechnology,
responsible conduct of research,
scientific publishing
– Technology assessment
• Scientific publishing
– Open access
– Data
– Responsibilities of editors, administrators,
authors
Scientific
publishing
The publication of scientific
information is intended to
move science forward.
More specifically, the act of
publishing is a quid pro quo
in which authors receive
credit and acknowledgment
in exchange for disclosure
of their scientific findings.
Journal name as proxy for quality
• Journal Impact Factor: a librarian’s
number
• The concern is not use, but misuse
– Research assessment
– “JIF 38.597: a subscription for the price of
the IF”
• Why has this been adopted for research
assessment?
– Cross-disciplinary
– Intuitive and reflective
– Prospective
Research assessment
is an ecosystem
Funders
Journals
Researchers
Other assessors?
What DORA sets out
• Main recommendation: Do not use
journal-based metrics, such as Journal
Impact Factors, as a surrogate measure
of the quality of individual research
articles, to assess an individual
scientist’s contributions, or in hiring,
promotion, or funding decisions
• Implementation?
What DORA sets out
• Research institutions and funding
agencies: be clear on evaluation criteria
and consider all contributions
• Publishers: do not use JIF as a
marketing tool, make more article level
metrics available, make all reference
lists open, remove limits on reference
list length
What DORA sets out
• Metrics suppliers: provide methodology
and data in a useful form, account for
variation in article types (reviews v.
research articles)
• Researchers: as assessors, review for
scientific content; as authors, cite
appropriate (primary) literature;
challenge bad practices
What DORA does not say
• Metrics based research assessment is wrong
• JIF is flawed for assessing journals
• Citations are a flawed metric
• There is a simple alternative
• Publishers are to blame
• Thomson Reuters is to blame
What DORA does not say
• Metrics based research assessment is wrong
• JIF is flawed for assessing journals
• Citations are a flawed metric
• There is a simple alternative
• Publishers are to blame
Altmetric Score
• X is to blame
Incremental advances
• More institutions and funders
emphasizing biosketches and 'select
your 5 best papers' strategies over IF.
• Constructive discussions with Thomson
Reuters. More interest in dialogue and a
willingness to improve the JIF as a
metric
• Competition is good for everyone
Incremental advances
• Engagement with funders
• Engaging additional research
communities
• Study national/regional variations
• Editorials forthcoming
– Key point: better analyses needed
• Policy analysis
– Implementation and governance issues,
metrics, stakeholders
It’s the system (?)
• This is not (just) about overworked or lazy
promotion committees and rapacious
journals
• The reward system in science is
(becoming) warped
• Resources for thorough evaluation are not
available
• Journal articles have become the currency
of rewards rather than a contribution to
knowledge
Research Assessment:
Stakeholders
•
•
•
•
•
•
•
Researchers
Publishers
Research administrators
Funders
Metrics researchers
Metrics providers
Decision-makers
What should we be assessing?
• We are great at measuring inputs
(funding, numbers of students)
• We are good at measuring outputs
(numbers of papers, some impact
measures)
• Outcomes measurements are a
problem
What should we be assessing?
• Papers
– And how they are discovered?
• Data
– And how they are discovered?
•
•
•
•
Reviewing?
Teaching?
Committee work?
Responsible conduct?
Ongoing work
• Workshops
– Governance issues
– Stakeholders
• Engagement with funders