Research evaluation - euroCRIS | Current Research

Download Report

Transcript Research evaluation - euroCRIS | Current Research

THE ROLE OF CITATION ANALYSIS
IN RESEARCH EVALUATION
Philip Purnell
September 2010
HOW DO WE EVALUATE RESEARCH?
• Research grants
– Number and value
• Prestigious awards
– Nobel Prizes
• Patents
– Demonstrating innovative research
• Faculty
– Number of post-graduate researchers
• Citation analysis
– Publication and citation counts
– Normalised by benchmarks
• Peer Evaluation
– Expensive, time consuming and subjective
A BRIEF HISTORY OF THE CITATION INDEX
• Concept first developed by Dr Eugene Garfield
– Science, 1955
• The Science Citation Index (1963)
–
–
–
–
SCI print (1960’s)
On-line with SciSearch in the 1970’s
CD-ROM in the 1980’s
Web interface (1997) Web of Science
• Content enhanced:
– Social Sciences Citation Index (SSCI)
– Arts & Humanities Citation Index (AHCI)
• The Citation Index
– Primarily developed for purposes of information retrieval
– Development of electronic media and powerful searching tools have
increased its use and popularity for purposes of Research Evaluation
WEB OF SCIENCE
JOURNAL SELECTION POLICY
• Why do we select journals?
WHY NOT INDEX ALL JOURNALS?
120
40% of the journals:
% of database
100
80
• 80% of the publications
60
• 92% of cited papers
40
4% of the journals:
20
• 30% of the publications
0
0
1000
2000
3000
4000
# of journals
Articles
Citations
5000
6000
• 51% of cited papers
HOW TO DECIDE WHICH JOURNALS
TO INDEX
• Approx. 2000 journals evaluated annually
– 10-12% accepted
• Thomson Reuters editors
– Information professionals
– Librarians
– Experts in the literature of their subject area
Web of Science
Journals under evaluation
Journal ‘quality’
THOMSON REUTERS
JOURNAL SELECTION POLICY
• Publishing Standards
– Peer review, Editorial conventions
• Editorial content
– Addition to knowledge in specific subject field
• Diversity
– International, regional influence of authors, editors, advisors
• Citation analysis
– Editors and authors’ prior work
GLOBAL RESEARCH REPRESENTATION
WEB OF SCIENCE COVERAGE
Region
# Journals from Region in Web of Science
Europe 5,573
49%
North America 4,251
38%
Asia-Pacific 965
9%
Latin America 272
2%
Middle East/Africa 200
1%
Language
# Journals in Web of Science
English
9114
81%
Other
2147
19%
SUMMARY
CONSISTENCY IS THE KEY TO VALIDITY
 Analyses based on authoritative, consistent data from the world’s
leading provider of Research Evaluation solutions
 Thomson Reuters has developed a selection policy over the last
50 years designed to hand-pick the relevant journals containing
the core content over the full range of scholarly disciplines
 This has created a large set of journals containing comparable
papers and citations
 Thomson Reuters has always had one consistent editorial policy
to index all journals cover-to-cover, index all authors and index all
addresses. This unique consistency makes Web of Science the
only suitable data source for citation analysis
GOVERNMENTS AND INSTITUTIONS
USING TR DATA FOR EVALUATION (INCL.)
• Germany: IFQ, Max Planck Society, DKFZ, MDCUS
• Netherlands: NWO & KNAW
• France: Min. de la Recherche, OST - Paris, CNRS
• United Kingdom: King’s College London; HEFCE
• European Union: EC’s DGXII(Research Directorate)
• US: NSF: biennial Science & Engineering Indicators report (since 1974)
• Canada: NSERC, FRSQ (Quebec), Alberta Research Council
• Australian Academy of Science, gov’t lab CSIRO
• Japan: Ministry of Education, Ministry of Economy, Trade & Industry
• People’s Republic of China: Chinese Academy of Science
• Times Higher Education: World University Rankings (from 2010)
10
EVALUATING COUNTRIES
SCIENTIFIC RESEARCH IMPACT
IN CENTRAL EUROPE
Thomson Reuters InCites
12
OUTPUT AND PRODUCTIVITY
BULGARIAN RESEARCH 1998 - 2008
13
COMPARATIVE IMPACT IN SELECTED FIELDS
BETWEEN COUNTRIES
Source: Thomson Reuters InCites
14
BULGARIAN RESEARCH
RELATIVE PRODUCTIVITY BY FIELD
22% Bulgarian papers
are in Chemistry
<1% Bulgarian papers
are in Psychiatry
Source: Thomson Reuters InCites
15
EVALUATING INSTITUTIONS
EVALUATING INSTITUTIONS
Source: Thomson Reuters
North America University Science Indicators
CITATIONS PER PAPER
MATHEMATICS
Source: Thomson Reuters InCites
18
COMPARISON OF TOP MATHEMATICS
INSTITUTES AROUND THE WORLD
Source: Thomson Reuters InCites
19
WITH WHOM DOES OUR FACULTY
COLLABORATE?
Source: Thomson Reuters InCites
20
WHICH COLLABORATIONS
ARE THE MOST VALUABLE?
Collaborations with
these institutions have
produced highly cited
papers within their
subject fields
Source: Thomson Reuters InCites
21
EVALUATING JOURNALS
CALCULATING 2009 IMPACT FACTOR
- JOURNAL OF CONTAMINANT HYDROLOGY
Citations in 2009
To items published in 2008 =
153
To items published in 2007 =
239
Sum
392
=
392
= 2,01
195
Number of items
Published in 2008 =
97
Published in 2007 =
98
Sum
=
195
JOURNAL IMPACT FACTOR
SELECTED CHEMISTRY JOURNALS
Thomson Reuters Journal Citation Reports
24
USING THE IMPACT FACTOR
EVALUATING JOURNALS
• Appropriate use
– To evaluate journals within a subject field
• Misuse
– Comparison of journals from different fields
– Evaluation of individual articles
– Evaluation of institution or researcher
25
USING THE IMPACT FACTOR
MISUSE: EVALUATING INDIVIDUAL PAPERS
30% of articles in
Food Policy were
not cited at all
Journal Impact
Factor = 2,01
26
BENCHMARK YOUR PAPERS AGAINST
GLOBAL AVERAGES – IS THIS A HIGHLY CITED PAPER?
Articles published
in ‘Blood’ from
2004 have been
cited 34,30 times
This paper has received
40/34,30=1,17 times the
expected citations for
this journal
Hematology articles from
this year have been cited
18,83 times
This article is ranked in the
12,92nd percentile in its
field by citations
This paper has received
40/18,83=2,12 times the
expected citations for
this subject category
27
EVALUATING INDIVIDUALS
HOW CAN WE COMPARE RESEARCHERS?
Author A: 60 papers
Author B: 117 papers
Source: Thomson Reuters InCites
29
OBTAIN MULTIPLE MEASURES
RECOGNIZE THE SKEWED NATURE OF
CITATION DATA
• Citation distribution is always skewed
– Few highly cited papers
– Majority cited little or not at all
• Distribution type
– Always distorted
– Human decision
• E.g. Criticality
SUMMARY (I):
TREAT AS A SCIENTIFIC STUDY
• Ask whether the results are reasonable
• Follow scientific process for evaluating data
• Apply scientific skepticism
SUMMARY (II):
HOW DO WE EVALUATE RESEARCH?
• Research grants
– Number and value
• Prestigious awards
– Nobel Prizes
• Patents
– Demonstrating innovative research
• Faculty
– Number of post-graduate researchers
• Citation analysis
– Publication and citation counts
– Normalised by benchmarks
• Peer Evaluation
– Expensive, time consuming and subjective
THANK YOU
Philip Purnell
September 2010