THOMSON REUTERS PRESENTATION TEMPLATE

Download Report

Transcript THOMSON REUTERS PRESENTATION TEMPLATE

Using Incites Benchmark and Analytics to Evaluate Research
Performance
[email protected]
http://incites.thomsonreuters.com
Objectives:
• During this session we will explore the scenarios and questions related to evaluating
research performance on three different levels:
– Institution
– Individual researcher
– Custom groupings (peer groups/ competitors)
• We will discuss each scenario/question and provide evidence based responses
using metrics and reports taken from the various modules of Incites.
• The aim of this workshop is to show with practical examples how Incites data can
be applied to provide citation based evidence to support decisions for the purpose
of Research Evaluation.
• This is an interactive workshop and participants are encouraged to input
questions they have regarding Research Evaluation at their institutions and the
group will discuss how Incites can be used to provide a solution.
• Upcoming enhancements to Incites
2
InCites Benchmark and Analytics Data
Overview
Source Edition: Web of Science
Core Collection Data
• SCIE, SSCI, A&HCI, CPCI-S, CPCI-SSH, BKCI-S, BKCISSH
Citing Edition:
• ALL
Document Types:
• ALL (Articles, Reviews, Letters, Editorials,
Conference Proceedings, Books, Book Chapters, etc)
Organizations:
• 4,856 unified organizations
Regions:
Subject Schemas:
Journals/Books/Proceedings
Time Period:
Data and Benchmarks:
Source and Citing Years:
• 241 Countries
• 11 Discipline Schemas (including WoS & ESI
categories and regional schemas)
• 153, 930+ publications from Web of Science Core
Collection
• 1 Year, Cumulative
• Will be updated bimonthly (WOS CC 1st October 2014,
ESI July/August 2014)
• 1980-2014
Using bibliographic data to evaluate Research
Performance
Who are our authors?
Who do they collaborate with?
Where do we publish our research?
In which journals is the performance
exceeding the expected impact?
Who do we collaborate with?
Which are the best performing
collaborations?
What is our overall influence?
How does our average influence
compare to peers?
Which are our best performing
papers?
Do we produce Highly Cited/ Hot
research?
4
Using bibliographic data to evaluate Research
Performance
Who are our authors? Using
RID’s to identify papers.
How are they performing?
In which areas of research do we
publish?
Which are our best performing
areas?
What types of documents do
we publish?
5
Incites B&A Key Functionality
• Explore the thematic modules (people, institutions, countries, research areas,
sources)
• Create custom reports (assign research areas, thresholds, time period…)
• Explore the absolute and normalised indicators
• Create a range of visualizations from graphs available
• Save reports (tiles) and edit saved reports
• Edit your dashboard
• Export source articles
• Export reports (tabular reports)
• Share your dashboard
• Create custom groups using pinning feature and
• Evaluate performance using baselines (pinned list baselines, country baselines,
global baselines)
• Refocus reports to explore thematic details of research output
• Run preconfigured system reports (evaluate institutional performance)
6
GUIDELINES FOR CITATION
ANALYSIS
• Compare like with like – The Golden Rule
• Use relative measures, not just absolute counts
• More applicable to hard sciences than arts/humanities
• Know your data parameters:
– journal categories
– author names
– author addresses
– time periods
– document types
• Obtain multiple measures
• Recognize skewed nature of citation data
• Ask whether the results are reasonable
• Understand your data source
For further guidance on citation metrics, download the white papers at:
http://science.thomsonreuters.com/info/bibliometrics/
http://isiwebofknowledge.com/media/pdf/UsingBibliometricsinEval_WP.pdf
7
Absolute and Normalised Metrics
Normalisation takes place at these
levels
•Document type
•Publication Year
•Journal
•Category (Web of Science Categories
assigned to journals)
This papers impact is:
•9 times above or 900% compared to
expected journal performance
•48 times above or 4,800% compared to
expected category performance
•Ranks in Top 1% of its field
8
Organisations- What do want to
measure/analyse/evaluate?
9
Evaluate Performance of Organisations
1)
What is our total output and how does that compares to peer institutions?
2)
What is our output trend over time? How does this compares to peers?
3)
How does our normalised impact (at journal and category) compare to our peers?
4)
Do we produce Highly Cited and Hot Papers? How does that compare to peers?
5)
What % of output is a collaboration with an international organisation?
6)
What % of output is a collaboration with industry?
7)
Which are our top performing papers?
8)
Which are our strongest fields of research in terms of output?
9)
Which is our strongest field of research in terms of impact?
10)
How do we perform in an area of research? How does that compare to peers?
11)
Who do we collaborate most frequently with within our country and outside?
12)
Which are our best performing collaborations?
13)
Is our research performance exceeding the country/global average?
14)
Has the number of publications in open access journals increased ?
15)
In which journals/proceedings/books do we publish?
Can you think of other questions related to evaluating institutional research performance?
10
Using Incites to evaluate research
performance
2. What is our output trend over time? How does this compares to peers?
3. How does our normalised impact (at journal and category) compare to our
peers?
1. What is our total output and how does that compares to peer institutions?
Using Incites to evaluate research
performance
4. Do we produce Highly Cited
and Hot Papers? How
does that compare to
peers?
5. What % of output is a
collaboration with an
international organisation?
6. What % of output is a
collaboration with industry?
12
Article level performance
7. Which are our top performing papers? Use either Times Cited, Category NCI
or Percentile
13
Using Incites to evaluate research
performance
8. Which are our strongest
fields of research in
terms of output?
9. Which is our strongest
field of research in
terms of impact?
14
Using Incites to evaluate research
performance
10. How do we perform in an
area of research?
How does that compare to
peers?
15
Using Incites to evaluate research
performance
11. Who do we collaborate
most frequently with
within our country
and outside?
12. Which are our best
performing
collaborations?
16
Using Incites to evaluate research
performance
13. Is our research performance
exceeding the country/global
average?
17
Using Incites to evaluate research
performance
14. Has the number of publications in
open access journals
increased ?
18
‘Refocus’ report to explore other themes of
Research Performance
19
Where do we publish?
15. In which journals/proceedings/books
do we publish?
20
Export Tabular Reports
21
Save Report
Save report to:
•
Dashboard
•
Existing folder
•
New folder
22
Edit Reports
23
Access My Folders and Run Saved Reports
24
Export Article level metrics
Export up to 50,000 records
25
Individual Researcher
Evaluation-what do you want to
measure/analyse/identify?
26
Author Performance
1)
What is my total output?
2)
What is my overall citation impact?
3)
What is my Citation Impact (average cites per paper)
4)
Which are my best performing papers?
5)
What is my overall performance in the fields of research I publish in?
6)
Which are my strongest fields of research in terms of output?
7)
Which are my strongest fields of research in terms of impact?
8)
Who do I collaborate with within my institution and outside?
9)
Which are my best performing collaborations?
10)
In which journals do I publish? Which papers are performing better than the journal
expected impact?
11)
How do I compare to other researchers publishing in the same field? Which papers
are performing better than the category expected impact?
12)
How many papers do I have in the top 1%, or top 10% of their field?
Can you think of any more questions related to evaluating researcher performance?
Create a peer/competitor group
Who do you want to compare?
What metrics are appropriate to
evaluate research performance
of this custom group?
28
Up Coming Enhancements
• Export results from Web of Science Core Collection
to Incites
• More JCR metrics
• WOS Profiles to replace Author Datasets in IC1
29
Thank You!
[email protected]
http://researchanalytics.thomsonreuters.com/incites/