Publication output patterns

Download Report

Transcript Publication output patterns

The storyline
The programme evaluation (PE) literature has – over the past forty years
– developed a fairly standard classification of evaluation types. There are
certain distinctions that are (at least within realist approaches) relatively
widely accepted. I begin by discussing some key notions in programme
avluation, including the notion of the Logic Model framework and give an
example of how it has been useful in constructing a monitoring
framework of research at the systems/institutional level.
Part Two is devoted to a case study in research impact assessment. In this
discussion I reconstruct the processes and events that led to high levels of
uptake and impact.
In the final part of the paper, I make some general comments and
observations about the notion of research impact.
Part One:
Impact assessment within programme evaluation
studies
The logic of interventions
Context
INTERVENTION
MANAGEMENT
Human resources
Project
administration
M&E system
Stakeholders
INTERVENTION
STRUCTURE
Programme resources/
Activities/outputs
E. g. workshops/
Courses/ manuals
GOALS
Objectives
Target
group
MEASURABLE OUTCOMES
More knowledgeable /
Increased competence/
Higher productivity
Applying the logic model to interventions
Problem
Programme aims
and objectives
Resources
What you need to carry out
the activities you have
planned – people, money,
materials, infrastructure
Activities
What you do with
the resources you
have
Outputs
What the activities produce
e.g. products/ deliverables/
goods/ services
Outcomes/
effects
What happens to the target
group as a result of the
delivery of the programme
Indicators
Concrete and measurable “signs” of
“occurrence” which are often quantified and
aggregated into composite measures (indices)
Types of programme monitoring
•
•
•
Process or implementation evaluation
− Process evaluation verifies what the programme is and whether it is
delivered as intended to the targeted recipients
− It addresses issues about the effectiveness of programme operations,
service delivery and whether it is successful in reaching the target group
as planned (coverage)
Routine programme monitoring and management information systems
− Gathers information on programme outputs e.g. Number of clients
served, quality of service provided, nr of workshops delivered,etc.
− Continuous monitoring of indicators of selected aspects of programme
processes or activities as a tool for effective management
Performance measurement and monitoring
− Accountability demands require that programmes demonstrate that
they accomplish something worthwhile often against some set standards
or benchmarks
− Orientated towards the assessment of outcome, I.e. the results of
services
Impact assessment in programme evaluation
The basic aim of impact assessment is to produce an
estimate of the net effects of an intervention – i.e. an
estimate of the impact of the intervention
uncontaminated by the influence of other events or
processes that may also affect the behaviour or changes
that a programme is directed at achieving (Freeman &
Rossi)
Prerequisites for assessing impact
• The programme’s objectives must be sufficiently well articulated
to make it possible to specify credible measures of the expected
outcomes
• The programme must have been sufficiently well implemented –
there must be no question whether its critical elements have been
delivered to the appropriate targets
Gross versus net outcomes
•
•
Establishing a programme’s impact, i.e. its combined and accumulative effects
(outcomes), is identical to establishing that the programme is a cause of a specific effect
or effects.
It is important that causality not be confused with lawlikeness. “A is a cause of B” usually
means that if we introduce A, B is more likely to result than if we do not introduce A.This
statement does not imply that B always results, nor does it means that B occurs only if A
is introduced.
Gross
=
outcome
Effects of
Intervention
(net effect)
+
Effects of
other
processes
(extraneous
factors)
+
Design
effects
An example at the systems/institutional level
• The National Plan on Higher Education (2002)
has set the following five systemic goals for
Higher Education research
− To redress the inequities of the past as far as race
and gender is concerned (Equity)
− To ensure that HE research is in line with national
goals (Responsiveness)
− To increase the volume of research output (Quantity)
− To improve the quality of research produced in the
system (Quality)
− To improve the efficiency (throughput) of the system
(Efficiency)
Domains by system goals
GOALS
RESEARCH
STUDENTS
STAFF
1. Equity
1.1 More women /
black/ young
participating
1.2 More women /
black/ foreign
participating
1.3 More women /
black/ young /levels/
participating
2. Responsiveness
2.1 Responsive to
national goals/ niche/
industry partnerships
2.2 Fields of
study/levels
Graduateness
2.3 Programmes/ Field
of study/mix
3. Quantity/
volume
3.1 Higher volume of
research outputs
3.2 Increased
participation/
output
3.3 More staff
4. Quality/
excellence
4.1 Better quality
research
4.2 Better skilled
students
4.3 Higher qualified
staff/\
Better programmes
5. Efficiency
5.1 Improved outputcost ratio
5.2 Throughput
Completion rate
5.3 Staff-student
Staff-research
An example of indicators of research (inputs,
outputs, effects) at the systems/ institutional level
Equity
Participation
Responsiveness
Quality
Efficiency
RESEARCH
Researchers
Funding
Research
process
Internal efficiency
Nr of black and female
researchers getting NRF funding
[EQ]
R&D expenditure [P]
Contract income [R]
THRIP funding [R]
Numbers of rated researchers
[Q]
NRF grants in focus areas [R]
Research
outputs
EFFECTS
Effectiveness
Publications per FTE
researcher [EF]
Publications per R&D
expended [EF]
Extent of research
utilisation [R]
Nr of black, female and young
PhD’s [EQ]
Nr of research publications
[P]
Nr of ISI publications [Q]
Publication trends in line with
national goals [R]
Research co-authorship [R]
Part Two: A case study in research impact assessment
CREST’s research programme on the demographics of SA
scientists
The CREST research on the demographics of the
SA S&T workforce:The origins
NRTA
[R&D Survey]
Survey
database
Development of
SA Knowledgebase
Public
presentation
DACST not
interested in its
further development
1997 ---------------------1998 ---------------1999---------------2000 ----------- 2001
The production of scientific knowledge in SA:
Race trends
100%
80%
60%
40%
20%
0%
1990
1992
1994
White
1996
African
Indian
1998
2000
Coloured
2002
The production of scientific knowledge in SA:
Gender trends
Male
Female
02
20
01
20
00
20
99
19
98
19
97
19
96
19
95
19
94
19
93
19
92
19
91
19
19
90
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
The production of scientific knowledge in SA:
Age trends
100%
2
90%
18
3
21
80%
5
5
25
29
7
32
70%
60%
41
41
50%
41
38
40%
37
11
32
36
30%
20%
33
30
25
24
10%
0%
15
33
34
21
19
17
6
5
4
4
3
2
1
1990
1992
1994
1996
1998
2000
2002
Under 30
30 - 39
40 - 49
50 - 59
60+
Initial dissemination of the results (2001 – 2002)
• Public presentations
− Science in Africa symposium (October 2001)
− SARIMA founding meeting (February 2002)
− NACI Discussion Forum in Pretoria (April 2002)
• Publications
− (with A Bawa) “Research” chapter in Transformation
in higher education in SA (July 2001 submitted;
published in 2002)
− NACI Facts and Figures – 2002
− South African science in transition. Science, technology
and society Vol 8 (2003)
Accelerated uptake and utilisation of the CREST
results
• NACI commission on South African Science: Facts and figures
(November 2001/ Finished April 2002)
• COHORT request (June 2002) for a think piece on “A new
generation of scientists” (report submitted in December 2002)
• All three slides on the “frozen demographics” - Department of
Science and Technology – Final version of the new R&D Strategy
(August 2002)
• Request by the Academy of Science of South Africa for a document
on: Promoting worldwide S&T capabilities for the 21st century
(August 2002 report submitted/ symposium held October 2002)
• Reference in numerous press statements by ministers, directorsgenerals and senior officials of the DST, DoE, NRF, SAUVCA and
others
• Requests from a number of universities and professional societies
(Physics/ Marine Science) for the data.
Mapping the effects of the CREST research
programme
IE11
(expanded
research programme)
IE1
(better understanding)
IE21
(expanded dbase)
IE2
(dbase development)
UE1
(inform R&D strategy)
Leverage funds for R&D
UE2
(inform strategic planning)
UE3
(inform HR policy)
IE111
(journals)
IE112
(language)
IE222
(OSTIA)
?
UE21 DST
UE22 NRF
UE31
(UCT)
UE32
(US)
ACCUMULATIVE EFFECTS
Some observations on the process
• Researcher directed and driven dissemination soon
complemented by user/demand-driven dissemination
• Differentiated effects: The research produced a range of
very different kinds of effects across a wide range of
actors
• The majority of the effects were unexpected and were
not foreseen
• One effect often spun-off multiple subsequent effects
Part Three
Concluding observations
Where programme evaluation tools are useful
• Outputs and outcomes or effects not to be confused: Research outputs
are the immediate (epistemic) results (findings/data) that flow from the
research process. Outcomes (effects) imply some change/ some
recognition of the value of these results and subsequent utilisation or
uptake thereof: other scientists who cite my results, users who apply the
newly acquired information in various ways through technology
development or improvement, policy formation, improvement of practice,
and so on.
• Research outputs can generate multiple effects: One discovery or
research finding can produce many and diverse outcomes (some
immediate = which we will call the proximate effects/ others accruing
over a longer time frame = which we will call accumulative effects)
• Effects in turn often spin-off other effects (multiplier effect) and new
effects (often unintended) emerge over time
Knowledge production, uptake or utilisation and impact
Modes of
research
Research (based)
Outputs/ results
Production
New knowledge
Knowledge applications/
technologies
CODIFIED
New facts/
theories/
models
Policies/ legislation/practices
Process technologies
Product technologies
Tests/ scenario’s/ systems
Scientific community
EMBODIED
Students
Society
Government
Industry
Uptake
Impact
But there are also fundamental differences between
intervention and research programmes !
• An intervention programme = a structured set of goal-driven activities
that are aimed at producing certain (measurable) positive
outcomes/effects for a specific target group
• A research programme = an assemblage of loosely interrelated,
sometimes converging but also diverging, activities (research lines)
intended to produce credible results that have epistemic (knowledge
generating) and non-epistemic (symbolic/social/ technological/ economic)
value
• Proper implementation of an intervention programme (a training
programme or poverty alleviation programme) implies that there are
expected outcomes that can be “predicted” – that are reasonably
determinate
• Good execution of a research programme still does not imply that the
outcomes are determinate – research essentially remains an open-ended
process of unforeseen discoveries and findings
Concluding observations
The impact of a intervention AND research programmes is usually…..
• Only evident after some time has lapsed (the notion of emergence)
• The combined result of various effects or outcomes that together
produces the benefits to the users (the notion of accumulation)
• Made up of very different (in nature) kinds of mutually reinforcing effects
(the notion of differentiation)
BUT whereas intervention programmes have both intended AND
unintended effects (the notion of goal-free vs. driven evaluations)
Research programmes are more than likely to include unforeseen than
foreseen effects (the notion of indeterminacy)
Conclusions
Measuring the impact of scientific research
(programmes) means observing/estimating the
The accumulated, differentiated proximate and
emergent effects – some of which will of
necessity be unforeseen
Thank you