Janet Halliwell
Download
Report
Transcript Janet Halliwell
Janet E. Halliwell
Presentation overview
The big picture - measuring the return on S&T
investments (ROI)
Context, challenges and issues, variables,
methodologies, trends
Innovation and what our understanding means for
assessing ROI
The data challenge
Wrap-up
The ROI context
Increasing pressure to measure impacts of public S&T
investments (nothing new here!)
What do researchers tend think about (especially for P&T)?
Inputs (e.g., funding raised)
Research productivity
Numbers of HQP
Perhaps commercialization
What are institutions most interested in?
Competitiveness for students, prestige and funds
Costs - impacts on their bottom line
What is the larger public interest?
Quality of the PSE system
Larger social and economic impacts from R&D and service
ROI measurement challenges
Very diverse languages and expectations on what is meant by ROI
No universal framework or universally applied methodologies
Measurement of ROI needs to encompass diverse dimensions of
impact:
Economic (e.g. jobs, new products and services, spin off companies,
business process change)
Social and health (e.g. changes in policy and practice, improved
outcomes, costs avoided)
Environmental (e.g. reduced footprint and environmental impact,
branding Canada green)
In addition to practical stumbling blocks of measurement of these ,
interpretation of any measures is non-trivial
And all of the above does not necessarily measure the full impact on
innovation or the innovation system
Some issues
ROI measurement requires us to think about what happens
down the value chain as a result of the research and
research related activities - beyond the quality, volume and
influence of research on other research – e.g. what
difference did this S&T investment make in the real world
ROI measurement is NOT a classical economic I/O study
(which measures the flow of monies resulting from an
activity regardless of what that activity is)
A theoretically-sound ROI method is poor if key
stakeholders are not consulted or don’t understand it
Variables to think about
Your audience/target
Scope and level of aggregation
Distance down the value chain of outputs, outcomes
and impacts
Time scale (how far back)
Methodologies
Desired detail; how to communicate (e.g. visualize)
Balancing accuracy, longevity, comparability and ease
of collection of metrics
Downstream measurment …
Categories relevant to innovation include (at least):
Direct – Using research findings for better ideas, products,
processes, policies, practice etc.
Indirect – From participation in the research, including HQP
training, KT, tacit knowledge, better management, etc.
Spin-off – Using findings in unexpected ways and fields
Knock-on – Arising far after the research is done
Also very important – outcomes that foster an
environment in which innovation flourishes
Example methodologies
Quantitative
Surveys
Bibliometrics, including publication counts, citation analysis, data
mining, international collaboration analysis, social network analysis
Technometrics, including hotspot patent – publication linkages
Economic rate of return – micro and macro levels
Sociometrics
Benchmarking
Qualitative
Peer Review, Merit/Expert Review
Case study method – exploratory, descriptive, explanatory
Retrospective documentary analysis
Interviews
Mixed models (e.g. Payback, OMS, CAHS)
Trends
Mixed methods
Increasing attention to networks, linkages,
collaborations
Global frame of reference
Involvement of stakeholders inside and external to
R&D unit
External focus – e.g. short-term external impacts
for industry or government, rather than immediate
outcomes for the R&D organization
What is innovation?
Doing new things in new ways
Tom Jenkins
“Innovation” is (or should be) a very broad term, BUT …
Many studies focus only on the easiest metrics to measure –
not innovation relevant issues
Or, they focus exclusively on industrial impacts such as sales
of new products
So ….
Encourage other routes and types of impacts
Attempt to measure them, including cost savings
Encourage “end-goal” thinking
And remember
Innovation comes in many different guises, e.g.
Incremental innovation
Integrated innovation
Open innovation
Informal innovation
Social innovation
Design as innovation
Consider measurement in the innovation context
The macro level messages
Measurement is important – but NOT just any
measurement
Need to ground measurement in a strong conceptual
framework connecting activities ultimate goals,
intended uses, and both targeted users (logic models)
Then look at relationships of outcomes and impacts with
innovation in your sector or sphere of activity
Measurement is BOTH qualitative and quantitative
Proper measurement often takes deliberate effort and time
Why quantitative and qualitative
Quantitative for understanding reach, scope, &
importance of impacts
Qualitative for the how and why of impacts , barriers &
solutions, incrementality and attribution, government,
societal and environmental effects, etc.
There is no such thing as a purely quantitative system
that measures full impacts of S&T
Reporting SE impacts
Impact
Investigated and Described in:
Narrative fashion
Quantitative terms
Qualitative
Quantitative
Economic
Dollar terms
And …
Consider “outcome mapping” a la IDRC – where the
focus is on people and organizations.
Go beyond simply assessing the products of a
project/program to focus on changes in behaviours,
relationships, actions, and/or activities of the people
and organizations with whom a program works
This is a learning-based and use-driven approach
Recognizes the importance of active engagement of
players in adaptation and innovation – “productive
interactions”
The data challenge (1)
Need a good MIS at levels of researcher and
project/activity - one that connects with your
Network/Centre goals
Need to integrate in the MIS the needs of your
reporting requirements, accountability plans and
Centre/Network self monitoring/self-learning
Tie the MIS to performance measurement system by
having automatic reports produced, red flags etc
Need a foundation of data standards
The data challenge (2)
Standards can help:
Reduce burden on researchers
Facilitate the interface with funders
Access cost effective software solutions
Comparisons with self over time
Comparisons with other institutions
International benchmarking
CASRAI is a large part of the standards picture
Customized and flexible methodologies
There are plenty of metrics and methods available
No need to invent any more (although you will likely need to
intensify your data collection)
It’s how metrics and narrative are used and combined that
make the difference
No “one size fits all” methods or metrics work for all types of
S&T, for all types of organizations, or for all uses and users
All methods have substantial strengths and substantial
weaknesses
Involve key stakeholders
Remember that innovation requires many players, not just the
R&D team
Measurement can make a difference
Accountability and advocacy - Making the case on the
basis of outcomes and impacts:
For overall program funding
For the nature and dynamics of the staff complement
Self awareness and understanding:
Internal - Strengths, weaknesses, gaps
External – Threats, opportunities
Forward directions/areas needing attention
Fine tuning the strategic vision
Fostering sustainable relationships
Finally …
To achieve these objectives, you need:
Good (and visionary) governance
Good management - capable staff complement
Robust database with in-house expertise
Active engagement of players in using the outcomes
measures for adaptation and innovation
Measurement is “quantum”– It changes the
system; you tend to get what you ask people to
measure
Thank you
[email protected]