Transcript Document

Use of Impact Evaluation for Organizational Learning and Policy Influence: The Case of International Agricultural Research

AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009

Overview/Introduction

Use and non-use of impact evaluation: the CGIAR case Douglas Horton & Ronald Mackay, Independent evaluation consultants • Towards a broader range of impact evaluation

methods for collaborative research: report on a

work in progress Patricia Rogers, Royal Melbourne Institute of Technology & Jamie Watts, CGIAR Institutional Learning and Change Initiative • Role of Impact Evaluation in Moving from Research into Use Sheelagh O’Reilly, Team Leader, Impact Evaluation, Research into Use Programme

Programme

• Combined presentation • Reaction from Robert Chambers, Discussant • Q&A and Discussion

Use and Non-Use of Impact Evaluation: the CGIAR Case

Douglas Horton & Ronald Mackay

Overview

• CGIAR has a long history of producing high-quality impact evaluations • However, there has been limited use of findings: – To influence donor / investor decisions & resource allocations – To promote learning & program improvement • Use may be enhanced somewhat through better planning and communication, but there remain some inherent problems with all disciplinary-oriented evaluation approaches • Other ways of evaluating and fostering learning are needed for social / institutional learning and for policy and program improvement

History of IE in the CGIAR

• High estimated returns to investment in ag. research were key to establishing the CGIAR • Hundreds of economic impact assessments report high rates of return • CGIAR economists have contributed significantly to improving IA theory & methods

From the Studies …

“CGI [crop genetic improvement] programmes have been outstanding investments. Few investments can come close to achieving the poverty reduction per dollar expended that the CGI programmes evaluated in this volume have realized… Any reduction in support to agricultural projects, in particular to projects designed to improve productivity, will seriously limit and hamper efforts to reduce mass poverty.”

(Evenson & Rosegrant, 2003: 496)

The Emerging Paradox

“Concern is growing within the donor community relating to the effectiveness of existing impact assessment research in guiding international agricultural research... donor support for agricultural research is declining, despite the credible assessments showing that investment in this area indeed has had high return.”

(Gregersen & Morris, 2003: vii)

“There is little apparent relationship between impact assessment findings and the subsequent allocation patterns of donors… those areas of research with the highest levels of assessed benefits often suffer from declining funding, while unproven areas of research and non-research investment receive rising funding shares”

(Raitzer & Winkel, 2005: ix)

Funding to International Agricultural Research (Source: ASTI Initiative)

500 450 400 350 300 250 200 150 100 50 0 1961 1971 1981 1991 2001 Unrestricted Total

Years

What is Going On Here?

• Good (impact evaluation) research does not necessarily lead to policy / programme support.

• Many factors may affect policy & management decisions more than (evaluation) information).

• For any kind of evaluation to have an impact, use needs to be cultivated from the beginning.

• One type of IE may not meet all needs

Some factors influencing use

1.

Engagement of intended users 2.

The 4 “I’s” 3.

Types and levels of use 4.

Attention to use

Engagement of Potential/Intended Users

      Donors & development agencies Policymakers Center / program managers Researchers Peers Constituents / intended beneficiaries

Why engage users?

Influence on decision making Use of Findings “Process Use” Engagement

Four “Is”

Interests • Ideologies • Institutions • Information (Weiss, 1998)

Types and Level of Use

Decision level Type of use Strategic Structural Operational Direct/ instrumental Indirect/ conceptual

  

Symbolic

  

Attention to Communication

       Multiple forms of communication Match format to audience Long-term involvement Integrate evaluation into program Guard against standardization Involve stakeholders Create context for dialogue

Suggestions

View and manage IE as “evaluation,” not as “research.” 1. Plan and manage evaluations to foster specific uses.

2. Target specific policies and program related issues.

3. Explain how programmes or projects attain results in their context.

4. Use mixed methods from various disciplines as needed to respond to evaluation questions.

5. Judge them for usefulness, practicality, respect for propriety and accuracy of data and results

Towards a broader range of impact evaluation methods…Why?

Agricultural research has expanded into a broader range of areas – From crop improvement to higher level development goals Role of the researcher in the agricultural innovation system is changing – From center of excellence to collaborative and capacity building approach – From transfer of technology to demand driven, locally relevant solutions Traditional evaluation designs may not always be feasible or appropriate

Increasingly diverse portfolio

• Impact assessment of genetic improvement of major crops well represented • Somewhat represented biological control of pests • Under represented in IA portfolio: – crop and integrated pest management – livestock – natural resources management – post harvest technologies – policy and gender research

Increasingly collaborative research

(i) 1996 Source: Douthwaite 2004.

(ii) 2003

Increasing demand to engage intended end users:

• Increase researchers’ understanding of local issues to improve the relevance of research to local conditions • Increase uptake and appropriate

adaptation

• Incorporate local knowledge into research • Co-production of knowledge by researchers and community members • Develop end-users’ capacity to build and use knowledge for adaptive management

Spectrum of participation Scope of this work

` • Conventional research: scientists make the decisions alone without organized participation by end-users • Contractual: scientists contract with end-users to participate.

Consultative: scientists make decisions but with organized communication with end-users • Collaborative: decision-making authority is shared between end-users and scientists. Neither party can revoke or override a joint decision.

Collegial: end-users make decisions collectively either in a group process or through individual end-users who are in organized communication with scientists.

End-user experimentation: end-users make the decisions without organized communication with scientists. (adapted from Lilja and Ashby)

Nature June 2008 Special issue on translational research

Conceptualising translational research

[Nobel laureate Sydney] Brenner is one of many scientists challenging the idea that translational research is just about carrying results from bench to bedside, arguing that the importance of reversing that polarity has been overlooked. “I’m advocating it go the other way,” Brenner said.

Simple

Following a Recipe

The recipe is essential Recipes are tested to

assure replicability of later efforts

No particular

expertise; knowing how to cook increases success

Recipes produce

standard products

Certainty of same

results every time

Complicated Complex

A Rocket to the Moon

Formulae are critical

and necessary Raising a Child

Formulae have only a

limited application

Sending one rocket

increases assurance that next will be ok

Raising one child gives

no assurance of success with the next

High level of

expertise in many specialized fields + coordination

Expertise can help but

is not sufficient; relationships are key

Every child is uniqueRockets similar in Uncertainty of outcome

critical ways

High degree of

certainty of outcome

(Diagram from Zimmerman 2003)

Deciding impacts Simple Likely to be agreed Complicated Likely to differ, reflecting different agendas Complex May be emergent Describing impacts Analysing cause Reporting More likely to have standardised measures developed Likely to be clear counter factual Evidence needed about multiple components Causal packages and non-linearity Clear messages Complicated message Harder to plan for given emergence Unique, highly contingent causality Uptake requires further adaptation

The need for broader range of methods

• Complement existing methods for Impact Evaluation (raising issues of multidisciplinary and mixed methods) • Identify, describe, measure and value impacts • Assess causal inference in collaborative and/or participatory projects • Support the use of impact evaluation for learning and adaptive management

Entry Points for learning and change

• Knowledge, skills & attitudes – People need to want to learn and know how to engage partners in co-creation of knowledge • Management systems & practices –

Leaders

learn, value learning, and promote learning in concrete ways –

Communication channels

facilitate easy access to information and knowledge sharing –

Systems and structures

facilitate learning • Organizational culture – Supports and rewards reflection & learning and the application of lessons • External environment – Is conducive to reflection and learning from experience

Visualising the connection between laboratory research and practice research

Tabak, 2005 National Institute of Dental and Crano-Facial Research, National Institutes of Health

Capacity for organizational learning

• Systematically gathering information • Making sense of information • Sharing knowledge and learning • Drawing conclusions and developing guidelines for action • Implementing action plans • Institutionalizing lessons learned and applying them to new and on-going work

Research Into Use Programme

How can innovation-system approaches promote and facilitate greater use of research-based knowledge – Maximise the poverty-reducing impact of previous research on natural resources – Develop understanding of how innovation system approaches contribute to reducing poverty whilst ensuring effective and efficient management of natural resources.

• Challenges to impact evaluation – need to identify critical success factors – coherent approaches for spotting ‘potential winners’ among research outputs, in the move from research into innovation – mainstream use of new technologies that contribute to poverty reduction and economic growth.