HDR 2007/08 - International Initiative for Impact

Download Report

Transcript HDR 2007/08 - International Initiative for Impact

Challenges to
Institutionalizing Impact
Evaluation
Discussion Points* presented at the International
Conference on Impact Evaluation
Cairo, Egypt
March 28- April2, 2009
*By Getahun Tafesse, CIDA –ECCO M&E Advisor
Strengths and Challenges to
Institutionalizing Impact Evaluation
Opportunities
• Enormous development challenges in Ethiopia -
widespread poverty, chronic food insecurity, HIV/AIDs,
illiteracy, environmental degradation etc.
• This provides fertile ground for evaluation as
intervention resources are limited and different
stakeholders are pressurized to demonstrate results
• The Government of Ethiopia (GoE) policy statements
express commitment to poverty reduction and
sustainable development.
Strengths and Challenges to
Institutionalizing Impact Evaluation
Opportunities
• Ethiopia has also formally adapted the MDGs
as an overarching development guiding
framework
• The existence of well established sectoral
annual review mechanisms
• The pool of M&E professionals in the country is
gradually increasing as more and more
organizations are currently recruiting experts
specifically assigned as such
Strengths and Challenges to
Institutionalizing Impact Evaluation
Opportunities
• Recent trends show growing recognition and
institutionalization of monitoring and evaluation
practices:
• A growing appreciation of evaluation across different
stakeholders and particularly encouragement of
participatory evaluation at different levels
• Institutionalization of M&E in non-government circles
• Increasing conduct of M&E training by organizations
• Incorporation of M&E courses in some training programs
•
•
•
•
•
•
Actors involved in Evaluation
Actors involved in M &E
The Government of Ethiopia (GoE)
– National – MOFED, PMO, NBE
– Sector ministries
– Regional and Woreda administrations
Donors (through projects/ programs)
– Bilateral
– Multilateral
INGOs/ NGOs (through projects/ programs)
– Relief and Development
– Advocacy
– Professional associations
Academic and Research Institutions
The Private Sector/ business community
Communities
Ministry of Finance &
Economic Development
Central Statistical
Authority
Welfare Monitoring
Unit
Branch Offices
Surveys/
Censuses
Participatory Poverty
Assessments
Sector
Ministries
Donors
Regional
Bureaus
Programs
Woreda
Desks
Facilities
Projects
Grass Root Communities
INGOs/
NGOs
Branch
Offices
Projects
Welfare Monitoring System in Ethiopia
National Welfare
Monitoring Steering
Committee
National Welfare
Monitoring Technical
Advisory Committee
Welfare Monitoring
Unit
Ministry of Finance and Economic Development
Chair
Ministry of Agriculture and Rural Dev’t
Member
Ministry of Health
Member
Ministry of Labour and Social Affairs
Member
Ministry of Women Affairs
Member
Ministry of Education
Member
Ethiopian Road Authority
Member
Same as
Above
Central Statistical
Authority
Sector Reports
HHICES
PPAs
Welfare Monitoring Survey
INGOs/ NGOs
Data Users
RCBP interest
could be
reflected here
The Status Quo in Impact Evaluation
• Impact Evaluation is not a common practice
• Some sorts of Evaluation are practiced mostly to fulfil donor
requirements
Can Impact Evaluation meet the sense of urgency that characterize the
need for development intervention in low income countries?
• Frequently development needs require urgent/ immediate assistance
• Programming in developing countries is geared towards fulfilment of
basic needs
• Choice of area of Intervention is hardly questionable
* Difficult to make the main thrust of development programming as
Experimentation
The Status Quo in Impact Evaluation
Is there national ownership in Impact
Evaluation?
• Impact Evaluation implies there is a
preconceived desired State
• Whose definition of ‘ development ‘
matters
• If desired state is imposed, ownership to IE
is lost
The Status Quo in Impact Evaluation
Is the Cure for development problems known?
• Good governance – participation, accountability,
transparency…
• Investment on education, health, agriculture,
road…
• Equity, peace, security…
• Ownership, partnership, harmonization…
*But the degree of effectiveness of a specific
programming in these areas is not easily
known
The Status Quo in Impact Evaluation
The Impact evaluation Dilemma
• There is sense of urgency for development
programming
• Appropriate cure depends on Impact evaluation/
experimentation
So, the focus of Impact Evaluation in LDCs should
be
– Not on identification of Areas of investment
– Rather on methods of delivery
The Cure is known (the vaccine is identified).
The main question is how to deliver best.
The Status Quo in Impact Evaluation
Methodological Challenges
How to make the Case of IE strong?
• Demonstrating practical benefits of Impact
Evaluation
• Resources are scarce – how much to spend on
something that is intuitively known is good
• Flexible and easy IE techniques/ methods that
are less costly in terms of time and resources
• IE aimed at guiding/ improving decision making
– Show possible alternative use of resources
The Status Quo in Impact Evaluation
Recommendations
• Make Impact Evaluation a parallel endeavour/ not a
major thrust of development programming –
experimentation on a small scale
• Impact evaluation on key programs without disturbing
regular programming, i.e., without creating discontinuity
in program
• Impact Evaluation as a Second Stage Experimentation
with the focus being on Approach to delivery, exploring
practical alternatives
• Impact Evaluation in developing countries should have a
strong component of comparison element among known
strategies
• From the supply side, effort should be made to expand
the availability of easy and flexible tools of IE that are
less costly in terms of both time and resources
The Status Quo in Impact Evaluation
Recommendations
• Impact Evaluation in developing countries should
have a strong component of comparison element
among known strategies – not simple and full
focus on the specific program that is the subject
of evaluation
• IE should be designed to aide decision making on
‘so what’
• From the supply side, effort should be made to
expand the availability of easy and flexible tools
of IE that are less costly in terms of both time
and resources
Strengths and Challenges to
Institutionalizing Impact Evaluation
Strengths
• GoE national Reports have greatly been improved in quality (depth
of analyses) and coverage (sectors, sub-sectors, themes) due to
 Growing demand for and use of such reports
 DAG financial and technical support extended to MoFED & use of
professionals (consultants) for data analyses and report production
 Inclusion of Governance, foreign Aid, MDGs, Environment, Gender,
etc. themes in the reports although not to sufficient degree
 Implementation of regular household surveys (HHICE, WMS, DHS)
 Improved capacity and performance on the part of CSA
 Improved capacity and performance on the part of selected sectoral
line ministries
• Great improvements in sectoral reports, especially Health and
Education aided by




Sector Management Information Systems
Annual Review Mechanisms
High political commitment (Health, for example)
Improved standardization, rationalization an harmonization of
indicators, data collection and reporting procedures
Strengths and Challenges to
Institutionalizing Impact Evaluation
Progresses
• Strong and growing capacity in statistical data collection
 CSA’s impressive data collection program
 Annual and periodical regular surveys
 Long experience and institutional capacity
• Civil service reform including business process reengineering
(BPR)
 Streamlined tasks and responsibilities
 Result oriented work planning
• Good practice of contracting out data collection and analyses
 Supported by growing # and capacity of private consultants
Strengths and Challenges to
Institutionalizing Impact Evaluation
Opportunities
• Strong culture of collaboration by
beneficiaries in responding to studies
and gradual development in their level
of active participation and articulate
responses
Strengths and Challenges to
Institutionalizing Impact Evaluation
Opportunities
• Availability of administrative data by sector
ministries (eg. education, health etc), and
survey data by Central Statistical Agency
• Cooperation among stakeholders (government
and non-government alike) to share data
available
• Structured societal organizations established
down to small community level.
• Some level of established practice in using
evaluation for planning purposes…
Strengths and Challenges to
Institutionalizing Impact Evaluation
Challenges
• Lack of informed debate on local development
perspectives and relevant evaluation conceptual
frameworks and approaches; which leads to:
• Lack of consensus on development concept and measurement
criteria
• Lack of consensus on evaluation concept and criteria
• Uniform application of evaluation techniques and lack of
adoption to specific cultural and behavioral contexts
Limitations/ Challenges
•
Widespread traditional management practice that
gives focus to counting activities and outputs
rather than focus to assessing higher level results,
i.e., poor result based management practice. This
is particularly reflected:
•
•
•
In absence of baseline data
Poor feedback mechanism
Lack of informed decision making or poor linkage
between assessment and decision making
Strengths and Challenges to
Institutionalizing Impact Evaluation
Challenges
• Data collection, analyses and reporting
aimed at demonstrating achievements and
less focus given to analyses of constraints
and challenges
• Significant discrepancies between
administrative and survey data
Strengths and Challenges to
Institutionalizing Impact Evaluation
Limitations
• Lack of linkage/ integration across
different sectoral M&E systems
• Different timing
• Different level of reporting
• duplication
• sectors at varying stages of capacity and
performance in evaluation practice
• poor practice of verification methods/
triangulation of data from different
sources
Limitations
• Generating Compelling and evidence based
results attributable to programming
• Large number and varying quality of indicators
used in GoE reports and challenges to discern
overall progress
• Missing reference comparisons or lack of
standard reference points
• Missing indicators w.r.t. gender, disadvantaged
groups (disabled, destitute, etc.)
Limitations/ Challenges
 Routine data collection at lower level
cumbersome and unsystematic
 Simplifying data collection & ensuring timely use
of data
 The progressive data aggregation at each
higher level not necessarily conducive to data
analyses
 No systematic integration of national data
collection activities
 The need to support national strategy for the
development of statistics
 Integration of data collection systems (within &
outside sectors)
Limitations/ Challenges
 Poor feed back and linkage with
planning and decision making
• weak data producer and users common
forums and linkages
Limitations/ Challenges
• Lack of standardization of survey
methods, definition of indicators and
measurement tools.
• Poor recognition given to evaluation
importance as reflected by:
• Poor integration of evaluation approach in
programs/ projects design
• Lack of earmarked budget for M&E
• Lack of earmarked human resource for M&E
in established structures
• M&E task usually undertaken as an add-on
task
Limitations
Generation of data disaggregated at
woreda level

–

•
Different levels of reporting across different
sectors
No regular complementary qualitative
information (PPA, citizen card, etc.)/
public opinion
Limitations/ Challenges
Poor Maximizing of benefits from data
analyses

–
–
–
Gender disaggregated data
Thorough/ in depth analyses of data sets
Timely analyses of data
•
Inadequate practice and capacity to review
and enforce good ethical standards in the
undertaking of evaluation
•
Teaching in evaluation not well recognized
and integrated in curricula of different
disciplines
Limitations/ Challenges
Poor institutional capacity especially at
lower levels




Fragile and over loaded
No earmarked human and financial
resource – M&E is usually an add-on task
High staff turn-over
Limitations/ Challenges
•
Lack of agency/ home, networks for
• developing and disseminating knowledge on
evaluation approaches, tools, and best
practices
• adoption and popularization of
internationally set goals and commitments
(ex., MDGs, Conventions, Declarations )
• adoption and popularization of established
methodologies
• sharing of experiences and exchanging ideas
Food Security
Monitoring & Evaluation
GoE Food Security M&E System
• The FSCB has an overall responsibility
for programme M&E
• Food Security Programs M&E Plan
• different stakeholders were involved in the
process & many acquired opportunity to
comment
• Simple and Practical
• 4 Principles were applied:
–
–
–
–
Simplicity –vs- Utility
Process –vs- Product/ outcome
Decentralization –vs- Accountability
Participation –vs- Rigour
GoE Food Security M&E System
• Food Security Programs M&E Plan
–
–
Result Frameworks (recently revised based
on two 2-day workshops involving GoE &
Donors)
Monitoring Formats
• Activity Reports (from community up to federal)
• Quarterly financial and procurement reports
developed by FSCB in accordance with GoE
accounting procedures and PSNP Procurement
Guidelines.
–
Focal persons responsible at different levels
Training given to focal persons on Monitoring
Formats
GoE Food Security M&E System
• Food Security Programs M&E Plan
–
–
–
–
–
–
–
–
Programme Description
FSP Logical Framework
M&E System: Objectives and Approaches
Institutional Roles and Information Flow
Monitoring Guidelines and Methods
Evaluation Guidelines and Methods
Human Resource Needs and Training Plan
Reporting Formats
GoE Food Security M&E System
• Monitoring and Evaluation Technical
Task Force
–
–
–
–
Comprises members from GoE and Donors
financing food security programs
Meets every two-weeks
Oversees the implementation of the M&E
plan
Reviews study designs and mobilizes the
necessary technical assistance and capacity
building resources
GoE Food Security M&E System
•
•
JCC - meetings every fortnight to discuss and
decide on various issues related to the
implementation of the program (resource
flows to the beneficiary, targeting issues,
capacity building and other pertinent issues
as they arise).
Rapid Response Mechanism field monitoring
(usually with both FSCB and donor
representatives) undertaken on monthly basis
to examine ad hoc issues and constraints as
they arise. Team reports identifying issues
and recommendations are presented to the
JCC for consideration.
GoE Food Security M&E System
•
•
An Information Centre has been set up and staffed in
the FSCB to regularly follow up on problems at
woreda and regional levels (particularly with respect
to the flow of funds to woredas and payments to
beneficiaries).
Joint Implementation Support Missions are conducted
twice each year (May and October) to review progress
with program implementation per se and with related
capacity building actions (e.g., financial
management, public works, woreda planning and
quality control, etc.). The FSCB presents financial
and technical progress reports during the missions, as
well as reports prepared by the RRM Teams.
Additional reports may be produced as required,
including reviews of procurement.
GoE Food Security M&E System
• Surveys and Studies
–
–
–
–
–
Baseline Survey
Annual Survey
Public Work Reviews
Food Aid Assessments
Other specific studies
• Targeting
• Institutional Assessment
• Impact
GoE Food Security M&E System
Other data sources:
• Annual Agricultural Statistics
–
–
Crop forecast, Actual crop assessment
Woreda level Ethiopian Agricultural Sample
Enumeration (EASE)
• Agricultural Census
• DPPA Early Warning System
• Vulnerability Profiles