Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.

Download Report

Transcript Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.

Presented by CIDA on behalf of the Task Team
on Multilateral Effectiveness

2009
◦ Large Task Team on Multilateral Effectiveness
was established to explore and further develop proposal to
strengthen information on development effectiveness (DE) of
Multilateral Organizations (MOs)
 A smaller Management Group(MG) (US, UK, SADEV, CIDA, WB, UNEG &
MOPAN) was created to undertake the work more easily

2010
◦ Methodology and approach were pilot tested (ADB and WHO)
under guidance of the MG/Task Team
◦ Draft report submitted to the Network in November
 Network provided guidance to the Task Team on
completing the pilot test phase
2

Task Team was requested to:
1) Finalize pilot test (ADB and WHO)
2) Build stronger links with MOPAN and examine
complementarity of results with MOPAN 2010
assessments
3) Refine methodology and guidelines
4) Further engage with MOs and with all stakeholders
5) Hold a Management/Steering Group meeting in
spring 2011 to take stock and prepare response to
the Network on lessons and future steps, taking into
account developments by MOPAN
3





Technical Team discussed strategies for complementarity and
convergence with the MOPAN Technical Working Group
Received MOPAN reports on ADB and WHO and prepared draft paper
on complementarity
Workshop of the Technical Team (CIDA, SADEV,
DFID with MOPAN representative) to discuss
possible revisions to pilot test report, paper on
complementarity with MOPAN, and methodology
and guidelines
ECG and UNEG invited to participate in the MG,
however, they reconsidered their participation
and provided their regrets
MG (US, UK, SADEV, CIDA, MOPAN) met and agreed on
revisions to pilot test report, paper on
complementarity with MOPAN, and methodology
& guidelines
4
1. Focused more directly on the
implications of the results for
the validity and utility of the
meta-evaluation methodology
2. Clarified that results reported are
illustrative and used only to test the
approach and methodology
3. Classified validity of findings for each
criteria as illustrated by each pilot case
5
6
Highly
Satisfactory
Satisfactory
Unsatisfactory
Highly
Unsatisfactory
N of
25
1.1 Evaluation
Systems Effective
0%
44%
25%
31%
16
1.3 RBM Systems
Effective
0%
0%
100%
0%
3
2.1 Programs Suit
Needs of Target Group
44%
44%
11%
0%
18
3.1 Programs Achieve
Stated Objectives
0%
71%
29%
0%
21
3.2 Effective in Gender
Equality
-
-
-
-
0
3.3 Environmentally
Sustainable
-
-
100%
-
1
4.2 Objectives
Achieved On Time
0%
20%
80%
0%
5
5.1 Positive Impacts
7%
57%
36%
0%
14
5.2 Unintended
negative changes
0%
78%
22%
0%
9
6.1 Benefits Likely
Sustainable
18%
55%
18%
9%
11
Assessment Criteria
7




Proposed approach is workable and can be implemented within
estimated time (6-8 months) and resource requirements (USD
$125,000) with little burden on MO
Where MO produces adequate volume of
evaluation reports over 3-4 year period
covering significant portion of investments
◦ Approach works well and results can be
generalized about MO DE
Where adequate number of evaluation
reports is not available and coverage of
activities cannot be estimated
◦ Results on DE are interesting but harder
to generalize
At the completion of the pilot test there were
opportunities to improve methodology by
refining process and instruments
8



Continued participation by MOPAN in the MG and
dialogue between Technical Team and MOPAN Technical
Working Group
Identified MOPAN Key Performance Indicators (KPI) and
Micro-Indicators (MI) that can be compared to approach
criteria
Compared results of pilot test to survey results reported
by MOPAN
9
10

Where pilot test and MOPAN Survey criteria are
comparable (for seven of 19 tested criteria)
◦ Results are often in agreement (e.g. on strength of the
evaluation function)

Where results are not in agreement
◦ This can be explained by different time frames and
organizational levels examined by each approach (e.g. on
the strength of RBM systems)



The two approaches focus on different aspects of MO
effectiveness and rely on different information sources
Results are complementary rather than in conflict
Together, they can provide a more complete picture of
MO’s overall performance
11



Reorganized and focused criteria more directly on
development effectiveness
Clarified the process for selection options for
strengthening development effectiveness
information
Refined quality assurance criteria
12
1. The achievement of development objectives and
expected results (including policy impacts)
2. Cross-cutting issues: inclusive development which is
gender sensitive and environmentally sustainable
3. The sustainability of benefits and results achieved
4. The relevance of MO activities and supported projects
and programs
5. The efficiency of MO operations in support of projects
and programs
6. The use of monitoring and evaluation to
improve development effectiveness
13
Preliminary Review
Review of essential documentation
Scenario A
MO Reporting on DE is
Adequate
Option 1
Rely on MO
Reporting Systems
Scenario B
Scenario C
MO Reporting on DE is not
Adequate but Evaluation
Function is
MO Effectiveness Reporting
and Available Evaluations
Inadequate for Reporting on
DE
Option 2
Conduct a systematic
synthesis of information
from available
evaluations
Option 3
Implement actions
aimed at strengthening
MO Evaluation system
and DE reporting
Apply the metasynthesis of evaluation
results methodology
14
Evaluation Quality Assurance Criteria
A. Evaluation purpose clearly stated
B. Report organized, transparently structured, well written
C. Evaluation objectives stated
D. Evaluation subject clearly described
E. Scope of the evaluation clearly defined
F. Evaluation criteria clearly identified
G. Multiple lines of evidence used
H. Evaluation well designed: methods appropriate to the criteria used
I. Evaluation findings and conclusions relevant and evidence based
J. Evaluation report indicates limitations of the methodologies
K. Evaluation report includes specific recommendations
15





Recognize that the pilot test has been successful and
acknowledge that the methodology is available for use
by bilateral, multilaterals or others to apply, as
desired
In light of the point above, recognize that others can
build on the methodology and guidelines with greater
participation by members of MOPAN and individual
MOs as further work to assess the effectiveness of
MOs are undertaken
Acknowledge that some donors and groups of donors may move
forward to apply the methodology (jointly or individually;
sequentially or concurrently with MOPAN)
Explore over time how to institutionalize or formalize the capacity
and responsibility of assessing DE of MOs with MOPAN or other
organizations
Explore further engagement on approaches and vehicles for
assessing the DE of MOs with the MOs
16



Is the approach and methodology acceptable
to the Network?
Can agencies using the approach and
methodology indicate it has been endorsed by
the Network?
What is the level of interest among member
agencies of the Network in leading or
participating in reviews of MOs using the
approach and methodology (as revised)?
17