Exercising Adaptive Leadership CARE

Download Report

Transcript Exercising Adaptive Leadership CARE

Applying Quality Standards in
Impact Evaluation: Case of
CARE Program Quality
Framework and Evaluation
Policy
Ahmed Ag Aboubacrine
Josephine Kainessie
Bockarie Sesay
Dr. Moses Lahai
Patrick Robin
DME Unit – CARE Sierra Leone
5th AFREA/NONIE/3IE Conference –
Cairo -31st March – 2nd April 2009
© 2002, CARE USA. All rights reserved.
CARE International Program Quality
Framework
http://pqdl.care.org
Our Vision & Mission
& Values
Programming Principles
Evaluation Policy
Program Quality
Standards
Core Guidelines
(HLS,
RBA, UF, DME)
Sector/Technical
Guidelines
© 2005, CARE USA. All rights reserved.
EVALUATION POLICY
CARE
International
Principles







Relevance (focus on
what is important)
Participation (of
community
representatives)
Focused on impact on
the lives of people
(significance)
Credibility (objective
and reliable methods)
Integrity (ethical
standards)
Transparency
(willingness to share
findings)
Independence (of
evaluators)
© 2005, CARE USA. All rights reserved.
DAC Principles









Purpose Of Evaluation
Impartiality &
Independence
Credibility
Usefulness
Participation of Donors
and Recipients
Donor Co-operation
Evaluation Programming
Design and
Implementation of
Evaluations
Reporting, Dissemination
and Feedback





AFREA
Guidelines
Utility
Feasibility
Propriety
Accuracy
Evaluation
Accountab
ility
EVALUATION POLICY
CARE Evaluation Policy
Lines
1. Responsibility of COs
2. Consistent with CI Principles
(3&6) and Standards (10)
3. Test the relationship with CI’s
Vision and Mission and MDGs.
4. Analysis of the degree and
consequences of
implementation of the CI PQF
(SP, UF)
5. Follow professional interagency standards (“speak a
common language”)
6. Significant participation and
high level of influence of
participants and stakeholders
7. Evaluation Completeness
8. Conducted openly and in a
transparent manner
9. Follow up and accountability
10. Evaluation is a priority  CB +
Rigor + Use
11. Generating the resources
required the EP
© 2005, CARE USA. All rights reserved.
DAC
Standards
1. Rationale, purpose
and objectives of an
evaluation
2. Evaluation scope
3. Context
4. Evaluation
methodology
5. Information sources
6. Independence
7. Evaluation ethics
8. Quality assurance
9. Relevance of the
evaluation results
10. Completeness
And …?
Other
Evaluation
Standards &
Guidelines
(DFID,
Sphere, etc.)
Lessons Learnt in Practice








Policy Evaluation as the main guide while designing both
the intervention and its evaluation (ToRs)
Operationalize the policy requirements in the evaluation
design (in technical offer) – Use checklist
Stickiness to the standards (staff, consultants, donors)
Mix-Methods (no single method!) by separate experts
working as a team (Quantitative Study followed by in-depth
Qualitative Assessment)
Impact Measurement Vs Participation Principle
Evidence Vs Ownership / Sustainability
Seeking impact Vs Inventing impact
Independence (internal /external)?
© 2005, CARE USA. All rights reserved.
Influencing Factors




Capacity Constraints
Human Factor (agenda, skills, competencies,
etc.)
Data Collection and Analysis Methods
 Analysis of Priorities (felt / normative /relative
needs)
 Analysis of Impact (Measured Vs Perceived)
Ad-hoc external Vs Action Research through out
the lifetime
© 2005, CARE USA. All rights reserved.
Rethinking the standards
New Challenges





Evolution of thinking (IE)
Strategic Impact Inquiries
Project to Program Shift (P2P)
Chose Appropriate Impact Measurement
Methods
Review Of the Program Quality
Framework?
© 2005, CARE USA. All rights reserved.
Use of Evaluation standards in
Post-Conflict Context


Opportunities
 Alignment is still possible (Evaluation Policies,
Paris Declaration, Accra Agenda for Aid
Effectiveness).
 Emerging trend of evaluation and accountability
by aid agencies
Constraints
 Contextual Limits to Evaluation Utilization of
evaluation to influence decision makers
 Capacity Development
 Persistence of emergency culture (dependency)
 No process oriented
© 2005, CARE USA. All rights reserved.
DO






“Do” and “Don’t” in using
evaluation standards
Question your design with
the lens of your EPs
Select what would be
mandatory
Contextualize (set level of
compliance for each
principle / standard)
Promote attitudes
(thinking evaluatively)
Work with qualified
academic / research
people and/or institutions
Allocate resources and
time
© 2005, CARE USA. All rights reserved.
DON’T
 Think that every thing
is feasible
 Wait for the evaluator
to apply the EP
 Oh!... That’s the job of
M&E Officer
 Think that your
evaluation should be
always perfect (there
are always limits!)
For more resources, visit: http://pqdl.care.org
THANKS!
Question?
© 2005, CARE USA. All rights reserved.