Presentation Title

Download Report

Transcript Presentation Title

AGENCY-LEVEL MEASUREMENT:
Can it work, and does it matter?
ACDI/VOCA
M&E Community
of Practice
ACDI/VOCA Vision
• Vision
A world in which people are empowered to succeed in the global
economy.
• Mission
To promote economic opportunities for cooperatives, enterprises
and communities through the innovative application of sound
business practice.
• Our Work
–
–
–
–
–
Financial Access
Enterprise Development
Community Development
Food Security
Agribusiness
Results Based Oriented
• M&E systems at ACDI/VOCA have two primary purposes:
accountability to stakeholders and learning to facilitate
results-oriented program design and knowledge-based
management.
– Accountability refers to measuring project efficiency,
effectiveness, relevance, and sustainability of projects, as well as
transparently (disclosing) sharing findings from project
evaluation to stakeholders, and using the findings to inform
resource allocation and other decisions.
– Learning refers to the systematic generation of knowledge that
can be used to refine designs and introduce improvements into
future efforts with the goal of achieving meaningful and
sustainable results. M&E systems should ultimately demonstrate
that ACDI/VOCA’s activities are contributing to its vision and
producing sustainable and equitable development results for the
communities in which it works.
Results Based Guiding Principles
• Integrated into program management and technical approaches;
• Systematic so as to produce reliable, accurate and useful information,
while also
• Flexible to the complex environments in which ACDI/VOCA operates;
• Participatory of stakeholders and beneficiaries;
• Ethical in the collection and reporting of information about and to
beneficiaries, partners and stakeholders (Confidentiality and anonymityDo no harm);
• Supportive of building local capacity of staff, partners and beneficiaries in
M&E;
• Foster inclusivity and equity of program interventions for both men and
women and traditionally marginalized groups; and.
• Results-oriented: in addition to focusing on implementation processes;
ACDI/VOCA M&E systems will emphasize results by asking the “so what”
questions.
Capturing Performance Indicators on a
Statistical Canvas (CaPISC)
• CaPISC is an online database that captures the performance monitoring
plans for all of ACDI/VOCA’s 70 active projects. To help ACDI/VOCA
measure progress towards the goals at the project and practice area level
to support our attention to accountability and learning. The system will
provide ACDI/VOCA with agency-wide performance measures for:
• Reporting to stakeholders and donors
• Communicating with the development community
• Managing regional and country programs through comparative
performance measures
• Measuring beneficiaries reached by AV intervention
• Measuring integration of activities
• Reporting on Companywide common indicators
• Managing Practice Areas/ individual Project
• Developing New Business (Marketing)
Why Global Performance Indicators?
• To assess the performance of its projects, A/V would
like to use a comprehensive performance
measurement framework.
• ACDI/VOCA will track this progression of its
performance through results reporting against
indicators from the project level (predominantly
outputs), to the broader program level
(predominantly outcomes).
• The indicators will be standardized to the greatest
degree possible, thus allowing ACDI/VOCA and its
partners to compare performance across the
portfolio of projects.
A Starting Point
• Global Level Indicators
– Per capita expenditures (as a proxy for income) of USG
targeted beneficiaries
– Number of jobs attributed to A/V implementation
– Number of hectares of natural resources showing
improved biophysical conditions as a result of USG
assistance
– Percent of beneficiaries adopting xx number of improved
practices/technologies
Goals of CaPISC
• CaPISC allows for
current project
reporting thus more
responsive
management and
program adaptation.
• Reporting with CaPISC
will allow for enhanced
program and
institutional learning.
Uses of CaPISC – as a Results Based Management
Tool
•
•
•
•
Visually depict progress on project goals and indicators
Determine progress of key project indicators COP Dashboards
Provides tools to help you analyze your data
Create reports for donors
Use CaPISC to Help in New Business
Development
•
•
•
•
Illustrate results over time and across projects
Show past experience and results
Demonstrate capability using indicator data
Model PMPs after similar ones in the system
– Uniform terminology
– Global indicators
• Indicator library
Challenges/Lessons Learned
• It has been a long path to persuade the
management to invest in this database.
• It is taking some time to train project level
M&E specialist on how to use and enter the
data
• Data quality (especially data that comes from
the COs).
• Recourses (human and financial)
InterAction Forum 2012
Measuring Agency-Level Results
CRS
Beneficiary and Service Delivery Data (BSDI)
May 1, 2012
Beneficiary and Service Delivery Indicators
(BSDI)
Output Level Indicators
that
Measure Agency-wide Performance
on
CRS Fundamental Program Outputs
Services Delivered
Beneficiaries Served
Beneficiary and Service Delivery Indicators (BSDI) Presentation
Illustrative Program Area and Services Catalogue:
Water and Sanitation
Beneficiary and Service Delivery Indicators (BSDI) Presentation
BSDI Data Flow Map
Point of Service Delivery
Typically data are recorded on
paper by individual beneficiary
Beneficiary and service
delivery data are registered
Regional/Partner Office
Beneficiary and service delivery
data are recorded and records
are updated
Typically data are
transcribed to Excel files
Project Management Office
Typically data are
stored in a project
database
Beneficiary and services delivery
data are stored, maintained,
analyzed and reported
On-line Agency
Database (PIMS)
Beneficiary and Service Delivery Indicators (BSDI) Presentation
Donor
BSDI Performance Products
BSDI allows accurate, annual agency-wide tracking of CRS performance
“Number” and “% targeted” of direct beneficiaries served” by
 Sub-program area service categories
 Geographic location (GPS) point of service delivery
 Gender and other beneficiary demographic characteristics
BSDI allows integrated technical and financial performance analysis
BSDI will track “double counted” beneficiaries in a multi-sector projects
Beneficiary and Service Delivery Indicators (BSDI) Presentation
Mission Metrics
Incoming
Data
Cycle of
Learning
Take Action
21
Measuring Agency Level Results
InterAction –Forum 2012
May 01, 2012
Save the Children
Purpose, key drivers and elements of Agency Level
Measurement
Key Elements
Purpose
•
•
Measure progress towards multiyear strategic plan—specifically to
support an overarching objective of
“accountability for results”
Communicate w/staff and board and
on performance against our
intended results for children within
the context of our “Theory of
Change”
Key Drivers
•
•
Operational Metrics (eg, budget
performance, revenue generation,
brand recognition, media rating etc)
Programmatic and Policy Metrics
(eg. Reach, Global Indicators
around our Theory of Change)
Audience and Use
•
•
Accountability to ourselves and to
others (specifically children)
•
•
Need to have a tool for management
purposes from the senior leadership
of the organization (including the
board).
•
A dashboard for Senior
Management Team/Board
(Accountability)
Staff at all levels (Global, Regional,
CO staff) – Learning, new business
development
Public/Donors – public report
(Accountability)
23
Blank.potx
23
Theory of Change
How
we work to create impact for children
THEORY OF CHANGE:
how we work to create impact for children
We will…
… be the voice
advocate and campaign for better practices and policies to fulfill
children's rights and to ensure that children’s voices are heard
(particularly those children most marginalized or living in poverty)
build partnerships
… be the innovator
develop and prove evidence-based,
replicable breakthrough solutions to
problems facing children
collaborate with children, civil society organizations,
communities, governments and the private sector to
share knowledge to ensure children’s rights are met
… achieve results at scale
support effective implementation of best practices,
programs and policies for children, leveraging our
knowledge to ensure sustainable impact at scale
Blank.potx
24
Understanding Results
The output, outcome or impact
(intended or unintended,
positive and/or negative) of a
development intervention.
RESULTS
Examples
Definition
Input
Output
Outcom
e
Impact
Input
Output
Outcome
Impact
The financial, human, and
material resource used for the
development intervention.
The products, capital goods
and services which result
from a development
intervention; may also include
changes resulting from the
intervention which are
relevant to the achievement of
outcomes.
The likely or achieved shortterm and medium-term
effects of one or more
interventions’ outputs.
Positive and negative,
primary and secondary longterm effect produced by a
development intervention,
directly or indirectly, intended
or unintended
E.g. $ spent, staff time
e.g. # health workers trained,
teachers trained,
e.g. children accessing life
saving interventions;
Policy change implemented
e.g. children’s lives saved;
under 5 mortality rate over
district, region or country
GLOBAL INDICATORS
EVALUATIONS
REACH
Blank.potx
25
Components of our program/policy results monitoring
system (towards our strategic goals)
Reach
• Measures the number of children and adults who have received
program input and/pr services directly and indirectly by Save the
Children and its partners in the fiscal year disaggregated by theme and
sub-theme area.
Global Indicators
• A set of outcome indicators linked to Save the Children’s global
outcome statements which measure achievement of outcomes.
Advocacy Measurement
• A tool to measure progress against an advocacy objective aimed at
policy change and/or implementation
26
Example of Global Indicators
Intended Impact: Children learn and develop with age appropriate
care and education
Outcome Statement: In support of MDG2, by 2015, Save the Children
will have significantly contributed to getting 2 million of the hardest to
reach children into school, raising the Quality of Learning
Environments (QLE), and improving the learning outcomes of more
children
Indicators:
• Quality of LE: % of Basic Education and Early Childhood Care
and Development schools / sites supported by SC that achieve 4
guiding principles
27
Lessons learned and key challenges
•
Buy in and support from the senior leadership at all levels (global and country)
is critical for the success of such a system.
•
Agency dashboard is used to discuss results quarterly—informing decision
making on a regular basis.
•
Data quality and management can be overwhelming, capacity (at all levels) and
resource constraints to institute a strong system at all levels.
•
Shared global objectives has helped to establish shared metrics across the
Save the Children Global Movement – however determining attribution is a
challenge.
•
Keeping the balance between a concise dashboard (with few metrics) while
providing a comprehensive picture of our results for children and the complex
nature of our work.
•
Reach figures do not tell us about our program impact. But measuring outcome
and impact level indicators are resource intensive.
28
Contact
Maby Palmisano
Senior Director, Monitoring and Evaluation, ACDI/VOCA
[email protected]
Harry "Hap" Carr
Senior Technical Advisor for Monitoring and Evaluation, CRS
[email protected]
Barbara Willett
Senior Technical Advisor – DM&E, Mercy Corps
[email protected]
Muluemebet Chekol
Senior Director, Monitoring & Evaluation, Save the Children
[email protected]
29