Performance Expectations of Programmatic/M&E Experts

Download Report

Transcript Performance Expectations of Programmatic/M&E Experts

Performance Expectations
of LFA Programmatic/
M&E Experts
LFA M&E Training
February 2014
1
Session objectives
• Clarity on the role of the LFA programmatic
health/ M&E expert in collaboration with the
Global Fund Country Team;
• A common understanding of performance
expectations;
• Concrete examples of best practices.
2
Session structure
1. Small Group Discussion: LFA SelfAssessment (45 min)
2. Clarifying roles and expectations (30 min)
3. Revised PET (5 min)
4. Scenario Case Studies and Discussion
(25 min)
3
LFA self-assessment: background
• The LFA M&E/Programmatic expert plays a key role by
verifying PR results, identifying potential risks to the
grant or program, and translating key contextual
information to improve decision-making.
• This information is most frequently transmitted to the
CT through reporting documents, including the OSDV,
RSQA, PU/DR, and LFA assessments.
4
LFA self-assessment: directions
We will present four Guiding Questions.
Please take the next 25 minutes to discuss them in
small groups (tables).
Then we will reconvene to share our answers
(20 min).
5
LFA self-assessment: guiding questions
• Describe a time when you felt that you had added value to
a particular deliverable or outcome. Why do you think you
added value?
• Describe a time when you felt that you had missed an
opportunity to add value. Why do you think you were not
able to add value?
• What could you have done differently in order to add
value?
• What could the Secretariat Country Team have done
differently to have helped you add value?
6
Structure of Secretariat CT vs. LFA team
SECRETARIAT CT
FPM
PO
Finance
Specialist
Legal
Officer
LFA Team
Team Leader
PSM
Specialist
Public
Health/
M&E
Specialist
Finance
Expert
PSM
Expert
LFA
Programmatic/
M&E expert
7
Roles and responsibilities
Global Fund
Public Health/
M&E
Specialist
LFA
Programmatic/
M&E expert
• Advise on programmatic aspects of grants
• Negotiate performance targets and establish/follow up on
management actions/conditions
• Assess progress to make funding decisions and identify
areas for improvement
• Enable M&E systems strengthening and technical
assistance provision
• Identify and respond to M&E and programmatic risks in
the portfolio
• Provide information about PR capacity in M&E and
program implementation
• Verify programmatic results for disbursement and
renewals recommendations
• Identify potential M&E/programmatic risks, including data
and service quality issues
• Verify progress against management actions/conditions
• Identify relevant changes in country context (e.g.
epidemiological, policy or political changes)
8
LFA provides essential information for
decision-making
LFA assessment
RSQA
OSDV
PU/DR
Communications
Global Fund
Public Health/
M&E
Specialist
Inputs for improving
grant/program
LFA
Programmatic/
M&E expert
Site data
Site visits
Community
Local media
9
Information sources for decision-making
Global Fund
Public Health/
M&E
Specialist
•
•
•
•
•
National Strategic Plan
M&E plan
Evaluations and program reviews
Technical partners
0-4 country visits/year
Imagine being a PHME Specialist at the Secretariat. Could you
fulfill your responsibilities based on these information sources
alone? If not, what would you need?
10
Common information gaps
•
•
•
•
•
•
What is “on paper” versus actually being implemented?
What works well within the system and what are the gaps?
Why are there gaps in service delivery coverage or quality?
Are there any current or potential changes in the country
context (political, economic, social) that may affect the
program?
Are there key groups or populations within the country that
are not being reached? What are the barriers to reaching
them? Does the policy and program framework address
these barriers?
What are the relationships like between PR and SRs and
how can those be strengthened to improve the program?
11
What are effective LFA practices?
LFA
Deliverable
Examples of Best Practices
Program
Design
• Proactively identify programmatic gaps and advise
Secretariat
• Note when practice differs from guidelines/protocols. LFA is
best placed to provide guidance on realities in country
• Possess practical knowledge of the M&E system in-country
and be able to advise PH/M&E officer on its actual
functioning
• e.g. data completeness (does reporting cover public/private
sector sites? community?); data integration (e.g. diseasespecific with HMIS?); systems (e.g. computerization? at what
levels?)
• Possess knowledge of in-country stakeholders beyond PRs
and be able to facilitate meetings for PH/M&E officer if
necessary
• Proactively monitor implementation of studies, surveys,
evaluations in-country and share reports with Secretariat
12
What are effective LFA practices?
LFA
Deliverable
Examples of Best Practices
GrantMaking
•
•
Actively participate in grant making and provide Secretariat
with advice on issues that may affect implementation
Advise on indicator selection based on difficulties noted in
PU/DRs
• (e.g. data sources, indicator definitions, definition of package of
services, reporting flows)
Grant
implementation
•
Provide detailed descriptions of the M&E system, including
examples of recording and reporting forms
•
Make recommendations within the grant context of upcoming
review
Identify the needed strengthening measures and/or budget
item which could provide the resources
LFA Programmatic/M&E expert triangulates information with
LFA PSM and LFA Finance experts
•
•
13
What are effective LFA practices?
LFA
Deliverable
Examples of Best Practices
PU/DR
•
•
•
•
•
OSDV/
RSQA
•
•
Identify issues that are not strictly related to the task at hand but
nevertheless important
Make the link between performance and funding, in particular in
case of large discrepancies between grant expenditure rate and
target achievement
Proactively identify issues with indicator definition or reporting
system, and propose sound recommendations so that these can
be corrected for the next reporting period
How overall performance relates to progress towards
outcome/impact
Identification of M&E risks and mitigation measures
Good selection of indicators and well documented methodology
to be used including the source documents and any proxy
indicators
LFA links the OSDV proposal (indicator/site selection) with
issues identified in past OSDVs, PU/DRs
14
What are less effective LFA practices?
LFA
Deliverable
Examples of Less Effective Practices
Grant-Making
• LFA tries PF negotiation (rather than PH/M&E officer);
• LFA “plays the role” of PR
• LFA is unprepared to provide the necessary support to
PH/M&E officer
Grant
implementation
• Use of external consultants for short-term assignments which
require specific knowledge of the country disease program
• Findings based on limited experience (e.g. one site)
generalized and made into an overall finding - need to be
contextualised
• Lack of prioritization of recommendation and instead present
a “laundry list” of recommendations
15
What are less effective LFA practices?
LFA
Examples of Less Effective Practices
Deliverable
PU/DR
•
•
•
•
•
•
•
Lack of specificity/consistency, e.g. “target in PF does not match
PU/DR target”
Vague language which is non-committal and does not give
confidence in the findings, e.g. "the LFA considers the PR's
explanations plausible”
Lack of comprehensive investigation into issues identified and the
underlying factors
Repeating what has been noted by the PR (in PU/DRs) OR
repetition across several findings. Findings should be
grouped/classified - as it facilitates analysis and formulation of
recommendations [relevant findings should be cited]
Numerators/denominators are not provided (or not verified) for
percentage targets
Where PF is quarterly and PU/DR is for the semester, LFA does
not ensure that both quarters are reported in the PU/DR
For issues with indicator definitions or reporting, LFA makes
recommendations to the PR without checking beforehand with the
Secretariat
16
What are less effective LFA practices?
LFA
Examples of Less Effective Practices
Deliverable
OSDV/
RSQA
• Vague, non-actionable recommendations (e.g. “Supervisions
should be strengthened”)
• Deviation from what was agreed in OSDV Proposal without
informing the Secretariat
• Unrealistic timelines
17
Plenary discussion:
information gaps and communication practices
•
What are other important communication
gaps?
•
What are good practices in communication
between LFA and Secretariat, and between
LFA and PR?
•
Which communication practices are less
effective?
18
LFA communication
Communication skills are critical to building and managing
relationships with stakeholders
• Build constructive and professional relationships based
on mutual respect and transparency
• Manage expectations of other parties
• Maintain sufficient professional distance to make
judgment in the best interest of the GF
• Clarify your role and responsibilities.
19
LFA PET
• The LFA Performance Evaluation Tool (PET) is a
management tool for providing regular, structured and
meaningful feedback to LFAs on their performance.
• Objective: LFAs deliver consistent high-quality, tailored to
risk, timely, relevant and best-value services to the
Secretariat in line with the Secretariat's expectations and
requirements.
20
LFA PET – key changes
Current PET
Revised PET
• PETs are completed for certain key
LFA services.
• Mandatory: 2 PETs per year (1 PET
per year for ≤$350k LFA budget
countries) which cover all services
provided by the LFA during that
period.
• Voluntary: separate PET for a
specific service or more frequent
overall feedback.
• Only the 6 main LFA services were
evaluated (PU/DR, PR assessment,
Grant Renewal, OSDV, Audit
reports related and Grant Closure)
• Feedback covers overall
performance and all services
provided by the LFA during a
specific period
21
LFA PET – key changes (cont’d)
Current PET
Revised PET
No major changes in the areas of performance to be assessed:
• Completeness/ accuracy/ clarity
• Analysis and consistency
• Practicality of recommendations
• Timeliness/ responsiveness/ proactivity/ communication
• Other
• Average rating is calculated
automatically.
• FPM will have the possibility to
adjust the automatically calculated
rating, if necessary.
22
LFA PET – key changes (cont’d)
Current PET
Revised PET
• LFA response/comments are not
included in the form.
• LFA response, comments and
proposed action plan will be
included in the form;
• If no LFA comments received within
15 days of submission to the LFA –
PET is closed automatically;
• Final approval from the FPM will be
added to reflect on the LFA
comments/response;
• FPM/CT will have a chance to
manually change the rating and add
comments, if considered necessary;
• Key submission dates will be saved
automatically in the form.
23
Case study exercise
• Please find on your table four short scenarios.
• Please review and discuss the scenario which
is highlighted. Each table will only discuss one
of the four scenarios.
• Following your review of the case study,
please discuss this question:
“What should the LFA do, and what should
the LFA not do in this scenario?
24
Final comments or questions?
Thank you!
25