iCCM Embedded Research

Download Report

Transcript iCCM Embedded Research

Improving data to improve iCCM programs:
Implementation of a data quality and use
package in Malawi
Presenter: Emmanuel Chimbalanga, Save the Children Malawi & IMCI Unit,
MOH Malawi
On behalf of the Malawi CCM-IDIP working group: Jennifer Bryce, Emmanuel Chimbalanga, Tiyese
Chimuna, Kate Gilroy, Tanya Guenther, Elizabeth Hazel, Angella Mtimuni, Humphreys Nsona
University Research Co., LLC
June 12, 2014
Overview of CCM-IDIP
• Embedded implementation research project focused on
improving program monitoring and evaluation for iCCM
• Funding provided by USAID/URC and partners include JHU-IIP,
Save the Children and the Ministry of Health IMCI Unit
• 4 Focus countries: Malawi, Mali, Ethiopia, Mozambique
• Malawi program activities:
• Desk review of M&E system for iCCM and stakeholder
consultations
• Data quality assessments
• Implementation and evaluation of 2 innovative approaches
This presentation focuses on one of the innovative approaches
– a package to improve data quality and use
iCCM program in Malawi
• Health Surveillance Assistants (HSAs) started providing
iCCM for malaria, pneumonia and diarrhea in 2008
• HSAs provide iCCM services through village clinics that
they operate several days each week
• As of February 2014, about 4000 HSAs have been
trained and deployed for CCM across the 29 districts of
Malawi
• A 2012 data quality assessment (DQA) identified data
quality issues and low levels of data use
iCCM reporting system
HSA register
HSA
Form 1A
Health center
HSA report – Form 1A
Form 1B
District
Form 1C
National
Development of a data quality and use
improvement package
• Developed the package with district health staff
and partners
• The DI package included:
– general training on data management, use and
interpretation;
– refresher training on the routine reporting forms;
– simple templates for displaying CCM implementation
strength data;
– provision of calculators to assist with completing
monitoring forms; and
– working with district staff to identify reporting
benchmarks and action thresholds.
Templates for HSAs
Set of 5 graphs to summarize:
• Background data and supervision visits
• # cases treated and referred
• Total cases and days VC operated
Templates for Health Facilities
Set of 6 graphs to summarize:
 HSA is residing in catchment area
 HSA reporting
 Stock-outs lasting more than 7 days
 Supervision (routine)
 Mentoring
 Cases treated by HSAs
District templates
• Electronic data entry
• Generates dashboard
with indicators and time
trend graphs
• Excel format; potential
to link to DHIS2 in
future
Pilot Implementation
• All relevant district staff, HSA supervisors
and HSAs implementing iCCM (n=426)
trained in Dowa and Kasungu districts
• Feb 2013: TOT with IMCI/deputy
coordinators, HMIS, Pharm. Tech, others
• April 2013: District staff conducted
trainings for HSAs and senior HSAs at
health facilities
– Half-day trainings
– Trainings supervised by MOH and
SC staff
Evaluation of the package
• Sample: 5 health facilities and 3-4 HSAs per facility were
randomly selected at baseline in each district. The same
facilities and HSAs were followed up at endline
• Data collection: Baseline data collection in June 2012. Endline
data collection in July/August 2013 after 3+ months of
implementation.
• Data analysis:
1. Measured changes in reporting:
• Consistency: measured through results verification ratio (RVR:
verified/reported; 1.0 = perfect consistency)
• Availability and completeness: forms were submitted and
complete for the previous month
2. Assessed data use (display of templates; how used)
3. Documented package costs
Results Verification Ratio Calculation:
HSA level example
Count from HSA register
(e.g. # of fever cases treated)
RVR of 1 means perfect
match
Count from HSA Form 1 A report
(e.g. # of fever cases reported)
RVR of less than 1 indicates
over-reporting
RVR more than 1 indicates
under-reporting
Consistency: Improvements in HSA
reporting
• After introduction of the
package, the monthly data
reported by HSAs for cases
treated was more consistent
with what they recorded in
their registers:
– Average reporting consistency
for cases treated improved
– There was less variation in
reporting consistency after the
package (shown by the smaller
boxes).
Figure: Comparison of reporting consistency
levels for cases treated between baseline and
endline
Consistency: Health facility level
• Baseline showed good consistency between HSA
reports (Form 1A) and HF summary (Form 1B) for
cases treated; consistency sustained at endline
• Some over-reporting of HSA stock-outs of ACT,
AB, ORS at baseline (RVR less than 1); minor
improvements in consistency for stock-out
reporting at endline (RVRs closer to 1)
• Small sample sizes (less than 10 facilities) limit
conclusions
Availability and completeness: district
differences
• At baseline, Dowa was the stronger district with 95% of
HSA and HF forms available and complete compared with
just 74% in Kasungu district
• At endline, Kasungu showed some improvements in
availability and completeness for HSA forms and 100% of
facility level forms were available and complete
• In Dowa, availability and completeness dropped (63% of
HSAs forms and just 16% of HF forms available and
complete)
Data use: data display templates
• All HSAs and nearly all health
facilities were using the templates
(one HF wasn’t using)
• Almost all (97%) said the templates
were easy to use and not time
intensive (most spent less than an
hour to complete each month)
• Most (89%) of HSAs had completed
the templates for each month since
January 2013; completeness was
lower at HF (78%)
• About 40% of HSAs could not
display their templates because
they lacked a permanent structure
Data use: examples

Most HSAs mentioned using data to inform their community
health education activities
“The display of data makes it easy for the community to see which cases are common
which helps in choosing targeted interventions to address the situation” – HSA, Dowa
“The community was told that not any cough is fast-breathing; the community
perception of demanding cotrim for any cough is gradually changing” – HSA, Dowa
 Senior HSAs reported using data to make staffing decisions (deploy
HSAs to vacant areas, ask district to allocate more HSAs) and to
respond to stock-outs
“Our percentage of CCM-trained HSAs with stock-outs >7days in February, March and
April was above action threshold so we took action to order drugs on time” –
Msakambewa HF
Costing of Package
247 USD per health facility; 27 USD per HSA*
*Includes cost of printing and calculators
Summary of findings
• Package helped to improve consistency between # cases
recorded in HSA register and # cases reported in monthly
report
• Routine data on iCCM treatments aggregated at the HF level,
may not be as bad as people think
• Strength is that now “everyone can see the data”.
• HSAs and HF staff do use these data to improve the iCCM
program at the grassroots level. The benchmarks and action
thresholds were seen as helpful guidance.
• Turn-over and other management/health systems issues at the
district level limit its potential effects
• Package is acceptable and feasible to implement at national
level
Package Expansion
• Package is being scaled up to 23 other districts (with
support from MOH, Save the Children’s RAcE, MICS and
SSDI Services)
• Modifications:
– Some clarification on indicator wording, definitions
and targets done
– Templates have been modified to meet number of
cases HSAs are currently seeing
• The Ministry of Health through IMCI unit has
taken a lead role in ensuring that the package is
scaled up
Lessons from Expansion
What is working well?
What are the challenges?
 The package has created
interest amongst stakeholders
including HSAs, SHSAs, and
DHMTs (need to sustain the
momentum)
 Reporting levels and quality of
data have improved
 Improved community
participation
 Aids supervision
 The package is simple and is not
seen as additional work
• Inadequate resources seems to
be affecting scale up (In some
districts only TOT was conducted)
• Delay in distributing necessary
material e.g. templates and
calculators has resulted in some
HSAs forgetting what they learnt
• Some supervisors are not using
the designed implementation
strength indicators summary
sheet
Next Steps
• Exploring opportunities to:
– Integrate successful elements of the DI package within iCCM
training for HSAs and HSA supervisors
– Include dashboards at district and national levels within the
DHIS 2
– Improve tracking of actions and problem solving
– Include displays for other services by HSAs (newborn, family
planning)
– Include changes in iCCM service provision e.g. mRDTs
– Disseminate and advocate for package uptake at district level
Thank you
Acknowledgements:
This study was supported by the American people
through the United States Agency for International
Development (USAID) and its Translating
Research into Action (TRAction). TRAction is
managed by University Research Co., LLC (URC)
under the Cooperative Agreement Number GHS-A00-09-00015-00.
For more information on TRAction's work, please
visit http://www.tractionproject.org/.
More information and reports from
TRAction in Malawi:
http://www.jhsph.edu/departments/internationalhealth/centers-and-institutes/institute-forinternational-programs/projects/traction/index.html
Baseline data quality assessment
Assessment Objectives:
 To assess data availability, completeness and
quality
 To explore the use of iCCM data in program
management and decision making
• Conducted May 2012 in 2 districts – Dowa and
Kasungu
• Random selection of 4 health centers + the
hospital in each district
• Random selection of 4 HSAs per facility
• District staff involved in data collection and
interpretation of findings
Major Strengths & Weaknesses
Strengths:
Weaknesses:
 Well-defined structure for
reporting with clear deadlines &
expectations
 Reporting forms easy to use
 System of quality checks in place
 Good levels of reporting and
completeness
 Reasonable levels of consistency
with a few exceptions.
 HSAs meet regularly with health
center staff/community leaders
•
•
•
•
•
•
iCCM data not kept at health center or
HSA level
Concerns with data quality reported
by participants
Limited to no training on data use,
processing and interpretation.
Data use in decision making is low,
mostly top-down approach.
Supervision checklist was not yet in
use (Kasungu) and low in Dowa. Use of
mentoring checklist is low.
Staffing barriers and high turnover