Contribution of Survey to Health Systems Performance Monitoring: experience with the World Health Survey THE MALAYSIAN EXPERIENCE 28-29 September 2006 Montreux, Switzerland.

Download Report

Transcript Contribution of Survey to Health Systems Performance Monitoring: experience with the World Health Survey THE MALAYSIAN EXPERIENCE 28-29 September 2006 Montreux, Switzerland.

Contribution of Survey to Health
Systems Performance Monitoring:
experience with the
World Health Survey
THE MALAYSIAN EXPERIENCE
28-29 September 2006
Montreux, Switzerland
Introduction
• World Health Survey 2002
– Nationwide community survey
– Multistage stratified sampling representative of
population
– Stratified for state & urban rural location
– National & rural/urban location estimates
– Where possible estimates across various sociodemographic variables
– Institutionalised population excluded (<3%)
Introduction
• World Health Survey 2002
– Data collection early March – mid April
2003
– 200 personnel of various categories
including temporary research assistants
– MOH facilities & vehicles
– Nationwide publicity
Field Preparation
– Organisational structure
• Advisory/Steering committee
• Central Research Team
• Field Data Collection team
• Data Entry Team
ORGANISATIONAL STRUCTURE
National Advisory Committee
Central Research Team
State Liaison Officer (1 per state)
State Teams (12 + 2 states)
2 Teams per state for Peninsular Malaysia
+ 4 Teams per state for Sabah & Sarawak
( 32 teams)
1 Field
Supervisor per
team
(32 supervisors)
1 chief scout per state
(14 chief scouts)
1 scout (PHO) per
district (part time)
4 Interviewers (3
temporary staff + 1
nurse) per team
(128 data collectors)
1 driver per
team
(32 drivers)
Implementation strategy
More rural
high density
areas (larger
sample size)
Less
densely
populated
areas
Less rural
Very highly density
areas (larger
sample size)
Mobilization of survey teams across districts & states
Survey
implementation schedule
–
–
Budget proposal (Oct-02)
Translating & Pre-testing instruments (Oct-Nov
02)
–
–
–
Road shows (16-20 Dec 02)
Recruitment of research assistants (Jan-Feb 03)
Field preparation- (sampling & procurement)
(Jan-Feb 03)
–
Identification of EBs and Tagging exercise (JanFeb 03)
–
Training (17 Feb- 15 Mar 03)
–
–
–
–
“Launching” (28 Feb 03)
Data collection (Mac-April 03)
Publicity in various media (Feb – April 03)
Data entry (Mac-April 03)
Survey
implementation schedule
– Presentation of preliminary findings (July 03)
• Programme heads and service providers
• Share contents of WHS 2002
• Identify additional questions relevant for programme needs
– Further assistance with analysis from WHO (July 05)
– Mini-conference (September 2005)
• Invited resource person from WHO
• Senior officers from programmes and various operational level
– Clinicians, public health specialists, public health engineers, nutritionists, human
resource personnel
Survey
implementation schedule
– Report writing (October 2005 – June 2006)
• 5 volumes
• 4 drafts
– 3 volumes already with printers (August
2006)
– Proposed presentation to senior
management (Nov – Dec 2006)
WHS 2002
Snapshot
of
Data Quality
WHS 2002
• Sample size = 7528.
• Response Rate = 80.2%
• Analysis (as per WHO) done in 2005.
Sampling
Response Rate
Location
Urban
Rural
Total
Household interviews
Selected
4654
2874
7528
Interviewed
3610
2516
6126
HH RR (%)
77.6
87.5
81.4
Selected
4654
2874
7528
Interviewed
3554
2484
6038
Individual RR (%)
76.4
86.4
80.2
Individual interviews
HH Sample Deviation Index
Individual level Sample Deviation
Index
Missing Data
Reliability
HH Level: Sociodemographic Profile
(weighted)
Household
size
• Mean Household size
= 4.2
• Male : Female ratio
= 0.98
Q1poorest
4.00
Q2
4.11
Q3
4.16
Q4
4.29
Q5richest
4.89
Missing
3.57
Distribution of population by Location
• Geographical location:
Majority in urban areas
32%
Metropolitan
51%
Urban Large
Urban Small
Rural
5%
12%
HH Level: Sociodemographic Profile
(weighted)
Income by Location
(HH Data n=6126)
ALL
Q1poorest
Rural
Q2
Q3
Urban Small
Q4
Q5richest
Urban Large
QMissing
Metropolitan
0%
20%
40%
60%
80%
100%
Presentation & Utilisation of findings
• To date, results yet to be formally presented to
top management
• General impression of findings
– Value added as it provides a better perspective (new
dimensions) of country HS performance
– Better performance in some aspects but “eye-opener”
in others!
• Application found downstream at various levels
Utilisation of findings
• National level
– Key input into development of National Health Financing
Mechanism
• Volume 5 (Responsiveness section)
–
–
–
–
Areas respondents were not satisfied
State of hospital
Utilisation pattern
Cost of care
• Volume 4 (H Expenditure)
– OOP, perception on risk pooling
• Volume 3 (Coverage)
– understanding of current situation of service provision
• Volume 2 (Risk factors)
– What should go into the basic benefit package
– ANC, HIV transmission amongst mothers, condom use for prevention
Utilisation of findings
• Programme level
– Responsiveness component used in development of “soft skills”
training modules for health workers & evaluation of front line
customer services
– Input into development of Patient/People - centred services
– Verify effectiveness of current programmes and related activities
– Support (evidence-based) the justification of newly introduced
activity
– Recommend development of new strategies/activities to specific
risk groups
– Identification of new research to look into impact
Potential application
• Evaluation of Mid term review of
performance of 9th Malaysia Plan
What we have learnt…..
• Objectives have been achieved
– Contribution to development of cross-country
measure
– Assess country HS performance
– Transfer of technology (to some extent!)
What we have learnt…..
• Costly affair
– Human & financial resource intensive
• Need for “buy-in” from all sectors
– To ensure successful survey implementation
– To ensure usage of findings
What we have learnt…..
• Need for advanced planning (at least
1yr!!)
– Negotiation with operational managers for
manpower assistance & other logistics
– Budget proposal and approval
What we have learnt…..
• Instrument itself
– Translation into local language poses a challenge!
– Complicated & lengthy (2.5 hrs mean completion time)
– Some aspects politically sensitive (8000 series were omitted)
– Definitions of certain variables differ from own country
• Uneasiness & defensive about country performance by various
programme heads
– E.g. HIV & Human resource
• In retrospective
– Should allow for 2 sets of definitions
What we have learnt…..
• Analysis & Interpretation
– Stata software
• Limited expertise
– Complex analysis
• CHOPIT (done by WHO team, Geneva)
• Some analysis still pending
– E.g. Y star results for adjustments for cross country comparison
(responsiveness section still pending)
• Duration of whole activity (data collection to
analysis to report writing)
– Too long
What we have learnt…..
• Lack of expertise to translate research
findings into action
– e.g. interpreting findings, writing policy briefs
Our conclusion…
• In general,
– WHS 2002 a useful tool for management
– Provides good info/added value about our HS
performance not found in routine M & E
• But….
– A costly affair
– A painful exercise (blood, sweat & tears!) of
negotiations, personal sacrifices, energy sapping, etc
• And…..
– Success requires careful planning
Our recommendations….
• Have sufficient budget for implementation
• Ensure top management commitment
• Allow countries to adopt & adapt sections applicable to
them
• Need to simplify
– vignettes
– health status
• We have the mean score but as there is no benchmark, results not
really meaningful
• Need to make instrument brief
• Assist countries without capacity to undertake national
community surveys
Our recommendations….
• Assist countries to market findings to
policy makers
– Translation into policies
– Help to see implications of findings to current
policies
• Need to build greater “in-country” capacity
from beginning to end
Thank you