FeedbackIDSP.ppt

Download Report

Transcript FeedbackIDSP.ppt

Feedback
Integrated Disease Surveillance Programme (IDSP)
district surveillance officers (DSO) course
Preliminary questions to the group
• Were you already involved in feedback of
surveillance data?
• If yes, what difficulties did you face?
• What would you like to learn about feedback
of surveillance data?
2
Warm up exercise
• Diphtheria persists in Delhi
• One hospital used at a sentinel centre
• Data analyzed from 1954-1997
3
Diphtheria incidence and case fatality,
sentinel unit, New Delhi, India, 1954-97
1800
1600
40%
Cases
Case fatality
1400
35%
30%
25%
1000
20%
800
15%
600
10%
400
200
5%
0
0%
4
Year
Case fatality
Number of cases
1200
Diphtheria incidence by month, sentinel
unit, New Delhi, India, 1997
35
Cases
Deaths
30
Number of cases
25
20
15
10
5
0
J
F
M
A
M
J
5
Month
J
A
S
O
N
D
Characteristics of diphtheria cases,
sentinel unit, New Delhi, India, 1997
Characteristics
Age
Sex
Religion
Vaccine
% Total
Death
CFR (%)
<1
20
14
10
50
1-4
73
51
22
30
5-9
38
27
13
34
10+
12
8
0
0
Male
90
63
27
30
Female
53
37
18
34
Hindu
89
62
25
28
Muslim *
54
38
10
37
Yes
15
10
1
7
No
114
80
43
38
14
10
1
7
143
100
45
31
?
Total
Cases
* Account for 9%, 17% ad 5% of the population
6 in Delhi, UP and Haryana
Questions for the group:
Propose messages for feedback
• Describe what you see
• Recommend action to report in your
feedback bulletin
7
Key findings messages re:
diphtheria in Delhi, India,1997
• Incidence decreases
• Case fatality increased
• Seasonality:
 August to October
• Most cases among unvaccinated, at at age
when they should be protected from primary
vaccination
8
Key recommendations re:
diphtheria in Delhi, India,1997
• Increase vaccine coverage and reach
unvaccinated pockets
• Store and use anti-toxin early
9
Outline of the session
1. Rationale for feedback
2. Content of feedback
3. Feedback mechanisms
Group exercise to conclude
10
Difficulties with surveillance system with
no feedback
 Lack of motivation
• Data disappear in a black hole
 Unreliability
• Mistakes are not corrected
 Sluggishness
• The various levels do not communicate
 Data falsification
• The data is opaque
 Weak human resources
• The actors do not see the system in action
11
Rationale for feedback
of surveillance data
 Motivation
• Everyone sees how their data fit in the bigger picture
 Reliability
• Identifies errors
 Reactivity
• Places everyone on the same page
 Quality
• Increases transparency
 Education
• Demonstrates how the system works
12
A dynamic vision of surveillance
Collect and
transmit
Make
decisions
data
All levels use
information
to make
decisions
Analyze
data
Feedback
information
13
Surveillance
Data flow and feedback: Level by level
Centre
State
Data
District
Primary / Community
health centre
Community
14
Feedback
Content of feedback
• Information on diseases under surveillance
• Information on quality of data collected
15
Content of feedback
• Information on diseases under surveillance
 Summary data tables
 Analyzed epidemiological information
• Time (Graphs with trends)
• Place (Maps)
• Persons (Tables)
• Information on quality of data collected
16
Content of feedback
• Information on diseases under surveillance
• Information on quality of data collected





Regularity of reporting
Timeliness of reporting
Completeness of reporting
Responses initiated by the unit
Validity of data
17
Feedback methods
•
•
•
•
•
Newsletters, bulleting
Monthly review meetings
Outbreak investigation reports
Informal feedback
Electronic communication
18
Newsletter
• Regular epidemiological bulletin
• Educational tool
• Contains
 Summary tables and graphs
 Commentary on diseases or topic
19
Monthly review meetings
•
•
•
•
District / block monthly meeting
Presentation of data during meetings
Generates comments from peers
Need to stress positive aspects
 Public negative comments may de-motivate
20
Outbreak investigation reports
• Excellent for feedback and learning
• Allow sharing of experiences that may be
encountered in other places
• Content
 Information about the epidemiological
characteristics of disease
 Lessons learned in the investigation process
21
Informal feedback
• Oral feedback
• Useful for pointing out mistakes
• Does not suffice by itself
22
Electronic methods
•
•
•
•
Through email, websites
Fast and efficient
May be updated rapidly
Allows
 Dynamic data presentation
 Queries
23
Take home messages
1. Feedback closes the surveillance loop
2. Feedback
•
Epidemiological information
•
•
•
•
Time
Place
Person
Information on data quality
3. Use all possible mechanisms of feedback to
get the information across
24
Exercise
• Read the article on the analysis of measles
surveillance data in Uttar Pradesh in 1996
 Singh J. et al. Widespread outbreaks of measles in rural Uttar
Pradesh, India, 1996: High risk areas and groups. Indian Pediatrics
1999; 36: 249-255.
• Imagine you need to prepare a feedback
meeting with health officials in Uttar
Pradesh
• You need to prepare a presentation
25
Group work
• Sit down by by groups of 4 or 5
• Extract information from the article to
structure your feedback
• Use a table format to prepare your
presentation
26
Empty table shell to organize
feedback information
Epi data
Data
quality
issues
Findings
Interpretation
Recommendations
Time
•…
•…
•…
Place
•…
•…
•…
Person
•…
•…
•…
Surveillance
issues
•…
•…
•…
27
Break in groups
Take 15 minutes
Key elements of feedback for measles in
Uttar Pradesh, India, 1996
Epi data
Data
quality
issues
Findings
Interpretation
Recommendations
Time
•Most cases in
low
transmission
months
•Measles all year
long
•Be mobilized all
year long
Place
•Deaths
concentrates
in 10 districts
•Higher
incidence?
•Better
reporting?
•Compare reporting
practices in the 10
districts with the
others
Person
•85% cases not
vaccinated
•Measles persists •Increase coverage
because of low
vaccine coverage
•Efficacy 92%
•Vaccine works
in UP
•Increase coverage
•Measles
surveillance is
weak
•Improve reporting
Surveillance •1% of
estimated
cases
reported
29
Presenting your feedback
• Present the background
• Explain how you collected the data
• Display the key results presented in the
summary table
 Back up data with tables, graphs and maps
• Interpret the data
• Summarize the recommendations that can
be deducted from the data
30
Additional reading
• Section 4 of IDSP operations manual (Report
2-7, page 57-64)
• Section 10 of IDSP operations manual
• Module 10 of training manual
31