Using Data to Improve Quality
Download
Report
Transcript Using Data to Improve Quality
Finding the Link
Using Data Reports to Initiate Quality
Improvement Projects (QIPs)
Julia Hidalgo, ScD, MSW, MPH
George Washington University and
Positive Outcomes, Inc.
Today we will…
Identify data sources to be used for quality
improvement (QI) and analyze data points and data
trends to identify key findings
Prioritize data results to implement action steps to
improve HIV care and services
Know how to access benchmarking reports and best
practices on how to share data and findings
Learn best practices from other grantees
QI Data Sources: HAB Required Data Reports
Report
Ryan White HIV/AIDS Program
Services Report (RSR)
Parts
Units of Analysis
A, B, C, D, F Grantee, provider, and client-level
data
Ryan White HIV/AIDS Program Data A, B, C, D, F Grantee, provider, and aggregate
Report (RDR)
client-level data
B
ADAP Quarterly and Annual Reports
ADAP program and aggregate clientlevel data
F
Community-Based Dental
CBDPP grantee and DRP applicant
Partnership Program and Dental
pre- and post-doctoral dental
Reimbursement Program Dental
education and dental hygiene
Services Report
programs
F
AETC Event Record and Participant
AETC regional centers, local
Information Form
associated sites, and national centers
Fiscal Status Report (FSR)
Allocation and Expenditure Reports
A, B, C, D, F Grantee
A, B , C D Grantee
Other Common Sources of QI Data
Administrative data gathered in the course of
service delivery, grants management, and QM
Fee-for-service reimbursement claims
Client/patient health records
Client/patient satisfaction surveys
Geoanalysis of HIV epidemiologic, service
utilization, process, and outcome data
Review of secondary data
Assessment of QM plans
Special studies
Types of Data Commonly Used In HIV QI
Data Type
Quantitative
Characteristics
From the same root word as
quantity
Based on measureable
information
Qualitative
From the same root word as
quality
Descriptive information
Observed but not measure
Example
Laboratory test values
Patient height, weight
Number of patients
receiving a screening test
Number of patients with
undetectable viral load
Observation of patient
flow
Client satisfaction surveys
Focus groups
Key informant interviews
Types of Data Commonly Used In HIV QI
Data Type
Nominal
Ordinal
Interval
Ratio
Characteristics
•
Items that are differentiated by a naming system that has
numbers assigned to names of groups
•
Commonly nominal data are organized by categories (also
known as categorical data)
•
Items ordered by their relative position on a numeric scale or
that represent some type of hierarchy
•
Characterize the sequence of categories of data
•
Parametric data that are measured on a scale with equal
distance between integers on the scale
•
Quantitative data that have units of measurement
•
•
•
•
Discrete
Continuous
•
•
Example
•
Gender
•
Racial group
•
ZIP Code
•
Health insurer
•
Federal Poverty Level
•
Likert scale items
•
Fahrenheit and Celsius
temperature scales
Also known as scale data
Parametric data in which numbers can be compared as multiples
of one another
Quantitative data that have units of measurement
True zeros can exist in ratio data
Fixed values commonly on an arbitrary scale
•
Income
Household size
Number of clients and
medical visits
•
Age in years
Scale data that allow us to determine differences between
groups and over time
•
CD4
Viral load
•
•
•
Data types requires different approaches to statistical
analyses and graphic data presentation
Levels of Analysis
Systemwide
Program
Provider
Client
Using RSR Client-Level
Data in QI Projects:
Moving Beyond Rates
Application of Clinical Data QI Methods
Fort Lauderdale/Broward County EMA assessed Part A
outpatient/ambulatory medical care (OAMC)
3,414 clients received OAMC in 2009
Assessment objectives
Quality of OAMC
Quality of OAMC data collected by the Provide Enterprise
(PE) Care Management Software System
Access to OAMC, engagement and retention in care, and
utilization patterns
Enrollment of OAMC patients in other health insurance to
eliminate the ADAP waiting
Data sources: PE client-level data, RDR aggregate data, chart
reviews data, and staff interviews
Use of Client-Level Data in QI: Broward EMA Part A
Grantee Performance Measures
Performance Measures
% clients with 2+
medical visits
% clients with clinical
visit every 4 months
% clients with 2+ CD4
counts performed at
least 3 months apart
% clients with an
undetectable viral load
% clients with AIDS
prescribed HAART
% female clients with a
Pap test
IHI Goal
HIVQUAL
Mean
Performance
Score
Clinic Clinic 2 Clinic 3
1
(n=87) (n=106)
(n=61)
Clinic 4 Clinic 5
(n=92) (n=76)
Total
(n=422)
NA
NA
90%
88%
93%
92%
92%
91%
NA
83%
64%
71%
75%
82%
75%
74%
90%
90%
47%
85%
81%
88%
84%
79%
NA
NA
29%
81%
68%
52%
62%
60%
90%
NA
61%
82%
89%
90%
88%
83%
90%
69%
26%
62%
55%
39%
61%
51%
Use of Client-Level Data in QI: Broward EMA Part A
Grantee Performance Measures
Performance Measures
IHI
Goal
% clients screened for Hep NA
B virus infection
% clients screened for Hep 95%
C since HIV dx
% clients receiving latent
NA
TB test
% clients receiving syphilis 90%
test
% clients screened for
NA
depression
% clients screened for
90%
substance use
% clients receiving
NA
influenza vaccine
HIVQUAL
Clinic 1 Clinic Clinic 3 Clinic Clinic
Total
Mean
(n=61)
2
(n=106)
4
5
(n=422)
Performance
(n=87)
(n=92) (n=76)
Score
NA
84%
21%
75%
39% 68%
47%
90%
80%
21%
62%
42% 63%
43%
70%
38%
62%
62%
57% 54%
56%
80%
79%
82%
79%
75% 68%
61%
73%
38%
32%
25%
98% 91%
56%
80%
29%
87%
60%
92% 76%
71%
NA
2%
63%
41%
47% 50%
43%
Use of Client-Level Data in QI: Broward EMA Part A
Grantee Performance Measures
Performance Measures
Goals
Clinic 1
(n=61)
Clinic 2
(n=87)
Clinic 3
(n=106)
Clinic 4
(n=92)
Clinic 5
(n=76)
Total
(n=422)
% clients with CD4 below 350
cells/mm3 prescribed HAART
% clients screened for diabetes
80%
56%
88%
92%
86%
86%
81%
NA
95%
99%
96%
100%
92%
97%
% clients screened for
hypertension
Mean number of quarters that
HIV+ clients are retained in care
(from January 2009 to June
2010)
Median number of quarters that
HIV+ clients are retained in care
(from Jan 2009 to June 2010)
NA
98%
91%
94%
97%
82%
92%
NA
2.98
4.35
3.95
4.17
4.04
3.96
NA
3.00
5.00
4.00
4.00
4.00
4.00
Moving Beyond The Rates
Querying provider staff (clinicians and clinic managers)
about the clinical processes educated us in interpreting
differences in the rates
We had to address the question of whether the rates reflected poor
data or poor performance
Statistical analyses are necessary to address these
questions:
Were differences between the benchmarks and the providers’ rates
statistically significant?
How do we address small cell values in determining differences?
What are the trends in rates? Were rates improving, worsening, or
flat over time?
What systems, program, provider, and client characteristics were
associated with differences in rates?
What systemwide issues are likely to impact the ability to improve
performance?
Next Steps in Launching QIPs
Compute
Rates
Discuss
Findings
With
Providers
Re-measure
& Refine
Processes
Prioritize,
Plan, &
Conduct
QIPs
Identify &
Address Data
Quality
Deficiencies
Statistical Analysis
Identifying Factors
Associated With
Rates
Considerations in Prioritizing Processes for QIPs
Do your performance rates reflect poor data, poor
performance, or both?
Does the indicator have a great impact on the programs or
clients in your program?
What priorities for improvement are set by your funders?
Do you have the resources needed to undertake the QIPs?
Do you have commitment from leaders and front line staff?
Do you have the expertise to measure the impact of changes
through process or outcome measures?
Do you have benchmarks or comparators with which to
assess your improvement?
Do you have the resources to sustain gains achieved by the
QIPs?
What external barriers impede the QIP?
Applying QI Processes in
Data Improvement
Steps in Validating and Improving Data
Identifying
Data
Needs
Each Step
Offers
Opportunity
for Data QI
Data
Analysis &
Reporting
Refine
Forms
Design
Data
Extraction
From
Charts or
EHRs
Data Quality
&
Improvement
Data Entry
&
Cleaning
Database
Design &
Management
Assessing Forms Used to Gather RSR Client
Report Data
In Northern Virginia, nine Part A-funded providers’
intake and treatment forms were assessed to
determine the extent to which they accurately
gathered RSR Client Report variables
Applying QI processes, areas of improvement were
identified
Some intake forms were not updated to gather RSR
demographic and screening variables
Screening items did not accurately screen for HIV risk,
mental health, and substance abuse
High reading levels
Redundant variables were gathered more than once
Data Flow Analysis
Fort Lauderdale/Broward County EMA assessed Part A OAMC, including the
quality of OAMC data in the PE Care Management Software System
PE
• Client intake forms
• Medical & MCM
notes
• Rx List
Clinic
• Manual data entry
• Electronic transfer
(labs)
• Subset of
Provide data to
XML file
• Transmission to
HAB
RSR
Data entry by HIV program managers,
clerks, case managers
Application of Data QI Cycle
Reviewed
RSR Data to
Compute
Missing Data
Rates
Used PE,
Identified
OAMC
Patients, Drew
Sample
Accuracy, completeness, and
timeliness of OAMC data were
assessed
Adapted
eHIVQUAL
Form to
Review
Charts,
Identified
Data Flow
Issues
Computed
Error Rates,
Identified
Areas of
Improvement,
Launch QIPs
How complete was the 2009 RSR submission to HAB?
% Clients With Unknown Screening or Test
Measure
Min Max Mean
TB
34% 100% 73%
Syphilis
35% 100% 74%
Hep B
48% 100% 90%
Hep C
48% 100% 79%
Substance Abuse
30% 100% 72%
Mental Health
28% 100% 76%
Female Pap Smear 46% 100% 86%
How accurate was the 2009 RSR submission to HAB?
% Clients With Unknown Screening or Test
RSR
Chart
Measure
Min
Max Mean Mean
TB
34% 100% 73%
94%
Syphilis
35% 100% 74%
96%
Hep B
48% 100% 90%
95%
Hep C
48% 100% 79%
95%
Substance Abuse
30% 100% 72%
90%
Mental Health
28% 100% 76% 90-92%
Female Pap Smear
46% 100% 86%
25%
About 25% of the OAMC patients identified as having no
insurance had evidence of health insurance enrollment
What factors were associated with inaccurate data
collection?
Incomplete ascertainment of data through medical
and MCM interviews
Inadequate screening for mental illness and
substance abuse
Inadequate chart documentation and illegibility
Progress notes were standardized and not tailored to the
patient visit
Difficulty in assessing if some procedures were
completed
Inaccurate transfer of data from charts to PE
Inability to obtain electronic lab records from lab
vendors
Data Items With Relatively Low Accuracy Rates
Items collected by medical case managers at intake
were not updated
ZIP code, health insurance, housing stability, HIV status,
HIV risk factor
Items for which clients were not forthcoming
Household size and income (used for FPL computation)
Items that were assessed subjectively
HIV risk, substance and mental health assessment
Items for which there is not clinical agreement about
the criteria for screening
Mental health, substance abuse
Items that were re-entered into PE data systems from
lab reports
Redundant items that providers report presented
unnecessary burden (quality measures)
What steps were undertaken in improving data
quality?
Site-specific reports were generated to identify areas
of improvement
Exit interviews were conducted
Lists of clients enrolled in or likely to be eligible for
other payers were produced to reduce ADAP waiting
list
Summary report to grantee and QM Committee of
HIV Planning Council were prepared
Data QIPs were prioritized, planned, and conducted
System-wide and clinic-specific QIPs are being conducted
Ongoing analysis of PE data and chart review will be
conducted to ensure improvements are sustained
Using Data to Improve Quality of
HIV Services:
Experiences of the Virginia CrossPart Collaborative
Adam Thompson &
the Virginia CrossPart Collaborative
Start at the very beginning . . .
Who should be involved in the decisions?
What data should be collected and reviewed?
What is happening locally and nationally?
What are the emerging trends or priorities?
Where is the passion?
Collect Data
2010 Virginia Ryan White All Grantee Meeting
2008 and 2009 Virginia RDRs
Virginia Part B QM Advisory Committee (QMAC)
Virginia Epidemiology - HIV/STD Incidence and
Prevalence Trends
Virginia Department of Health Division of Disease
Prevention HIV/TB/STD Prevention Priorities
Virginia Community Planning Group
National SPNS Initiatives
Identify Relevant Data
Provider Poll indicated priorities were:
– 1. Hepatitis C Screening
– 2. Syphilis Screening
RDR showed a 40% screening rate for syphilis
QMAC identified through chart abstractions, patient
interviews, and data reports that system reports were
incomplete
Incidence and prevalence of HIV and syphilis were
high in MSM (particularly young black MSM)
33% of all total early syphilis (TES) cases were in PLWH
Syphilis elimination funds were added to MSM
prevention grants
SPNS projects were focusing on Hep C
Analyze Findings
With SPNS initiatives focused on Hepatitis C,
waiting on those funded outcomes was advisable
QMAC findings indicated a systems issues with data
reporting
Rates of TES aligned with incidence and prevalence
rates for HIV (MSM)
Syphilis is a treatable disease when identified
Syphilis elimination and lost to care priorities were
young black MSM which aligned with syphilis and
HIV incidence
Make Decisions
2011 statewide project focus on syphilis screening
– Yearly screening offered an opportunity to examine systems
throughout the year to improve data reporting mechanisms
– Overlap of HIV and Syphilis meant one-third of all TES cases
could potentially be treated through screening current HIV
clients
– Care and prevention providers will be working in tandem to
achieve lower syphilis rates through prevention, screening,
treatment in HIV high-risk populations
– IHI Standard is 80% for syphilis which is attainable goal
– Individual providers will be responsible for measuring clinic
performance, compare with system report, identify systems
versus care issues, submit QIP plans, measure performance,
and report un-blinded data by third quarter