Transcript Slide 1

The Role of Data
in
Quality
Improvement
Julia Hidalgo, ScD, MSW, MPH
July 10, 2008
888-NQC-QI-TA
NationalQualityCenter.org
Funded by HRSA
HIV/AIDS Bureau
Today We Will Discuss
• Basics of measurement and analysis
• Identify data sources used for quality
improvement (QI) and analyze data to identify
key findings
• Prioritize results to implement action steps to
improve HIV care and services
• Know how to access benchmark reports and
best practices on how to share data and
findings
2
National Quality Center (NQC)
Basics of Performance Measurement
• Why measure?
• What to measure?
• When to measure?
• How to measure?
3
National Quality Center (NQC)
Why is measurement important in
quality management (QM)?
• Measurement differentiates between what we think is
happening from what really is happening
• Establishes a baseline
 It is ok to start out with low scores!
• Determines whether changes in processes actually lead to
improvements
• Avoids slippage in performance
 Sustaining the gain made through QI
• Ongoing and periodic monitoring identifies problems as
they emerge
4
National Quality Center (NQC)
Why measure? (cont.)
• Measurement allows comparison of grantees,
subgrantees, program sites within an agency, individual
providers, and networks
• The Ryan White Treatment Modernization Act of 2006
requires performance measurement
• The HIV/AIDS Bureau (HAB) places strong emphasis on
QM
 Grantees are responsible for ensuring that QI systems are
established by subgrantees and can require regular reporting
5
National Quality Center (NQC)
Important Definitions From the National
Quality Measures Clearinghouse
• Measure
 A tool to assign a quantity to an attribute by comparing it with a
criterion
• Quality measure
 A tool to assign a quantity to quality of care by comparing it with a
criterion
• Clinical performance
 The degree of accomplishment of desired health objectives by a
clinician or health care organization
• Clinical performance measure
 A tool for assessing the degree to which a provider competently and
safely delivers clinical services that are appropriate for the patient in
the optimal time period
6
National Quality Center (NQC)
Important Definitions From the National
Quality Measures Clearinghouse
• Process measure
 Evidence that the measured clinical process has led to improved
health outcomes
• Outcome measure
 Evidence that the outcome measure has been used to detect the
impact of one or more clinical interventions
• Access measure
 Evidence that an association exists between the result of the access
measure and the outcomes of or satisfaction with care
• Patient experience measure
 Evidence that an association exists between the measure of patient
experience of health care and the values and preferences of
patients/consumers
7
National Quality Center (NQC)
Much of What We Measure in HIV Is
Processes of Care
• Clinical
• Case management
• Inter or intra-clinic processes
• Patient utilization of services (sometimes
referred to in HIV planning as “accessing
services”)
• Underutilization
• Overutilization
• Misutilization
• Coordination of care
8
National Quality Center (NQC)
Outcomes Measures Might Include
• Patient Health Status
•
•
•
•
•
•
•
Intermediate outcomes like immune and virological status
Survival
Symptoms
Disease progression
Disability
Self-reported health status (e.g., pain scale)
Hospital and ER visits
• Patient Satisfaction
9
National Quality Center (NQC)
Considerations in Measurement Selection
• What question are you trying to answer?
• Is the service or process you are measuring well
established in the clinical or human services fields?
 If so, there are likely to be measures already well defined, field tested
using rigorous research methods, and benchmark data are likely to be
available
• If you “customize” a measure you may lose the ability to benchmark
performance with other providers or networks using earlier or
ongoing studies
 Does the indicator affect a lot of people or programs?
 Does the indicator have a great impact on the
programs or patients or clients in your program?
10
National Quality Center (NQC)
Other Considerations in
Measurement Selection
 Is there empirical evidence upon which to base your
measure?
• Is the indicator either based on accepted guideline or developed
through formal group-decision making methods?
• Is there consensus among providers about the measure’s relevance?
 Does the measure directly relate to the process or
outcome you are measuring?
• Is this indicator within our control?
 Can the indicator realistically and efficiently be
measured given finite resources?
 Can the performance rate realistically be improved
given the limitations of service systems and the
population’s health care utilization behaviors?
11
National Quality Center (NQC)
Methods Considerations in Selecting
Measures
• Can providers consistently, accurately, and reliably
gather data to populate the measure?
• Do they agree with and understand the measure?
• Have chart abstraction instruments already been
designed, field tested and used routinely?
• For measurement using electronic medical records (EMRs), have
coding algorithms been designed, field tested?
• Does the measure specify the exact inclusion and exclusion criteria and
time frames for assessment?
• Patient characteristics (e.g., age, gender, clinical parameters, treatment
status, etc.) and time period (hours, days, months, etc.)
12
National Quality Center (NQC)
What is the quality of the data collection
processes in which measurement is embedded?
• It is important to assess the quality of routine data
recording by providers in charts and EMRs to
determine if improvement is necessary BEFORE you
apply new quality measures
 Are providers reliably, accurately, and completely charting the
processes which you wish to measure?
• If not, data QI projects need to be undertaken before measurement
begins
• Technical assistance (TA) is available
• You do not want to inadvertently measure the quality of the data instead of
the process or outcome of interest
13
National Quality Center (NQC)
Example of a Measure’s Specification
• Performance Measure: HAART - Related OPR Measure
No. 12a and Group 1 Measure
 The measure was developed by a panel of HIV clinical experts
• Percentage of clients with AIDS who are prescribed
HAART
 Numerator = number of clients with AIDS who were prescribed a
HAART regimen within the measurement year
 Denominator = Number of clients who have a diagnosis of AIDS
(history of a CD4 T-cell count below 200 cells/mm3 or other AIDSdefining condition), and had at least one medical visit with a
provider with prescribing privileges (i.e., MD, PA, NP in the
measurement year)
• Patient Exclusions = Patients newly enrolled in care during the last
three months of the measurement year
14
National Quality Center (NQC)
Example of a Measure’s Specification (Cont’d)
• Data Element
 Is the client diagnosed with CDC-defined AIDS? (Y/N)
 If yes, was the client prescribed HAART during the reporting period?
(Y/N)
• Data Sources
 Program Data Report, Section 2, Items 26 and 31 may provide data




15
useful in establishing a baseline for this performance measure
Electronic Medical Record/Electronic Health Record
CAREWare, Lab Tracker, or other electronic data base
HIVQUAL reports on this measure for grantee under review
Medical record data abstraction by grantee of a sample of records
National Quality Center (NQC)
Example of a Measure’s Specification (Cont’d)
• Basis for Selection and Placement in Group 1
 Randomized clinical trials provide strong evidence of improved




16
survival and reduced disease progression by treating symptomatic
patients and patients with CD4 T-cell s <200 cells/mm3
Measure reflects important aspect of care that significantly impacts
survival, mortality and hinders transmission
Data collection is currently feasible and measure has a strong
evidence base supporting the use
US Public Health Service Guidelines -"Antiretroviral therapy is
recommended for all patients with history of an AIDS-defining illness
or severe symptoms of HIV infection regardless of CD4 T-cell count”
Peer reviewed clinical studies cited
National Quality Center (NQC)
Collect “Just Enough” Data
• The goal is to improve care, not prove a new
theorem
• Data from 100% of client/patient’s charts do not
need to be abstracted, automated, and analyzed
• Maximal statistical power is not needed
• In most cases, a straightforward sample will
provide sufficient statistical power to assess the
performance
17
National Quality Center (NQC)
Random Sampling to Collect Data
• Use a random sample if the entire population
cannot easily be measured
• “Random selection” means that each record has an
equal chance of being included in the sample
• The easiest way to select records randomly is to
use a random number table and pull each paper
record or EMR in the random sequence
18
National Quality Center (NQC)
Resources to Create
Random Samples
• “Measuring Clinical
Performance: A Guide for HIV
Health Care Providers”
(includes random number
tables)
• A useful website for the
generation of random
numbers is
www.randomizer.org
• Common spreadsheet
programs, such as MS Excel
19
Sampling Records
National Quality Center (NQC)
Frequency of Measurement
• You do not have to measure everything all the time
 PDSA cycles can be used to sample a short period of time and
extrapolate the results
• Balance the frequency of measurement against the
costs
• If limited resources, measure areas of concern more
frequently, others less frequently
• Balance the frequency of measurement against
usefulness in producing change
• Consider the audience
 How will frequency best assist in setting priorities and generating
change?
20
National Quality Center (NQC)
Chart Abstraction Tools and Process
• Must be designed to reflect accurately the measure
 New tools should be field tested using sound methods
• Abstractors should be trained to use the tools
 Inter-rater reliability should be assessed when new tools or measures
are introduced to identify areas of imprecise measures or instructions
• Abstractors (including clinicians and other
professionals) can provide valuable TA in identifying
areas of weakness in chart documentation
21
National Quality Center (NQC)
Measurement Analysis
• Results by agency in a table is one step
• Benchmarks should be set in advance to which the
results can be compared
• Assess differences between the benchmark and
individual and group performance
22
National Quality Center (NQC)
Strategies Depend on Resources
• Data systems enhance capability
 More indicators can be measured
 Indicators can be measured more often
 Entire populations can be measured
 Outcome as well as process indicators can be measured
 Alerts, custom reports help manage care
• Personnel resources
 Person power for chart reviews, logs, other means of measurement is
needed
 Expertise in electronic / manual measurement
 Ideally, individual trained in statistics analyze the data
23
National Quality Center (NQC)
Examples of Measures Used by
Ryan White Program Grantees
in Quality Management
Collaboratives
Funded by HRSA
HIV/AIDS Bureau
Part C & D Collaborative: Measures
(81 Part C&D Grantees, 2000-2001)
Access and Retention
 % of patients with visit(s) in last 3 months
Viral Load
 % of patients with undetectable viral load (below 50
copies/ml)
CD4 Count
 % of patients with CD4 count < 200 cells/ml
Clinical Care
 % of patients on HAART
Self-Management and Adherence
 % of HAART patients with adherence
counseling/intervention at last visit
25
National Quality Center (NQC)
Part C & D Collaborative:
% with 3 Month Visit
HIV Collaborative - Patients with 3 Months Visits
Percent
N=All Collaborative Teams
86.00
84.00
82.00
80.00
78.00
76.00
74.00
72.00
D-01
N-01
O-01
S-01
A-01
J-01
J-01
M-01
A-01
M-01
F-01
J-01
D-00
N-00
O-00
S-00
A-00
J-00
J-00
M-00
70.00
Reporting Month
26
National Quality Center (NQC)
Part C & D Collaborative:
% with Adherence Intervention
HIV Collaborative - Patients with Adherence Intervention
(last visit)
90
N=All Collaborative Teams
Percent
80
70
D-01
N-01
O-01
S-01
A-01
J-01
J-01
M-01
A-01
M-01
F-01
J-01
D-00
N-00
O-00
S-00
A-00
J-00
J-00
M-00
60
Reporting Month
27
National Quality Center (NQC)
Part A Collaborative Pilot: Key Measures
(5 Part A Grantees, 2002-2003)
Viral and CD4
 % of patients with CD4 count > 350
 % of patients with viral load < 10,000
Access and Retention
 % of patients entering primary care HIV positive and
asymptomatic
 % of patients with primary care visit(s) in last 3 months
Case Management
 % of patients whose service plan is current
Self-Management
 % of patients with self-management goal setting
28
National Quality Center (NQC)
Part A Collaborative: Lessons Learned
• Strong leadership for quality improvement at the Part A
level is essential to sustained change
• Data are very hard to obtain due to complexity of system,
lack of integrated information structures
 The value of data to help improve the system, however, outweighs
the difficulties in obtaining it
• Building information technology system infrastructure is
vital to integration and coordination of services
29
National Quality Center (NQC)
Part B Collaborative Pilot: Measures
(8 Part B Grantees, 2004-2006)
 % of ADAP applicants approved/denied for ADAP enrollment within two
weeks of receiving a complete application
 % of ADAP enrollees recertified for ADAP eligibility criteria annually
 % of individuals newly reported with HIV infection who also have AIDS
diagnosis
 % of individuals newly reported with HIV infection who progress to
AIDS diagnosis within 12 months of HIV diagnosis
 Ratio of individuals who die within 12 months of HIV diagnosis to the
number of individuals newly reported with HIV infection
 % of individuals with at least two general HIV medical care visits in the
last 12 months
 % of individuals with either a CD4 or viral load in the last six months
30
National Quality Center (NQC)
Low Incidence Initiative: Key Measures
(17 Part B Grantees, 2007-2008)
• % of Ryan White Program-funded clients who have a CD4
test done at least every six months
• % of applying state ADAP clients approved or denied for
ADAP services within two weeks of ADAP receiving a
complete application
• % of clients with at least two general HIV medical care visits
in the last 12 months who are enrolled in case management
31
National Quality Center (NQC)
Low Incidence Initiative: Lessons Learned
• Use standardized data collection tools to obtain
reliable and valid data.
• Provide education on data collection methods, while
keeping the concepts as simple as possible.
• Report findings back to stakeholders, especially to
those who collect data.
• Use data to improve systems and care provided.
• A solid data management system is needed.
32
National Quality Center (NQC)
Other Data and Measurement Resources
• HIVQUAL www.HIVQUAL.org
• National Quality Center www.NationalQualityCenter.org
• HAB Performance Measures
 http://hab.hrsa.gov/special/habmeasures.htm#draft1
• dataCHATT: JSI Research and Training Institute Cooperative
Agreement
 http://www.datachatt.jsi.com/
• National Quality Measures Clearinghouse
 http://www.qualitymeasures.ahrq.gov/
33
National Quality Center (NQC)
National Quality Center (NQC)
888-NQC-QI-TA
NationalQualityCenter.org
[email protected]
Funded by HRSA
HIV/AIDS Bureau