Transcript Slide 1

Welcome to the NQC TA Call on
The Basics of Performance Measurement
for Quality Improvement
May 8, 2008
Nanette Brey Magnani, EdD, NQC Consultant
Genevive Meredith, Maine Part B
Jack Rustico, Connecticut Children,Youth and Family Network
Hollie Malamud-Price, Michigan Part B & D
Funded by HRSA
HIV/AIDS Bureau
Linking Performance Measurement
and Quality Improvement
Infrastructure
2
National Quality Center (NQC)
How to Go in Circles
3
National Quality Center (NQC)
Trends in QM
• From monitoring (QA) to improvement projects (QM)
• From QA by administrators to QM by teams
• From core medical indicators to expanded scope of
process indicators
• From 100% goals to goals by benchmarking (internal,
external)
• From data by hand to data by computer
• From process to outcome indicators
• Accountability to/ inclusion of consumers
• From program to regional QM
4
National Quality Center (NQC)
Example: Maine Part B Program
Participant of NQC/HAB Low Incidence Initiative
 Realized difference between Quality
Assurance and Quality Management
 Developed tools to feed data back to
agencies for QI; little agency response
 Now QM part of yearly subgrantee
contracts which include QI Projects/PDSA
cycles
5
National Quality Center (NQC)
Basics of Performance Measurement
•
•
•
•
•
6
Why measure?
What to measure?
When to measure?
How to measure?
Strategic planning for measurement
National Quality Center (NQC)
Why Measure?
Funded by HRSA
HIV/AIDS Bureau
Reasons to Measure
• Separates what you think is happening from
what really is happening
• Establishes a baseline: It’s ok to start out with
low scores!
• Determines whether changes actually lead to
improvements
• Avoids slippage
8
National Quality Center (NQC)
Reasons to Measure (cont.)
• Ongoing / periodic monitoring identifies
problems as they emerge
• Measurement allows for comparison across
sites, programs, EMAs, TGAs and states and
across years
• The Ryan White Treatment Modernization Act
of 2006 mandates performance measurement
• The HIV/AIDS Bureau places strong emphasis
on quality management
9
National Quality Center (NQC)
What to Measure
Funded by HRSA
HIV/AIDS Bureau
What is a Quality Indicator?
• A quality of care indicator is an aspect of
patient care that is measured to evaluate the
extent to which a facility provides or achieves
a particular element of care.
• Generally, based on specific standards of
care derived from guidelines issued by a
professional society and/or government
agency.
11
National Quality Center (NQC)
Process Indicator Topic Areas
•
•
•
•
Medical processes
Case management processes
Clinic / agency / EMA / state processes
Patient utilization of care
-underutilization
-overutilization
-misutilization
• State, EMA,TGA common processes
• Coordination of care processes
12
National Quality Center (NQC)
Example: Clinical HAB Core Measures
http://hab.hrsa.gov/special/habmeasures.htm
• % of clients with HIV infection who had 2 or more
CD4 T-cell counts performed in the measurement
year
• % of clients with AIDS who are prescribed HAART
• % of clients with HIV infection who had 2 or more
medical visits in an HIV care setting in the
measurement year
• % of clients with HIV infection and a CD4 T-cell
counts below 200 cells/mm3 who were prescribed
PCP prophylaxis
• % of women with HIV infection who are prescribed
ARV therapy
13
National Quality Center (NQC)
Example: HIVQUAL Measures
www.hivqual.org
•
•
•
•
•
•
•
•
•
•
•
•
•
14
Clinical visits
HIV specialist care
ARV Therapy Management
Adherence assessment
HIV monitoring
Lipid screening
Gynecology care
STD management
Hepatitis C Screening
Mental Health
Prevention education
Health literacy screening (pilot)
Baseline resistance testing (pilot)
National Quality Center (NQC)
Example: Coordination of Care – Michigan
Department of Community Health – Part D
• Initial chart reviews to establish baseline data
 Referrals documented in care plans
 Referral documented between agencies
• Use referral field in CAREWare
 Case management – documentation of any referral
 Medical - Colposcopy, dental and opthamology (new for Part
D, Part C)
• Performance measure: Clients with identified needs will
have documentation of referrals. 75% of charts will have
documentation of referrals. Number of charts documenting
referrals/number of clients with documented needs.
15
National Quality Center (NQC)
Outcome Indicator Topic Areas
• Patient Health Status
• Intermediate outcomes like immune and
virological status
• Survival
• Symptoms
• Disease progression
• Disability
• Subjective health status
• Hospital and ER visits
16
National Quality Center (NQC)
Example: Outcomes- Achieving Undectable Viral Loads
Snapshot of Antiretroviral Treatment and Success at the
Grand Junction Western Slope HIV Collaborative Clinic
• Chart review on 4/23/08
• 137 GJ patients with at least
one medical visit in the period
7/1/07-12/31/07
• 113 (82%) of GJ patients were
on ART
• Of the 113, 97 (87%) and 77
(68%) had an HIV VL < 400
and < 50 respectively
• 75% of the patients not on ART
had either a CD4 > 350 or had
declined therapy
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
% on
ART
% on
ART <
400
% on
ART < 50
ART Snapshot
4/23/08
Source: Merilou Johnson and Lucy Graham
17
National Quality Center (NQC)
Example: Maine Part B Program
Choosing Process and Outcome Measures
• Every agency participating in reporting data is
at the table, as well as consumer reps
• First agreed on outcomes of medical case
management
• Then determined outputs and the process to
achieve outputs
18
National Quality Center (NQC)
Example cont’d: Maine Part B
• Outcome of medical case management:
Improved quality of life
• Outputs: Achievement of short term clientdefined goals
• Processes:
 Comprehensive annual assessment
 Quarterly care plans
 Care plans need minimum of two client-defined
goals for focus of intervention during that quarter
19
National Quality Center (NQC)
Example: Maine Statewide All Parts Indicators
Outcomes and Process
3 Outcomes:
 Receive adequate medical monitoring
• 2 or more medical visits/year
• 2 or more CD4/year
• 2 or more VLs/year
 Receiving adequate medical care
• CD4 < 200 receive HAART
• Pregnant women receiving ARV therapy
• CD4 <200 receiving PCP prophylaxis
 Comprehensive Care
• Case management measures
20
National Quality Center (NQC)
What is a Good Indicator?
• Relevance- How Important is the Indicator?
 Does the indicator affect a lot of people or programs?
 Does the indicator have a great impact on the programs or
patients/clients in your EMA, TGA or state? Measurability
 Can the indicator realistically and efficiently be measured
given finite resources?
• Measurability
 Can the indicator realistically and efficiently be measured
given finite resources?
21
National Quality Center (NQC)
What is a Good Indicator? (Cont’d)
• Accuracy
 Is the indicator based on accepted guidelines or
developed through formal group-decision making
methods?
• Improvability
 Can the performance rate associated with the indicator
realistically be improved given the limitations of
services and population?
22
National Quality Center (NQC)
Specify criteria to define your
measurement population
•
•
•
•
Location: all sites, or only some?
Gender: men, women, or both?
Age: any limits?
Client conditions: all HIV-infected clients, or
only those with a specific diagnosis?
• Treatment status?
23
Sampling Records
National Quality Center (NQC)
Portland TGA Performance Measures: Using
the Chronic Care Model Framework
Purpose/
Objective
Data Source
Decision
Support
Outcomes
Reports
Decision
Support
Clinical
Outcomes
Measurable Items


At least one service-specific outcome;
Client access to primary HIV medical care (% of clients with a HIV medical visit in last 6
mos).


Stable/improved CD4 counts;
Care provided according to clinical standards (% clients with PPDs; HAART; PCP/MAC.
prophylaxis; syphilis & Hep C screening; Pap test);
Maintenance of participation in medical care.

Client
Information
Systems
Monthly Service
Utilization
Reports
Client
Information
Systems
Quarterly
Service
Utilization
Reports



Annual Client
Services Data
Review
Delivery
System
Design
Delivery
System
Design
Client
Satisfaction
Survey
Contractor
Narrative
Reports
Contractor
Narrative
Reports
Workshop
evaluations
Patient Self
Management
24
Annually
Annually
Monthly service utilization reports include all with service category specific items:
o Unduplicated number of clients served YTD;
o Number of clients served monthly;
o Number of service units provided.
Monthly
Reports separated by service category, including:
o Unduplicated number of clients served;
o Number of service units provided;
o Total dollar amount spent.
Quarterly

The proportion of clients served and the proportion of services provided equals or exceeds
racial/ethnic minority representation in the epidemic.
The proportion of clients served and the proportion of services provided equals or exceeds
female representation in the epidemic.
Clients have demonstrated need (as defined by other client characteristics: %FPL, permanent
Annually
housing status, insurance status, etc.).
New clients are served in core service areas.
An increased proportion of clients are receiving medical care services in the EMA (Unmet
Need estimate).
Continuity of client care across care continuum.


Increase/decrease in mean scores across 10 client satisfaction items.
Themes for suggested service changes.


At least one quality service improvement (QSI) measure per contractor
Progress in implementation of CCM within each local contractor is reported.


Number of PSMP workshops held.
Increases in self efficacy

Client
Information
Systems
Timeframe



Annually
National Quality Center (NQC)
Process for Developing Relevant and
Accurate Indicators
An example from
Children Youth and Family Network
(CYFAN,CT)
25
National Quality Center (NQC)
Connecticut Statewide
Medical Case Management Indicators
•
•
•
•
•
•
26
HRSA facilitator assigned through RW Part A and B Project Officer at
beginning of process; very important neutral party for buy-in
Statewide group representing RW Parts A,B,C and D (mostly providers and
administrators at the table) began meeting Summer, 2007
Opportunity was good for informal collaboration across RW Care Act, generally
missing in Connecticut for some time.
A combination of face to face meetings (5 total to date) and email exchange
Individual Administrators and Providers across Connecticut were assigned the
task of discussing Medical Case Management concepts and indicators with
front-line staff and consumers. In some cases this approach worked well in
other cases not so well.
Group worked off of existing HRSA definition of Medical Case Management for
the most part
National Quality Center (NQC)
Examples of Medical Case Management
Standards in Connecticut
•
•
•
•
•
•
•
27
Client records/care plans will have medical assessment performed every 3
months and eligibility & support services assessment performed every 6 months.
Referral linkages will be tracked.
Agencies document referrals in appropriate data base and/or progress notes.
Care Plans should be signed by the case manager developing the plan and by
the client. The client’s signature confirms that the client understands the plan (if
the client does not sign the Care Plan, document reason in the client’s Progress
Note).
Progress note entries must include the full legal name and title of the person
making the entry. The entry must also be dated and time, title and credentials
within five (5) days after an interaction with the client.
Client records will have progress notes updated monthly.
Medical case managers must receive minimum training requirements
established by Parts A, B, C, D.
National Quality Center (NQC)
Connecticut Statewide Medical Case
Management Indicators: Lessons Learned
• Major Advantages
 More consistent expectations for service
 Maximization of resources
 Better client information available
• Major Challenges
 Higher demand on professional development
 Information access
 Adaptation to current client encounters
28
National Quality Center (NQC)
Indicator Definition Tips
1. Base the indicator on guidelines and
standards of care when possible
2. Be inclusive (of staff and consumers) when
developing an indicator to create ownership
3. Be clear in terms of patient / program
characteristics (gender, age, patient
condition, provider type, etc.)
4. Set specific time-frames in indicator
definitions
29
National Quality Center (NQC)
Lessons Learned – Portland TGA;
Michigan Part D; Maine Part B
Programmatic lessons:
• Keep clients at the center – performance measures are
aimed at improving the quality of the care and evaluation
programs, not at proving a thesis
• Don’t let measurement guide the program - the
results should guide the program but not the
measurability
• Work with providers to establish measures - provide
technical assistance and make compromises; include
consumers in process as well
• Remain flexible with what’s happening at program level
e.g. changes in staff
• Learn together
• Reality of baseline data could be a shock, i.e., a lot of
work to do!
30
National Quality Center (NQC)
Cont’d Lessons Learned
Technical lessons:
• Be realistic about time involved in collecting data for
certain measures
• Critical to have a client-level data base
• Provide technical assistance- different capacities of
subcontractors
• Providers are experts at providing care – we need to
provide expertise in data analysis
• Flexibility in terms of software, i.e., CAREWare
catching up with what is going on in chart reviews
31
National Quality Center (NQC)
How to Measure
Funded by HRSA
HIV/AIDS Bureau
Create a Plan
• Decide on a sampling plan (sample size,
eligible records, draw a random sample)
• Develop data collection tools and instructions
• Train data abstractors
• Run pilot test (adjust after a few records)
• Inform other staff of the measurement process
• Check for data accuracy
• Remain available for guidance
• Make a plan for display and distribution of data
33
National Quality Center (NQC)
Using a Random Sample
• Use a random sample if the entire population
can’t easily be measured
• “Random selection” means that each record
has an equal chance of being included in the
sample.
• The easiest way to select records randomly is
to find a random number table and pull each
record in the random sequence.
34
National Quality Center (NQC)
Resources to Randomize
• “Measuring Clinical
Performance: A Guide for
HIV Health Care Providers”
(includes random number
tables)
• A useful website for the
generation of random
numbers is
www.randomizer.org
• Common spreadsheet
programs, such as MS Excel
35
Sampling Records
National Quality Center (NQC)
Collect “Just enough” Data
• The goal is to improve care, not prove a new
theorem
• 100% is not needed
• Maximal power is not needed
• In most cases, a straightforward sample will
do just fine
36
National Quality Center (NQC)
Strategies Depend on Resources
• Data systems enhance capability
 More indicators can be measured
 Indicators can be measured more often
 Entire populations can be measured
 Outcome as well as process indicators can be
measured
 Alerts, custom reports help manage care
• Personnel resources
 Person power for chart reviews, logs, other
means of measurement is needed
 Expertise in electronic / manual measurement
37
National Quality Center (NQC)
When to Measure
Funded by HRSA
HIV/AIDS Bureau
Frequency
• You don’t need to measure everything all of the time.
You can sample a short period of time and
extrapolate the results
• Balance the frequency of measurement against the
cost in resources
• If limited resources, measure areas of concern more
frequently, others less frequently
• Balance the frequency of measurement against
usefulness in producing change
• Consider the audience. How will frequency best
assist in setting priorities and generating change?
39
National Quality Center (NQC)
National HIVQUAL Data Reports
• Shows national trends based on selfreported data by participating HIVQUAL
grantees
• Provides an opportunity to compare program
performance with national data to highlight
areas of improvement opportunity
40
National Quality Center (NQC)
The HIVQUAL Project 2006 Performance Data
Part C and D Programs
41
National Quality Center (NQC)
Questions for Data Follow-up
• What are the results for key indicators?
• What are the major findings based on the generated
data reports and your data analysis?
 What is the frequency of patients / programs not
getting care?
 What is the impact of not getting the care?
 How does the performance compare with
benchmark data?
 What is the feasibility of improving the care?
42
National Quality Center (NQC)
Key Questions for Data Follow Up (Cont’d)
Example: Maine Part B, MDCH, CYFAN
• How can you best share the data results with your
key stakeholders (Part A/B QI committees, HIV
providers, consumers, etc.)?
• How do you generate ownership among providers
and consumers?
• How will you assist in initiating/implementing QI
projects to address the data findings? Who will be
responsible and what are the next steps?
43
National Quality Center (NQC)
On Our Way to…
QI Heaven
44
National Quality Center (NQC)
Contact Information
Genevive Meredith
[email protected] 207-287-4846
State of Maine Center for Disease Control, Ryan White Part B Program
Hollie Malamud-Price [email protected] 313-456 436
Michigan Dept of Community Health, Detroit
Lucy Graham [email protected] 970-255-1735
St.Mary’s Family Practice, Grand Junction, CO
Jack Rustico [email protected],860-667-7820
Connecticut Children, Youth and Family Network
Margy Robinson [email protected] 503-988-3030
Portland TGA
45
National Quality Center (NQC)
National Quality Center (NQC)
NYSDOH AIDS Institute
90 Church Street—13th Floor
New York, NY 10007-2919
888-NQC-QI-TA
[email protected]
NationalQualityCenter.org
46
National Quality Center (NQC)