Public Report Card

Download Report

Transcript Public Report Card

California Pay for Performance
Dolores Yanagihara, MPH
Integrated Healthcare Association
Mendocino Health Information Exchange
June 18, 2008
Agenda
• California P4P Program information
• P4P Results
– Performance
– Public Reporting
– Payment
– Stakeholder Feedback
• Overcoming Program Challenges
– Technical
– Political / Legal
2
Integrated Healthcare Association (IHA)
• Statewide leadership group that promotes
quality improvement, accountability, and
affordability of health care in California
• Mission: to create breakthrough improvements
in health care services for Californians through
collaboration among key stakeholders
• Principal projects:
–
–
–
–
pay for performance
medical technology assessment and purchasing
measurement and reward of efficiency in health care
prevention programs directed at obesity
3
Background
Institute of Medicine (IOM) reports a call to action
to improve quality and safety of U.S. healthcare
with specific recommendations including:
• Quality measurement and reporting
• Public Transparency
• Incentives for quality improvement
(Pay for Performance)
4
California P4P: History
• 2000: Stakeholder discussions started
• 2002: Testing year
– IHA received CHCF Rewarding Results Grant
• 2003: First measurement year
• 2004: First reporting and payment year
• 2008: Sixth measurement year;
fifth reporting and payment year
5
The California P4P Players
• 8 health plans
 Aetna, Blue Cross, Blue Shield, Cigna, Health
Net, Kaiser, PacifiCare, Western Health
Advantage
• 40,000 physicians in 235 physician groups
• HMO commercial members
Payout: 5.5 million
Public reporting: 11 million*
* Kaiser medical groups participate in public reporting only starting 2005
6
Program Governance
•
•
•
•
•
•
Steering Committee – determine strategy, set policy
Planning Committee – overall program direction
Technical Committees – develop measure set
Payment Committee – recommend payment method
IHA – facilitates governance/project management
Sub-contractors
 NCQA/DDD – data collection and aggregation
 NCQA/PBGH – technical support
 Thomson – efficiency measurement
Multi-stakeholders “own” the program
7
Goal of California P4P
To create a compelling set of incentives
that will drive breakthrough improvements
in clinical quality and the patient
experience through:
√
√
√
√
Common set of measures
Data aggregation
A public report card
Health plan payments
8
Organizing Principles
• Measures must be valid, accurate, meaningful to
consumers, important to public health in CA, economical
to collect (admin data), stable, and get harder over time
• New measures are tested and put out for stakeholder
comment prior to adoption
• Data collection is electronic only (no chart review)
• Data from all participating health plans is aggregated to
create a total patient population for each physician group
• Reporting and payment at physician group level
• Financial incentives are paid directly by health plans to
physician groups
9
The California P4P Process
Development
Year
Public
Comment
Testing
Year
Public
Comment
Measurement
Year
Reporting
Year
Data Aggregation
and Payments
10
MY 2008 Clinical Measures
• Preventive Care





• Acute Care
Breast Cancer Screening
Cervical Cancer Screening
Childhood Immunizations
Chlamydia Screening
Colorectal Cancer Screening
 Treatment for Children with
Upper Respiratory Infection
 Appropriate Testing for
Children with Pharyngitis
 Avoidance of Antibiotic
Treatment in Adults with
Acute Bronchitis
 Use of Imaging Studies for
Low Back Pain
• Chronic Disease Care
 Appropriate Meds for Persons
with Asthma
 Cholesterol Mgmt: LDL
Screening & Control <100
 Monitoring of Patients on
Persistent Medication
11
MY 2008 Patient Experience Measures
• Specialty Care
• Timely Care and Service composite
• Doctor-Patient Interaction composite
• Care Coordination composite
• Overall Ratings of Care
• Office Staff composite
• Health Promotion composite
12
MY 2008 IT-Enabled “Systemness” Domain
1. Data Integration for Population Management
2. Electronic Clinical Decision Support at the
Point of Care
3. Care Management
•
•
•
Coordination with practitioners
Chronic care management processes
Continuity of care after hospitalization
4. Access and Communication Standards
5. Physician Measurement and Reporting
13
New Domain for MY 2008
Coordinated Diabetes Care Domain
– Diabetes Clinical Measures
• HbA1c screening, poor control >9, good control <7
• LDL screening, control <100
• Nephropathy Monitoring
– Diabetes Population Management Activities
• Diabetes Registry (including blood pressure)
• Actionable Reports on Diabetes care
• Individual Physician Reporting on Diabetes measures
– Diabetes Care Management
14
New Measures for “Testing” in 2008
• Test in 2008 for potential inclusion in MY 2009
• Clinical
– Depression Screening and Assessment of High Risk
Patients
–
–
–
–
Inpatient Readmissions within 30 Days
Asthma Medication Ratio
Evidence-based Cervical Cancer Screening (re-test)
Potentially Avoidable Hospitalization (re-specify and
re-test)
15
Efficiency Measurement
• Purchasers and Health Plans are demanding
that cost be included in the equation
Quality + Cost = Value
• Use both population-based and episode-based
methodologies
• Use both standardized costs and actual costs to
account for utilization and pricing
16
Efficiency Measures
1. Generic prescribing (MY 2007)
•
Calculated by cost and by number of scripts
2. Overall Group Efficiency (MY 2009)
•
•
Episode and population based methodologies
Calculated using both standardized and actual costs
3. Efficiency by Clinical Area (MY 2009)
•
Calculated using standardized costs
4. Actual to Standardized Pricing Indices (MY 2009)
17
CA P4P Data Collection & Aggregation
Audited rates
using
Admin data
Clinical
Measures
OR
Audited rates
using
Admin data
Group
CCHRI
Patient
Experience
Measures
IT-Enabled
Systemness
Measures
Efficiency
Measures
Physician
Group
Report
for QI
Plans
PAS
Scores
Data Aggregator:
NCQA/DDD
Produces one
set of scores
per Group
Health
Plan
Report for
Payment
Vendor/Partner:
Thomson (Medstat)
Report
Card
Vendor for
Public
Reporting
Group
Survey Tools
and
Documentation
Claims/
encounter
data files
Plans
18
Produces one set of
efficiency scores
per Group
Aggregating Data
Benefits:
• Increase sample size
– More reportable data
– More robust and reliable results
• Measure total patient population
• Produce standardized, consistent performance
information
Requirements:
• Consistent unit of measurement
• Standard, specified measures
19
The Power of Data Aggregation
Aggregating data across plans creates a larger denominator
and allows valid reporting and payment for more groups
Health
Plan
Size
# of % physician groups % physician groups
Health
with sufficient
with sufficient
Plans
sample size to
sample size to
report all clinical
report all clinical
measures using
measures using the
Aggregated Dataset
Plan Data Only
< 500K
members
3
16%
70%
>1M
members
4
30%
65%
20
P4P Results
Overview of P4P Program Results
• Year over year improvement across all measure
domains and measures
• Single public report card through state agency
(Office of the Patient Advocate)
• Incentive payments totaling over $210 million for
measurement years (MY) 2003-2006
• Physician groups highly engaged and generally
supportive
22
Clinical Results MY 2003-2006
90
MY 2003
MY 2004
MY 2005
MY 2006
80
70
60
50
40
30
20
10
0
Breast Cancer
Screening
Cervical Cancer HbA1c Screening
Screening
23
Chlamydia
Screening
Childhood
Immunizations
Regional Variation in Clinical Performance
90
Top Performing Groups
85
Inland Empire
Los Angeles
Central Coast
Central Valley
San Diego
Orange County
Bay Area
Sacramento/North
Statewide
80
75
70
65
60
55
50
MY 2006 Results by Region
24
IT Measure 1:
Population Management Activities
MY 2003
MY 2004
MY 2005
MY 2006
Percentage
of Groups
60
50
40
30
20
10
0
Patient Registry
Actionable Reports
25
HEDIS Results
IT Measure 2:
Point-of-Care Activities
Percentage
of Groups
MY 2003
MY 2004
MY 2005
MY 2006
45
40
35
30
25
20
15
10
5
0
Electronic
Prescribing
Electronic
Electronic
Check of
Retrieval of
Prescription Lab Results
Interaction
Electronic
Access of
Clinical
Notes
26
Electronic
Retrieval of
Patient
Reminders
Accessing
Clinical
Findings
Electronic
Messaging
Correlation Between IT Adoption and
Clinical Performance
80
Clinical Score
75
70
65
60
55
50
No IT Adoption
Full IT Credit
27
Public Report Card
http://opa.ca.gov/report_card/medicalgroupcounty.aspx
28
Health Plan Payment Results
• Each health plan determines their own reward
methodology and payment amount
(http://www.iha.org/ftransp.htm)
• Most plans pay on relative performance, after
meeting thresholds
•
•
•
•
$38 M paid out in 2004
$54 M paid out in 2005
$55 M paid out in 2006
$65 M paid out in 2007
(about 1.5-2% of base pay on average)
29
Paying for Performance & Improvement
Earning Quality Points Example
Measure: Pneumococcal Vaccination
.47
.87
Benchmark
Attainment Threshold
Hospital I
Attainment Range
Score
•
.21
Score
baseline
.70
performance
•
1
•
•
2
3
•
4
•
5
••
6
•
7
•
•
8
9
Attainment Range
•
1
•
2
•
3
•4
•5
•
•
6
7
Improvement Range
•8
•9
Hospital I Earns: 6 points for attainment
7 points for improvement
Hospital I Score: maximum of attainment or improvement
= 7 points on this measure
Excerpt from CMS Hospital Value-Based Purchasing Listening Session #2, April 12, 2007
30
Physician Group Engagement
Program Strengths
–
–
–
–
Physician groups are highly engaged
74% believe the measures are reasonable
Widespread support for increased incentives
Belief the program has increased the focus on quality
improvement and IT capabilities
Program Weaknesses
– Lack of consumer interest in public reporting
– Concern about the potential for too many measures
Overall Rating
– Mean score of 3.86 for importance (on a 1 to 5 scale)
31
Health Plan Engagement
Program Strengths
–
–
–
–
Increased collaboration
Push toward QI
Investments in IT
Greater accountability and transparency
Program Weaknesses
–
–
–
–
Improvements viewed as marginal
Concerns about “teaching to the test”
Lack of a positive ROI
Failure of clinical data feed to raise HEDIS scores
Overall Rating
- 2.5 mean score (1 to 5 pt. scale)
32
Overcoming Program Challenges
The Data Problem
Paper Electronic
Claims Medical Medical
The data you want:
Data Record Record
N
Y
• Easy to collect
Y?
Y
N
• Clinically rich
Y
Y?
N
• Complete and consistent
Y
Y
• Across product lines/payors N
Y
N
Y
• Whole eligible population
Y
34
Addressing the Data Problem
Enhancing claims data
• Identify and address data gaps
• Encourage use of CPT-II codes
• Develop supplemental clinical data
– Lab results
– Preventive care / chronic disease registries
– Exclusion databases
• Push EMR adoption
35
Addressing the Data Problem
Example: Blood pressure control
– Previously a chart review measure
– Creation of CPT-II codes allows administrative
measurement
– Incentivize inclusion in registry
 Create system for routinely collecting information
36
Data Exchange
•
•
•
•
Standard format and data definitions
Defined data flow process
Enhanced member matching
Adequate documentation
37
Data Exchange Issues
N
AdminOnly Mean
All-Data
Mean
National HEDIS Rates, MY 2003
313
25
59.8
P4P Plan HEDIS Rates, MY 2003
7
8.4
60
LDL<130 Rates - Diabetes Population
P4P Plan-Specific Rates, MY 2004
Plan 1 (not used in aggregation)
0.0
Plan 2 (not used in aggregation)
0.5
Plan 3 (not used in aggregation)
1.0
Plan 4 (not used in aggregation)
6.3
Plan 5
21.4
Plan 6
25.9
Plan 7
26.3
Self-Report Average
51.0
38
Facilitating Data Exchange
Third party lab
data repository
Lab
Physician
Group
Intermediary
Plan
Physician
Group
39
Legal and Political Issues
• Complying with HIPAA regulations
• Overcoming Non-Disclosure Agreements
• Addressing Data Ownership
40
Addressing Legal and Political Issues
Example #1: Lab results
– Code of Conduct for bi-directional data
exchange
– Lab authorization form
– Disease Management Coordination initiative
41
Addressing Legal and Political Issues
Example #2: Efficiency measurement
– BAA
– Antitrust Counsel
– Consent to Disclosure Agreements
– No group-specific results shared first two
years
– Publicly available sources of data
42
Conclusions on Data Issues
• Data is a limiting factor in performance
measurement
• Administrative data can be enhanced by
supplemental sources
• Data transfer of supplemental sources
needs to be standardized
• Aggregation can make results more robust
• Legal and political issues carry as much
weight as technical issues
43
Summary
•
•
•
•
Initial process goals achieved
“Breakthrough” outcome goal not achieved
Strong collaborative “platform” established
Fundamental changes in direction and
implementation required to address
emerging affordability goal
44
California Pay for Performance
For more information:
www.iha.org
(510) 208-1740
Initial support for IHA Pay for Performance provided
by California Health Care Foundation
45