Overview of 2004-2005 External Quality Review (EQR) Activities

Download Report

Transcript Overview of 2004-2005 External Quality Review (EQR) Activities

Performance Improvement
Projects Technical Assistance –
PIP 101
Monday, June 18, 2007
1:30 p.m. – 3:00 p.m.
David Mabb, MS, CHCA
Sr. Director of Statistical Evaluation
Presentation Outline
•
•
•
•
Balance Budget Act (BBA) 1997
Who is HSAG?
Overview of the PIP process
PIP Summary Form Review
– MCO demographics
– CMS rationale
– HSAG evaluation elements
• PIP Scoring Methodology
• HSAG Contact Information
• Questions and Answers
Balanced Budget Act (BBA) of 1997
The BBA requires states with Medicaid
managed care programs to implement certain
standards and business practices pertaining
to:
– Enrollee Rights and Responsibilities
– Quality Assessment and Performance
Improvement
– Grievance and Appeals System
Who is HSAG?
• We are an External Quality Review
Organization.
• To date, we have validated over 300 PIP
studies.
• We validate Managed Care Organization and
Behavioral Health Organization PIPs.
Overview of PIPs
What is the purpose of a PIP?
• To assess and improve processes, and
subsequently, outcomes of care. It typically
consists of a baseline, intervention period(s),
and remeasurement (s).
What is a PIP?
• It is a quality improvement project.
Primary Objective of a PIP
• Measurement of performance using objective
quality indicators.
• Implementation of system interventions to
achieve improvement in quality.
• Evaluation of the effectiveness of the
interventions.
• Planning and initiation of activities for
increasing or sustaining improvement.
PIP Validation
Ensures that:
• PIPs are designed, implemented, and
reported in a methodologically sound manner.
• Real improvement in the quality of care can
be achieved.
• Documentation is compliant with CMS
protocols for conducting PIPs.
• Stakeholders can have confidence in the
reported improvements.
Overview of PIPs (cont.)
The PIP process
provides an
opportunity to:
– Identify and
measure a targeted
area (clinical or
nonclinical)
– Analyze the results
– Implement
interventions for
improvement
Overview of PIPs (cont.)
HSAG’s role:
• Validates PIPs using CMS’ protocol, Validating
Performance Improvement Projects, A protocol for
use in Conducting Medicaid External Quality Review
Activities, Final Protocol, Version 1.0.
• PIP Validation is a desk audit evaluation
• HSAG validates the study’s findings on the likely
validity and reliability of the results
• Provides PIP Validation Reports to AHCA and the
MCOs
• Identify best practices
PIP Review Process
• The PIP team includes clinicians and
statisticians.
• PIP review team members assigned to the
PIP study reads it in its entirety.
• Each PIP study has one clinician and one
statistician assigned to reading the study.
They read the study independently.
• If discrepancies, the PIP reviewers meets to
reconcile any scoring differences.
PIP Review Process (cont.)
• Once scored, the PIP Validation Reports are
sent to AHCA.
• The MCOs have an opportunity to comment
on any miscalculations or errors noted in the
PIP Validation Report.
• The final PIP Validation Reports are then
released to the MCOs.
PIP Summary Form Review
• Health plan demographics (first page of the
submission form)
• Discuss the 10 PIP Activities
– CMS Rationale
– HSAG evaluation elements
A. Activity One: Choose the Selected
Study Topic
CMS Rationale
• Impacts a significant portion of the members.
• Reflects Medicaid enrollment in terms of
demographic characteristics, prevalence of
disease, and the potential consequences
(risks) of the disease.
A. Activity One: Choose the Selected
Study Topic
CMS Rationale
• Addresses the need for a specific service.
• Goal should be to improve processes and
outcomes of health care.
• The study topic may be specified by the State
Medicaid agency or on the basis of Medicaid
enrollee input.
A. Activity One: Choose the Selected
Study Topic
HSAG Evaluation Elements
• Reflects high-volume or high-risk conditions
(or was selected by the State).
• Is selected following collection and analysis
of data (or was selected by the State).
• Addresses a broad spectrum of care and
services (or was selected by the State).
A. Activity One: Choose the Selected
Study Topic
HSAG Evaluation Elements (cont.)
• Includes all Medicaid eligible populations that
meet the study criteria.
• Includes members with special health care
needs.
• Has the potential to affect member health,
functional status, or satisfaction.
A. Activity One: Choose the Selected
Study Topic
Example Study Topics:
• Cervical Cancer Screening
• HbA1c testing
• Flu Vaccinations
• Timeliness of Case Management
• Discharge Planning
• Readmission to Inpatient Psychiatric Care
within 30, 90, and 120 days
B. Activity Two: The Study Question
CMS Rationale
• Stating the question(s) helps maintain the
focus of the PIP and sets the framework for
data collection, analysis, and interpretation.
B. Activity Two: The Study Question
HSAG Evaluation Elements
• States the problem to be studied in simple
terms.
• Is answerable.
In general, the question should illustrate the point of:
Does doing X result in Y?
Example: Will increased planning and attention to the importance
of follow-up after inpatient discharge improve the rate of
members receiving follow-up services?
C. Activity Three: Selected Study Indicators
CMS Rationale
• Quantitative or qualitative characteristic.
• Discrete event (member has or has not had
XX).
• Appropriate for the study topic.
• Objective, clearly and unambiguously
defined.
C. Activity Three: Selected Study Indicators
HSAG Evaluation Elements
The study indicator(s):
• Is well defined, objective, and measurable.
• Is based on practice guidelines, with sources
identified.
C. Activity Three: Selected Study Indicators
HSAG Evaluation Elements (cont.)
The study indicator(s):
• Allows for the study question to be answered.
• Measures changes (outcomes) in health or
functional status, member satisfaction, or
valid process alternatives.
C. Activity Three: Selected Study Indicators
HSAG Evaluation Elements (cont.)
The study indicator(s):
• Has available data that can be collected on
each indicator.
• Is a nationally recognized measure such as
HEDIS®, when appropriate.
• Includes the basis on which each indicator
was adopted, if internally developed.
HEDIS® is a registered trademark of the National Committee for Quality Assurance (NCQA).
D. Activity Four: Identified Study Population
CMS Rationale
• Represents the entire Medicaid eligible
enrolled population.
• Allows systemwide measurement.
• Implements improvement efforts to which the
study indicators apply.
D. Activity Four: Identified Study Population
HSAG Evaluation Elements
The method for identifying the eligible
population:
• Is accurately and completely defined.
• Includes requirements for the length
of a member’s enrollment in the
managed care plan.
• Captures all members to whom the
study question applies.
D. Activity Four: Identified Study Population
Example of Study Population:
All Medicaid children with at least 11 months
(12 months with one 30-day gap of
enrollment) of continuous enrollment in the
health plan, who were born on or between
January 1, 2001, and December 31, 2003.
E. Activity Five: Valid Sampling Techniques
CMS Rationale
• Sample size impacts the level of statistical
confidence in the study.
-Statistical confidence is a numerical statement
of the probable degree of certainty or
accuracy of an estimate.
• Reflects improvement efforts to which the study
indicators apply.
• Reflects the entire population or a sample of that
population.
E. Activity Five: Valid Sampling Techniques
HSAG Evaluation Elements
• Consider and specify the true or
estimated frequency of occurrence
• Identify the sample size
• Specify the confidence level to be
used
E. Activity Five: Valid Sampling Techniques
HSAG Evaluation Elements (cont.)
• Specify the acceptable margin of error.
• Ensure a representative sample of the
eligible population.
• Ensure that the sampling techniques
are in accordance with generally
accepted principles of research design
and statistical analysis.
F. Activity Six: Data Collection Procedures,
Data Collection Cycle, and Data Analysis
CMS Rationale
• Administrative data collection.
• Manual data collection.
• Inter-rater reliability.
• Frequency of collection and analysis
cycle.
F. Activity Six: Data Collection Procedures,
Data Collection Cycle, and Data Analysis
HSAG Evaluation Elements
The data collection techniques:
• Provide clearly defined data elements
to be collected.
• Clearly specify sources of data.
• Provide for a clearly defined and
systematic process for collecting data
that includes how baseline and
remeasurement data will be collected.
F. Activity Six: Data Collection Procedures,
Data Collection Cycle, and Data Analysis
HSAG Evaluation Elements (cont.)
The data collection techniques
• Provide for a timeline for the collection
of baseline and remeasurement data.
• Provide for qualified staff and
personnel to collect manual data.
F. Activity Six: Data Collection Procedures,
Data Collection Cycle, and Data Analysis
HSAG Evaluation Elements (cont.)
The manual data collection tool:
• Ensures consistent and accurate
collection of data according to
indicator specifications.
• Supports inter-rater reliability.
• Has clear and concise written
instructions for completion.
F. Activity Six: Data Collection Procedures,
Data Collection Cycle, and Data Analysis
HSAG Evaluation Elements (cont.)
• An overview of the study in the written
manual data collection tool instructions.
• Administrative data collection algorithms
that show steps in the production of
indicators.
• An estimated degree of automated
data completeness (important if using
the administrative method).
G. Activity Seven: Improvement Strategies
CMS Rationale
• An intervention designed to change
behavior at all levels of the care
delivery system, including the
members.
• Changing performance, according to
predefined quality indicators.
• Appropriate interventions.
• Likelihood of effecting measurable
change.
G. Activity Seven: Improvement Strategies
HSAG Evaluation Elements
Planned/implemented strategies for improvement
are:
• Related to causes/barriers identified through data
analysis and Quality Improvement (QI) processes.
• System changes that are likely to induce
permanent change.
• Revised if original interventions are not
successful.
• Standardized and monitored if interventions are
successful.
G. Activity Seven: Improvement Strategies
HSAG Evaluation Elements (cont.)
Planned/implemented strategies for improvement:
• May be at the health plan, provider, or
member level
• Should be realistic, feasible, and
clearly defined
• Need a reasonable amount of time to
be effective
H. Activity Eight: Data Analysis and
Interpretation of Study Results
CMS Rationale
• Initiated using statistical analysis
techniques.
• Included an interpretation of the
extent to which the study was
successful.
H. Activity Eight: Data Analysis and
Interpretation of Study Results
HSAG Evaluation Elements
The data analysis:
• Is conducted according to the data
analysis plan in the study design.
• Allows for generalization of the results
to the study population if a sample
was selected.
• Identifies factors that threaten internal
or external validity of findings.
• Includes an interpretation of findings.
H. Activity Eight: Data Analysis and
Interpretation of Study Results
HSAG Evaluation Elements (cont.)
The data analysis:
• Is presented in a way that provides accurate,
clear, and easily understood information.
• Identifies initial measurement and
remeasurement of study indicators.
• Identifies statistical differences between initial
measurement and remeasurement.
• Identifies factors that affect the ability to compare
initial measurement with remeasurement.
• Includes the extent to which the study was
successful.
I. Activity Nine: Study Results and Summary
Improvement
CMS Rationale
• Probability that improvement is true
improvement.
• Included an interpretation of the extent to
which any changes in performance is
statistically significant.
I. Activity Nine: Study Results and Summary
Improvement
HSAG Evaluation Elements
• The remeasurement methodology is the
same as the baseline methodology.
• There is documented improvement in
processes or outcomes of care.
• The improvement appears to be the result of
intervention(s).
• There is statistical evidence that observed
improvement is true improvement.
J. Activity Ten: Sustained Improvement
CMS Rationale
• Change results from modifications in the
processes of health care delivery.
• If real change has occurred, the project
should be able to achieve sustained
improvement.
J. Activity Ten: Sustained Improvement
HSAG Evaluation Elements
• Repeated measurements over comparable time
periods demonstrate sustained improvement, or
that a decline in improvement is not statistically
significant.
PIP Scoring Methodology
HSAG Evaluation Tool
• 13 Critical Elements
• 53 Evaluation Elements
(including the Critical Elements)
PIP Scoring Methodology
Overall PIP Score
Percentage Score for all Evaluation Elements:
Calculated by dividing the total Met (includes critical
elements) by the sum of the total Met, Partially Met,
and Not Met.
Percentage Score for Critical Elements:
Calculated by dividing the total critical elements Met by
the sum of the critical elements Met, Partially Met,
and Not Met.
Validation Status: Met, Partially Met, Not Met
PIP Scoring Methodology
Met
(1) All critical elements were Met, and
(2) 80%–100% of all elements were Met
across all activities.
PIP Scoring Methodology
Partially Met
(1) All critical elements were Met,
and 60% to 79% of all elements were Met
across all activities;
or
(2) One or more critical element(s) were
Partially Met.
PIP Scoring Methodology
Not Met
(1) All critical elements were Met and <60% of
all elements were Met across all activities;
or
(1) One or more critical element(s) were Not
Met.
PIP Scoring Methodology
Not Applicable (NA)
NA elements (including critical elements)
were removed from all scoring.
Not Assessed
Not Assessed elements (including critical
elements) were removed from all scoring.
PIP Scoring Methodology
Example 1
• Met = 43, Partially Met = 2, Not Met = 0, NA
= 8, and all critical elements were Met.
• The MCO receives an overall Met status,
indicating the PIP is valid.
• The score for the MCO is calculated as
43/45 = 95.6 percent.
PIP Tips
1. Complete the demographic page before submission.
2. Notify HSAG when the PIP documents are uploaded
to the secure ftp site and state the number of documents
uploaded.
3. Label ALL attachments and reference them
in the body of the PIP study.
4. HSAG does not require personal health
information to be submitted. Submit only aggregate
results.
5. Document, document, and document!!
6. Look for the CMS Protocols on myfloridaeqro.com. If you
have additional questions, contact HSAG.
HSAG Contacts
For questions contact:
• Cheryl Neel
– [email protected]
– 602.745.6201
• Denise Driscoll
– [email protected]
– 602.745.6260
Questions and Answers