Overview of 2004-2005 External Quality Review (EQR) Activities

Download Report

Transcript Overview of 2004-2005 External Quality Review (EQR) Activities

WELCOME
to the PIP Technical Assistance
Training for Florida NHDPs
We will begin shortly. Please place your
phone on mute, unless you are speaking.
Thank you.
Performance Improvement Projects
(PIPs)
Technical Assistance
for Florida Medicaid
NHDPs
August 22, 2007
Christi Melendez, RN
PIP Review Team Project Leader
Presentation Outline
 Purpose
 Review of PIP Activities II through X
 Review PIP submission process for the 20072008 validation cycle
 Questions and Answers
PURPOSE
 To provide technical assistance with
examples for Activities receiving an
overall score of Partially Met or Not Met
for the 2006-2007 validation cycle.
 PIP submission process for the 20072008 validation cycle.
Activity Two: The Study Question
HSAG Evaluation Criteria
 The study question stated the problem to be
studied in simple terms.
 Was answerable.
*In general, the question should illustrate the
point of: Does doing X result in Y?
Examples of Study Questions



Do targeted interventions increase the
number of members completing an
Advance Directive within the first 30 days of
enrollment?
Will member interventions increase the rate
of members who receive a flu vaccine?
Will targeted interventions decrease the rate
of missed personal care aide visits?
Activity Three: Selected Study Indicators
HSAG Evaluation Criteria
 Was well defined, objective, and measurable.
 Was based on practice guidelines, with
sources identified. If no practice guidelines
were available for the topic, please specify.
 Aligned with the study question(s) and
allowed for the study question(s) to be
answered.
Activity Three: Selected Study Indicators
HSAG Evaluation Criteria (cont.)
 Measured changes (outcomes) in health or
functional status, member satisfaction, or valid
process alternatives.
 There were data available to be collected on
each indicator.
 Included the basis on how each indicator was
developed.
Activity Three: Study Indicator
EXAMPLE
Study Indicator 1(rationale)
Numerator:
Denominator:
First Measurement Period Dates:Baseline
Benchmark:
Source of Benchmark:
Baseline Goal:
According to the Diversion Program
policy, members must have an Advance
Directive completed within 30 days of
enrollment into the Diversion Program.
The number of members who completed an
Advance Directive within the first 30 days
of enrollment.
The total number of members continuously
enrolled for at least 30 days during the
study period.
1/1/2005-12/31/2005
65%
2005 Diversion Program Policy/Procedure
80%
Activity Four: The Study Population
HSAG Evaluation Criteria
 Was accurately and completely defined.
 Included requirements for length of a
members enrollment in the NHDP. If
enrollment is not applicable, this should be
stated in the PIP text.
 Captured all members to whom the study
question applies.
 Included ICD-9 codes and procedure codes
(if applicable).
Activity Four: The Study Population
Example:
All members 65 years of age or older who
were continuously enrolled in the NHDP for at
least 30 days during 1/1/06-12/31/06.
Activity Five: Sampling Techniques
HSAG Evaluation Criteria
 The true or estimated frequency of
occurrence was provided and considered in
the sampling technique.
 Sample size was specified.
 Confidence level was specified.
 Acceptable margin of error was specified.
Activity Five: Sampling Techniques
HSAG Evaluation Criteria (cont.)
 The sampling technique ensured a
representative sample of the eligible
population.
 Sampling techniques were in accordance with
generally accepted principles of research
design and statistical analysis. Valid sampling
techniques should be used for all study
indicators, which can be replicated by using
the reported sampling parameters.
Activity Five: Sampling Technique
EXAMPLE
E. Activity 5: Use sound sampling methods. If sampling is to be used to select consumers of the study, proper sampling
techniques are necessary to provide valid and reliable information on the quality of care provided. The true prevalence or
incidence rate for the event in the population may not be known the first time a topic is studied.
Method for
Sample Error and
Sample
Sampling Method
Measure
Population
Determining Size
Confidence Level
Size
(describe)
(describe)
Advance Directive:
No sampling:
Baseline 1/1/05-12/31/05
Entire population
used
First Remeasurement:
1/1/06-12/31/06
Second Remeasurement
1/1/07-12/31/07
Study Implementation Phase
Activity V: Sampling Technique
Example
E. Activity 5: Use sound sampling methods. If sampling is to be used to select consumers of the study, proper sampling
techniques are necessary to provide valid and reliable information on the quality of care provided. The true prevalence or
incidence rate for the event in the population may not be known the first time a topic is studied.
Method for
Sample Error and
Sample
Sampling Method
Measure
Population
Determining Size
Confidence Level
Size
(describe)
(describe)
Advanced Directive
95% Confidence
56
65
Online sample size
Simple random
Baseline: 1/1/05 –
Level
calculator:
sampling
www.surveysystem.com
12/31/05
5% +/- Margin of
error
Advanced Directive
95% Confidence
135
200
Online sample size
Simple random
Remeasurement 1:
Level
calculator:
sampling
www.surveysystem.com
1/1/06 – 12/31/06
5% +/- Margin of
error
Advanced Directive
95% Confidence
99
130
Online sample size
Simple random
Remeasurement 2:
Level
calculator:
sampling
www.surveysystem.com
1/1/07 – 12/31/07
5% +/- Margin of
error
Activity Six: Data Collection
HSAG Evaluation Criteria
 Data elements to be collected were clearly
identified.
 The data sources were clearly identified.
 A systematic method for data collection was
outlined in the PIP documentation.
 A timeline included both starting and ending
dates for all measurement periods.
Activity Six: Data Collection
For Manual Data Collection
 The relevant education, experience, and
training of all manual data collection staff
were described in the PIP text.
 The manual data collection tool was included
with the PIP submission.
 A discussion of the interrater reliability
process was documented in the PIP text.
Activity Six: Data Collection
HSAG Evaluation Criteria (cont.)
 Written instructions for the manual data
collection tool were clearly and succinctly
written and included in the PIP
documentation.
 A brief statement about the purpose of the
study (overview) was included in the written
instructions for the manual data collection
tool.
Activity Six: Data Collection
For Administrative Data Collection
 Documentation should include a systematic
process of the steps used to collect data. This
can be defined in narrative format or with
algorithms/flow charts.
 The estimated degree of administrative data
completeness should be included along with
an explanation of how the percentage of
completeness was calculated.
Activity Six: Data Collection
Data Sources
[ x ] Hybrid (medical/treatment records and administrative)
[ x ] Administrative Data
Data Source
[ x] Programmed pull from claims/encounters
[ ] Complaint/appeal
[ ] Pharmacy data
[ ] Telephone service data /call center data
[ ] Appointment/access data
[ ] Delegated entity/vendor data ____________________________
[ ] Other _______________________
[ x ] Medical/Treatment Record Abstraction
Record Type
[ x ] Outpatient
[ ] Inpatient
[ ] Other ____________________________
Other Requirements
[ x ] Data collection tool attached
[ x ] Data collection instructions attached
[ x ] Summary of data collection training attached
[ x ] IRR process and results attached
[
] Other data
Description of data collection staff (include training,
experience and qualifications):
3 RNs with BSN degrees with 5 years of clinical and
quality improvement experience will perform medical
record abstraction. All 3 RNs have attended 8 hours of
training on how to perform data abstraction and use of
manual data collection tool
Other Requirements
[ ] Data completeness assessment attached
[ ] Coding verification process attached
[
] Survey Data
Fielding Method
[ ] Personal interview
[ ] Mail
[ ] Phone with CATI script
[ ] Phone with IVR
[ ] Internet
[ ] Other ____________________________
Other
[
[
[
Requirements
] Number of waves _____________________________
] Response rate _____________________________
] Incentives used _____________________________
Activity Six: Data Collection
F. Activity 6b: Determine the data collection cycle.
Determine the data analysis cycle.
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
] Once a year
] Twice a year
] Once a season
x] Once a quarter
] Once a month
] Once a week
] Once a day
] Continuous
] Other (list and describe):
] Once a year
] Once a season
x] Once a quarter
] Once a month
] Continuous
] Other (list and describe):
F. Activity 6c. Data analysis plan and other pertinent methodological features. Complete only if needed.
Estimated percentage degree of administrative data completeness: __90__ percent.
The explanation of how the estimated degree of administrative completeness was calculated should be entered here.
Example of Analysis plan:
The study will be a non-randomized time series design. Measurement will be made in the baseline period and at one year intervals. A chi-square
analysis will be performed to assess significance of any observed change and reported at the 95% confidence interval.
Activity Seven: Improvement Strategies
HSAG Evaluation Criteria
 A completed causal/barrier analysis
explanation of how the intervention(s) were
related to causes/barriers identified through
data analysis and quality improvement
processes should be included in the PIP
documentation.
 System interventions that will have a
permanent effect on the outcomes of the PIP
should be documented in the text.
Activity Seven: Improvement Strategies
HSAG Evaluation Criteria (cont.)
 If repeat measures do not yield statistically
significant improvements, there should be an
explanation of how problem solving and data
analysis was performed to identify possible
causes.
 If quality improvement interventions were
successful, it should be documented that the
interventions were standardized and the
interventions were monitored.
How to perform a Causal/Barrier Analysis
Determine why an event or condition
occurs?
1. What’s the problem?
–
Define what the problem is and why it’s a concern.
2. Determine the significance of the problem.
–
Look at data and see how the problem impacts your
members and/or health plan.
3. Identify the causes/barriers?
–
–
–
Conduct analysis of chart review data; surveys; focus
groups.
Brainstorming at quality improvement committee
meeting.
Literature review.
4. Develop/Implement interventions based on
barriers identified.
Causal/Barrier Analysis Methods andTools
 Methods:
Quality improvement committee
Develop an internal task force
 Tools:
Fishbone
Process mapping
Barrier/intervention table
Barrier/Intervention Table
EXAMPLE
Interventions Taken for Improvement as a Result of Analysis. List chronologically the interventions that
have had the most impact on improving the measure. Describe only the interventions and provide quantitative
details whenever possible (e.g., “hired 4 customer service reps” as opposed to “hired customer service reps”).
Do not include intervention planning activities.
Date Implemented
Check if
ongoing
Interventions
Barriers that Interventions Address
March 2006
X
Member education
(newsletter/article)
regarding the importance of
completing the Advanced
Directive.
Members are not completing the
Advanced Directive.
April 2006
X
Implement case manager
assistance program to
assist all members in
completing the Advanced
Directive.
Members are not completing the
Advanced Directive.
Activity Eight: Data Analysis and
Interpretation of Study Results
HSAG Evaluation Criteria
The data analysis:
 Was conducted according to the data
analysis plan in the study design.
 Allowed for generalization of the results
to the study population if a sample
was selected.
 Identified factors that threaten internal
or external validity of findings (change in
demographic population, acquiring another health plan’s
members, change in the IS system, change in health plan).
Activity Eight: Data Analysis and
Interpretation of Study Results
HSAG Evaluation Criteria (cont.)
 Included an interpretation of findings.
 Was presented in a way that provides accurate,
clear, and easily understood information.
 Identified initial measurement and
remeasurement of study indicators.
 Identified statistical differences between initial
measurement and remeasurement.
Activity Eight: Data Analysis and
Interpretation of Study Results
HSAG Evaluation Criteria (cont.)
 Identified factors that affect the ability to
compare initial measurement with
remeasurement (changes to the methodology, change in
time periods, seasonality, or a change in vendors).
 Included the extent to which the study was
successful.
Activity Eight: Data Analysis and
Interpretation of Study Results
Example: Baseline Interpretation
The baseline data collection yielded results of
14.1 percent of members who completed an
Advanced Directive during the baseline
timeframe of 1/1/05-12/31/05.
Activity Eight: Data Analysis and
Interpretation of Study Results
Example: Remeasurement 1
The baseline rate of members completing an
Advanced Directive at 14.1 percent increased to
21.7 percent in the first remeasurement (1/1/06 12/31/06). This represents a statistically
significant (p = 0.00167) increase.
Activity Eight: Data Analysis and
Interpretation of Study Results
Example: Remeasurement 2
The first remeasurement rate increased from
21.7 percent to 27.6 percent in the second
remeasurement (1/1/07 – 12/31/07). This increase
was not a statistically significant increase (p =
0.0699).
Activity Eight: Data Analysis and
Interpretation of Study Results
Overall Analysis:
The rate of members completing an
Advanced Directive increased each year with
statistically significant results in the first
remeasurement with no decline in
performance.
The study has been successful in increasing
the rate of members completing an Advanced
Directive and will be continued until the goal
is met.
Activity Nine: Assessing for Real
Improvement
HSAG Evaluation Criteria
 The use of the same methodology for
baseline and remeasurement should be
documented.
 If there was a change in methodology, the
issue should be discussed in the PIP text that
justifies the needed changes.
 Documentation should include how
intervention(s) were successful in affecting
system wide processes or health care
outcomes.
Activity Nine: Assessing for Real
Improvement
HSAG Evaluation Criteria (cont.)
 The improvement in performance as a result of
the intervention(s) should be documented in the
text of the PIP.
 The PIP documentation should include
calculations and reports on the degree to which
the intervention(s) were statistically significant.
 The table in Activity IX should be completely filled
out for each measurement period. The actual p
values should be documented and whether or not
the value was statistically significant.
Activity Nine: Assessing For Real
Improvement
Example: Completed Table
Quantifiable Measure No. 1: Advanced Directive
Baseline
Time Period
Project
Rate or
Industry
Measurement
Numerator Denominator
Indicator
Results Benchmark
Covers
Measurement
1/1/05 – 12/31/05
1/1/06 – 12/31/06
Baseline:
1/1/07 – 12/31/07
Remeasurement 1
62
136
440
627
14.1%
21.7%
38.7%
40.3%
Remeasurement 2
227
822
27.6%
39.2%
Remeasurement 3
Remeasurement 4
Remeasurement 5
Statistical Test and
Significance*
Test statistic and pvalue
N/A
Chi-square = 9.88 P =
0.00167
Statistically Significant
increase
Chi-square = 3.28 P =
0.0699
Not Statistically Significant
increase.
Activity Ten: Assessing for Sustained
Improvement
HSAG Evaluation Criteria
* This activity is not assessed until baseline and
a minimum of two annual measurements
have been completed.
 Demonstrated improvement in all the study
indicators should be explained in the text of
the PIP.
 If there was a decline in results, the PIP text
should have an explanation of this decline
and what follow-up activities are planned.
New PIP submissions
 New PIPs not submitted for the 20062007 validation cycle.
 For new PIP submissions, it is important
to contact HSAG to obtain the most
current updated PIP Summary Form.
How to submit continuing PIPs
 On-going PIPs (submitted to HSAG for the
2006-2007 validation cycle).
 Highlight, bold, or add text in a different color,
and date any new information that is added to
the existing PIP Summary Form.
 Strikethrough and date any information that
no longer applies to the PIP study.
 Ensure all Partially Met and Not Met
evaluation elements from the previous
validation cycle have been addressed in the
documentation.
Resources
 Frequently asked questions (FAQs) and PIP
information myfloridaeqro.com
 NCQA Quality Profiles http://www.qualityprofiles.org/index.asp
 Institute for Healthcare Improvement – www.ihi.org
 Center for Healthcare Strategies – www.chcs.org
 Health Care Quality Improvement Studies in
Managed Care Settings – A Guide for State Medicaid
Agencies www.ncqa.org/publications
 National Guideline Clearinghouse –
www.guidelines.gov
 Agency for Healthcare Research and Quality –
www.ahrq.gov
Deliverables
September 7th:
NHDPs notified electronically of
submission date with instructions
October 5th:
Submit PIP studies to HSAG
* HSAG will be validating two PIPs per NHDP; one
clinical and one nonclinical. If the collaborative PIP is
clinical, the other PIP chosen for validation will be
nonclinical.
PIP Tips
1. Complete the demographic page before submission.
2. Notify HSAG when the PIP documents are uploaded
to the secure ftp site and state the number of documents
uploaded.
3. Label ALL attachments and reference them
in the body of the PIP study.
4. HSAG does not require personal health
information to be submitted. Submit only aggregate
results.
5. Document, document, and document!!
6. Go to myfloridaeqro.com for FAQ or contact Cheryl Neel
at [email protected] to answer any questions.
HSAG Contacts
For questions contact:
 Cheryl Neel
– [email protected]
– 602.745.6201
 Denise Driscoll
– [email protected]
– 602.745.6260
Questions and Answers