Overview of 2004-2005 External Quality Review (EQR) Activities

Download Report

Transcript Overview of 2004-2005 External Quality Review (EQR) Activities

Performance Improvement Projects
Technical Assistance
Nursing Home Diversion Programs
Thursday, March 29, 2007
8:30 a.m. – 10:30 a.m.
Cheryl L. Neel, RN, MPH, CPHQ
Manager, Performance Improvement Projects
David Mabb, MS
Sr. Director, Statistical Evaluation
Presentation Outline
•
•
•
•
PIP Overall Comments
Aggregate MCO PIP Findings
Aggregate NHDP Specific Findings
Technical Assistance with Group Activities
– Study Design
– Study Implementation
– Quality Outcomes Achieved
• Questions and Answers
Key PIP Strategies
1. Conduct outcome-oriented projects
2. Achieve demonstrable improvement
3. Sustain improvement
4. Correct systemic problems
Validity and Reliability of PIP Results
• Activity 3 of the CMS Validating
Protocol: Evaluating overall validity and
reliability of PIP results:
– Met= Confidence/High confidence in reported
PIP results
– Partially Met= Low confidence in reported PIP
results
– Not Met= Reported PIP results not credible
Summary of PIP Validation Scores
Percentage Score of
Evaluation
Elements Met
90% to 100%
80% to 89%
70% to 79%
60% to 69%
Less than 60%
Total
HMO
NHDP
PMHP
Total
2
13
13
8
17
53
0
0
1
3
16
20
0
6
0
1
1
8
2
19
14
12
34
81
Proportion of PIPs Meeting the Requirements
for Each Activity
Aggregate Valid Percent Met
I
II
III
IV
V
VI
VII
VIII
IX
X
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
1
3
5
7
9
11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53
Evaluation Element Number
NHDP Specific Findings
•
•
•
•
20 PIPs submitted
Scores ranged from 17% to 75%
Average score was 40%
Assessed evaluation elements were scored
as Met 40% of the time
Summary of NHDP Validation Score
Percentage Score
of Evaluation
Elements Met
90% to 100%
80% to 89%
70% to 79%
60% to 69%
Less than 60%
Total
NHDP
0
0
1
3
16
20
Study Design
Four Components:
1. Activity I. Selecting an Appropriate Study
Topic
2. Activity II. Presenting Clearly Defined,
Answerable Study Question(s)
3. Activity III. Documenting Clearly Defined
Study Indicator(s)
4. Activity IV. Stating a Correctly Identified
Study Population
Activity I. Selecting an Appropriate Study
Topic - NHDP Overall Score
95%
100%
95%
85%
90%
80%
70%
70%
60%
55%
50%
40%
25%
30%
20%
10%
0%
1
2
3
4
Evaluation Element Number
5
6
Activity I. Selecting an Appropriate
Study Topic
Results:
• 71 percent of the six evaluation elements
were Met
• 29 percent were Partially Met or Not Met
• None of the evaluation elements were Not
Applicable or Not Assessed
Activity I: Review the Selected Study Topic
HSAG Evaluation Elements:
• Reflects high-volume or high-risk conditions (or was
selected by the State).
• Is selected following collection and analysis of data (or was
selected by the State).
• Addresses a broad spectrum of care and services (or was
selected by the State).
• Includes all eligible populations that meet the study criteria.
• Does not exclude members with special health care needs.
• Has the potential to affect member health, functional status,
or satisfaction.
Bolded evaluation elements show areas for improvement
Activity II. Presenting Clearly Defined,
Answerable Study Question(s) - NHDP Overall
Score
100%
90%
80%
70%
60%
50%
40%
30%
30%
25%
20%
10%
0%
1
2
Evaluation Element Number
Activity II. Presenting Clearly Defined,
Answerable Study Question(s)
Results:
• 28 percent of the two evaluation elements
were Met
• 73 percent were Partially Met or Not Met
• None of the evaluation elements were Not
Applicable or Not Assessed
Activity II: Review the Study Question(s)
HSAG Evaluation Elements:
• States the problem to be studied in simple
terms.
• Is answerable.
Bolded evaluation elements show areas for improvement
Activity III. Documenting Clearly Defined
Study Indicator(s) - NHDP Overall Score
100%
90%
80%
70%
60%
60%
50%
35%
40%
30%
25%
23%
30%
25%
20%
10%
0%
0%
1
2
3
4
5
Evaluation Element Number
6
7
Activity III. Documenting Clearly Defined
Study Indicator(s)
Results:
• 27 percent of the seven evaluation elements
were Met
• 54 percent were Partially Met or Not Met
• 19 percent were Not Applicable or Not
Assessed
Activity III: Review Selected Study
Indicator(s)
HSAG Evaluation Elements:
•
•
•
•
Is well defined, objective, and measurable.
Are based on practice guidelines, with sources identified.
Allows for the study question to be answered.
Measures changes (outcomes) in health or functional
status, member satisfaction, or valid process alternatives.
• Have available data that can be collected on each indicator.
• Are nationally recognized measure such as HEDIS®, when
appropriate.
• Includes the basis on which each indicator was adopted, if
internally developed.
Bolded evaluation elements show areas for improvement
Activity IV. Stating a Correctly Identified Study
Population - NHDP Overall Score
100%
90%
80%
70%
60%
55%
60%
50%
40%
30%
30%
20%
10%
0%
1
2
Evaluation Element Number
3
Activity IV. Stating a Correctly Identified
Study Population
Results:
• 48 percent of the three evaluation elements
were Met
• 52 percent were Partially Met or Not Met
• None of the evaluation elements were Not
Applicable or Not Assessed
Activity IV: Review the Identified Study
Population
HSAG Evaluation Elements:
• Is accurately and completely defined.
• Includes requirements for the length of a
member’s enrollment in the managed care
plan.
• Captures all members to whom the study
question applies.
Bolded evaluation elements show areas for improvement
Group Activity
Study Implementation
Three Components:
1. Activity V. Valid Sampling Techniques
2. Activity VI. Accurate/Complete Data
Collection
3. Activity VII. Appropriate Improvement
Strategies
Activity V. Presenting a Valid Sampling
Technique - NHDP Overall Score
100%
100%
90%
80%
70%
60%
50%
40%
40%
40%
40%
30%
20%
10%
0%
0%
0%
1
2
3
4
Evaluation Element Number
5
6
Activity V. Presenting a Valid Sampling
Technique
Results:
• 5 out of the 20 PIP studies used sampling.
• 9 percent of the six evaluation elements were
Met.
• 16 percent were Partially Met or Not Met.
• 75 percent of the evaluation elements were
Not Applicable or Not Assessed.
Activity V: Review Sampling Methods
* This section is only validated if sampling is used.
HSAG Evaluation Elements:
• Consider and specify the true or estimated frequency of
occurrence. (N=5)
• Identify the sample size. (N=5)
• Specify the confidence level to be used. (N=5)
• Specify the acceptable margin of error. (N=5)
• Ensure a representative sample of the eligible population.
(N=5)
• Ensure that the sampling techniques are in accordance
with generally accepted principles of research design and
statistical analysis. (N=5)
Bolded evaluation elements show areas for improvement
Populations or Samples?
Generally,
– Administrative data uses populations
– Hybrid (chart abstraction) method uses
samples identified through
administrative data
Activity VI. Specifying Accurate/Complete
Data Collection - NHDP Overall Score
100%
85%
90%
80%
70%
55%
60%
50%
45%
40%
30%
30%
21%
21%
20%
10%
0%
5%
5%
8%
8
9
10
0%
0%
1
2
3
4
5
6
7
Evaluation Element Number
11
Activity VI. Specifying Accurate/Complete
Data Collection
Results:
• 25 percent of the eleven evaluation elements
were Met
• 66 percent were Partially Met or Not Met
• 10 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity VI: Review Data Collection
Procedures
HSAG Evaluation Elements:
• Clearly defined data elements to be collected.
• Clearly identified sources of data.
• A clearly defined and systematic process for
collecting data that includes how baseline and
remeasurement data will be collected.
• A timeline for the collection of baseline and
remeasurement data.
• Qualified staff and personnel to collect manual data.
• A manual data collection tool that ensures consistent
and accurate collection of data according to indicator
specifications.
Bolded evaluation elements show areas for improvement
Activity VI: Review Data Collection
Procedures (cont.)
HSAG Evaluation Elements:
• A manual data collection tool that supports interrater
reliability.
• Clear and concise written instructions for completing
the manual data collection tool.
• An overview of the study in the written instructions.
• Administrative data collection algorithms that show
steps in the production of indicators.
• An estimated degree of automated data completeness
(important if using the administrative method).
Where do we look for our sources of data?
Baseline Data Sources
•
•
•
•
•
•
•
Medical Records
Administrative claims/encounter data
Hybrid
HEDIS
Survey Data
MCO program data
Other
Activity VII. Documenting the Appropriate
Improvement Strategies - NHDP Overall Score
100%
100%
3
4
100%
90%
80%
67%
70%
60%
50%
40%
30%
20%
17%
10%
0%
1
2
Evaluation Element Number
Activity VII. Documenting the Appropriate
Improvement Strategies
Results:
• 15 percent of the four evaluation elements
were Met
• 18 percent were Partially Met or Not Met
• 68 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity Seven: Assess Improvement
Strategies
HSAG Evaluation Elements:
• Related to causes/barriers identified through data
analysis and Quality Improvement (QI) processes.
• System changes that are likely to induce
permanent change.
• Revised if original interventions are not successful.
• Standardized and monitored if interventions are
successful.
Bolded evaluation elements show areas for improvement
Determining Interventions
Once you know how you are doing at
baseline, what interventions will produce
meaningful improvement in the target
population?
First Do A Barrier Analysis
What did an analysis of baseline results
show ?
How can we relate it to system
improvement?
• Opportunities for improvement
• Determine intervention(s)
• Identify barriers to reaching improvement
How was intervention(s) chosen?
• By reviewing the literature
– Evidence-based
– Pros & cons
– Benefits & costs
• Develop list of potential interventions -- what
is most effective?
Types of Interventions
•
•
•
•
•
•
Education
Provider performance feedback
Reminders & tracking systems
Organizational changes
Community level interventions
Mass media
Choosing Interventions
• Balance
– potential for success with ease of use
– acceptability to providers & collaborators
– cost considerations (direct and indirect)
• Feasibility
– adequate resources
– adequate staff and training to ensure a
sustainable effort
Physician Interventions: Multifaceted
Most Effective
• Most effective:
– real-time
reminders
– outreach/detailing
– opinion leaders
– provider profiles
• Less effective:
– educational
materials (alone)
– formal CME
program without
enabling or
reinforcing
strategies
Patient Interventions
• Educational programs
– Disease-specific education booklets
– Lists of questions to ask your physician
– Organizing materials: flowsheets, charts,
reminder cards
– Screening instruments to detect
complications
– Direct mailing, media ads, websites
Evaluating Interventions
• Does it target a specific quality indicator?
• Is it aimed at appropriate stakeholders?
• Is it directed at a specific process/outcome of
care or service?
• Did the intervention begin after baseline
measurement period?
Interventions Checklist
 Analyze barriers (root causes)
 Choose & understand target audience
 Select interventions based on cost-benefit
 Track intermediate results
 Evaluate effectiveness
 Modify interventions as needed
 Re-Measure
Group Activity
Quality Outcomes Achieved
Three Components:
1. Activity VIII. Presentation of Sufficient Data
Analysis and Interpretation
2. Activity IX. Evidence of Real Improvement
Achieved
3. Activity X. Data Supporting Sustained
Improvement Achieved
Activity VIII. Presentation of Sufficient Data
Analysis and Interpretation - NHDP Overall
Score
100%
100%
90%
80%
70%
60%
56%
56%
50%
50%
40%
33%
30%
22%
20%
10%
0%
0%
0%
7
8
0%
1
2
3
4
5
6
Evaluation Element Number
9
Activity VIII. Presentation of Sufficient Data
Analysis and Interpretation
Results:
• 10 percent of the nine evaluation elements
were Met
• 18 percent of the evaluation elements
Partially Met or Not Met
• 72 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity VIII: Review Data Analysis and
Interpretation of Study Results
HSAG Evaluation Elements:
• Is conducted according to the data analysis
plan in the study design.
• Allows for generalization of the results to the
study population if a sample was selected.
• Identifies factors that threaten internal or
external validity of findings.
• Includes an interpretation of findings.
• Is presented in a way that provides accurate,
clear, and easily understood information.
Activity VIII: Review Data Analysis and
Interpretation of Study Results (cont.)
HSAG Evaluation Elements:
• Identifies initial measurement and remeasurement of
study indicators.
• Identifies statistical differences between initial
measurement and remeasurement.
• Identifies factors that affect the ability to compare
initial measurement with remeasurement.
• Includes the extent to which the study was
successful.
Bolded evaluation elements show areas for improvement
Changes in Study Design?
Study design should be same as baseline
 Data source
 Data collection methods
 Data analysis
 Target population or sample size
 Sampling methodology
If change:
rationale must be specified & appropriate
Activity IX. Evidence of Real Improvement NHDP Overall Score
100%
100%
90%
80%
70%
60%
50%
50%
50%
40%
30%
20%
10%
0%
0%
1
2
3
Evaluation Element Number
4
Activity IX. Evidence of Real Improvement
Results:
• 5 percent of the four evaluation elements
were Met
• 5 percent were Partially Met or Not Met
• 90 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity IX: Assess the Likelihood that
Reported Improvement is “Real” Improvement
HSAG Evaluation Elements:
• The remeasurement methodology is the same as the
baseline methodology.
• There is documented improvement in processes
or outcomes of care.
• The improvement appears to be the result of
intervention(s).
• There is statistical evidence that observed
improvement is true improvement.
Bolded evaluation elements show areas for improvement
Statistical Significance Testing
Time
Periods
Measurement
Periods
Numerator
Denominator
Rate or
Results
Industry
Benchmark
Statistical Testing
and Significance
CY 2003
Baseline
201
411
48.9%
60%
N/A
CY 2004
Re-measurement 1
225
411
54.7%
60%
Chi-square = 2.8
P-value = 0.09387
NOT SIGNIFICANT
AT THE 95%
CONFIDENCE
LEVEL
Activity X. Data Supporting Sustained
Improvement Achieved - NHDP Overall Score
100%
90%
80%
70%
60%
50%
No Met evaluation elements
for this Activity
40%
30%
20%
10%
0%
0%
1
Evaluation Element Number
Activity X. Data Supporting Sustained
Improvement Achieved
Results:
• 0 percent of the one evaluation element was
Met
• 10 percent was Partially Met or Not Met
• 90 percent of the evaluation element was Not
Applicable or Not Assessed
Activity X: Assess Whether Improvement is
Sustained
HSAG Evaluation Elements:
• Repeated measurements over comparable
time periods demonstrate sustained
improvement, or that a decline in
improvement is not statistically significant.
Quality Outcomes Achieved
Baseline
1st Yr
Demonstrable
Improvement
Sustained
Improvement
Sustained Improvement
• Modifications in interventions
• Changes in study design
• Improvement sustained for 1 year
HSAG Contact Information
Cheryl Neel, RN, MPH,CPHQ
Manager, Performance Improvement Projects
[email protected]
602.745.6201
Denise Driscoll
Administrative Assistant
[email protected]
602.745.6260
Questions and Answers