EMS Quality Improvement - National, State, & Local Indicators John New

Download Report

Transcript EMS Quality Improvement - National, State, & Local Indicators John New

EMS Quality Improvement National, State, & Local Indicators
John New, Director of MIEMSS Quality Management
Prepared For: QA Officer Training
Day 2, Session 3 – 10:30 to 12:00
Today’s Objectives
Creating an Indicator Friendly
Environment
 Understanding an Indicator’s Purpose(es)
 Review National, State, Local Indicators

Copyright 2007 All Rights Reserved
Something To Consider
“People seldom improve when they
have no other model but
themselves to copy after.”
Oliver Goldsmith, Irish Author -1758
Copyright 2007 All Rights Reserved
The “Indicator Threat”
Evaluate “Good” Neighbor
 Establish Indicators:

Return trash cans 1 hr. after pickup
 Height of grass never exceeds 4 inches
 Recognition of your family’s birthdays


Annual Report Time
Copyright 2007 All Rights Reserved
Indicator Focus Areas

Sign of an organization’s:
Strength
 Weakness
 Opportunity
 Threat (Vulnerability)

Copyright 2007 All Rights Reserved
Horse and Cart Order
What Organizations struggle with:
Quality Drives Indicators
Vs.
 Indicators Drive Quality

Copyright 2007 All Rights Reserved
Blended Results

Chrysler Building (77 story)
(405 Lexington Ave. NY, NY)

Safety Baseline - 1 death
per story above the 15th
story - expected 62 deaths.
Construction Mortality = 0
 Adoption/adherence to
professional safety
standards

Copyright 2007 All Rights Reserved
Characteristics of Quality
Organizations
EMS System
Results
Continual
Improvement
Process
Management
Measurement
Total
Involvement
Customer Focus
& Satisfaction
Leadership
Total Quality
Organizational Mission, Vision, & Principles
Copyright 2007 All Rights Reserved
Quality Culture

Authoritarian
Participation

Status Quo
Continuous
Improvement
A Few Statistical
Experts
All Trained in
Basic Tools

Focus on Job
Focus on
Customers

“Beat on Suppliers”
In Partnership
with Suppliers

Copyright 2007 All Rights Reserved
Dr. Bass’s
5 C’s
Consensus
Coordination
Cooperation
Confidentiality
Communication
Developing Quality Indicators

In order to clearly determine if we are:
doing a quality job,
 improving our performance, or
 satisfying our customers,


We must develop and use measurable &
meaningful quality indicators
Copyright 2007 All Rights Reserved
Developing Meaningful Indicators
Indicators are used for several reasons
Copyright 2007 All Rights Reserved
Developing Meaningful Indicators



Indicators are used for several reasons
They help to determine the performance
baseline
 by establishing “What is the current
performance”
They help to determine the relationship to
standards
 by determining “What is needed”
Copyright 2007 All Rights Reserved
Developing Meaningful Indicators


They help in setting goals & objectives
 by determining “What is wanted &
possible”
They alert us to problems
 by providing clear information
Copyright 2007 All Rights Reserved
Indicators and Our Daily Work




Our work can be divided into three
very large general areas;
Things that need to be fixed,
Things that need to be maintained (kept
fixed) &
Things that need to be improved.
Copyright 2007 All Rights Reserved
OK LEVEL
Fix It
1
Problem Solving
Standard Setting
2
OK LEVEL = STANDARD
1
Goal/Objective
Development
2
Prevent It
OK LEVEL
Fix It
1
3
Improve It
3
2
What is wanted
What is
Needed
STANDARD
What it is
1
Baseline
3
2
STANDARD
Indicators
1
Quality Improvement Framework
TOOLS: A recognized tool to facilitate the QI
process is the FOCUS-PDCA cycle:
F
O
C
U
S
Find a process to improve.
Organize an effort to work on improvement.
Clarify current knowledge of the process.
Understand process variation and capability.
Select a strategy for further improvement.
Copyright 2007 All Rights Reserved
Quality Improvement Framework
P
D
C
Plan a change or test aimed at improvement.
A
Act - adopt the change, or abandon it,
or run through the cycle again.
Do - carry out the change or the test.
Check the results, what was learned,
what went wrong.
Copyright 2007 All Rights Reserved
QI Indicator Consideration

Selection of jurisdictional indicators
High volume
 High risk, low volume
 Benchmark standards
 Provider interest

Copyright 2007 All Rights Reserved
Characteristics of Good Performance

When you think of good
performance what characteristics or
traits come to mind?
Copyright 2007 All Rights Reserved
Good Performance
3
4
5
2
1
Copyright 2007
1 Timely
2 Accurate
3 Productive
4 Good
Service
5 Cost
Effective
6 Customers
Satisfied
All Rights Reserved
6
The “HOW” Technique






Indicators are essentially developed by taking
the “how” question and applying it to one or
more of the characteristics of good
performance
By using the “how” technique we can ask these
questions;
How
How
How
How
many?
How costly?
accurate?
How courteous/friendly?
timely?
How thorough?
satisfied is the customer?
Copyright 2007 All Rights Reserved
National - NFPA 1710
Indicator 2.1:
Call Processing
Measure:
Total time from call intake by unit dispatching agency to response unit
notification.
Goal:
95% of EMS calls processed in less than 90 seconds.
Indicator 2.2:
Turnout Time
Measure:
Total time from response unit notification to wheels rolling towards the
incident.
Goal:
90% of EMS calls turned out in less than 60 seconds.
Indicator 2.3:
Travel Time
Measure:
Time elapse from vehicle wheels turning to arrival of vehicle at response
address/ incident location.
Goal:
a) First responder with minimum of BLS capability = 90% in 4 minutes.
b) Transport capable vehicle = 90% in 8 minutes.
c) ALS capability = 90% in 8 minutes.
Copyright 2007 All Rights Reserved
National - NFPA 1710
Indicator 2.4:
Staffing
Measure:
Measure of the staffing pattern for all ALS level responses.
Goal:
Compliance with state regulations for staffing ALS transport units.
Compliance with NFPA 1710 standards for staffing ALS
response units.
Indicator 2.5:
Deployment
Measure:
Percentage of calls in which units are not available to respond
Immediately causing a deviation from standard deployment
procedures.
Goal:
0% of calls without resources immediately available.
Indicator 2.6:
Road Structure Coverage Capability
Measure:
Determination whether the department has optimized the location of fixed
assets from which mobile assets are deployed.
Goal:
90% jurisdiction coverage within the travel times designated in
measure 2.3.
Copyright 2007 All Rights Reserved
National - NFPA 1710
Indicator 2.7:
Patient Care Protocol Compliance
Measure:
Compliance with established patient care protocols.
Goal:
90% patient care protocol compliance.
Indicator 2.8:
Patient Outcome
Measure:
Patient’s status following EMS encounter relative to patient status upon
initial contact by EMS personnel.
Goal:
80% positive (Improved, no change).
Indicator 2.9:
Defibrillation Availability
Measure:
Percentage of first shocks delivered within 5 minutes of collapse.
Goal:
50% of first shocks delivered in 5 minutes, 0 seconds or less.
Copyright 2007 All Rights Reserved
National - NFPA 1710
Indicator 2.10:
Extrication Capability
Measure:
Percentage of calls requiring an extrication tool having one delivered
to the scene within 8 minutes of call dispatch.
Goal:
Delivery of an extrication to the scene of 90% of calls requiring the
device within 8 minutes, 0 seconds or less.
Indicator 2.11:
Employee Illness and Injury
Measure:
Percentage of employees acquiring an illness or injury as a result of
participating in an EMS call.
Goal:
0% of employees becoming ill or injured as a result of participating
in an EMS call.
Indicator 2.12:
Employee Turnover
Measure:
Percentage turnover of EMS-trained employees per year.
Goal:
Less than 5% employee turnover (per time period measured, e.g. per year).
Copyright 2007 All Rights Reserved
National - NFPA 1710
Indicator 2.13:
Quality Program
Measure:
All fire departments with ALS services shall have a medical director with
the responsibility to oversee and ensure quality medical care in accordance
with state or provincial laws or regulations.
Goal:
System evaluation program in place with a focus on quality.
Indicator 2.14:
System User Opinion
Measure:
Mail/phone survey to assess the satisfaction of system users with the
system’s performance.
Goal:
90% or greater satisfaction.
Indicator 2.15:
Multi Casualty Event Response Plan
Measure:
An established plan to mitigate a multiple casualty disaster while
maintaining sufficient resources to respond to normal volume
of emergency calls within the jurisdiction.
Goal:
Plan in place and practiced at least Biannually.
Copyright 2007 All Rights Reserved
2005 National Consensus Meeting on
EMS Clinical Performance Indicators



http://emsoutcomes.ncemsi.org/
http://www.nasemsd.org/Projects/PerformanceM
easures/
Objectives:




Simple – consider as “EMS Starter Kit”
For Everyone – target least common denominator
Ease – Use NEMSIS set
Useful – to local, regeonal, state, national levels
Copyright 2007 All Rights Reserved
2005 National Consensus Meeting on
EMS Clinical Performance Indicators

Time


Respiratory



Symptom onset to 911 access
% patients requiring support who got it
Time taken to provide support
Accuracy



PCR
Needed ALS & got it
BLS time to defibrillation
Copyright 2007 All Rights Reserved
2006 EMS Performance Measures
Project Steering Committee
Standardized format
 18 question areas
 35 indicators or attributes
 7 performance categories








System Design and Structure
Human Resources (culture, training, safety, credentialing)
Clinical Care and Outcome
Response
Finance/Funding
Quality Management
Community Demographics
Copyright 2007 All Rights Reserved
2006 EMS Performance Measures
Project Steering Committee















Emergency Medical Dispatch
Emergency Medical Dispatch Impact on Response
Emergency Medical Dispatch Impact on Response
Annual Turnover Rate
Defibrillation Time Mean, 90 th Percentile
Initial Rhythm Analysis Time Mean, 90th Percentile
Major Trauma Triage to Trauma Center
Pain Rates - Relief, Worsened, Unchanged
Pain Intervention Rate
12 Lead Performance Rate
Aspirin Administration for Chest Pain/Discomfort
Cardiac Ischemia Triage to Specialty Center
Emergency Patient Response Interval Mean, 90th Percentile
Emergency Scene Interval Mean, 90th Percentile
Emergency Transport Interval Mean, 90th Percentile
Copyright 2007 All Rights Reserved
2006 EMS Performance Measures
Project Steering Committee












Per Capita Agency Operating Expense
Patient Care Satisfaction Rate
Appropriate Oxygen Use Rate
Undetected Esophageal Intubation Rate
Delay Causing Crash Rate per 1,000 EMS Responses
EMS Crash Rate per 100,000 Fleet Miles
Crash Injury Rate per 100,000 Fleet Miles
Crash Death Rate per 100,000 Fleet Miles
Call Complaint Distribution
Call Complaint Rate
EMS Cardiac Arrest Survival Rate to ED Discharge
EMS Cardiac Arrest Survival Rate to Hospital Discharge
Copyright 2007 All Rights Reserved
Six Sigma EMS




Data driven management – very statistical
Motorola – 3.4 defective parts per million
High accountability
DMAIC





Define
Measure
Analyze
Improve
Control
Copyright 2007 All Rights Reserved
Average Turnaround Days for EMT Card Processing
June 1994 to October 1998
n=53
16.00
Average Turnaround Days
UCL
14.00
s2=11.66
Average
UCL=12.38
LCL
Pink Data Represent EMT-A
Blue Data Represent EMT-B
10.00
s2=2.15
8.00
UCL=6.52
6.00
CL=5.55
4.00
CL=3.59
2.00
2
s =0.66
UCL=4.24
Improvement
Improvement
Average Turnaround Days
12.00
CL=2.62
LCL=0.65
LCL=1.00
0.00
September '98
June '98
March '98
December '97
September '97
June '97
Month
March '97
LCL=-1.28
December '96
September '96
June '96
March '96
December '95
September '95
June '95
March '95
December '94
September '94
-4.00
June '94
-2.00
Quality Improvement Initiatives

Examples of jurisdictional indicators
ICAM/PEIP
 First responder training
 Patient needs
 Intubation success rates
 Data completeness
 On-scene times
 Outcome linkage

Copyright 2007 All Rights Reserved
Quality Improvement Initiatives
PCR information linked to Hospital
ED/Discharge information
 Workgroup to examine:

Current rates of completeness
 Best EMS/Hospital practices
 Impact of Electronic PCR use
 Future of Patient Tracking (triage tag)

Copyright 2007 All Rights Reserved
Managing For Results:
An EMS Road Map for QI

Measurement of Program
Performance: Continually assess
program performance to improve quality
and effectiveness of services.





Inputs
Outputs
Efficiency
Quality
Outcome
Copyright 2007 All Rights Reserved
Performance Model for Moving
Maryland Forward
Customer
Results
Process
Objectives
Goals
Guiding
Principles
Vision
Mission
Customer
Q
Customer
$
Customer
Customer
Feedback
+
Learning
Copyright
2007 All Rights
Reserved
C
U
S
T
O
M
E
R
F
O
C
U
S
Managing For Results:
An EMS Road Map for QI
 Definition:
Governor’s phased-in
initiative to assure a connection
between the budget for State services
and desired results from those services.
Copyright 2007 All Rights Reserved
Managing For Results:
An EMS Road Map for QI

Phase I. - Strategic Planning: Set
direction to achieve desired results over
time.

Internal/External Assessment (SWOT)



Reflects customer and stakeholder needs - EMS Plan ‘94, ‘00,
‘02
EMS Agenda for the Future
Reflected in Mission, Vision, Goals, Objectives,
and Performance Measures
Copyright 2007 All Rights Reserved
Managing For Results:
An EMS Road Map for QI

(Continued)
Phase II. - Measurement of
Program Performance: Continually
assess program performance to improve
quality and effectiveness of services.





Inputs  Traditional Government Indicators
Outputs 
Efficiency  “How well did we use our resources”
Quality  “How well did we meet the expectations of
our customers?”
Outcome  “What results did we achieve?”
Copyright 2007 All Rights Reserved
MIEMSS MFR Key Components

2 Goals




Provide High Quality Medical Care to
Individuals Receiving Emergency Medical
Services
Maintain a Well-Functioning EMS System
6 Objectives
6 Performance Measurements


2 Outcome
4 Quality
Copyright 2007 All Rights Reserved
Outcome #1
Goal: Provide High Quality Medical Care
 Target: Trauma Patient (FY 2000)
 Objective: Maintain > 95% statistical level

of confidence that Maryland performs above the
national norm.
Source Data: Maryland Trauma Registry
 Tools: TRISS analysis, MTOS, Z score statistic
 Actions: Monitor outcome quarterly (Trauma QIC)

Copyright 2007 All Rights Reserved
Outcome #2
Goal: Provide High Quality Medical Care
 Target: Critically Injured Patient (FY 2007)
 Objective: Reduce the overall inpatient

complication rate by 10% or greater in Maryland
trauma centers.
Tools: MTR Complication Report ( ICD-9-CM / ACS )
 Actions: Utilize new “Outcomes” software to identify
cases and present/follow at monthly M & M meetings
and Trauma QIC
 Source Data: Maryland Trauma Registry

Copyright 2007 All Rights Reserved
Quality #1
Goal: Maintain a Well-Functioning EMS System
 Target: Patients receiving EMS services (FY 2004)
 Objective: Establish Baseline for “100%”

jurisdictions > 99 % protocol compliance.
Source Data: Jurisdictional MRC Reports
 Tools: Standardized reporting means (form/database)
 Actions: Implement reporting process

Copyright 2007 All Rights Reserved
Quality #2

Goal: Maintain a Well-Functioning EMS
System
Target: EMS Providers (FY 2000)
 Objective: Maintain a 90 % successful

completion rate statewide for EMS radio
communications with base stations.
Source Data: MAIS
 Tools: EMS Cummunications Master Plan
 Actions: Proceed with communication upgrades
Continue to monitor

Copyright 2007 All Rights Reserved
Quality #3

Goal: Maintain a Well-Functioning EMS
System
Target: Trauma patient population (FY 2001)
 Objective: Maintain an 85 % triage rate of

seriously injured patients transported to a
designated trauma center.
Source Data: MAIS
 Tools: American College of Surgeons on Trauma
 Actions: Continue to monitor

Copyright 2007 All Rights Reserved
Quality #4

Goal: Maintain a Well-Functioning EMS
System
Target: Prehospital patient population
 Objective: Have two-thirds of the

(FY 2007)
jurisdictions utilizing EMAIS.
Source Data: EMAIS
 Tools: EMAIS Software, Educational curriculum, report
functions
 Actions: Education EMS community on virtues.
Continue enhancements

Copyright 2007 All Rights Reserved
Where Are We Today?
MFR FY ‘06 System Report Card
Performance Measures/Performance Indicators
2006
Actual
2007
Actual
2008
Estimate
2009
Estimate
Outcome:
Maintaining >95% statistical level of confidence
Statewide trauma center complication rate < 10%
Yes
17.8
Yes
18.3
Yes
*
Yes
*
Quality:
100% jurisdictions with >99% protocol compliance
% EMS radio communications successfully completed
% seriously injured patients transported to designated centers
% jurisdictions utilizing EMAIS
100
98
90
50.0
100
98
90
60.0
100
98
90
66.7
100
98
90
80.0
* Questionable measurement and forwarded to the Trauma QIC for review.
Copyright 2007 All Rights Reserved
Checking The National QI Pulse
National EMS Managers Association
 NEMSMA.org [email protected]

CY 2004
Capnography
Staffing
ALS Billing
Definition Standards
Who’s in Charge
EMT-P Coverage
CY2005
CY2006
ALS Effectiveness
Attendance
HIPAA
Control Sub. Sec.
E vs P Report
EMT Dumb Down
Benchmarking
Auto Pulse
CPAP
Psyche Transfers
“Various requests”
Nasal Intubation
Copyright 2007 All Rights Reserved
Conclusion
Department of Quality
Management

Questions

Comments

Thank You!
Copyright 2007 All Rights Reserved