Measuring Quality

Download Report

Transcript Measuring Quality

Information Standards Implementation
Dr Mark Davies, Health and Social Care Information Centre
High performing clinical teams:
•
•
•
•
•
•
Have good clinical leadership
Align clinical and management goals
Use measurement to drive improvement
Value peer group comparisons
Value professional development
Share information with patients
Measuring quality – where are we now?
• Some excellent work on measuring quality in some places and sectors
• Much less in others – important areas have little effective
measurement of quality (e.g. elderly patients with multiple conditions)
• Outcomes and safety data are patchy
• Little data available on quality by clinical pathways
• Disparate data from multiple sources, displayed in different locations
• Questions about the quality of some of the data themselves
• Little ability to compare English NHS quality data with other countries
or the private sector
Measuring Quality:
Where do we want to be?
• An intrinsic part of normal care
• Universal in NHS and should cover complete clinical pathways
• The measures of quality we use should be robustly developed and
meaningful to health professionals, to patients and to the public –
without being “dumbed down”
• We should make the best use of nationally comparable data
• We should be able to compare the quality of care offered in the English
NHS with that of the private sector, our UK neighbours and our
international peers
What’s the difference between an
indicator and a metric?
 A metric is a measure of a known attribute
 An example in a car dashboard would be a
speedometer
 Examples within clinical care include generic
or condition / procedure specific scales e.g.
Oxford Hip Score
 Metrics, whether based on physical
instruments or questionnaires, need rigorous
testing and calibration plus precision in use
 A website (http://phi.uhce.ox.ac.uk), based
on work previously funded by the NHS IC,
contains a bibliographic database, guidance
on how to choose appropriate metrics, and
topic specific reviews e.g. mental health
What’s the difference between an
indicator and a metric?
 An indicator describes how a measure is
expected to be used to judge quality, including
clear statements about the intended goal /
objective; whether it is expected to be used in
isolation or in concert with other measures or
indicators; and any thresholds or standards
which are expected to be applied
 An example in a car dashboard would be a
gauge showing low level of engine oil
 An example within clinical care includes the
proportion of patients with controlled high
blood pressure
 An indicator may act as an alert to an issue
that needs further investigation
The NHS IC’s website lists over 3,000 indicators, alongside
other products, yet covers only a part of the full range of
clinical care. There are many more indicators in use locally.
This is illustrative of the challenges we face in monitoring
clinical quality
Indirectly age standardised rate
per 100,000 & 95% confidence
intervals
EMERGENCY ADMISSIONS TO HOSPITAL FOR ACUTE CONDITIONS USUALLY MANAGED IN PRIMARY CARE, ALL
AGES, ENGLAND 2007/08 (Source: NHS IC Compendium, Crown Copyright)
1000
900
800
700
600
500
400
300
200
100
S
W
N
TO
LA
N
NU
M
A
ST
R
U
D
IN
FA
C
IA
L
TU
H
R
IN
IN
G
TE
R
TR
N
O
U
C
D
N
A
AS
TA
L
C
O
R
SP
E
O
D
E
YS
ID
G
EN
N
ER
TH
O
U
S
IN
G
AN
EW
N
PR
ONS Area Group / Local Authority
S
D
LA
N
S
N
TO
W
G
W
IN
RO
G
D
TR
EN
C
R
EG
IO EN
N
A GL
L
C AN
EN D
TR
ES
ES
W
TH
IT
R
H
IV
IN
IN
D
G
U
ST
LO
R
N
Y
D
O
LO N
N PER
D
IP
O
N
H
SU ER
PR LO
LO
Y
B
U
O ND
N
R
D
SP O
BS
E N C ON
R
C
IN O
G SM EN
TR
S
M OP
E
AL O
LE LIT
A
R
TO N
W
N
S
0
Indicator Assurance Process
• A “Pipeline Process” for guiding the development and adoption of
national quality indicators
• Supporting guidance in the form of the “Indicator Development
Framework”
• A register of indicators being developed and a library of indicators in
use (incorporating the data, metadata and associated technical
information)
• A database which will hold data collected
Indicator Assurance Process
National Quality Board’s Quality Information
Committee
 Purpose of indicator
 Rationale - evidence based standard
 What is measured – numerator, denominator, construction, source
of data, completeness of counts, quality of data
 How data are aggregated - type of analysis (direct/indirect
standardisation), risk adjustment
 Scientific validity – face, content, construct, criterion, predictive;
validity for public, clinicians, performance
 Interpretation – identifying outliers, explaining observations
 Use – timeliness, gaming, costs, access, credibility, feasibility,
usefulness
 Investigation and action – statistical variation, artefacts (e.g. data
quality), quality of care
Implications of Outcomes Framework
• Quality Standards developed by NICE (150)
• QOF measures reflecting quality outcomes and informing
GP payments
• COF measures reflecting quality commissioning and
informing Consortia incentives
• Outcome Measures produced to hold Commissioning
Board to account
• Summary Measures to hold SoS to account
• These should all link together and be informed by
professional practice i.e. Quality Standard drives the
process
Current Challenges for Indicator
Production
• Consistent definitions
• Underlying data standards
• Clear purpose and scope – i.e. is the right question being
asked
• Utilisation of appropriate data sources
– level of granularity (e.g. Classification vs terminology)
– Different data sources (e.g. HES vs Clinical Audit)
• Interpretation – appropriate context needed
• Presentation for different audiences
• Used appropriately
Demonstrating improvement and benefits
realisation
 Can only demonstrate improvements if we have clear standards,
transparent methodologies etc.
 Problems of snapshot indicators – need for comparison over time
 Statistical process control
© extracted from the VLAD CM system produced by OPUS 5K
Conclusions
 Ultimate choice of metrics / indicators depends
on whether they meet criteria for good
indicators, purpose, use, whether found useful,
investigation, follow-up and impact in terms of
continuously improving quality
 Requires an agreed process
 Documentation on the NHS IC’s indicator
assurance process (pipeline) is available in the
briefing pack
 Technical aspects and complexity require
organisations to develop capacity
We all know that we get
what we measure, so
we must be careful
what we ask for
http://en.wikipedia.org/wiki/Image:Rembrandt_Harmensz._van_Rijn_079.jpg