Boards, Dashboards, and Data

Download Report

Transcript Boards, Dashboards, and Data

Boards and Dashboards

Bedre Sundhed Politikerens rolle i kvalitetsudvikling Vejle, Denmark April, 2010

James L. Reinertsen, M.D

.

307.353.2294

www.reinertsengroup.com

www.orboardworks.com

Boards see a lot of information e.g. Primary Care Trust in England June 08 May 08 August 08 July 08 April 08

PCT EXAMPLE

March 08 Monthly

▪ ▪ ▪ ▪ ▪ ▪ Minutes Financial monitoring report Audit committee minutes Performance report PEC minutes Community provider board minutes ▪ ▪ ▪ ▪ ▪ ▪ Minutes Financial monitoring report Audit committee minutes Performance report PEC minutes Community provider board minutes ▪ ▪ ▪ ▪ ▪ ▪ Minutes Financial monitoring report Audit committee minutes Performance report PEC minutes Community provider board minutes ▪ ▪ ▪ ▪ ▪ ▪ Minutes Financial monitoring report Audit committee minutes Performance report PEC minutes Community provider board minutes ▪ ▪ ▪ ▪ ▪ ▪ Minutes Financial monitoring report Audit committee minutes Performance report PEC minutes Community provider board minutes ▪ ▪ ▪ ▪ ▪ ▪ Minutes Financial monitoring report Audit committee minutes Performance report PEC minutes Community provider board minutes

Annually Adhoc

▪ ▪ ▪ ▪ ▪ Workforce annual report Knowledge and skills framework PCT stop smoking service Use of the Common Seal Gift, hospitality and sponsorship register ▪ ▪ ▪ ▪ ▪ Mental Health inpatient review PCT stop smoking service Mental health provider minutes ▪ ▪ ▪ ▪ ▪ ▪ ▪ ▪ ▪ ▪ ▪ Cervical screening annual report PCT child measurement team annual report Audit comm. annual report Mental Health inpatient review PCT stop smoking service Mental health provider minutes Community health provider minute Disciplinary procedure Home working policy Our Healthier Nation report Mental health provider board TOR Equality & diversity report ▪ ▪ ▪ ▪ ▪ ▪ ▪ ▪ PCT’s Asian health and lifestyle survey 2006-7 NHS complaints procedure end of year report Annual review supply strategy ▪ ▪ Outline business case for development of palliative care centre Establishment agreements for SCGs and LCCBs Aiming high for disabled children The Royal XX hospitals NHS trust Childhood obesity national support team visit ▪ Healthcare commission core standards 2007-8 Assurance framework and statement of internal control 2007 8 Corporate manslaughter and Corporate Homicide Act 2008 ▪ Revenue budget overview 2008-9 ▪ ▪ ▪ Standing orders, reservation and delegation of power and financial instructions Report from the mental health provider committee HR policies – Substance misuse – Harassment and bullying – Recruitment and selection – Pay and banding for – training Accessing criminal records info for recruitment

Strategic plans

▪ ▪ Children’s plan 2008 2013 Strategic development plan ▪ ▪ ▪ ▪ Corporate objectives 2008-9 VFM plans 2008-9 ▪ ▪ ▪ Joint core strategy for the black country Walsall tPCT Healthcare strategy Estates strategy 2008 12 ▪ ▪ ▪ ▪ Improvement towards excellence plan Local operating plan 2008-9 Capital programme 2008-11 SSDP Typical papers 20-30 pages long McKinsey & Company | 2

3

So....we create “Dashboards” (Also Called “Scorecards,” “Performance Reports”...etc)

One Region’s “Dashboard” on National Indicators

A Regional Report Comparing 6 Hospitals to the 90% “Standard” Udsendelse af laegebrev Der skal vaere udsendt laegebrev til aegen lege senest 3 hverdage efter, at patienten er afsluttet fra sygehuset

Hospital A Hospital B Hospital C Hospital D Hospital E Hospital F

Presence of pressure ulcers – studies of prevalence at different hospitals In average 1/3 of all hospitalized patients have a pressure ulcer Hospital X Hospital Y Hospital Z Hospital V pressure ulcers

(grade 0-4) 32,5%

pressure ulcers

(grade 1-4 of total ) 17,3% 41,5% 35,4% 42,9 22,7 www.regioner.dk

HSMR – at hospital level (3. quarter 2009)

Hospital A Hospital B Hospital C Hospital D Hospital E Hospital F Hospital G High end 116 123 121 113 Low end 89 88 94 www.regioner.dk

HSMR – at regional level Region

Capital Region of Denmark Region Zealand Region of Southern Denmark Central Denmark Region North Denmark Region

1. Quarter 2009

95

2. Quarter 2009

96

3. Quarter 2009

98 107 113 118 105 92 120 100 93 103 105 92 111 www.regioner.dk

Hospital Infections (HAI)

Between 7-10% of all hospitalized patients have an HAI. Hospitals get their own data from the two annual national studies of prevalence of HAI (but these data are not public)

Region

Capital Region of Denmark Central Denmark Region

2008

10,6% 8,0%

2009

9,6% 5,6% North Denmark Region Region Zealand Region of Southern Denmark

Totalt

8,1% 6,5% 4,8%

8,7%

8,9% 7,3% 7,7%

8,3%

www.regioner.dk

Types of hospital infection (share at national level) Type of infection

Urinary tract infection Lower respiratory infections Post operative wound infection Bacteremia/sepsis

Share of infections 2008

28,6% 25,0% 38,0% 8,3%

Share of infections 2009

27,5% 26,2% 34,0% 12,3% www.regioner.dk

Exercise

• Discuss your Region’s “Quality Reports” at the Board ─ Are your aims for quality as clear as your aims for financial balance? (test: ask a non-executive director to explain the quality and finance aims in one or two sentences) ─ How timely are the measures on the dashboard? Why does it take so long to get the quality of care data?

─ For safety measures, does the dashboard answer the question “How many patients were harmed?” ─ Where does the review of your quality report come in your meeting agenda? First? Close to last?

A Health Care System’s Core Work

Inputs Patients Staff Supplies Equipment Buildings Care Processes Diagnosing Treating Explaining Teaching Monitoring Documenting Outputs Care Outcomes Harm Rate Customer Satisfaction Cost per Case

How This Looks to Many Board Members

Inputs Patients Staff Supplies Equipment Buildings Care Processes Diagnosing Treating Explaining Teaching Monitoring Documenting Outputs Care Outcomes Harm Rate Customer Satisfaction Cost per Case

How This Looks to Many Clinicians

Outputs Care Outcomes Harm Rate Customer Satisfaction

Cost per Case

Inputs Patients Staff Supplies Equipment Facilities

Many non-exec directors have trouble with the “Quality and Safety Dashboard,” and one main reason is that the dashboard contains a lot of detailed “process of care” measures that doctors understand, but bankers don’t.

The other main reason is that many dashboards mix together the answers to two questions that boards should ask about quality:

“How does our quality compare to standards and requirements?” “Are we on track to achieve our quality aims?”

The Comparison Dashboard

• How do we compare to… – Other hospitals and regions?

– Regulatory standards?

– Targets?

– Pay for performance thresholds?

• Hundreds of measures – Processes • Measures are typically – Externally defined – risk-adjusted – apples to apples (rates per procedure e.g.) – Slow – Tinged with fear

The Strategic Dashboard

• Are we on track to achieve our aims?

– Reduce harm – Improve outcomes – Improve satisfaction – Reduce costs • A few key measures – Outcomes, Drivers • Measures are typically – Internally defined – Close to real time – “Good enough”

“Findings…dashboards are generally used to create general awareness rather than used to guide operations and performance management… Greater hospital quality was linked to shorter, more focused dashboards, active use of dashboards for operations management, and strong influence of board quality committees in dashboard content and implementation.” Kroch et al., Journal of Patient Safety 2 (1) 10-19, March 2006

Examples: Strategic Safety Aims for Acute Hospitals

• “We will achieve a 50% reduction in hospital acquired infections within 12 months, as measured by the sum of Central Line Bloodstream Infections, Ventilator-Acquired Pneumonias, and Catheter-Associated Urinary Tract Infections.”

WellStar Health System

• “We will reduce Harm by 80%, as measured by Serious Safety Events, within 3 years.”

Cincinnati Children’s

The lead item on the strategic safety dashboard answers the question: “Are we on track to achieve our aim?”

229 Infections Avoided Thus Far!

16

HAI Reduction through April 09

HAI Reduction July 08-April 09

14 12 10 8 6 4 2 0 Ju ne Ju ly Au gu st Se pt em be r O cto be r No ve m be r De ce m be r Ja nu ary Fe bru ary M arc h HAI Ap ril M ay Ju FY 08 Target ne Ju ly Au gu st Se pt em be r O cto be r No ve m be r De ce m be r Ja nu ary Fe bru ary M arc h FY 09 target Ap ril

Serious Safety Events per 10,000 Adj. Patient Days Rolling 12-Month Average 0.6

0.4

0.2

0.0

1.8

1.6

1.4

1.2

1.0

0.8

aSSERT Began July 2006

Desired Direction of Change FY2005 FY2006 FY2007 FY2008 FY2009 FY2010

** Each point reflects the previous 12 months. Threshold line denotes significant difference from baseline for those 12 months (p=0.05).

** The narrowing thresholds in FY2005-FY2007 reflect increasing census. Adjusted patient days for FY07 were 27% higher than for FY05.

SSEs per 10,000 Adj. Patient Days Fiscal Year Goals (FY07=0.75 / FY08=0.50 / FY09=0.20) Baseline [ 1.0 (FY05-06) ] Threshold for Significant Change Chart Updated Through 31Aug09 by Art Wheeler, Legal Dept.

Source: Legal Dept.

Not-So-Specific Aims

• “Our region strives to achieve the highest levels of quality in service of our community” • “We aim to be in the top tier of hospitals for quality and safety”

As measured by….?

By when…?

Copyright, The Reinertsen Group

Murky aims beget murky dashboards.

Murky dashboards beget murky accountability

Exercise

• What are your region’s “strategic aims” for quality and safety?

─How good?

─By when?

─As measured by?

• How do you measure and track them?

• What is your “strategic theory” of how you will achieve these aims?

• Who is responsible for achieving them?

Your Strategic Theory Drives the Creation of the Board “Strategic Quality Dashboard” Big Dots

(Pillars, BSC…)

What are your key strategic aims? How good must we be, by when?

What are the system-level measures of those aims?

Drivers (Core Theory of Strategy) Projects (Ops Plan) Down deep, what really has to be changed, or put in place, in order to achieve each of these goals?

What are you tracking to know whether these drivers are changing?

What set of projects will move the drivers far enough, fast enough, to achieve your aims?

How will we know if the projects are being executed?

Example for an Acute Hospital: A Strategic Theory for the Aim “Reduce Mortality Rate” Big Dot Aim Drivers (Core Theory of Strategy ) Reduce mortality rate by 20% in 24 months, as measured by Hospital Standardized Mortality Rate (from 105 to 85) •

Culture of teamwork as measured by monthly survey of key nursing units

Reliable evidence based medicine delivery as measured by # of CORE + SCIP > 95%

Improved end of life care as measured by % deaths in home care or hospice

Projects (Ops Plan) What set of projects will move the drivers far enough, fast enough, to achieve your aims?

How will we know if the projects are being executed?

120 100 80 60 40 20 0

The Ideal Strategic Dashboard Parallels the Strategic Theory

100 50 0

Teamwork

Mortality Rate Goal

The Ideal Strategic Dashboard Parallels the Strategic Theory

100 50 0

Teamwork

120 100 80 60 40 20 0

Q: Are we executing our strategy?

A: For End-of-Life Care, and perhaps for Teamwork, Yes!

For Evidence-Based Medicine, No!

The Ideal Strategic Dashboard Parallels the Strategic Theory

100 50 0

Teamwork

120 100 80 60 40 20 0 Mortality Rate Goal

Q:What is your diagnosis of this situation?

30 20 10 0

CORE+SCIP >95

It’s not enough to have a dashboard that tracks your system-level aims and drivers. If you are to achieve your goals, the board and senior management must review the key data on big dots and drivers, and respond if needed with changes in strategy or improvements in execution, quickly .

Summary: The Strategic Dashboard Answers the Questions…Are We on Track to Achieve Our Aims?...and Is Our Strategy Working?

• To answer these questions… ─ The Board Dashboard should parallel the organization’s aims and strategic theory.

─ The measures should be weekly or monthly, real time, and displayed as run charts.

─ Measures do not necessarily need to be risk adjusted, or displayed as rates. You can

eliminate the denominator

in many instances.

─ Management and the board should review the key system-level measures at every meeting.

What About the Other Important Type of Quality Question?

• How does our quality and safety measure up… ─To other health care regions and hospitals?

─To standards and regulatory requirements?

─To industry “benchmarks?” ─…etc.

One US Hospital’s Example

A Danish Hospital’s Example

What Boards Should Know About Data on

“How Good are We and How do We Compare to Others and/or to Regulatory Standards?”

Upside • Often risk adjusted • Apples to apples • Source of pride • Source of energy for improvement • Necessary “staying in business” requirement (licensure, deemed status…) Downside • Time lag (months) • Static (no data over time) • If you look bad, energy is wasted on “the data must be wrong” • If you look good, you become complacent • How you look depends on how others perform • Standards and benchmarks are full of defects (“The cream of the crap”)

Recommendations for Board Use of “How do We Compare to Others?” Dashboards 1.

Don’t use comparative reports to oversee and guide improvement at each meeting.

2. Do ask for an “exception report” for any measures that are “off the regulatory and compliance rails.”

3.

Create a separate dashboard with all your publicly reported ‘compared to others’ data and review it annually.

4. Compare to the best, not the 50 th %tile.

5.

Always make sure you know how “green” is determined.

Summary: Good Board Practices for Dashboards

Separate the “comparison” and “strategic” questions into two dashboards.

• Use the “comparison” dashboard to take stock from time to time, not to steer by.

• Set a

few system-level, specific aims

, and develop a Strategic Dashboard with

timely, “good enough” data

that is based on your theory of what needs to happen to achieve the aims.

• Spend time on your strategic dashboard: If you’re not on track to achieve your aims, start asking hard questions.

Do you have any opportunities to improve your eyesight?

Oversight: Questions that all Boards should ask, regularly:

• Are we on track to achieve our aim?

• Are we executing our strategy to achieve our aim?

• Are we “off the rails” on any regulatory or compliance issues?

• Does this set of re-credentialing recommendations fully support our mission, aims, and strategies?

• How many patients is that?

• Who is the best in the world?

• Were patients and families involved?

Dashboard Workshop

• Assess your own region’s quality dashboard. ─ Are major aims crystal clear on the dashboard? (how good, by when, as measured by…) ─ Which measures belong on the “how do we compare to others/standards?” dashboard, and which belong on the “Are we on track to achieve our aims?” dashboard?

─ How timely are the measures? How could you improve the time delay in getting feedback on performance?

─ For harm-related measures, does the dashboard answer the question “How many patients was that?

• List three specific improvements you intend to make in your board’s quality dashboard.