Collecting & Providing Patient Feedback for Quality

Download Report

Transcript Collecting & Providing Patient Feedback for Quality

Introduction to Improving the
Patient Experience Series
Part 2 – April 7, 2010
Measuring the Patient Experience
Tammy Fisher, MPH
Director, Quality & Performance Improvement
San Francisco Health Plan
Agenda
• Purposes of Measurement
• Measurement to identify areas for improvement
– Tools, methodologies , frequency
• Measurement to evaluate impact of changes
– Data collection strategies, tools, and methodologies
.
• Measurement to spread and sustain
improvements
– Tools, methodologies, frequency
• Case Study
– San Francisco Health Plan
• Providing feedback
– Strategies
2
Purposes for Measurement
Aspect
Improvement
Accountability
Research
Aim
Improvement of care Comparison, choice, New knowledge
reassurance
Test Observability
Test observations
Evaluate current
Test blinded
performance; no test
Bias & Sample
Size
Consistent bias –
just enough data
Measure and adjust
to reduce bias –
100% of data
Flexibility of
hypothesis
Improvement of care No hypothesis
Fixed hypothesis
Testing strategy
Sequential tests
No tests
1 test
Is change an
improvement?
Run or control
charts
No change focus
Hypothesis tests (Ftest, T-test, Chisquared, P-value)
Confidentiality of
data
Only used by those
involved in
improvement
Available for public
consumption
Identities protected
Design to eliminate
bias – just in case
data
3
3
Applying it to Patient Experience
1.
Improvement
•
•
•
2.
Accountability
•
•
3.
Understand impact of changes
Provide rapid feedback – engagement strategy
Convince others to try changes
Diagnostic – identify high leverage areas and
people for targeted improvements
Sustainability- public reporting, pay for performance
Research – borrow methods
•
Build a compelling business case to Leadership
4
Measurement Continuum
Identify Areas
and Providers for
Improvement
Evaluate Impact
of Changes
Spreading &
Sustaining
Improvements
5
Identify Areas and People for
Improvement
•
•
•
•
Robust surveys
Robust measurement methodologies
Measure annually
Data at the organization and
individual provider level
• Look at composites strongly
correlated with overall ratings of
experience
6
Validated Surveys
• Clinician Group CAHPS Survey
• https://www.cahps.ahrq.gov/content/products/CG/P
ROD_CG_CG40Products.asp?p=1021&s=213
• Clinician Group CAHPS Visit Survey
• https://www.cahps.ahrq.gov/content/products/CG/P
ROD_CG_CG40Products.asp?p=1021&s=213
• PBGH Short PAS Survey
• PAS website:
http://www.cchri.org/programs/programs_pas.html
• Short PAS survey:
http://www.calquality.org/programs/patientexp/docu
ments/Short_Form_Survey_PCP_feb2010.doc
7
Survey Options
Vendor
Method of
Administration
Cost
Considerations
Groups using it
MTC:
Ph-800-295-9681, ask for Guy
Swenson
Telephonic
$5-10/
completed
survey
+ can customize survey and development costs
are low and turn around is quick
+ rapid feedback (usually within two weeks of
survey completion)
- reporting is limited so need resources
internally to manipulate data for reporting
purposes





Sullivan/Luallin: ph619.283.8988 or at www.sullivanluallin.com
Mailed Survey
Variable
+ recognized by CAPG
+ good reporting capabilities
+ in wide use by multiple groups
+option for customization
Many CA groups
( , Beaver, Sharp)
Press Ganey
www.pressganey.com
Mailed Survey
Call for a quote.
+ robust survey, good reputation
+ excellent reporting capability
- especially good in hospitals/homecare, less so
in outpatient
UCSF
PBGH doctor level survey: Ted
VonGlahn, ph- 415-615-6318
Mailed survey once a
year
$185/per
doctor
+ very robust reporting, including physician
detailed actionable report
+robust algorithms for selecting random
samples
- limited for QI purposes
40 groups in CA
AMGA
Point of service survey
Check out
costs on their
website. A little
complicated.
+ in wide use
+ provides feedback regularly
+ analytic and reporting capabilities
+ good benchmarks
+includes methodologies for assuring random
sample
- once data are forwarded to , report 5-6 weeks
later
A large number of national and CA groups
using it.
Mailed survey
Ask for a quote.
+in wide use nationally
+ provides feedback regularly
+ includes methodologies for assuring random
sample
+good benchmarks
+analytic and reporting capabilities
St. Joseph Heritage Medical Group
–
http://www.amga.org
/QMR/PSAT/index_psat.asp
Avatar
www.avatar-intl.com
MG
John Muir
Physician Associates
Camino Medical Group
CQC doctors in first Collaborative
8
Robust Methodologies
• Mail administration
– 3 waves of mailing (initial mail,
postcard reminder, second mail)
• Telephone administration
– At least 6 attempts across different
days of the week and times of day
• Mixed mail and telephone
administration
– Boost mail survey response by adding
telephone administration
9
Tips
• Survey
–
–
–
–
Include questions that matter most to consumers
Questions that ask about care experience
Applicability across heterogeneous populations
Demonstrates strong psychometric properties
• Reporting
– Includes internal and external benchmarks
• Methodology
– Appropriate sampling (reduce bias, large samples)
– Standardized protocols
– Timeframe- in the last 12 months
• Frequency
– Annually
10
Evaluate Impact of Changes
• Data collection tool specific to
changes tested
• Methodologies that allow for
sequential testing – small samples,
less standardization
• Data given to individuals testing
changes
• Frequent feedback – daily, weekly,
monthly
• Inexpensive methods
11
Data Collection Tools
•
•
•
•
•
•
Point of service surveys
Telephonic surveys
Comment cards
Patient exit surveys
Focus groups
Kiosks, via web
12
Point of Service
• Good for measuring the effect of changes tested
• Focus on meaningful measures
• Have 4-6 response choices
• Include 8-20 measures
• Document collection methodology; train staff
collecting information
• Collect “just enough” data
• Have at least 15 completed surveys and 15
measurement points
• Easy to develop reports
• Data collection is burdensome!
13
Telephonic Surveys
•
•
•
•
•
•
More rapid feedback than mailed surveys
Typically less expensive
Outside vendors do it and provide reports
Easy to manipulate data for reporting
Less frequent – monthly data at best
Literature suggests more bias than
mailed surveys
14
Sample Comment Card
Comment Card
We would like to know what you think about your visit with Doctor X.
□ Yes, Definitely □ Yes, Somewhat, □ No
Did Dr. X listen carefully to you?
Did Dr. X explain things in a
way that was easy to understand?
Is there anything you would like to comment on further?
Thank you. We are committed to improving the care and services we provide our
patients.
15
Patient Exit Interviews
• Rapid feedback on changes tested
• Not burdensome to collect data
• Uncover new issues which may go
unreported in surveys
• Requires translation of information into
actionable behaviors
• Providers “see” the feedback
• Include 3-5 questions, mix of specific
measures and open ended questions
• Receptionist or non-clinic member obtains
feedback (HP or IPA staff)
16
Spreading & Sustaining
Improvements
• Survey
–
–
–
–
Include questions that matter most to consumers
Questions that ask about care experience
Applicability across heterogeneous populations
Demonstrates strong psychometric properties
• Reporting
– Comparisons within peer group
• Methodology
–
–
–
–
Appropriate sampling (reduce bias, large samples)
Standardized protocols
Risk adjustment
Timeframe- most recent visit
• Frequency
– Quarterly
17
CASE STUDY: SFHP
18
Areas for Improvement
• Provider- patient communication,
office staff, & Access to care
– Performed in the lowest quartile
– PPC and Access strongly correlated
with overall ratings of care
– Office staff support provider-patient
communication – Team approach
19
Start Small, then Scale Up
3 -10
Practices
6 – 8 months
• Learn about getting results
at your practices
• Develop physician and
staff champions
• Understand what it takes
from the group to support
practice changes
6 – 12
months
Design systems and
tools to support changes
across many sites
Network
Rollout
Thanks to Chuck Kilo, MD
20
Improvement Project
• AIM: To improve CAHPS scores by
achieving the 50th percentile in the
following composites by MY 2012:
– Access to care
– Provider-patient communication
• APPROACH
– Begin with 10 pilots
– Spread to most providers by MY 2011
21
Purposes for Measurement
1. For Leadership to know if changes
have an impact and to build a
compelling case to spread
changes to other clinics
2. For Clinics to get rapid feedback
on tests of change to understand
their progress towards their own
aims
22
Purpose 1 (for Leadership)
Measures & Approach
Measures
Methodology
Frequency
Reports
Patients’ ratings of Point-of-Care
Quarterly
their care
survey, about 25
questions, using a
At provider level
nationally
with roll up to
recognized tool
clinic
Risk-adjusted
data, delineating
statistical
significance.
Showing data
over time.
Clinic Site
Satisfaction
Data over time
Anonymous
Online survey
instrument
Quarterly
23
Patient Ratings of their Care
• Standardized survey instrument based on the
Clinician-Group CAHPS visit survey, about 30
questions
• Administered at the point of care by clinic
– SFHP provides surveys in 3 languages (English, Spanish,
Chinese) and picks up surveys on Friday of each week
• Defined methodology – all patients, given after the
visit
• Five fielding periods: April 2010, July 2010, Oct
2010, Jan 2011, April 2011
• Each fielding period is 3 weeks
• Risk adjusted results at the provider level with roll up
at clinic level
• Extra incentives – up to $500 per clinic
24
Clinic/Practice Site Satisfaction
• Survey instrument based on the Dartmouth and
Tantau & Associates, about 20 questions
• Administered online by SFHP
– SFHP sends a link to complete the survey
online
– Anonymous, results can be aggregated by
role
• Five fielding periods: March 2010, June 2010,
Sept 2010, Dec 2010, March 2011
• Each fielding period is 2 weeks
• Results at the clinic level 2 weeks following the
close of the measurement period
25
Purpose 2 (for Clinics)
Measures & Approach
Measures
Methodology
Options
Patients’ ratings of 1. Point of service
their care
survey
2. Telephonic
survey
Select 5-7
3. Comment cards
measures based
4. Web-based
on AIM statement
survey
5. Patient exit
interviews
Frequency
Reports
Weekly
Monthly
Clinics document
experience and
results in a
narrative
26
PROVIDING FEEDBACK
27
Tips
• Provide supportive feedback (nonjudgmental)
• Include peer comparisons, targets,
explanation of measures, show trended
data over 2-3 years, identify “actionable
behaviors”
• Meet 1:1, use peer/clinic group meetings,
dashboards, distribute via mail/email/web
• Include testimonials from providers and
patients – “stories”
• Encourage Peer-peer interactions to
follow-up with providers
28
How Data is Displayed is
Important
• Pre/Post data collection
+ larger samples, can test for statistical significance
+ easy to interpret data
- may miss an opportunity to intervene – results masked by
natural variation
- can’t measure sustainability
• Run charts
- hard to interpret
- need enough data to establish trends
+ analyze variation and pinpoint when improvement occurred
+ measures process and ability to act on “slippage”
+ frequent feedback over time
+ evaluate sustainability
• Narrative
+ hear the patient’s voice – see their comments
+ get data quickly
- hard to identify trends and pinpoint areas for improvement
29
© Pacific Business Group on Health
30
© Pacific Business Group on Health
31
Total number of completed responses per question by measurement period
Measurement Period
Spend
enough
Time
Warm
greeting
Explains
things
well
Receptionist
Helpful
Receptionist
Respectful
9/22/09-9/24/09
23
23
23
22
23
10/5/09-10/16/09
17
17
17
18
17
11/4/09- 11/16/09
41
41
41
41
41
12/1/09-12/4/09
34
34
34
34
34
Measurement Period
Patient Comments
9/22/09-9/24/09
Liked the questionnaire handed to me at my visit. The doctor
remembered that I went on a trip - what a great memory!
10/5/09-10/16/09
11/4/09-11/16/09
11/16-11/25/09
12/1/09-12/4/09
Opportunities for Improvement
Explains things well
Spends enough time
Print an after visit summary (see attached)
Use "ask before telling" technique and use short summaries technique.
32
Run Chart
Spend enough Time
Percent Score
100.0%
80.0%
60.0%
40.0%
20.0%
0.0%
9/22/099/24/09
10/5/0910/16/09
11/4/0911/16/09
11/1611/25/09
12/1/0912/4/09
91.0%
100.0%
75.0%
92.0%
95.0%
Good
8.0%
0.0%
25.0%
7.0%
5.0%
Fair
0.0%
0.0%
0.0%
0.0%
0.0%
Poor
0.0%
0.0%
0.0%
0.0%
0.0%
Excellent
33
Run Chart
Overall rating of care
Patient's Experiences with their Care
12
10
8
6
4
2
0
Jan-09 Feb-09 Mar-09 Apr-09 May-09 Jun-09
Jul-09 Aug-09 Sep-09 Oct-09 Nov-09 Dec-09 Jan-10 Feb-10 Mar-10
34