View the poster.

Download Report

Transcript View the poster.

Quality Improvement Learning in the Education Centered Medical Home:
Student Questions and Self-Evaluation of QI Skills
Kristen M Unti1; Adrian Nicholas Gaty1; Lindsay DiMarco, MPH2; Daniel B Evans, MD3; Donna Woods, EdM,
2
1Medical Student, Northwestern Feinberg School of Medicine, 2Center PhD
for Healthcare
Studies, 3General Internal Medicine, Northwestern Feinberg School of
Medicine
Table 1. Education Centered Medical Home (ECMH)Medical Student
Quality Measurement and Improvement Self-Assessment Results
Background
The quality of the healthcare delivered in the United States is a
growing concern, and accurate assessment of the quality of care is
becoming a required competency for the next generation of
physicians. Yet, in the majority of current medical school curricula,
little time or effort is put toward educating medical students in the
assessment of the quality of care that they are providing.
Methods
In a pioneering endeavor to bring continuity of care to students’
medical education, Northwestern University’s Feinberg School of
Medicine began an IRB-approved project in September 2011 called
the Education-Centered Medical Home (ECMH), combining the ideas
behind a “patient-centered medical home” with an emphasis on
comprehensive education including continuity, team-based care, and
patient safety and quality improvement. There are presently 213
students in 13 ECMH clinic practices.
Table of Common Themes in ECMH Students’ Questions and Concerns Regarding
Quality Assessment and Improvement
Question or Concern
N=188
Number (%)
Use/what actually comes of quality data
18 (13.0)
Barriers measuring quality data
17 (12.3)
Effect of EMR complexity and accuracy on quality data
17 (12.3)
Making data collecting more efficient
15 (10.9)
How to know/who decides best quality metrics to measure
13
(9.4)
Specific questions about a given metric
13
(9.4)
How to account for patient's choice/refusal of care
12
(8.7)
Not applicable to all patients
10
(7.3)
Impact of communication between students/providers about patient's quality
metrics
10
(7.3)
How to know what to improve and how to improve it
7
(5.1)
Effects on quality data of other factors (resources of clinic, SES of patient,
access, etc.)
7
(5.1)
Negative impact of collecting data on individual health
6
(4.4)
How to code if a patient meets enough of a metric
6
(4.4)
Cost-effectiveness of gathering data
4
(2.9)
Person completing quality metrics
3
(2.2)
Accounting for changing diagnoses
3
(2.2)
Results
Not at all
Slightly
Moderatel
y
Extremel
y
Writing a clear problem statement (goal, aim)
11.54%
36.15%
47.69%
4.62%
Applying the best professional knowledge
15.38%
46.15%
34.62%
3.85%
Using measurement to improve your skills
13.08%
47.69%
33.85%
5.38%
Studying the process
13.85%
50.77%
32.31%
3.07%
Making changes in a system
28.46%
42.31%
27.69%
1.54%
Identifying whether a change leads to
improvement in your skills
15.38%
52.31%
30.00%
2.31%
Using small cycles of change
26.92%
49.23%
20.77%
3.08%
Identifying best practices and comparing
these to your local practice/skills
20.77%
45.38%
31.54%
2.31%
Implementing a structured plan to test a
change
23.08%
45.38%
28.46%
3.08%
Using the PDSA model as a systematic
framework for trial and learning
59.23%
28.46%
11.54%
0.77%
Identifying how data is linked to specific
practices
28.46%
43.85%
26.15%
1.54%
Building your next improvement upon prior
success or failure
20.00%
43.08%
33.85%
3.07%
Summary Score (%)
23.02%
44.23%
29.87%
2.88%
Across all of the assessed quality measurement and improvement
skills in the Likert-scale questions, 32% of students said they were
moderately to extremely comfortable with their QI skills and only 23%
said not at all. Over half of the students felt moderately to extremely
comfortable with "Writing a clear problem statement (goal, aim),” and
approximately a third of the students felt moderately to extremely
comfortable with: “Applying the best professional knowledge" (38%);
"Using measurement to improve your skills" (39%); "Identifying best
practices and comparing these to your local practice/skills” (34%);
"Implementing a structured plan to test a change" (32%); and "Building
your next improvement upon prior success or failure" (37%). Students
reported being only slightly comfortable with: "Making changes in a
system" (42%) and "Using small cycles of change" (49%). The
students felt “not at all” comfortable with: "Using the PDSA model as a
systematic framework for trial and learning" (59%).
After coding and analyzing common themes in the students’ free
responses and notecards, 22% of the ECMH students reported
needing more training and practice in how to record quality metrics.
Additionally, 14% of the students wanted to learn more about how to
apply the findings they uncover with their quality data. In terms of
validity of the data, 7% of students were concerned with
comprehensiveness and accuracy of the standards. Interestingly, 17%
of the students reported having no further questions or concerns, but
that is likely due to not even knowing what to ask.
Conclusions
An initial experience of reporting quality metrics for patients being
seen in the students’ ECMH is a strong learning vehicle for learning
the nuance of abstracting data for quality metrics and constructing the
quality measures. This exercise opened many questions for students
that they may not have otherwise considered how to incorporate
quality assessment into clinical practice. Student self-assessment
results suggest that asking students to actually construct quality
measures is a good introduction to quality reporting.