Transcript Performance Improvement in a Medical School: Defining
Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets
Diane Hills, Ph.D. Associate Dean for Academic Affairs College of Osteopathic Medicine
Mary Pat Wohlford-Wessels, Ph.D. Assistant Dean for Academic Quality and Medicine Education Research College of Osteopathic Medicine
2
Introduction
Improving medical education requires
systematic processes
and
assessment
that support the of the work we do.
review
The outcome of careful review supports effective
strategic planning
,
resource allocation
,
resource utilization, faculty development, curricular change, research development
and much more.
3
This presentation builds upon last year’s AACOM presentation. Last year DMU introduced its
new
performance improvement plan and processes. We presented our intent to implement a system of review framed within the Baldrige Quality Criteria.
Since last year, we have adopted the Baldrige criteria, and now collect and format our annual
Performance Improvement
(PI) report within the criteria.
4
Last year, we introduced session participants to:
Our Committee Structure A Gantt Chart of PI activities Proposed data utilization How we classified data sources into meaningful categories
5
Performance Improvement Report
Developed Annually Distributed to College and University Stakeholders 2004 - represented initial efforts 2005 – formatted using the Baldrige criteria and represented early benchmark development 2006 – will focus on clinical education and post graduate perceptions (PGY1 and Residency Directors)
6
Baldrige Values
Visionary Leadership Learning-Centered Education Organizational and personal learning Valuing faculty, staff and partners Agility Focus on the future Managing for innovation Management by fact Social Responsibility Focus on results Systems Perspective
7
Baldrige Criteria
Leadership Strategic Planning Student, Stakeholder, and Market Focus Measurement, Analysis, and Knowledge Management Faculty and Staff Focus Process Management Results
8
What we have learned about culture and leadership
Baldrige “Are We Making Progress” survey compared faculty responses to those of 228 individuals from organizations engaged in the Baldrige process.
The DMU COM faculty responses were (statistically significant) higher than national average for 1 question lower than national average for 8 questions equal to the national average on 30 questions
9
What we have learned about faculty and workload
Faculty workload is quite variable, even after controlling for discipline.
Approximately 45% of basic science teaching effort supports other University programs.
25% of the total teaching effort is lecture, while 44% is scheduled laboratory time. The remainder of time is dedicated to small group learning.
Research growth has been dramatic due the efforts of a core group of basic science faculty.
10
What we have learned about student outcomes and the curriculum
Students perform well on COMLEX 1 in terms of both pass rate and average score.
The pass rate and average score is lower on COMLEX 2 CE and lower still on COMLEX 3.
The curriculum for years 1 & 2 is well managed and faculty are responsive to needed improvement. Years 3 & 4 have received less review. New staff along with an enhanced focus will result in significant changes in the clinical portion of the curriculum.
11
What we have learned about OMM
A survey of 3 rd year and graduating 4 th year (n=192) students regarding their OMM training revealed: 83.2% are confident in their OMM training.
84% said only a small percentage (0-25%) of their DO preceptors used OMM in their practice.
67.5% said that they rarely or never had an opportunity to use OMM during the clinical portion of their training.
What does this mean for our curriculum? What should this mean for the profession?
12
What we have learned regarding research growth
Research/scholarship productivity continues to grow.
Research and Scholarly Activity Totals COM 2001-2004 140 120 100 80 60 40 20 0 COM2001 COM2002 COM2003 COM2004 Edit orial/ Peer Review Posit ions Ot her Scient if ic/ Prof essional Publicat ions Submit t ed Abst ract s Published Abst ract s Submit t ed Manuscript s Manuscript s in Press Manuscript s in Peer Reviewed Journals Books and Book Chapt ers The National Center for Higher Education Management Systems (NCHEMS) data indicates that DMU-COM funding is competitive with peer private osteopathic colleges.
13
What we have learned about our mission and vision
The College Mission statement needed to be revised.
The Vision statement needed to be revised.
Values statements needed to be written.
14
Data Development and Growth
DMU-COM Data (03-04) NCHEMS (03-04) AAMC – Allopathic Medical School Data (04-05) AACOM – Osteopathic Medical School Data (03-04) Residency Directors (05-06) NBOME (03-04)
15
Where does DMU rank?
DMU tracks public information on the following schools: Arizona College of Osteopathic College of Osteopathic Medicine of the Pacific Touro University College of Osteopathic Nova Southeastern Chicago College of Osteopathic Medicine Des Moines University Pikeville College School of Osteopathic Medicine University of New England Michigan State University College of Osteopathic Medicine Kirksville College of Osteopathic Medicine UNDNJ School of Osteopathic Medicine New York College of Osteopathic Medicine Ohio University College of Osteopathic Medicine Oklahoma State University College of Osteopathic Medicine Philadelphia College of Osteopathic Medicine University of North Texas Health Sciences Center West Virginia School of Osteopathic Medicine
16
Where does DMU rank?
Descriptive Statistics for COM Peer group (2002) Data Element Number of schools reporting Minimum Maximum
In state tuition Out of state tuition Room/board/ expenses % students receiving grants Average student indebtedness Enrollment total % men % women % minority % underrepresented minority 17 17 14 17 17 17 17 17 17 16 7,802.00 20,902.00 5,067.00 2% 81,303.00 109 39% 29% 6% 1% 30,433.00 40,550.00 13,190.00 81% 153,966.00 1135 71% 61% 48% 22% Acceptance rate 17 6% 34%
Peer Group Mean
23,115.11 28,527.52 9,916.14 25,475 25,475 12,112 34% 122,666. 561 55% 44% 24% 8% 17%
DMU
23% 142,997 802 56% 44% 10% 6% 26%
DMU Position Relative to Peers
Higher Lower Higher Lower Higher Higher Equal Equal Lower Lower
DMU Rank
9 th out of 17 3 rd out of 14 9 th out of 17 4 th out of 17 3 rd out of 17 6 th out of 17 10 th out of 17 7 th out of 17 8 th out of 16 Higher 8 th out of 17
17
Where does DMU rank?
Characteristic of First Two Years of Medical Education Basic and Clinical Science Course Objectives were Made Clear to Students Basic Science Courses Were Sufficiently Integrated Course Objectives and Examination Content Matched Closely Course Work Adequately Prepared Students for Clerkships The First Two Years of Medical School were Well Organized Students were Provided with Timely Feedback on Performance There was Adequate Exposure to Patient Care During the First Two Years There was Adequate Preparation for COMLEX Level I DMU Satisfied 01-02 82% 74% 75% 67% 69% 69% 48% 79% All Seniors 01-02 81% 73% 72% 70% 70% 71% 49% 64% DMU Satisfied 02-03 All Seniors 02-03 +/- 88% 81% +7 DMU Satisfied 03-04 All Seniors 03-04 +/- DMU Position Higher 86% 88% -2 79% 80% 77% 76% 77% 47% 79% 74% 71% 70% 65% 70% 50% 63% +3 +9 +7 100% 88% 92% +11 82% +7 -3 74% 90% -16 48% 83% 82% 82% 75% 83% 66% 72% +17 Higher +6 Higher +10 Higher +7 -9 Higher Equal +24 Higher -24 Lower
18
Where does DMU rank?
Academic Service Area DMU Satisfied 02-03 Academic Counseling Accessibility to Administration Awareness of Student Problems by Administration Career Counseling Computer Resource Center Disability Insurance Electronic Communications (E-mail, Internet) Faculty Mentoring Financial Aid Administration Services Library 49% 58% 33% 30% 80% 31% 82% 38% 60% 86% All Seniors 02-03 47% 66% 49% 36% 76% 38% 84% 48% 68% 82% +/- DMU Satisfied 03-04 +2 66% -8 72% All Seniors 03-04 63% 76% -16 -6 +4 -7 -2 -10 -8 +4 46% 40% 86% 36% 92% 46% 76% 88% 60% 48% 87% 48% 89% 61% 83% 89% +/- -3 -4 Allopathic Grad satisfaction 2004 64.3 -14 DMU Position Higher Lower Lower -7 -1 -12 49.9 86.6 +3 -15 67.8 -7 -1 85 Lower Equal Lower Higher Lower Lower Equal
19
Next Steps
Begin to develop correlations between clinical experiences and student clinical outcomes.
Further collect and analyze graduate feedback (performance perceptions from graduates and residency directors) Begin to develop assessment research methods to determine the effectiveness of utilizing patient simulators.
20
Next Steps
Continue to refine the Faculty Adequacy (Workload) Model Use existing information about research productivity to develop research related targets Investigate the use of faculty e-portfolios Investigate the use of student e-portfolios Continue to develop the Lecture Level Database (LLDB) to better manage the assessment of objectives and competencies
21
Summary
The process adopted several years ago and perfected over the past two year has resulted in the college knowing more about outcomes and operations.
We have become more sophisticated in our collection and use of data.
We are using data more and more to make decisions.
22