Introduction to Evaluation

Download Report

Transcript Introduction to Evaluation

A Practical Approach to Evaluation in
the Ambulatory Setting in the Era of the
New ACGME General Competencies
Eric S. Holmboe
Stephen Huot
Yale University School of Medicine
Yale Primary Care Residency Program
ACGME Core Competencies

Medical knowledge
 Patient care
 Practice-based learning and improvement
 Interpersonal and communication skills
 Professionalism
 Systems-based practice
Workshop Objectives

Understand the importance of the outpatient
setting for assessment of clinical skills
 Appreciate importance of directly observing
residents interacting with patients
 Discuss practical strategies for focused direct
observation
Workshop Elements

Mini-Lectures:
– Basic Premises
– Ambulatory clinical skills
– Faculty rating accuracy

Direct observation exercises
– Performance dimension exercise
– Videotape evaluation exercises
Basic Premises

Accurate resident evaluation – important
– Decision-making – “summative”
– Feedback – “formative”
– Professional obligation

Resident observation
– Traditional and vital
Ambulatory Clinical Skills

History taking
 Focused physical examinations
 Counseling and education
 Reflective practice
Importance of Sound Clinical Skills

Physician behaviors and communication
–
–
–
–

Accuracy / completeness of data gathering
Patient satisfaction and compliance
Clinical outcomes
Legal implications
Contribution of History & PE to
decision-making
– 80 to 90% diagnoses made by H & P
– Cost-effective use of health care resources
Clinical Skills

Stillman (1990)
– Wide variability in MS4 clinical skills

Sachdeva (1995)
– Wide variability in intern skills

Wray (1983) / Johnson (1986)
– High frequency of errors

Mangione (1997)
– Deficient cardiac auscultatory skills
Clinical Skills

Suchman (1997)
– Poor communication / humanistic skills

Ramsey (1998)
– Incomplete history-taking / preventive
health screening

Braddock (1999)
– Of > 1000 patient visits, less than 15%
fulfilled core elements of informed decision
making
Resident Clinical Skills: Themes

Deficiencies exist across continuum
 Specific skills more “error-prone”
 Not detected by other evaluation methods
– Basic clinic skills don’t correlate with other
competence dimensions

Residents aware of importance and underemphasis
 Without detection cannot be corrected
ACGME and Direct Observation
Direct Observation crucial to evaluate:
 Patient care
– History taking, Pexam, counseling

Interpersonal and communication skills
– Patient/peer/colleague interactions

Professionalism
Faculty Observation / Rating Skills

Thompson (1990)/Haber (1994)
– Significant “halo effect” with ratings
– Ratings based mostly on perceived
knowledge and personality

Kalet (1992)
– Poor reliability – interpersonal skills
– Poor validity and predictive value
– Rater training ineffective
Faculty Observation / Rating Skills

Herbers (1989) / Noel (1992)
– Structured > open-ended form
– Brief training video not effective
– Increased accuracy  discriminative ability

Kroboth (1992)
– Poor inter-rater reliability
– Rater training ineffective
Faculty as Raters – Key Issues
 Faculty
do not observe actual
performance
 Faculty ratings lack:
– Reliability
– Accuracy
 Content
specificity
Faculty as Raters - Solutions

Step 1: Getting faculty to observe
– Required by the ACGME
– Focused observations are logistically possible
• 5 to 10 minute observations are valuable
• Build into existing clinic schedule
– Build on faculty “epiphany”
• The “You will not believe what I saw
today” experience
Mini - CEX Tool
“Structured” approach to direct observation
 Direct assessment of actual patient care
 Incorporation of CEX into daily activities
 High satisfaction among housestaff

Logistics: GIMC

One mini-CEX per intern per day per
week
– One attending observes portion of first visit
of the day
• Interview, physical exam, counseling
– Minimizes disruption of resident clinic
– Perform over course of academic year
– Easy to obtain 6-8 Mini-CEX’s per year per
intern
Faculty as Raters - Solutions

Step 2: Improving reliability
– Multiple brief observations
– Perform over time: outpatient setting allows
for longitudinal observation
– Involve multiple faculty
– MiniCEX: sufficient reliability for pass/fail
determinations after just 4 observations
Direct Observation:Yale PGY-2 Resident
H
X
Ward
Onc
X
X
ER
ID
Amb
Amb Amb
X
P
E
X X
X
E
C
X
X
GI Ward ICU
X
X
Ward
X
X
X
X
X
Card
X
Videotape

Watch the following videotape and
then complete a Mini-CEX evaluation
on the clinical skills of this resident
Faculty as Raters - Solutions

Step 3: Improve accuracy and validity
– Most difficult step
– Improved with structured rating forms
– Can be improved with rater training, but
• Brief training interventions do not work
Can You Train Faculty?
Performance Appraisal Literature:
 Can reduce rating errors
 Can improve discriminative ability
 Can improve accuracy
Summary of Rater Training

Performance Dimension Training
 Frame of Reference Training
 Behavioral Observation Training
Performance Dimension Training

Involves familiarizing faculty with the
specific dimensions of competence
 Should involve discussion of the
“qualifications” required for each
dimension
 Use the ACGME competencies and the
ABIM portfolio to “calibrate” faculty
Frame of Reference Training
Goal is to improve “judgment” and accuracy
Steps in FOR training:
1. Raters given descriptions of each dimension discuss “qualifications” needed for each
dimension (PDT)
2. Review of clinical vignettes describing critical
incidents of performance: unsatisfactory to
average to superior

Frame of Reference Training
3. Raters used vignettes to then provide
ratings on a behaviorally anchored rating
scale (BARS) - think ABIM eval form
4. Session trainer provides feedback on what
“true” ratings should be along with
rationale
5. Discussion ensues about discrepancies
between trainers ratings and the
participants’ ratings
Frame of Reference Training

Most difficult aspect of FOR:
– Setting the actual performance
standards
– Reaching agreement and consensus
among teaching faculty
Behavioral Observation Training
Two main strategies:
1. Increase the amount of “sampling”
- More observations lead to more
accurate evaluations.
2. Use of observational “aides”
- Behavioral diary to record observed
performance.
Structuring the Observation

Prepare for the observation
 Minimize intrusiveness – correct
positioning
 Minimize interference with the residentpatient interaction
 Avoid distractions
 Possible solution
– Allow for habituation by consistent observation
Direct Observation: Challenges

Like all skills, requires training and
practice
 Faculty “calibration” important
– Agreeing on “metrics” of performance
– Faculty comfort with own skills

Faculty training
– How, when, who, what, where
Observation Summary

Sample “parts” of the visit:
– History-taking
– Physical examination
– Counseling

Perform longitudinally
– No need to do it all at once

Agree on performance metrics with
ambulatory faculty