No Slide Title

Download Report

Transcript No Slide Title

Objective
Structured
Clinical
Examination
Objective
Structured
Clinical
Examination
What is OSCE ?






series of stations with tasks
planned
marking form
examiner
patient : SP
organization > examination
Why OSCE ?


before OSCE (1975)
viva (oral), long case, short case

valid ?


know how NOT show how
reliable ?
different patients
 different examiners

Why OSCE ?

more valid


show how
more reliable



same task | patient
same examiner or
same structured marking sheet
Basic Structure
1
2
3
4
8
7
6
5
Basic Structure
1
2
3
5
4
6
8
4
7
3
6
8
5
7
Basic Structure : Parallel
1
8
A
C
2
3
B
4
7
6
D
5
Basic Structure : Double
2
1
3
2
7
6
5
4
How to start ?



blueprint of the whole OSCE
design the station
design the mark sheet
Blueprint

reversed table of classification
CVS
Hx
PE
Ix
Com.
RES
KUB
GI
Chest
XPain
Dyspnea
X
X Cr
Bun
X
PU
Station & Marking








learning by doing
8 groups : name list
8 stations
2(p) - 4(x) - 4(d)
signal
materials & ID
task (flexible)
lunch : 3rd floor
Station Design





station time 4-15 min.
total time < 2 hrs
focus task
examiner used : Y | N, who?
pilot
Type of Stations



static | written
practical : technique
clinical
Marking Sheet Design



checklist
rating scale
score
Checklist


dichotomous : Yes | No
Pros




high objectivity
high reliability
easy to feedback
Cons

only quantity check
Checklist

How to improve?

stem
clear
 observable
 not too long


overall

not too long
Rating Scale



rating
quality concern
lower objectivity

lower reliability
Rating Scale

How to improve?




3-7 scale
more clarification of each scale
more raters
rater training

common errors of rating scale
Rating Scale

common errors






leniency error
central tendency error
halo effect
logical error
proximity error
contrast error
Examiner


station developer
non station developer


teacher
not teacher
other staff
 SP


participation => reliability
Observation


direct
indirect



one-way mirror
monitor
video
Getting Feedback: How?





verbal
marked checklist & be the subject
marked checklist & watch video
printed answer
relevant papers
Getting Feedback : When?

during the exam





intra-station
in another station
stress?
NB: too much information!
after the exam

end of all stations
Setting an OSCE



learning by doing | 8 groups
structured task | medical student V
time





7 min. test (without feedback)
5 min. test + 2 min. with feedback
available tools : pls ask
draft of test and marking sheet : ~ 4 p.m.
preparation 8 - 9 a.m.
Minimal Passing Score

criterion-referenced (อิงเกณฑ์ )



holistic
modified Angoff
norm-referenced (อิงกลุ่ม)


borderline method
relative method
Holistic Method


medical school’s faculty-wide
pass mark
e.g. 60%
Modified Angoff Method





group of experts
get the OSCE
“Think of a group of borderline
candidates”
decide the passing score
expert discussion is acceptable in
original Angoff
Minimal Passing Score

criterion-referenced (อิงเกณฑ์ )



holistic
Modified Angoff
norm-referenced (อิงกลุ่ม)


borderline method
relative method
Borderline Method



marking form : checklist + global
rating
all categorized ‘borderline’ students
mean scores of ‘borderline’ group
Borderline Method
Stu
1
score 80
glo
S
2
3
4
5
6
7
8
72
90
65
70
80
50
90
BS
SS
U
BU
BS
SU
SS
Mean borderline score = (72+70+80) / 3 = 74
Relative Method

1st method : Wijnen Mothod


Passing mark = mean -1.96SE
2nd method

60% of the 95th percentile rank
score
Minimal Passing Stations

criterion-referenced
Staff & OSCE : Like



emotional comfort
validly assess
consistent
Staff & OSCE : Dislike



too compartmentalized
no opportunity to observe the
complete patient evaluation of
the student
repetitive nature => boring
Students & OSCE



fairer than other methods
less stressful
unsure whether the important
aspects tested
Limitations of OSCE





lengthy preparation
need more observational skill of
the staff
costly
low inter-station correlation
test security?
What’s next?


evaluation => learning
summative => formative
Innovation




senior student as SP and examiner in
OSCE
study sheet listing Dx that might
appear on the OSCEs
add structured oral exam into OSCE
GOSCE
GOSCE : Group OSCE

Pros





economy
mutual teaching
mutual support
opportunity to examine social skill
Cons


lack of individual assessment
different participants do different
tasks
Potential Use of GOSCE




formative assessment
end-of-course assessment
exploring interpersonal relationship
teaching method for short course
Re-using OSCE stations

across rotation in the same
academic year


statistically OK
from year to year

statistically not OK
Conclusion : OSCE

What ?


Why ?


stations + tasks + checklist
more valid, more reliable
How ?


How to organize?
How to analyze?