Online Course Evaluations: Lessons Learned

Download Report

Transcript Online Course Evaluations: Lessons Learned

Online Course
Evaluations:
Lessons Learned
With a cast of thousands,
including: Susan Monsen, W.
Ken Woo, Carrie Mahan
Groce,& Wayne Miller
Online Course
Evaluations:
Lessons Learned
Susan Monsen
Yale Law Experience
Course Evaluations were run by Student
Representatives
 Introduced first online system 2001
 Changed system twice and introduced
incentives
 For Spring 2005 have 90% response rate

YLS OCE Version 1






First online course evaluation (OCE) Fall 2001Spring 2003
Home grown web application with 18 questions
System did not scale for in-class completion
General email reminders sent to all students
No incentives
Response rate less than 20%
Back to Paper
Returned to Paper after 3 semesters use
Reasons:
Low response rate
Wanted an easier to use interface for
completing and viewing results
Wanted ability to add incentives
OCE Version 2—Design
Design with input from student
representatives and faculty
 Modeled after Yale College system
 Reduced the number of questions to 8
 Added a comment question
 Students with evaluations to complete
received weekly email reminder

Incentives

Tested Class Time for Completion
 Worked
for small-midsize classes
 Response rate about 90%
 Load testing indicated up to 75 simultaneous
users.

Introduced Grade Blocking
 Students
see an “*” instead of grade for those
classes not evaluated.
OCE 2 Results View
Response Rate by System
OCE 2 with Grade
Blocking (Spring
2005)
OCE 2 in class time
2 classes (Fall 2004)
Percent Completed
Paper (Spring 2003Fall 2003)
OCE 2 (Spring 2004Fall 2004)
OCE 1 (Fall 2001Fall 2002)
0
10
20
30
40
50
60
70
80
90
100
OCE 2 Response Rates
100
90
80
70
60
Percent
completed
50
Spring 03
40
Fall 04
30
Spring 05
20
10
0
Week 1
Week 2
Week 3
Week
Week 4
Final
What did we learn

Don’t




Too many questions
No automated reminders
No incentives
Do



Incentives work!
Reminders help
Load test system
CTEs Online
Presented by:
Ken Woo
Director, Law School Computing
Northwestern University School of Law
When?
1st Semester : Spring 2004
 2nd Semester : Fall 2005
 3rd Semester : Spring 2005

Only 1.5 years into it Online
When? (continued)
Paper system : Fall 2003
 Paper system : Spring 2003
 Paper system : Fall 2004
 1st Semester : Spring 2004
 2nd Semester : Fall 2005
 3rd Semester : Spring 2005

80%
77%
70%
N/A
70%
67.8%
Why?

Wanted to push everything onto the Web.
 Everyone
had some sort of web access
 Loose papers and go paperless

Centralized storage location
 On
a centralized server
 No Data Steward available
 Access by Registrar and Registrar Team only
 Professors can view own results
Why? (continued)

Perceived as easier to manage
 Changes
were easier for Registrar
 3 types of forms
Standard (19 questions)
 CLR (23 questions)
 Clinic (18 questions)

 Legibility
was a small issue
Lessons Learned
Very similar to paper questions with some
added questions for clarity
 Participation rate is falling
 Some ideas to increase participation

transcripts – no
 Withhold final grades – no
 Let know, no view of any results if no
participation – next semester Fall 06
 Withhold
Q&A
CTEs Online Presented by:
Ken Woo
Director, Law School Computing
Northwestern University School of Law
Online Course
Evaluations:
Lessons Learned
Carrie Mahan Groce
University of Denver
Sturm COL Experience

Why Online Evaluations
 Academic
Dean was the instigator. Wanted better,
more timely, access to evaluations, particularly
comments.
 Hoped
to get more meaningful written comments,
both good and bad.
 Our
school has a culture of use of written comments
by students and search committees.
University of Denver
Sturm COL Experience




Web Manager built homegrown Cold Fusion
application using current evaluation form and
procedures as a start.
Data pulled from administrative (Banner)
system.
Course and student data stored in one
database, results in a separate db (anonymity).
Questions generate dynamically.
University of Denver
Sturm COL Experience

Initial concerns taken into account.
 Faculty
- only registered students, one per student.
No evaluation after exam.
 Students – retain anonymity, no faculty access before
grades.

Additional Student Concern
 Complained
this format would be too time consuming
– not addressed, later feedback suggests students
appreciate freeing up class time.
University of Denver
Sturm COL Experience

Additional Faculty Concerns – how addressed
response rates – pilot conducted to get a feel
for response rates before faculty approval of online
evals.
 Lower
 Concern
that comments would be too accessible
leaving “less popular” professors vulnerable – agreed
that Academic Dean could remove very negative
comments from public view.
all courses followed standard exam schedule –
handled case by case.
 Not
University of Denver
Sturm COL Experience

Assoc. Dean wanted data to take to faculty
– came to Ed. Tech.
with pilot group in Fall 02 – 7 profs, 10
course participated.
 Spring 03 all adjuncts and a handful of
appointed faculty – 80 courses in all
 Summer 03 all courses participated.
 Started
University of Denver
Sturm COL Experience

Evaluation Procedures
 Evaluation
goes online 2 weeks prior to semester end
– available through the day prior to exams beginning.
Originally only last two weeks of class – extended
during 1st pilot.
 Students
receive emails with links to all their course
evaluations and detailed instructions.
 Reminder
emails sent every other day or so to those
who have not completed.
University of Denver
Sturm COL Experience

Results from pilots
encouraging. Response rates
good (higher than paper),
though inflated due to
incentives and babysitting.
100%
80%
60%


Summer low but very short
evaluation period.
Dean took data to faculty for
approval to move all courses
online. Approval given
beginning Fall 2003.
40%
20%
0%
Avg response
Fall 02
83%
Spring 03
Sum 03
81%
64%
University of Denver
Sturm COL Experience

Response Rates - real use setting
Spring 05
72%
Fall 04
72%
Summer 04
Spring 04
Fall 03
67%
77%
83%
University of Denver
Sturm COL Experience

Reasons for drop in response rates - speculation
 Change
of Academic Dean. Current dean not
invested, less hands on encouragement.
 Novelty
wearing off. This year we had our first
incoming class who never did a paper evaluation. No
novelty factor – just another chore.
University of Denver
Sturm COL Experience

What should we do?
 Nothing? Assessment
department happy with 70%
and we are getting better rates than other divisions.
 My preference – get the new dean back on board,
even more reminders, advertisement.
 Better communication to faculty about timing so they
can tell students what to expect.
University of Denver
Sturm COL Experience

Next steps
 More
sophisticated results generation. Advanced
searching: ability to compare profs side by side, show
all evals for a professor or a course.
 Streamline
course list interaction. Build direct access
to Banner system rather than pulling data out of the
admin system. Not likely to happen.
 Move
from Access back-end to SQL Server.
University of Denver
Sturm COL Experience

Potholes to watch out for.

Difficult to know how good the data is. We realized late that the
person pulling lists didn’t have permissions to get non-law
students enrolled in law classes. No way to know that from
looking at such large amounts of data. 150 courses/nearly 5000
individual evaluations.

Different schedules for different courses can cause headaches.
1st year Legal Writing wanted complete control over timing.
Some courses finish early. Hard to keep those in institutional
memory. Anytime an individual eval has a different schedule
response lower.
University of Denver
Sturm COL Experience

Potholes (cont.)
 Complete
anonymity made a few instances of
students filling out one evaluation as though it was for
a different professor tedious. Mostly resolved by
adding the professor’s name throughout the text of
the eval, in as many places as possible.
 Students
want to retract an evaluation (usually
negative). This semester was the first time we heard
this request. Academic dean turned down all requests
and shut the door to additional requests.
University of Denver
Sturm COL Experience

And a sink hole…
 A more
pervasive problem: with any ed tech
project, once we do something it becomes
“ours.”
 Problematic because we don’t have the staff
to take on administrative functions, nor have
we been given the power to handle issues
with those functions.
University of Denver
Sturm COL Experience

Remedies?
– never take too much control of a project.
Build as much administrative functionality in as
possible at the beginning.
 Proactive
 If
you’ve taken on too much - give it back, if it was
their job before it was online, it should still be their
job.
 Easier
said than done.
University of Denver
Sturm COL Experience

Final words of wisdom
 Don’t
try to reinvent the wheel. We found we
had better buy-in when we agreed to keep
system as close to original as possible.
Contact information
Carrie Mahan Groce
Web Manager
University of Denver Sturm College of Law
[email protected]
303.871.6098
Online Course
Evaluations:
Lessons Learned
Wayne Miller
The Duke Law Experience
Introduced Summer 2003 without much
planning when scantron equipment failed
and replacement was deemed too
expensive
 My motivation was to provide a service to
the law school that would benefit all: more
efficient for staff and students; unmediated
access for faculty; better community
access to public information (summaries)

The Duke Law Experience
Homegrown, PHP-based survey software
was employed
 Student Information System provided
rosters
 Local email system provided
authentication (through LDAP) for both
students and faculty

Shortsightedness….
Paper form was copied without reevaluation
 10 minutes for in-class completion of
paper evaluations was “given back” to
faculty
 Incentives for students were not thought
through

“Click the radio
button”
is awkward at best
Scale changes
are very
problematic
Things we designed right

Registrar has direct control over which
classes are included; which faculty are
associated with each class; etc.
Things we designed right

Students can submit “conditional
evaluations” when they fail to log in
correctly or are not in our roster
Things people want
Students want to be able to edit and save,
and come back to evaluations
 Registrar and some faculty members want
individualized time windows for certain
classes

Student Response Rate
70% response rate required to share
course eval summaries with community
 Students need constant cajoling or we
need to provide a better incentive
 Some faculty are apprehensive about
including students who would not have
been in attendance on day of paper
evaluations, and uneasy about cajoled
students

Student Response Rate
Semester
Fall 2003
Total
Response
Rate
66% (extended
Percentage of
Class/Instr
Making Cutoff
24/82 = 29%
into exam period)
Spring 2004
60%
36/119 = 30%
Fall 2004
52%
8/93 = 8%
Spring 2005
67% (dropped
48/117 = 41%
non-law students)
4/
11
/
4/ 200
12 5
/
4/ 200
13 5
/
4/ 200
14 5
/
4/ 200
15 5
/
4/ 200
16 5
/
4/ 200
17 5
/
4/ 200
18 5
/
4/ 200
19 5
/
4/ 200
20 5
/
4/ 200
21 5
/
4/ 200
22 5
/
4/ 200
23 5
/
4/ 200
24 5
/
4/ 200
25 5
/2
00
5
Student Response Rate
Number of Submissions
600
500
400
300
200
100
0
/2
4/ 005
12
/2
4/ 005
13
/2
4/ 005
14
/2
4/ 005
15
/2
4/ 005
16
/2
4/ 005
17
/2
4/ 005
18
/2
4/ 005
19
/2
4/ 005
20
/2
4/ 005
21
/2
4/ 005
22
/2
4/ 005
23
/2
4/ 005
24
/2
4/ 005
25
/2
00
5
4/
11
Student Response Rate
Time scheduled for evals in large classes
Number of Submissions
600
500
400
300
200
100
0
/2
4/ 005
12
/2
4/ 005
13
/2
4/ 005
14
/2
4/ 005
15
/2
4/ 005
16
/2
4/ 005
17
/2
4/ 005
18
/2
4/ 005
19
/2
4/ 005
20
/2
4/ 005
21
/2
4/ 005
22
/2
4/ 005
23
/2
4/ 005
24
/2
4/ 005
25
/2
00
5
4/
11
Student Response Rate
Automated and person-specific email from Associate Dean
Number of Submissions
600
500
400
300
200
100
0
/2
4/ 005
12
/2
4/ 005
13
/2
4/ 005
14
/2
4/ 005
15
/2
4/ 005
16
/2
4/ 005
17
/2
4/ 005
18
/2
4/ 005
19
/2
4/ 005
20
/2
4/ 005
21
/2
4/ 005
22
/2
4/ 005
23
/2
4/ 005
24
/2
4/ 005
25
/2
00
5
4/
11
Student Response Rate
Second automatic email from Associate Dean
and cajoling email from Registrar
Number of Submissions
600
500
400
300
200
100
0
Incentives under “consideration”
 Withhold
registration for following
semester
 Withhold grades
 Withhold free printing
 Withhold firstborn….
Issues




Security – not discussed much, but was a big
part of planning
Privacy – deal breaker for some students;
responses are anonymized before release
Accuracy – faculty are suspicious of mix-ups;
varying scales have confused students
Urban legends – stories abound among faculty
about how Prof X saw everyone’s evaluations,
etc.
Future
Evaluation form is being reworked: easier
to fill out, less confusing
 Incentives are being considered
 Scantron on/off-line solutions are being
weighed
 Support could at any point be withdrawn –
 And probably would have been, were
another solution easy to implement….

Contact information
Wayne Miller
Director of Educational Technologies
Duke University School of Law
[email protected]
919-613-7243
http://edtech.law.duke.edu/