Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University

Download Report

Transcript Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University

Web Enhanced Course Evaluation
at Columbia University
Jack McGourty
Columbia University
Overview





A little history
How does course assessment fit into
the “big picture”?
Why use web technology?
How is it being done?
Does it work?
History



Columbia’s Fu Foundation School of
Engineering and Applied Science began using
the web for course assessment about four
years ago starting with a student
administered web site for results
Designed and developed state-of-the-art
system using student teams
Now building on current infrastructure to
include on-line tutorials and increased
flexibility for administration
Student Web Site



Search by
course or
faculty
Current
and past
results
No
comments
The Big Picture

Why are we assessing courses and programs?

Continuous improvement of the education process


Integral part of our ABET EC2000 Compliance





What are we doing right, and what can we do better?
Develop a process
Collect and evaluate data
Close the loop
Document/Archive results
Course evaluation one of several outcome
assessment measures such as senior exit surveys,
enrolled student surveys, and alumni surveys
How WCES Fits in
SEAS Assessment Processes
Initiate Course
Evaluation Process
pre
1997
Conduct First
Alumni Survey
(All Alumni)
Start
Academic
Review Cycle
1998
Create Web Based
Course Evaluation
Process
1999
Conduct Second
Alumni Survey
1989 & 1994 Grads.
2000
Benchmarking
Senior Surveys Class of 2000
Initiate Freshman
Pre-Attitude
Survey
2001
Senior Surveys Class of 2001
Alumni - 1996
Using Technology

Pro





Students have the time to
consider their responses
Timely feedback
Responses are easily
analyzed, archived and
distributed
Less paper
Lower cost/efficient
administration

Con



You lose the “captive
audience”
You can’t guarantee a
diversity of opinions
 Motivated/Nonmotivated
 Like course/Dislike
course
Not necessarily less effort
Course Assessment Details

10 Core Items



Relevant ABET
EC2000 Items


Course Quality
Instructor Quality
Pre-selected by
faculty member
Customized
questions for specific
course objectives
Selecting EC2000 Questions
Monitoring Faculty Usage
One of our culture
change metrics is
the percentage of
faculty who are
capitalizing on the
system and
adding custom
and EC2000
questions.
Currently around
15%.
Course Evaluation Results

Web page access

Current term’s assessment




Previous terms results


Limited time window
Limited access
Secure site
Open access to numerical results; not comments
Email Results


Individual faculty
Aggregate Data – Department Chairs
Reporting
Promoting Responses





Student-driven
results website
Multiple targeted
emails to students
and faculty from
Dean
Announcements in
classes
Posters all over the
school
Random prize
drawing
Closing the Loop
Does it Work?




Student response rates have steadily
increased over past two years from 72% to
85%
More detail in student written comments in
course assessments
Data is available that we have never had
before
Faculty use of ABET EC2000 and Customized
question features increasing but still limited
(15%)