Effect on Faculty Ratings of Paper Based versus Web Based

Download Report

Transcript Effect on Faculty Ratings of Paper Based versus Web Based

Dr. Juliana Lancaster
Director of Institutional Effectiveness
Dr. Michael Furick
Assistant Professor of Business






The 35th member of the University System of
Georgia
Opened its doors to an inaugural junior class of
120 students in fall 2006
The first freshmen class of 337 was admitted
fall 2007 with total enrollment of 787.
First graduation was held on June 28, 2008
Enrollment has grown to over 3000 students
Accredited by SACS in June 2009


GGC’s mission supports access to
baccalaureate degrees that meet the economic
development needs of the growing and diverse
population of the northeast Atlanta
metropolitan region.
GGC offers majors in:
Biology
Education
Mathematics
Exercise Science
Information Technology
Business Administration
English
History
Political Science
Psychology
Criminal Justice/Criminology



Formative Assessment: Used by faculty to
improve and shape the quality of our own
teaching
Summative Assessment: Used to determine
overall performance, may be used in personnel
decisions
Programmatic Assessment: Used to evaluate
the role of a course within a degree program



Do you do paper or web?
Who gets to see results?
Who should see results?
Paper
Web
Execution in class: Passive system for
student
Delivery via email or portal link:
Requires active participation by
student
Requires short time window to
complete data collection
Requires longer time window to
complete data collection
Familiar to faculty and students
Unfamiliar to faculty and students
Requires extensive time and cost to
prepare and process
Reduces time and cost to prepare and
process
Results returned to faculty within
weeks
Results returned to faculty within
days



Will most students actually do a web based
evaluation?
Will the students who do a web based
evaluation be different from the overall student
population?
Will the evaluation ratings be affected by
changes in which students complete
evaluations?
Semester
Modality
Fall 2007
Paper
Spring 2008
distributed by
On-line
Summer 2008distributed by
Summer 2009
On-line
Fall 2009
course
On-line
Distribution of Form/
Access Information
In-class
Passwords
faculty in class
Passwords
system to student email
Active links to
surveys in student portal



Compared Fall 2007 (paper), Spring 2009 (web
with email delivery) and Fall 2009 (web with
portal delivery)
Sampled all evaluations of faculty who have
been at GGC continuously since Fall 2007
(N=78)
Used only items that have remained (near)
constant on evaluation instrument (N=11)

Did overall response rates change?

Did profile of respondents change?

Did ratings change in a consistent way?
Fall 2007
Spring 2009
Fall 2009

Response Rate
% Class sections
with RR < 15%
77.4%
38.7%
48.8%
0.0%
9.2%
1.9%
Response rates for web based semesters are
lower than for paper based semester.
50
45
40
35
30
F07
25
S09
F09
20
15
10
5
0
A
B
C
D
F


Looked at expected grade as reported by
respondent.
Overall pattern is highly similar, suggesting
web did not attract more unhappy students.
3.8
3.7
3.6
3.5
F07
S09
3.4
F09
3.3
3.2
3.1
SC
SO
TO
PI
IP
IE
DI
IC
FC
CC



Overall, ratings have shown a steady upward
trend for this set of faculty.
For 10 of 11 items analyzed, Fall 09 mean
ratings were significantly higher than Fall 07
ratings.
For one item, ratings dropped for the Spring 09
term but rose again in Fall 09




Shifting from paper to online evaluations does
cause a reduction in response rates
The reduction is spread across expected grades
and across the spectrum of ratings
Overall, our mean evaluations have risen over
time.
Next step: Replicate study with some paper
and some online surveys in Spring 2011