Technology-Mediated Assessment

Download Report

Transcript Technology-Mediated Assessment

Technology-Mediated Assessment
Jack McGourty, Columbia University
John Merrill, Ohio State University
Mary Besterfield-Sacre & Larry Shuman,
University of Pittsburgh
Gateway Engineering Education Coalition
Technology-Mediated Assessment

Introduction


Applications






Your Expectations
Drexel and Columbia’s Course Evaluation
Ohio State’s Activities
Team Evaluator
Your Experiences
Enablers and Barriers (Break-out Groups)
Conclusions
Introduction




Reasons for Online-Assessment
Common Applications
Design and Development
Things to Think About
Reasons for On-Line Assessment






Customized development
Targeted communication
Ease of distribution/no boundaries
Automatic data collection and analyses
Real time response monitoring
Timely feedback
Common Applications






Attitude Surveys
Multisource assessment and feedback
Course evaluations
Portfolios
Technology-mediated interviews
Tests
Design and Development





Item/Question development
Adaptive testing/expert systems
Multimedia tutorials
Dialogue boxes
Reporting wizards
Things to Think About




Confidentiality/Privacy
Response rates
Reliability/Validity
Ease of use


System growth



Administrators, end users
Can it easily be
upgraded?
Adding modules
System flexibility

Survey/test construction

Data flexibility




Platforms


Item databases
Reporting wizards
Data storage
Specific vs. combination
Reporting



Various levels
Dissemination
mechanisms
Real time vs. delayed
Technology in Education
Technology Enabled Assessment
The Wave of The Future
Dr. John Merrill
The Ohio State University
Introduction To Engineering Program
Objectives





Explanation of web-based assessment
tools
Uses of assessment tools
Virtual run-through of student actions
Lessons learned
Q&A
Web-Based Assessment Tools

Course Sorcerer (through WebCT)



Online Journal Entries
Course Evaluations
Team Evaluator

Peer Evaluations
WebCT


WebCT is a commercial web-based tool
used for course management.
IE Program uses/capabilities:


Electronic grade book, chat rooms, bulletin
boards, calendars
Provides links to



Course Material
Course Sorcerer
Team Evaluations (Team Evaluator)
Course Sorcerer


A simple, web-based evaluation tool created by
Scott Cantor at University Technology Services
Technical Specifications:






Written in Cold Fusion
Run on Windows NT with a Netscape Enterprise Web
Server
Uses a MS SQL Server database with 15 tables
Server Machine: PII-450 w/ 512M of RAM
Accesses Sybase running on Solaris 2.6 as a
warehouse for roster data.
Used for Journal Entries & Course Evaluations
Team Evaluator


(Peer Evaluation)
Used by team members to provide confidential assessment
System Requirements:
 Operating System: Windows 2000 with ActivePerl or UNIX
with Perl 5.004 or
 higher
 Perl Modules: CGI, DBI (plus SQL drivers), POSIX
 SQL Server: MySQL 3.23 or higher
 Web Server: IIS (Windows) or Apache 1.3 (UNIX)
 CPU: Pentium II 400 or better recommended
 Memory: 128 MB or higher recommended
 Disk Space: 100 MB for adequate database space
Journal Entries




Students complete journal entries
online every two weeks.
Submissions are anonymous.
All entries are read and summarized by
a staff member and shared with the
instructional team.
Instructional team members share the
summaries with their classes.
Course Evaluations


Students in 181 & 182 complete online
course evaluations at the end of each
quarter.
Questions designed to evaluate courses
based on items a-k of Criterion 3,
Program Outcomes & Assessment, in
the ABET Engineering Criteria, 2000.
Short Term Uses
Journal Entries & Course Evaluations



Address immediate student
concerns/questions about class, labs, or
projects.
Inquire about student problems with
specific topics and labs.
Discover general information from
students in regards to interests,
influences, and attitudes.
Example
Addressing Immediate Student Concerns


“How are the figures supposed to be done?
Strictly isometric or just drawn so you can
see everything? What pieces need to be
labeled?”
“What are we doing in labs 6 & 7? I know
it says in the syllabus that we are
incorporating the sorting mechanism, but is
that going to take two weeks?”
Long-Term Uses
Journal Entries & Course Evaluations




Improve program content
Improve course materials
Modify teaching styles
Evaluate course based on ABET criteria
Example
Improving Course Content

“Positive: I... - Gained knowledge about
circuits in general - Learned how to read
schematics - Learned how to use
breadboards - Further developed team
working skills Negative: - The circuits
did not work the first time. - Time ran
short for both labs, but we did finish
each circuit.”
How It Works
Start: WebCT site:
http://courses2.telr.ohio-state.edu
Completion Tracking
Engineering 182
Journal Completion Rate
Percent Complete Per Week
100.0%
Dickinson
Hastings
90.0%
Chubb
80.0%
Herrera
70.0%
Gustafson
60.0%
50.0%
Journal Entry Journal Entry Journal Entry Journal Entry Journal Entry
#1
#2
#3
#4
#5
All Entries
Avr.
Dickinson
87.2%
76.2%
73.8%
78.0%
75.5%
78.1%
Hastings
92.7%
85.5%
80.1%
78.4%
73.0%
81.9%
Chubb
93.1%
86.1%
79.2%
80.6%
80.6%
83.9%
Herrera
93.0%
71.7%
81.8%
74.6%
74.6%
79.1%
Gustafson
71.9%
65.6%
70.3%
68.8%
68.8%
69.1%
Lessons Learned
Journal Entries & Course Evaluations



Students are more likely to complete if
given credit.
Students are extremely responsive to
the anonymity of the online survey.
Students respond positively when asked
for suggestions/solutions to problems in
the class.
Web Enhanced Course Evaluation
at Columbia University
Jack McGourty
Columbia University
Overview





A little history
How does course assessment fit into
the “big picture”?
Why use web technology?
How is it being done?
Does it work?
History



Columbia’s Fu Foundation School of
Engineering and Applied Science began using
the web for course assessment about four
years ago starting with a student
administered web site for results
Designed and developed state-of-the-art
system using student teams
Now building on current infrastructure to
include on-line tutorials and increased
flexibility for administration
Student Web Site



Search by
course or
faculty
Current
and past
results
No
comments
The Big Picture

Why are we assessing courses and programs?

Continuous improvement of the education process


Integral part of our ABET EC2000 Compliance





What are we doing right, and what can we do better?
Develop a process
Collect and evaluate data
Close the loop
Document/Archive results
Course evaluation one of several outcome
assessment measures such as senior exit surveys,
enrolled student surveys, and alumni surveys
How WCES Fits in
SEAS Assessment Processes
Initiate Course
Evaluation Process
pre
1997
Conduct First
Alumni Survey
(All Alumni)
Start
Academic
Review Cycle
1998
Create Web Based
Course Evaluation
Process
1999
Conduct Second
Alumni Survey
1989 & 1994 Grads.
2000
Benchmarking
Senior Surveys Class of 2000
Initiate Freshman
Pre-Attitude
Survey
2001
Senior Surveys Class of 2001
Alumni - 1996
Using Technology

Pro





Students have the time to
consider their responses
Timely feedback
Responses are easily
analyzed, archived and
distributed
Less paper
Lower cost/efficient
administration

Con



You lose the “captive
audience”
You can’t guarantee a
diversity of opinions
 Motivated/Nonmotivated
 Like course/Dislike
course
Not necessarily less effort
Course Assessment Details

10 Core Items



Relevant ABET
EC2000 Items


Course Quality
Instructor Quality
Pre-selected by
faculty member
Customized
questions for specific
course objectives
Selecting EC2000 Questions
Monitoring Faculty Usage
One of our culture
change metrics is
the percentage of
faculty who are
capitalizing on the
system and
adding custom
and EC2000
questions.
Currently around
15%.
Course Evaluation Results

Web page access

Current term’s assessment




Previous terms results


Limited time window
Limited access
Secure site
Open access to numerical results; not comments
Email Results


Individual faculty
Aggregate Data – Department Chairs
Reporting
Promoting Responses





Student-driven
results website
Multiple targeted
emails to students
and faculty from
Dean
Announcements in
classes
Posters all over the
school
Random prize
drawing
Closing the Loop
Does it Work?




Student response rates have steadily
increased over past two years from 72% to
85%
More detail in student written comments in
course assessments
Data is available that we have never had
before
Faculty use of ABET EC2000 and Customized
question features increasing but still limited
(15%)
Cross Institutional Assessment
with a Customized Web-Based
Survey System
Mary Besterfield-Sacre & Larry Shuman
University of Pittsburgh
This work is sponsored by two grants by the Engineering Information Foundation, EiF 98-01, Perception versus
Performance: The Effects of Gender and Ethnicity Across Engineering Programs, and the National Science Foundation,
Action Agenda - EEC-9872498, Engineering Education: Assessment Methodologies and Curricula Innovations
Why a Web-Based Survey System for
Assessment?

Need for a mechanism to routinely



Most engineering schools lack sufficient
resources to conduct requisite program
assessments




Elicit student self-assessments and evaluations
Facilitate both tracking and benchmarking
Expertise
Time
Funds
Triangulation of multiple measures

Multiple measures
Pitt On-line Student Survey
System (Pitt-OS3)




Allows multiple engineering schools to conduct routine
program evaluations using EC 2000 related web-based survey
instruments.
Assess and track students at appropriate points in their
academic careers via questionnaires
Survey students throughout their undergraduate career
 Freshman Pre and Post
 Sophomore
 Junior
 Senior
 Alumni
Freshman orientation expanded to include
 Math Placement Examinations
 Mathematics Inventory Self-Assessment
Student-Focused Model
Attitudes
and
Valuing
Can Take
on
Complexity
Knowledge-Based
Opportunity
and
Application
Application Area Synthesize
multiple
areas
Competence
Welcome
Environment
Preparation
Work Experience
Accept
Ambiguity
Develop
Comfort
EC Outcomes
Confidence
System-Focused Model
CORE PROCESSES
WHO
WHAT
HOW
Curriculum
In-Class
Instruction
OUTCOMES
Knowledge
The Student
Skills
Attitudes
Learning
Through
Experience
Culture
ENABLERS & ENHANCERS
School of
Engineering
Services
Engineering
Management
Advising/
Counseling
University
Services
Student
Growth
Pitt OS3

Conduct routine program evaluation via
surveys through the web



Data collection
Report generation (under development)
Web versus paper surveys

Pros




Administration ease
Minimize obtrusiveness
Data is “cleaner”
Cons


Lower response than paper-pencil surveys
User/Technical issues
Pitt OS3
System Components
Local Administrator
Controlling Survey "A"
On-Line Student Survey
System (OS3)
Students
Taking Survey "A"
Internet
Students
Taking Survey "B"
Local Administrator
Controlling Survey "B"
Global Administrator
Maintaining System
Pitt OS3
Local Administrator








Individual at the school where the surveys are being
conducted
Responsible for the administering the surveys through a webinterface
Controls the appearance of the survey
 Selects school colors
 Uploads school emblem/logo
Selects survey survey beginning and ending dates
Composes initial and reminder email letter(s) to students
Cut-and-pastes user login names and email address
Manages surveys in progress
Extends surveys beyond original dates
Pitt OS3
Local Administrator
Pitt OS3
Local Administrator
Pitt OS3
Local Administrator
Pitt OS3
Local Administrator
Pitt OS3
Student






Java Applet running on a web browser
One question per screen minimizes scroll bar
confusion
Once student submits questionnaire, results are
compressed and sent to the OS3 server
Results stored and student’s password is invalidated
Confirmation screen thanks the student for taking the
survey
Can accommodate users who do not have email
accounts
Pitt OS3
Sample Student Email
Subject: Freshman Engineering Attitudes Pre-Survey
To: [email protected]
Hello and Welcome to the Colorado School of Mines!
You are invited to participate in a research study designed to study students' attitudes about engineering, mathematics, and
science. This information will help CSM to design more effective courses and programs to enhance your undergraduate
education.
The survey is called the Freshman Engineering Attitudes Pre-Survey. If you decide to participate, you will be asked to complete
this survey twice: once at the beginning of the semester and again at the end of the academic year. The questionnaire, which takes
less than 15 minutes to complete, can be taken any time at your leisure; however, the pre survey will only be available until 200009-22.
Please remember that there are no right or wrong answers, so be honest with your responses. Your responses will remain
confidential. If you have questions about this study, please contact Dr. Barbara Olds [ext. 3991 or [email protected]] or Dr. Ron
Miller [ext. 3892 or [email protected]].
Your decision to participate in this study is voluntary and there is no penalty if you decide not to participate.
For your convenience, the University of Pittsburgh has made it possible to take the survey online:
Web location: http://136.142.87.142/os3/SurveyClient.html?=4
Your username is: Mary
Your password is: Mary715
If you experience technical problems taking the survey, please contact Dr. Ray Hoare via email at [email protected].
Your participation in this project is important to us. Once you have completed the survey, please stop by the McBride Honors
Program office to pick up a small token of our appreciation. Thank you for your help with this important project.
Barbara M. Olds
Ronald L. Miller
Professor of Liberal Arts & International Studies
Professor of Chemical Engineering
Pitt OS3
Student Welcome
Pitt OS3
Student Instructions
3
OS
Pitt
Questionnaire
Pitt OS3
How it Works

Every day OS3 summarizes all active surveys for each
school



Summary reports on the number of students who have and
have not taken the survey
Specific students can also be viewed from the local
administrators account
Upon completion of the survey dates



Email addresses are stripped from the system
Only login names remain with results
Only time the OS3 system has student email addresses is
when the local admin is receiving daily updates about their
active surveys
Pitt OS3
Sample Daily Report
Date: Mon, 18 Jun 2001 13:10:21 -0500 (EST)
Date-warning: Date header was inserted by pitt.edu
From: [email protected]
Subject: Math Inventory Daily Update
To: [email protected]
The Math Inventory survey for University of Pittsburgh Freshman was started
on 2001-05-18.
The last day for the survey is 2001-08-20.
227 have taken the survey.
3 have not yet taken the survey.
The survey system is online at
http://166.153.77.154/os3/Student.html?=99,local. You can check the status of
individual students as well as change other options such as the color scheme
through your local administrator account:
Username: local
Password: xxx1234
Pitt OS3
Evaluation of the System

Piloted on five schools



Multiple surveys concurrently at each school
Multiple schools at one time
Response rates vary (30 - 70% on average)

Example



University of Pittsburgh - April 2001
One initial with two reminder emails over 2.5 weeks
Responses





Freshman 70%
Sophomores
Junior
44%
48%
Varied by department
Some usernames had “+”
Pitt OS3
System Trace of One School





Freshman Post Survey
Survey available for two weeks with one reminder message
57% overall response rate
Increased server ‘traffic’ 2 to 24 hours after each email
Design concerns
 63% of students had to log in more than one time
 Multiple logins due to case sensitive passwords
 14% never finished - browser problems or didn’t want to
finish
 10% gave up - just didn’t complete login
Pitt OS3
Issues to Consider

Consent for Human Subjects



Java Applets not supported by very old
browsers


Discuss with institution’s Internal Review
Board
Surveys often exempt
HTML as alternate
Firewalls established by other
organizations