2. Web-Based Quizzes/Surveys for Meaningful Learner Assessment Dr. Curtis J. Bonk Indiana University and CourseShare http://php.indiana.edu/~cjbonk [email protected].

Download Report

Transcript 2. Web-Based Quizzes/Surveys for Meaningful Learner Assessment Dr. Curtis J. Bonk Indiana University and CourseShare http://php.indiana.edu/~cjbonk [email protected].

2. Web-Based Quizzes/Surveys
for Meaningful Learner
Assessment
Dr. Curtis J. Bonk
Indiana University and CourseShare
http://php.indiana.edu/~cjbonk
[email protected]
Online Student
Assessment
Assessment Takes Center
Stage in Online Learning
(Dan Carnevale, April 13, 2001, Chronicle of Higher Education)
“One difference between assessment
in classrooms and in distance
education is that distanceeducation programs are largely
geared toward students who are
already in the workforce, which
often involves learning by doing.”
Focus of Assessment?
1.
2.
3.
4.
Basic Knowledge, Concepts,
Ideas
Higher-Order Thinking Skills,
Problem Solving,
Communication, Teamwork
Both of Above!!!
Other…
Assessments Possible





Online Portfolios of Work
Discussion/Forum Participation
Online Mentoring
Weekly Reflections
Tasks Attempted or Completed,
Usage, etc.
More Possible Assessments





Quizzes and Tests
Peer Feedback and Responsiveness
Cases and Problems
Group Work
Web Resource Explorations &
Evaluations
Measures of Student Success
(Focus groups, interviews,
observations, surveys, exams, records)






Positive Feedback, Recommendations
Increased Comprehension,
Achievement
High Retention in Program
Completion Rates or Course Attrition
Jobs Obtained, Internships
Enrollment Trends for Next Semester
Student Basic Quantitative





Grades, Achievement
Number of Posts
Participated
Computer Log Activity—peak usage,
messages/day, time of task or in system
Attitude Surveys
Student High-End Success






Message complexity, depth, interactivity, q’ing
Collaboration skills
Problem finding/solving and critical thinking
Challenging and debating others
Case-based reasoning, critical thinking
measures
Portfolios, performances, PBL activities
Sample Portfolio Scoring Dimensions
(10 pts each)
(see: http://php.indiana.edu/~cjbonk/p250syla.htm)
1.
2.
3.
4.
5.
6.
7.
8.
Richness
Coherence
Elaboration
Relevancy
Timeliness
Completeness
Persuasiveness
Originality
1.
2.
3.
4.
5.
6.
7.
8.
9.
Insightful
Clear/Logical
Original
Learning
Fdback/Responsive
Format
Thorough
Reflective
Overall Holistic
E-Peer Evaluation Form
Peer Evaluation. Name:
____________________
Rate on Scale of 1 (low) to 5 (high):
___ 1. Insight: creative, offers
analogies/examples, relationships
drawn, useful ideas and connections,
fosters growth.
___ 2. Helpful/Positive: prompt
feedback, encouraging, informative,
makes suggestions & advice, finds,
shares info.
___ 3. Valuable Team Member:
dependable, links group members, there
E-Case Analysis Evaluation
Peer Feedback Criteria
(1 pt per item; 5 pts/peer feedback)
(a) Provides additional points that may have
been missed.
(b) Corrects a concept, asks for clarification
where needed, debates issues, disagrees &
explains why.
(c) Ties concepts to another situation or refers
to the text or coursepack.
(d) Offer valuable insight based on personal
experience.
(e) Overall constructive feedback.
Issues to Consider…
1.
2.
3.
4.
5.
Bonus pts for participation?
Peer evaluation of work?
Assess improvement?
Is it timed? Allow retakes if lose
connection? How many retakes?
Give unlimited time to complete?
Issues to Consider…
6.
7.
8.
9.
10.
Cheating? Is it really that student?
Authenticity?
Negotiating tasks and criteria?
How measure competency?
How do you demonstrate learning
online?
Online Testing
Tools
Pause…
What tools do you use or know about?
What can Online Tests Do?





Assess student progress
Allow for self-assessment
Provide standards for success
Timed testing and retesting
Opportunity for instructor
commenting
Using WebCT Quizzes in a HighDemand Environment (Brothen & Wambach,
Technology Source, May/June 2003)

“For several years we have taught our
psychology courses with the
Personalized System of Instruction
(PSI; Keller, 1968), a highly researched
version of the mastery learning teaching
method. Students in PSI work at their
own pace to read a textbook with
direction from a study guide and, when
they are ready, take chapter quizzes;
after they master one chapter, they
move on to the next.”
Using WebCT Quizzes in a HighDemand Environment (Brothen & Wambach,
Technology Source, May/June 2003)

“Several reviews and meta-analyses
(Keller, 1974; Kulik, Kulik, & BangertDrowns, 1990; Kulik, Kulik, & Cohen,
1979; Robin, 1976; Ryan, 1974) have
found superior student learning in PSI
compared to traditional
lecture/discussion methods. For
example, in the 26 college psychology
courses analyzed by Kulik, Kulik, and
Bangert-Drowns, the effect size in
favor of PSI over traditional instruction
was 0.71.”
Test Selection Criteria
(Hezel, 1999)







Easy to Configure Items and Test
Handle Symbols
Scheduling of Feedback (immediate?)
Provides Clear Input of Dates for
Exam
Easy to Pick Items for Randomizing
Randomize Answers Within a Question
Weighting of Answer Options
More Test Selection Criteria





Recording of Multiple Submissions
Timed Tests
Comprehensive Statistics
Summarize in Portfolio and/or
Gradebook
Confirmation of Test Submission
More Test Selection Criteria
(Perry & Colon, 2001)





Supports multiple items types—multiple
choice, true-false, essay, keyword
Can easily modify or delete items
Incorporate graphic or audio elements?
Control over number of times students
can submit an activity or test
Provides feedback for each response
Learner-Content Interactions
More Option 6
More Option 6
More Option 6
Online Fun and Games
(see Thiagi.com
Or deepfun.com)
Puzzle games
2. Solve puzzle against
timer
3. Learn concepts
4. Compete
5. Get points
1.
Students Play Online Jeopardy Game
www.km-solutions.biz/caa/quiz.zip
More Test Selection Criteria
(Perry & Colon, 2001)




Flexible scoring—score first, last,
or average submission
Flexible reporting—by individual
or by item and cross tabulations.
Outputs data for further analysis
Provides item analysis statistics
(e.g., Test Item Frequency
Distributions).
Web Resources on
Assessment
1.
2.
3.
http://www.indiana.edu/~best/
http://www.indiana.edu/~best/best_
suggested_links.shtml
http://www.indiana.edu/~best/samsu
ng/
Rubric for evaluation technology
projects:
http://www.indiana.edu/~tickit/learning
center/rubric.htm
Online Survey
Tools for
Assessment
Perhaps you have heard of
my survey techniques?
Sample Survey Tools



Zoomerang
(http://www.zoomerang.com)
IOTA Solutions
(http://www.iotasolutions.com)
QuestionMark

(http://www.questionmark.com/home.ht
ml)
Survey Solutions from Perseus

Infopoll (http://www.infopoll.com)
(http://www.perseusdevelopment.com/fromsurv.
htm)
Sample Survey Tools

Active Feedback

(http://www.activefeedback.com/af)

SurveyKey
(http://www.surveykey.com)
EZSurvey from Raosoft
(http://www.raosoft.com/)

SurveyShare



(http://SurveyShare.com; from
Courseshare.com)
Electronic Voting and Polling
1. Ask students to vote on issue before class
(anonymously or send directly to the instructor)
2. Instructor pulls our minority pt of view
3. Discuss with majority pt of view
4. Repoll students after class
(Note: Delphi or Timed Disclosure Technique:
anomymous input till a due date
and then post results and
reconsider until consensus
Rick Kulp, IBM, 1999)
Poll Students Asynchronously Web
Conferencing Tools (e.g., Sitescape Forum)
Poll Students Synchronously Web
Conferencing Tools (e.g., HorizonLive)
Survey Student Opinions
(e.g., InfoPoll, SurveySolutions,
Zoomerang, SurveyShare.com)
Web-Based Survey
Advantages






Faster collection of data
Standardized collection format
Computer graphics may reduce
fatigue
Computer controlled branching
and skip sections
Easy to answer clicking
Wider distribution of respondents
Why Conduct Online Surveys




Formative assessment of class
Increase student voice and
ownership in class
Involve students from other
locations
Quickly gather answers to
questions
Web-Based Survey
Problems: Why Lower
Response Rates?






Low response rate
Lack of time
Unclear instructions
Too lengthy
Too many steps
Can’t find URL
Survey Tool Features







Support different types of items
(Likert,
multiple choice, forced ranking, paired comparisons,
etc.)
Maintain email lists and email
invitations
Conduct polls
Adaptive branching and cross
tabulations
Modifiable templates & library of past
surveys
Publish reports
Different types of accounts—hosted,
corporate, professional, etc.
Web-Based Survey
Solutions: Some Tips…








Send second request
Make URL link prominent
Offer incentives near top of request
Shorten survey, make attractive, easy
to read
Credible sponsorship—e.g., university
Disclose purpose, use, and privacy
E-mail cover letters
Prenotify of intent to survey
Plagiarism
Increasing Cheating Online
($7-$30/page, http://www.syllabus.com/ January, 2002,
Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?)

http://www.academictermpapers.com/
http://www.termpapers-on-file.com/
http://www.nocheaters.com/

http://www.cheathouse.com/uk/index.html




http://www.realpapers.com/
http://www.pinkmonkey.com/
(“you’ll never buy Cliffnotes again”)
Reducing Cheating Online






Ask yourself, why are they cheating?
Do they value the assignment?
Are tasks relevant and challenging?
What happens to the task after
submitted—reused, woven in, posted?
Due at end of term? Real audience?
Look at pedagogy b4 calling plagiarism
police!
Reducing Cheating Online






Proctored exams
Vary items in exam
Have timed exams
Make course too hard to cheat
Try Plagiarism.com ($300)
Random selection of items for item pool
Reducing Cheating Online






Use test passwords
Rely on IP# screening
Use mastery learning for some tasks
Assign collaborative tasks
Emphasize consequences (e.g., stories
of past offenders)
Have students make a vow of no
cheating (e.g., University of Virginia)
Reducing Cheating Online
($7-$30/page, http://www.syllabus.com/ January, 2002,
Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?)

http://www.plagiarism.org/ (resource)
http://www.turnitin.com/ (software, $100,
free 30 day demo/trial)
http://www.canexus.com/ (software; essay
verification engine, $19.95)
http://www.plagiserve.com/ (free database
of 70,000 student term papers & cliff notes)
http://www.academicintegrity.org/ (assoc.)
http://sja.ucdavis.edu/avoid.htm (guide)

http://www.georgetown.edu/honor/plagiarism.html





Turnitin Testimonials
"Many of my students believe that if they do not
submit their essays, I will not discover their
plagiarism. I will often type a paragraph or two
of their work in myself if I suspect plagiarism.
Every time, there was a "hit." Many students
were successful plagiarists in high school. A
service like this is needed to teach them that such
practices are no longer acceptable and certainly
not ethical!”
Tips on Authentification




Check e-mail access against list
Use password access
Provide keycode, PIN, or ID #
(Futuristic Other: Palm Print,
fingerprint, voice recognition, iris
scanning, facial scanning,
handwriting recognition, picture
ID)
Final advice…whatever you do…