Pedagogy and Instructional Design Part III: Courseware

Download Report

Transcript Pedagogy and Instructional Design Part III: Courseware

Research Related to the
Effectiveness E-Learning and
Collaborative Tools
Dr. Curtis J. Bonk
Associate Professor, Indiana University
President, CourseShare.com
http://php.indiana.edu/~cjbonk,
[email protected]
Are you ready???
A Vision of E-learning for
America’s Workforce, Report of the
Commission on Technology and Adult Learning, (2001, June)

A remarkable
84 percent of two-and
four-year colleges in the United States
expect to offer distance learning courses in
2002” (only 58% did in 1998) (US Dept of Education
report, 2000)

Web-based training is expected to
increase 900 percent between 1999
and 2003.” (ASTD, State of the Industry Report 2001).
Brains Before and After Elearning
Before
After
And when use synchronous
and asynchronous tools
Tons of Recent Research
Not much of it
...is any good...
Problems and Solutions
(Bonk, Wisher, & Lee, in review)
1.
2.
3.
4.
5.
6.
7.
Tasks Overwhelm
Confused on Web
Too Nice Due to
Limited Share
History
Lack Justification
Hard not to preach
Too much data
Communities not
easy to form







Train and be clear
Structure time/dates
due
Develop roles and
controversies
Train to back up claims
Students take lead role
Use Email Pals
Embed Informal/Social
Benefits and Implications
(Bonk, Wisher, & Lee, in review)
1.
2.
3.
4.
5.
6.
7.
Shy open up online 
Minimal off task

Delayed collab more 
rich than real time
Students can

generate lots of info
Minimal disruptions 
Extensive E-Advice 
Excited to Publish

Use async conferencing
Create social tasks
Use Async for debates;
Sync for help, office hours
Structure generation and
force reflection/comment
Foster debates/critique
Find Experts or Prac.
Ask Permission
Basic Distance Learning Finding?
• Research since 1928 shows that DL
students perform as well as their
counterparts in a traditional
classroom setting.
Per: Russell, 1999, The No Significant Difference
Phenomenon (5th Edition), NCSU, based on
355 research reports.
http://cuda.teleeducation.nb.ca/nosignificantdifference/
Question: Why is there no
learning in e-learning???
A. Poor pedagogy?
B. Inferior online tools?
C. Unmotivated students and instructors?
D. Poor research and measurement?
E. Too new?
F. Vendor and administrator visions do
not match reality?
Online Learning Research Problems
(National Center for Education Statistics, 1999; Phipps &
Merisotos, 1999; Wisher et al., 1999).




Anecdotal evidence; minimal
theory.
Questionable validity of tests.
Lack of control groups.
Hard to compare given different
assessment tools and domains.
Online Learning Research Problems
(National Center for Education Statistics, 1999; Phipps &
Merisotos, 1999; Wisher et al., 1999).


Fails to explain why the drop-out
rates of distance learners are
higher.
Does not relate learning styles to
different technologies or focus
on interaction of multiple
technologies.
Online Learning Research Problems
(Bonk & Wisher, 2000)
• For different purposes or domains: in our
study, 13% concern training, 87%
education
• Flaws in research designs
- Only 36% have objective learning
measures
- Only 45% have comparison groups
• When effective, it is difficult to know why
- Course design?
- Instructional methods?
- Technology?
Ten Primary Experiments
Adaptations from Education to Training
(Bonk & Wisher, 2000)
1)
2)
3)
4)
5)
6)
Variations in Instructor Moderation
Online Debating
Student Perceptions of e-Learning Envir.
Devel of Online Learning Communities
Time Logging
Critical Thinking and Problem Solving
Applications in Sync/Asynchronous Envir
7) Peer Tutoring and Online Mentoring:
8) Student Retention: E-learning and Attrition
9) Conceptual Referencing
10) Online Collaboration
Evaluating Web-Based Instruction:
Methods and Findings (41 studies)
(Olson & Wisher, in review)
Number of Studies
Year of Publication
(Projected)
12
10
8
6
4
2
0
1996
1997
1998
1999
Year
2000
2001
Wisher’s Wish List

Effect size of .5 or higher in
comparison to traditional
classroom instruction. But reality:
Web Based
Instruction
Average Effect
Size
Number of
Studies
CBI
Kulik [8]
CBI
Liao [18]
31
.
32
.
11
97
46
.
41
Evaluating Web-Based Instruction:
Methods and Findings
(Olson & Wisher, in review)
“…there is little consensus as to
what variables should be
examined and what measures of
of learning are most
appropriate, making
comparisons between studies
difficult and inconclusive.”
Evaluating Web-Based Instruction:
Methods and Findings
(Olson & Wisher, in review)
What to Measure?
•
•
•
•
•
•
•
•
•
demographics (age, gender, etc.),
previous experience,
course design,
instructor effectiveness or feedback,
technical issues,
levels of participation and collaboration,
student and instructor interactions,
student recommendation of course,
student desire to take add’l online courses.
Evaluating Web-Based Instruction:
Methods and Findings
(Olson & Wisher, in review)
Variables Studied:
1.
Type of Course: Graduate (18%) vs.
undergraduate courses (81%)
2. Level of Web Use: All-online (64%) vs.
blended/mixed courses (34%)
3. Content area (e.g., math/engineering
(27%), science/medicine (24%),
distance ed (15%), social science/educ
(12%), business (10%), etc.)
Other data:
a. Attrition data collected (34%)
b. Comparison Group (59%)
Different Goals…






Making connections
Appreciating different perspectives
Students as teachers
Greater depth of discussion
Fostering critical thinking online
Interactivity online
Learning Improved
(Maki & Maki, 2002, Journal of Experimental
Psychology: Applied, 8(2), 85-98)




Intro to Psych: Lecture vs. Online
Web-based course had more
advantages as comprehension skill
increased
Still students preferred the faceto-face over online
Why? More guidance, feedback, &
enthusiasm, and less deadlines.
Learning Improved…
(Maki, Maki, Patterson, & Whittaker, 2000)



Intro to Psych: Lecture vs. Online
Online consistently higher exam
scores
Online learned more as indicated
by higher scores on psych graduate
record exams during semester
Learning Improved…
(Maki et al., 2000)




Intro to Psych: Lecture vs. Online
Online performed better on
midterms.
Web-based course students scored
higher since had weekly activities
due
Lecture students could put off
reading until night before exam.
Learning Worse
(Wang & Newlin, 2000)







Stat Methods: Lecture vs. Online
No diffs at midterm
Lecture 87 on final, Web a 72
Course relatively unstructured
Web students encouraged to collab
Lecture students could not collab
All exams but final were open book
Learning Worse
(Washull, 2001)






Psych: Lecture vs. Online
No diffs at midterm
Self-selected sections: Lecture 86 on
final, Web a 77
Random Assignment sections: No
differences
Self-selected students more likely to fail
the online course
Web course higher student satisfaction
Learning Improved or Not…
(Hiltz, 1993)



Web may be suited to some and
lecture to others…
Students who find Web convenient
for them score better.
Ratings of course involvement and
ease of access to instructor also
important.
Learning Improved or Not…
(Sankaran et al., 2000)


Students with a positive attitude
toward Web format learned more
in Web course than in lecture
course.
Students with positive attitude
toward lecture format learned
more in lecture format.
Electronic Conferencing:
Quantitative Analyses






Usage patterns, # of messages, cases,
responses
Length of case, thread, response
Average number of responses
Timing of cases, commenting,
responses, etc.
Types of interactions (1:1; 1: many)
Data mining (logins, peak usage, location, session
length, paths taken, messages/day/week), Time-Series
Analyses (trends)
Electronic Conferencing:
Qualitative Analyses

General: Observation Logs, Reflective

Specific: Semantic Trace Analyses,

Emergent: Forms of Learning Assistance,
interviews, Retrospective Analyses,
Focus Groups
Talk/Dialogue Categories (Content talk,
questioning, peer feedback, social
acknowledgments, off task)
Levels of Questioning, Degree of Perspective
Taking, Case Quality, Participant Categories
AC3-DL Course Tools
(Orvis, Wisher, Bonk, & Olson)

Asynchronous:



Learning Management System
E-mail
Synchronous: Virtual Tactical Operations
Center (VTOC) (7 rooms; 15 people/extension)




Avatar
Audio conference by extension/room (voice over IP)
Text Chat Windows—global and private
Special tools for collaboration
Overall frequency of interactions
across chat categories (6,601 chats).
Mechanics
15%
Social
30%
On-Task
55%
Overall frequency of interactions
across chat categories (6,601 chats).
On-Task
Social
Mechanics
Mechanics
15%
70%
60%
50%
40%
30%
On-Task
55%
Social
30%
20%
10%
0%
Month 1,2
Month 3,4
Month 5,6
Research on Instructors Online



If teacher-centered, less explore, engage,
interact (Peck, and Laycock, 1992)
Informal, exploratory conversation fosters
risktaking & knowledge sharing (Weedman, 1999)
Four Key Acts of Instructors:
 pedagogical, managerial, technical, social


Instructors Tend to Rely on Simple Tools


(Ashton, Roberts, & Teles, 1999)
(Peffers & Bloom, 1999)
Job Varies--Plan, Interaction, Admin, Tchg

(McIsaac, Blocher, Mahes, & Vrasidas, 1999)
Study of Four Classes
(Bonk, Kirkley, Hara, & Dennen, 2001)




Technical—Train, early tasks, be flexible,
orientation task
Managerial—Initial meeting, FAQs, detailed
syllabus, calendar, post administrivia, assign
e-mail pals, gradebooks, email updates
Pedagogical—Peer feedback, debates, PBL,
cases, structured controversy, field reflections,
portfolios, teams, inquiry, portfolios
Social—Café, humor, interactivity, profiles,
foreign guests, digital pics, conversations,
guests
Network Conferencing Interactivity
(Rafaeli & Sudweeks, 1997)
1. > 50 percent of messages were reactive.
2. Only around 10 percent were truly interactive.
3. Most messages factual stmts or opinions
4. Many also contained questions or requests.
5. Frequent participators more reactive than low.
6. Interactive messages more opinions & humor.
7. More self-disclosure, involvement, &
belonging.
8. Attracted to fun, open, frank, helpful,
supportive environments.
Starter Centered Interaction:
Scattered Interaction (no starter):
Week 4
Collaborative Behaviors
(Curtis & Lawson, 1997)




Most common were: (1) Planning, (2)
Contributing, and (3) Seeking Input.
Other common events were:
(4) Initiating activities,
(5) Providing feedback,
(6) Sharing knowledge
Few students challenge others or attempt to
explain or elaborate
Recommend: using debates and modeling
appropriate ways to challenge others
Online Collaboration Behaviors
by Categories (US and Finland)
Behavior
Categories
Planning
Conferences (%)
Finland
U.S.
Average
0.0
0.0
0.0
Contributing
80.8
76.6
78.7
Seeking
Input
12.7
21.0
16.8
Reflection/
Monitoring
6.1
2.2
4.2
Social
Interaction
0.4
0.2
0.3
100.0
100.0
100.0
Total
Dimensions of Learning Process
(Henri, 1992)
1. Participation (rate, timing, duration of
messages)
2. Interactivity (explicit interaction, implicit
interaction, & independent comment)
3. Social Events (stmts unrelated to content)
4. Cognitive Events (e.g., clarifications,
inferencing, judgment, and strategies)
5. Metacognitive Events (e.g., both
metacognitive knowledge—person, and task,
and strategy and well as metacognitive
skill—evaluation, planning, regulation, and
self-awareness)
Some Findings
Cognitive Skills Displayed in Online
(see Hara,Conferencing
Bonk, & Angeli, 2000)
Social (in 26.7% of units coded)
 40
More inferences & judgments than elem
clarifications and in-depth clarifications
of
St
ra
ts
Ju
dg
me
nt
Inf
er
en
cin
g
Ap
pli
c

Cognitive Skills
More reflections on exper & self-awareness
Some planning, eval, & regulation & self q’ing
InDe
pt
h

Cl
ar
if
Metacognitive (in 56% of units)
Cl
ar
if

social cues decreased as semester progressed
messages gradually became less formal
became more embedded within statement
Cognitive (in 81.7% of units)
Ele
m
Percent of Coded Units
35
30

25
20
15
 10
5
0
Surface vs. Deep Posts
(Henri, 1992)
Surface Processing





making judgments
without justification,
stating that one shares
ideas or opinions already
stated,
repeating what has been
said
asking irrelevant
questions
i.e., fragmented, narrow,
and somewhat trite.
In-depth Processing





linked facts and ideas,
offered new elements of
information,
discussed advantages and
disadvantages of a
situation,
made judgments that were
supported by examples
and/or justification.
i.e., more integrated,
weighty, and refreshing.
Level of Cognitive Processing:
All Posts
Both
12%
Surface
33%
Surface
Deep
Deep
55%
Both
Critical Thinking
(Newman, Johnson, Webb & Cochrane, 1997)
Used Garrison’s five-stage critical thinking
model
 Critical thinking in both CMC and FTF envir.
 Depth of critical thinking higher in CMC envir.




More likely to bring in outside information
Link ideas and offer interpretations,
Generate important ideas and solutions.
FTF settings were better for generating new
ideas and creatively exploring problems.
Unjustified Statements (US)
24. Author: Katherine
Date: Apr. 27 3:12 AM 1998
I agree with you that technology is definitely taking a large
part in the classroom and will more so in the future…
25. Author: Jason
Date: Apr. 28 1:47 PM 1998
I feel technology will never over take the role of the teacher...I
feel however, this is just help us teachers...
26. Author: Daniel
Date: Apr. 30 0:11 AM 1998
I believe that the role of the teacher is being changed by
computers, but the computer will never totally replace the teacher...
I believe that the computers will eventually make teaching easier for
us and that most of the children's work will be done on computers.
But I believe that there…
Study #3. Fall, 1997
Unsupported
Social
Justified
Extension
Indicators for the Quality of Students’ Dialogue
(Angeli, Valanides, & Bonk, in review)
ID
1
2
Examples
Indicators
Social
acknowledgement/
Sharing/Feedback
·
·
Unsupported
statements (advice)
·
·
3
Questioning for
clarification and
extend dialogue
4
Critical thinking,
Reasoned thinkingjudgment
Hello, good to hear from you
I agree, good point, great idea
· \
·
I think you should try this….
This is what I would do…
·
Could you give us more info?
·
…explain what you mean
by…?
·\\
I disagree with X, because in
class we discussed….
·
I see the following
disadvantages to this approach….
Social Construction of Knowledge
(Gunawardena, Lowe, & Anderson, 1997)

Five Stage Model
1. Share ideas,
2. Discovery of Idea Inconsistencies,
3. Negotiate Meaning/Areas Agree,
4. Test and Modify,
5. Phrase Agreements


In global debate, very task driven.
Dialogue remained at Phase I: sharing info
Social Constructivism and Learning
Communities Online (SCALCO) Scale.
(Bonk & Wisher, 2000)
___ 1. The topics discussed online had real world
relevance.
___ 2. The online environment encouraged me to
question ideas and perspectives.
___ 3. I received useful feedback and mentoring
from others.
___ 4. There was a sense of membership in the
learning here.
___ 5. Instructors provided useful advice and
feedback online.
___ 6. I had some personal control over course
activities and discussion.
Evaluation…
Kirkpatrick’s
Reaction
 Learning
 Behavior
 Results
4 Levels

Percent of Respondents
Figure 26. How Respondent Organizations Measure
Success of Web-Based Learning According to the
Kirkpatrick Model
90
80
70
60
50
40
30
20
10
0
Learner satisfaction
Change in
knowledge, skill,
atttitude
Job performance
Kirkpatrick's Evaluation Level
ROI
My Evaluation Plan…
Considerations in Evaluation Plan
8. University
or
Organization
7. Program
6. Course
5. Tech Tool
1. Student
2. Instructor
3. Training
4. Task
1. Measures of Student Success
(Focus groups, interviews, observations,
surveys, exams, records)






Positive Feedback, Recommendations
Increased Comprehension, Achievement
High Retention in Program
Completion Rates or Course Attrition
Jobs Obtained, Internships
Enrollment Trends for Next Semester
1. Student Basic Quantitative





Grades, Achievement
Number of Posts
Participation
Computer Log Activity—peak usage,
messages/day, time of task or in system
Attitude Surveys
1. Student High-End Success






Message complexity, depth, interactivity,
q’ing
Collaboration skills
Problem finding/solving and critical
thinking
Challenging and debating others
Case-based reasoning, critical thinking
measures
Portfolios, performances, PBL activities
2. Instructor Success





High student evals; more signing
up
High student completion rates
Utilize Web to share teaching
Course recognized in tenure
decisions
Varies online feedback and
assistance techniques
3. Training
Outside Support






Training (FacultyTraining.net)
Courses & Certificates (JIU, e-education)
Reports, Newsletters, & Pubs
Aggregators of Info (CourseShare, Merlot)
Global Forums (FacultyOnline.com; GEN)
Resources, Guides/Tips, Link
Collections, Online Journals, Library
Resources
3. Training
Inside Support…







Instructional Consulting
Mentoring (strategic planning $)
Small Pots of Funding
Facilities
Summer and Year Round Workshops
Office of Distributed Learning
Colloquiums, Tech Showcases, Guest
Speakers

Newsletters, guides, active learning grants, annual
reports, faculty development, brown bags
RIDIC5-ULO3US Model of
Technology Use
4. Tasks (RIDIC):





Relevance
Individualization
Depth of Discussion
Interactivity
Collaboration-Control-ChoiceConstructivistic-Community
RIDIC5-ULO3US Model
of Technology Use
5. Tech Tools (ULOUS):





Utility/Usable
Learner-Centeredness
Opportunities with Outsiders Online
Ultra Friendly
Supportive
6. Course Success






Few technological glitches/bugs
Adequate online support
Increasing enrollment trends
Course quality (interactivity rating)
Monies paid
Accepted by other programs
7. Online Program or Course Budget
(i.e., how pay, how large is course, tech fees charged, # of
courses, tuition rate, etc.)

Indirect Costs: learner disk space,
phone, accreditation, integration with
existing technology, library resources, on
site orientation & tech training, faculty
training, office space

Direct Costs: courseware,
instructor, help desk, books, seat time,
bandwidth and data communications,
server, server back-up, course developers,
postage
8. Institutional Success

E-Enrollments from






new students, alumni, existing students
Additional grants
Press, publication, partners,
attention
Orientations, training, support
materials
Faculty attitudes
Acceptable policies (ADA compliant)
Online Student
Assessment
Assessment Takes Center
Stage in Online Learning
(Dan Carnevale, April 13, 2001, Chronicle of Higher Education)
“One difference between assessment
in classrooms and in distance
education is that distanceeducation programs are largely
geared toward students who are
already in the workforce, which
often involves learning by doing.”
Focus of Assessment?
1.
2.
3.
4.
Basic Knowledge, Concepts,
Ideas
Higher-Order Thinking Skills,
Problem Solving,
Communication, Teamwork
Both of Above!!!
Other…
Assessments Possible





Online Portfolios of Work
Discussion/Forum Participation
Online Mentoring
Weekly Reflections
Tasks Attempted or Completed,
Usage, etc.
More Possible Assessments





Quizzes and Tests
Peer Feedback and Responsiveness
Cases and Problems
Group Work
Web Resource Explorations &
Evaluations
Sample Portfolio Scoring Dimensions
(10 pts each)
(see: http://php.indiana.edu/~cjbonk/p250syla.htm)
1.
2.
3.
4.
5.
6.
7.
8.
Richness
Coherence
Elaboration
Relevancy
Timeliness
Completeness
Persuasiveness
Originality
1.
2.
3.
4.
5.
6.
7.
8.
9.
Insightful
Clear/Logical
Original
Learning
Fdback/Responsive
Format
Thorough
Reflective
Overall Holistic
E-Peer Evaluation Form
Peer Evaluation. Name:
____________________
Rate on Scale of 1 (low) to 5 (high):
___ 1. Insight: creative, offers
analogies/examples, relationships
drawn, useful ideas and connections,
fosters growth.
___ 2. Helpful/Positive: prompt
feedback, encouraging, informative,
makes suggestions & advice, finds,
shares info.
___ 3. Valuable Team Member:
dependable, links group members, there
Issues to Consider…
1.
2.
3.
4.
5.
Bonus pts for participation?
Peer evaluation of work?
Assess improvement?
Is it timed? Allow retakes if lose
connection? How many retakes?
Give unlimited time to complete?
Issues to Consider…
6.
7.
8.
9.
10.
Cheating? Is it really that
student?
Authenticity?
Negotiating tasks and criteria?
How measure competency?
How do you demonstrate learning
online?
Increasing Cheating Online
($7-$30/page, http://www.syllabus.com/ January, 2002,
Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?)

http://www.academictermpapers.com/
http://www.termpapers-on-file.com/
http://www.nocheaters.com/

http://www.cheathouse.com/uk/index.html




http://www.realpapers.com/
http://www.pinkmonkey.com/
(“you’ll never buy Cliffnotes again”)
Reducing Cheating Online






Ask yourself, why are they cheating?
Do they value the assignment?
Are tasks relevant and challenging?
What happens to the task after
submitted—reused, woven in, posted?
Due at end of term? Real audience?
Look at pedagogy b4 calling plagiarism
police!
Reducing Cheating Online








Proctored exams
Vary items in exam
Make course too hard to cheat
Try Plagiarism.com ($300)
Use mastery learning for some tasks
Random selection of items for item pool
Use test passwords, rely on IP#
screening
Assign collaborative tasks
Reducing Cheating Online
($7-$30/page, http://www.syllabus.com/ January, 2002,
Phillip Long, Plagiarism: IT-Enabled Tools for Deceit?)







http://www.plagiarism.org/ (resource)
http://www.turnitin.com/ (software, $100,
free 30 day demo/trial)
http://www.canexus.com/ (software; essay
verification engine, $19.95)
http://www.plagiserve.com/ (free database of
70,000 student term papers & cliff notes)
http://www.academicintegrity.org/ (assoc.)
http://sja.ucdavis.edu/avoid.htm (guide)
http://www.georgetown.edu/honor/plagiaris
m.html
Turnitin Testimonials
"Many of my students believe that if they do not
submit their essays, I will not discover their
plagiarism. I will often type a paragraph or two
of their work in myself if I suspect plagiarism.
Every time, there was a "hit." Many students
were successful plagiarists in high school. A
service like this is needed to teach them that such
practices are no longer acceptable and certainly
not ethical!”
New Zealand Universities Consider
Lawsuit Against Sites Selling Diplomas
in Their Names.


The Web sites, which already offer fake
diplomas in the names of hundreds of
colleges in the United States and
abroad, recently added New Zealand’s
Universities of Auckland, Canterbury,
and Otago to their lineup. The degrees
sell for up to $250 each.
Feb 11, 2002, David Cohen, Chronicle of Higher Education
Online Testing and
Survey Tools
Test Selection Criteria
(Hezel, 1999)






Easy to Configure Items and Test
Handle Symbols
Scheduling of Feedback
(immediate?)
Easy to Pick Items for
Randomizing
Randomize Answers Within a
Question
Weighting of Answer Options
More Test Selection Criteria





Recording of Multiple
Submissions; control # of
submissions
Timed Tests
Comprehensive Statistics
Summarize in Portfolio and/or
Gradebook
Confirmation of Test Submission
More Test Selection Criteria
(Perry & Colon, 2001; see:
http://www.indiana.edu/~best/)




Flexible scoring—score first, last,
or average submission
Flexible reporting—by individual
or by item and cross tabulations.
Outputs data for further analysis
Provides item analysis statistics
(e.g., Test Item Frequency
Distributions).
Sample Survey Tools



Zoomerang
(http://www.zoomerang.com)
IOTA Solutions
(http://www.iotasolutions.com)
QuestionMark

(http://www.questionmark.com/home.html)
SurveyShare (http://SurveyShare.com; from
Courseshare.com)
Survey Solutions from Perseus

Infopoll (http://www.infopoll.com)

(http://www.perseusdevelopment.com/fromsurv.htm)
Web-Based Survey
Advantages





Faster collection of data
Standardized collection format
Computer controlled branching
and skip sections
Easy to answer clicking
Wider distribution of respondents
Any questions?