Research & Evaluation
Download
Report
Transcript Research & Evaluation
Research Agenda for Open Content
Dr Patrick McAndrew, IET
Dr Simon Buckingham Shum, KMi
Open Content
• Background
– Open Content and other open education initiatives
• Sensemaking tools
– What we are trying that is different
• Evaluation
– Aims and initial work
• Questions
Open Content Initiative
• $9.9m two year programme
– supported by William and Flora Hewlett Foundation
• Strands
– Academic
– Technical: Production
– Technical: Tools
– Research and Evaluation
Open Courseware (MIT)
• Developed from campus-based courses
– “giving their educational materials to the world”
• 1280 MIT courses
• Lots of users ~250,000 visitors per month (2004 data)
• Access sections at a time
• No feedback mechanism
• What do learners do?
– http://ocw.mit.edu/
Connexions
•
•
•
•
Open to all to contribute
150 course/3161 modules
Variable quality
Limited feedback
– http://cnx.org/
Open Content Repositories
to
Open Sensemaking Communities
•
•
•
•
Going beyond content
Tools that will give the learners presence
Bringing research into practice
Open Content sites as social learning spaces
Giving ‘content’ a ‘social life’
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
From raw learning resources…
(what we push to the learner)
Giving ‘content’ a ‘social life’
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
…to layers of tools for sensemaking
(what the learners construct for themselves)
Giving ‘content’ a ‘social life’
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
…creating a web of ideas, open and evolving
Giving ‘content’ a ‘social life’
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
QuickTime™ and a
TIFF (LZW) decompressor
are needed to see this picture.
…creating a sensemaking
community
A missing layer in the Open
Content infrastructure
sensemaking: community discourse
(domain ontologies)
richer formalisation of consensus:
minimise inconsistency, ambiguity, controversy
metadata
generally uncontroversial:
minimise inconsistency, ambiguity, controversy
open content
documents, learning objects, etc…
Tools for sensemaking
BuddySpace
http://www.buddyspace.org/
Flashmeeting
http://www.flashmeeting.com/
Compendium
http://www.compendiuminstitute.org/
Two
environments
OU courses
Depository
Repository
Moodle
KMi Tools
Educators+
Moodle
Tested Tools
Learners+
Playspace
Showcase
Evaluation Aims
• Enhanced knowledge and understanding of open
content delivery, how it can be effective, and the
contribution it can make to the further development of
e-learning.
• Enhanced understanding of sustainable and scaleable
models of open content delivery.
Strands
Educators
Students
Passers-by
Learner experience
Content
Tools
Provider experience
OCI
OU
World
Strategy
•
•
•
•
•
Both provider and learner views of Open Content
Involvement in requirements, planning and prototyping
Researching options and feeding back
Large scale data through online systems
Targeted detailed evaluation studies
Approach
•
•
•
•
•
•
Build on studying informal and mobile learning
Activity theory – work out models of how people interact
Study the situations as well as the technology
Evaluate to explain, rather than judge
Find the interesting questions
Don’t forget the user
Evaluation into use
Early action
•
•
•
•
•
•
Create the evaluation team
Review other initiatives
Define and share evaluation methodology
Define and share research issues
Develop wider team
…
Early results
• Emergent issues from UNESCO discussion
• PRESTO survey of OU students
UNESCO
• 110 Research questions
• 3 connected main areas
– Open Education Resources (OER’s)
– Open Source Software
– Open Access
My selection of the issues
•
•
•
•
•
•
Appropriate use (and re-use) of OERs.
Evaluate the role of Web 2.0 technologies.
What are the barriers for learners to learn?
Supporting Communities of Interest.
Research on Pedagogical Patterns.
A set of "guiding principles“.
Edited from http://oerwiki.iiep-unesco.org/index.php?title=OER_research_questions_longlist
Student survey (Alan Woodley, IET)
1436 OU students
Autumn 2005
35
30
25
An excellent idea
A very good idea
20
Quite a good idea
In betw een/mixed
Quite a bad idea
15
A very bad idea
A terrible idea
10
5
id
ea
te
rri
b
A
ba
d
ve
ry
A
le
id
ea
id
ea
a
ui
te
Q
be
tw
In
a
ui
te
Q
ba
d
ee
n/
m
ix
ed
id
ea
go
od
id
ea
go
od
ve
ry
A
An
ex
ce
ll e
nt
id
ea
0
OU students: Mixed messages
•
•
•
•
64% need tutorials
89% want qualification
90% need assessment
75% would look ahead
•
•
•
•
37% costs not justified
7% would leave OU
33% free education not valued
57% hard to work online
70% Think people would be
interested in registering for the
OU once they could see the
quality of material
Outline Research Agenda
• How can we produce “good” open content?
– Study what we do and how it is used
• What does it mean to learn from open content?
– Ask the users, gather good stories, look for change
• How can the environment support learners?
– Go beyond content, sensemaking tools in action
• What will be the impact of open content?
– Monitor impact on OU, work with the world
Where we are now
• Setting targets
– What will make success?
• Opportunities for experiments
– Is there a role for Mobile learning?
– Can we carry out mass “Grid” data gathering?
– What will users reveal about themselves?
• How to measure?
– What happens to content in the wild?
IET
The Open University
Walton Hall
Milton Keynes
MK7 6AA
www.open.ac.uk