Presentation Title

Download Report

Transcript Presentation Title

Standards, quality assurance,
best practice and benchmarking
in e-learning
Professor Paul Bacsich
Matic Media Ltd, and
Middlesex University, UK
1
The Menu
• Standards (technical)
• Quality Assurance
– Standards (content)
– Standards (pedagogy and process)
• Best Practice
– Excellence?
• Benchmarking
• Conclusions
2
Standards (technical)
• UK follows mainly IMS
• Agency called CETIS set up by JISC to
advise universities and colleges on IMS
• A few mega universities (OU, Ufi, etc) are
direct members of IMS
• IMS Learning Design gaining influence
• Also e-portfolios
3
Standards (Content)
• Quality Assurance Agency has set up
“subject benchmarks”
• More about generalised competences than
detailed syllabi
• See
www.qaa.ac.uk/academicinfrastructure/
benchmark/
4
Standards (pedagogy and process)
• Quality Assurance Agency (QAA)
• “Code of practice for the assurance of academic
quality and standards in higher education”
• See www.qaa.ac.uk/academicinfrastructure/
codeOfPractice/
• Not much on pedagogy – this is left to the
discretion of the professor
5
QAA in e-learning
• Little has been done specifically on elearning – but see…
• “Collaborative provision and flexible and
distributed learning (including e-learning)”
• Recent – September 2004
• Some feel it says too little, others do not
want to be restricted
6
Digression on Pedagogy
• Higher Education Academy
• “works with universities and colleges, discipline
groups, individual staff and organisations to help
them deliver the best possible learning
experience for all students”
• Runs Subject Centres for each subject
• Beginning to advise on e-learning
7
Best practice in e-learning
• Not much studied in the UK yet
• OU a major source of advice
• UKeU set up to crystallise best practice into an
operational business
• It failed – but its legacy may help
– Committee for Academic Quality
• US much more active – see e.g. “Quality on the
Line” (IHEP, 2000)
8
In UK, universities compete
- and now in e-learning
• Universities want to judge how well they are
doing in e-learning
• And funding agencies also want to know
• But universities don’t want to tell if they are
doing badly! Not the public, not the funding
agencies.
• And universities (like people) are not good at
judging themselves.
9
Benchmarking
• Like Activity Based Costing, it has been
around for many years
• Unlike ABC, but like BPR, quality,
excellence, etc; no one is now sure what it
means…
10
Back to Basics (Xerox)
a process of self-evaluation and self-improvement
through the systematic and collaborative
comparison of practice [process]
and performance [metrics, KPIs]
with competitors [or comparators]
in order to identify own strengths and weaknesses,
and learn how to adapt and improve
as conditions change.
11
Benchmarking Dichotomies
•
•
•
•
•
•
Implicit
Independent
Internal
Vertical
Inputs or Processes
Metric
•
•
•
•
•
•
Explicit
Collaborative [clubs]
External
Horizontal
Outputs
Qualitative
(After Jackson)
12
Focus of my work
• Focussed purely on e-learning
• But not to any particular style (e.g. DL)
• Oriented to institutions past the “a few projects”
stage
• Suitable for desk research as well as invasive
studies
• Suitable for single- and multi-institution studies
13
Benchmarking (in Universities)
• There are several reports that will tell you
how to do benchmarking in general
– Higher Education Academy (UK)
– Learning and Skills Development Agency
(UK)
– Department of Education Training and Youth
Affairs (Australia)
14
Benchmarking (in Universities)
• And some agencies can help:
– European Benchmarking Programme on
University Management (ESMU, Brussels)
– English Universities Benchmarking Club
15
Benchmarking in e-Learning
• There are very few reports
– National Learning Network (UK) –not for
universities, but for colleges
– E-Learning Maturity Model (NZ) – brand new!
16
Quality/Best Practice in e-Learning
• There are a few reports (US):
– APQC/SHEEO Study 1998 (US)
– IHEP “Quality on the Line” 2000 (US)
• And several projects (EU):
– BENVIC
– SEEQUEL
– Swiss Virtual Campus @ Lugano: MINE
17
Excellence (?) in e-Learning
• New project:
E-xcellence (EADTU and others)
• Outside e-learning, several projects:
– Consortium for Excellence in Higher
Education (UK)
18
Benchmarking e-learning
A “synthesis”
19
Processes or Outputs?
• Outputs first (can be done by desk
research)
• Processes later (best done in clubs or
invasive studies)
• Inputs not of interest to students; but of
course of interest to funders
20
Metrics or Bureaucratic
• Use a 6-point scale
– 5 from Likert plus 1 more for “excellence”
• Backed up by metrics where possible
• Also contextualised by narrative
• Remember the problems of judging “best
practice”; judging “better practice” is easier
21
Other Decisions
•
•
•
•
Explicit (otherwise you are not trying)
Independent or collaborative
Internal or external
Horizontal: focus on processes across the
whole institution; do not be seduced into
individual projects
22
How Many Benchmarks?
• It is like ABC: how many activities?
• Answer: Not 5, not 500.
• Better answer: Well under 100.
– Composite some criteria together
– Remove any not specific to e-learning
– Be careful about any which are not provably
critical success factors.
23
How Many do Others Have?
•
•
•
•
LSDA (UK) has 14
IHEP (US) has 24
APQC/SHEEO (US) had 14
(Breaking news) EMM (NZ) has 43
24
Pick and Mix System
• 25 criteria (liable to grow to around 30)
• 6 levels, backed up by qualitative and numeric
information
• Student-oriented
• Focussed on critical success factors
• Requires no long training course to understand,
if you know about e-learning
• Methodology-agnostic
25
“Adoption phase” (Rogers)
1.
2.
3.
4.
5.
6.
Innovators only
Early adopters taking it up
Early adopters adopted; early majority taking it up
Early majority adopted; late majority taking it up
All taken up except laggards, who are now taking
it up (or retiring or leaving)
First wave embedded, second wave under way
(e.g. m-learning after e-learning)
26
“Training”
1.
2.
3.
4.
5.
6.
No systematic training for e-learning
Some systematic training, e.g. in some projects and
departments
U-wide training programme but little monitoring of
attendance or encouragement to go
U-wide training programme, monitored and incentivised
All staff trained in VLE use, training appropriate to job
type – and retrained when needed
Staff increasingly keep themselves up to date in a “just
in time, just for me” fashion except in situations of
discontinuous change
27
What’s next?
28
Next Steps
• Correlate with “quality” and “excellence” projects in EU
• Publish a review report on UK Committee for Academic
Quality (in e-Learnng) August
• Review underpinning methodologies (CMM etc)
• Literature search outside Europe, US and Commonwealth
• Series of workshops
– at ALT-C 2005 Manchester September
– At ACODE Australia November
– at Online Educa Berlin December
29
Thank you for listening
Any questions?
Professor Paul Bacsich
Global Campus, Middlesex University
[email protected]
www.cs.mdx.ac.uk/staff/profiles/p_bacsich.html
30