SBT Enrollment (Civilian vs. Military)

Download Report

Transcript SBT Enrollment (Civilian vs. Military)

The Best of Both Worlds:
Combining Open Educational
Resources with Credit by
Examination
November 7, 2012
Jeff Davidson and Mika Hoffman
Types of OER
OpenStudy





Physics lecture
Lecture notes
Videos of classroom lectures
Lessons designed for OER
Courses
Discussion groups
Khan Academy
Open University
Who Creates OER?

Professors from accredited colleges create the vast majority
of OER for higher education
 MIT Open Courseware

Over 2000+ resources shared
 Washington State open course library

42 complete full length courses
 Rice University Connexions
 Houses 20,000+ learning modules
The Saylor Foundation
Saylor video
Sample Uses for OER
 Material for courses (either blueprint structure or
individual readings, lectures, exercises, etc.)
 Courses for students pursuing independent study
 Additional/remedial skill development
OER in Action
 Dr. Joanna Smithback - Terra State C.C. Ohio
 Dr. Concepcion Saenz-Cambra – NYU
 Dr. Jason Gainous – Louisville University
 Dr. Leslie Wallace – U. of Pittsburgh
 Dr. Amy Thompson - Hopkinsville C.C.
 Dr. Benjamin Schwantes – Widener University
Is OER Sustainable?
 Private foundations such as Gates, Hewlett, Lumina
and Saylor are investing significant dollars in OER
 Governments are starting to invest in OER as well
 Imagine if every U.S. Public college built just one
open course - 4,400+ courses
 Low-cost assessments could recoup costs and pay
instructor fees
What is OER’s value?
 Learning
 Certificates
But…
 OER by itself does not typically
award formal educational credit
Why not?
Academic credit
 What is credit?
 Assurance that someone knows
something
 The something must be appropriate for
the particular academic program
 To provide that assurance, both the
someone and the something must be
verified
9
OER and assessment
“What you know is more important than
where or how you learned it.”
Credit should be based on knowledge, not
attendance
Focus here is on decoupled assessment
10
Validity
 Interpretation and use of results/credit is
supported by (good) arguments
 Part of making the argument is identifying
threats to validity and countering the threats
11
Aspects of validity for OER
assessments
 Identity verification
 Assessment quality
 Appropriateness of knowledge tested
for a particular degree program
 Scalability
12
Identity verification
Ryan Ruppe
Jeffery Turner
Steve Winton
13
Threats to validity--courses
 Did the person actually go through the
course?
 Did the person do his/her own work?
 Is the person who took the course the
same person who is presenting the
credential?
14
Threats to validity--assessments
 Is the person taking the assessment the
same person who is claiming the
knowledge?
 Is the person claiming the knowledge the
same person who is presenting the
credential?
15
Assessment quality
A good assessment
 Measures knowledge of the subject
 Does not measure irrelevant characteristics
 Gives a person the same score regardless of
which form is taken
 Gives people of the same ability the same
score
16
Threats to assessment quality
 Assessments may not measure quite the
same content as the OER addressed
 Not all things in the world labeled “Sociology”
cover the same topics!
 Assessments may not cover material in
enough depth
 A 10-question quiz is unlikely to cover the
content of OER covering the equivalent of a
college semester
17
Threats to assessment quality
 The assessment might not be scored
consistently enough
 Different instructors have different standards
 Assessments might measure irrelevant
characteristics
 Unnecessarily complicated questions
 Questions with “giveaway” answers
 Trick questions
18
Appropriateness of content
 Match of assessment to OER
 How close are the assessment specifications
to the learning objectives of the OER
material?
 Match of assessment to credit-granting
body
 How close are the assessment
specifications/learning objectives to what is
taught at the institution where credit is
sought?
19
Scalability - Generalizability
Less
generalizable
More
generalizable
 Course final exams and
homework
 Third-party assessments
designed for a specific course
 Competency/proficiency
assessments
Scalability – Large groups
Less scalable
 Individual assessments
 Portfolios
 Research papers
 Oral examinations
 Human-scored group assessments
 Short answer questions
 Essay questions
 Machine-scored assessments
 Multiple-choice exams
 Machine-scored constructed response
exams
More scalable
Efficient assessment
 Machine-scored competency/proficiency
exams can handle large numbers of
examinees and be used for multiple OER
sources
 Machine-scored exams need not measure
simply mindless regurgitation of facts
 For some types of OER, more specific
assessments may be needed
 Portfolios for highly specialized content
Excelsior College Exam-Based Degree
Paths
 Supported Independent Study model is based on Open
Courseware options reviewed and recommended for all
exams
 Initial degree templates for BS and BA in Liberal Studies
and AS and BS in General Business
Sample ASB Degree Template

(ETC ASB example)
Transportability
 Even with a valid assessment, the credit
decision is up to the institution where the
student wants credit
 Not many mechanisms are in place to
help!
 Many institutions not interested in mobility
Infrastructure Needs
 Common definitions of assessment-based
learning among the Regional Accreditors
 Access to Federal Financial Aid
 National Database of existing credential
and training assessments
27
A model for low-cost
education
OER University
 Daysha Geleta
http://www.youtube.com/watch?v=LaXTiQaP
L3c&feature=youtu.be
Chelsea Mansfield
http://www.youtube.com/watch?v=srainl1c_v
U&feature=youtu.be
29