Transcript PPT Slides

Engaging learning outcomes across a
discipline and in Institutions
Brian Frank
Queen’s University
Learning outcomes are not new…
E.g. Ontario’s college sector, professional programs
in Canada, accreditation requirements in the US
…but closing the loop is
using evidence from
learning outcomes to
improve student learning
and inform curriculum
Survey: Only 6% of 146 profiles of good practice
submitted contained evidence that student learning
had improved (Banta & Blaich, 2011).
Baker, G. R., Jankowski, N. A., Provezis, S., & Kinzie, J. (2012). Using Assessment Results:
Promising Practices of Institutions That Do It Well. Retrieved from
http://www.tamiu.edu/adminis/iep/documents/NILOA-Promising-Practices-Report-July-2012.pdf
2
Effect size (performance gain in σ)
Computer assisted instruction
800 meta-analyses
50,000+ studies
200+ million students
Time on task
Teaching quality
…
Problem solving teaching
Professional development
Hattie, J. (2009). The Black Box of
Tertiary Assessment: An
Impending Revolution. In Tertiary
Assessment & Higher Education
Student Outcomes: Policy, Practice
& Research (pp.259-275).
Wellington, New Zealand: Ako
Aotearoa
Self-questioning
Creativity programs
Metacognitive strategies
Spaced vs. mass practice
Feedback
Reciprocal teaching
Explicit objectives and assessment
Formative evalution to instructor
Student self-assessment
0
0.2
0.4
0.6
0.8
1
1.2
1.4
Role of learning outcomes in delivery:
Level:
Course
Program
Faculty
Institutio
n
Learning
outcomes
Assessment
Learning & teaching
activities
• Curriculum and assessment planning
• University learning space planning
• University-wide student services and academic
support planning
• Potentially: Competency based credentials
Disciplinary, national
Institutional, provincial
7
Canadian Engineering Accreditation Board:
3.1: Demonstrate that
graduates of a program
possess 12 graduate
attributes
3.2: Continual program
improvement processes
using results of graduate
attribute assessment
8
Engineering Graduate Attribute
Development (EGAD) Project
WHO
Engineering educators and educational developers
across Canada
MANDATE
Collect and develop resources and training
Run annual national workshops, and customized
institutional workshops
9
EGAD Workshops
1. Introduction to Continuous Program
Improvement Processes
2. Graduate Attribute Assessment as a Course
Instructor
3. Creating Useful Learning Outcomes
4. What to Look for in an Outcomes-Based
Process
5. Leading a program improvement process
10
11
Example process
1
2
Program objectives
and outcomes
What do you want
to know about the
program?
Curriculum &
process
improvement
5
Analyze and
interpret
4
Mapping
curriculum
and
assessment
planning
Collecting evidence
3
Developing or adapting outcomes
Tool: Learning outcomes collection
Diploma Bachelor
Aligning outcomes and curriculum
Tool: Curriculum map
Masters
Knowledge
…
…
…
Critical think
…
…
…
Writing
…
…
…
Interpersonal
…
…
…
Aligning outcomes within a course
Tool: Course planning table
Course 1
Outcome 1
Course 3
Develop
Outcome 2
Master/
assess
Assess
Outcome 3 Develop/a
ssess
Scoring performance
Tool: Rubrics
PHYS101 Course Outcomes: Students will:
1. Describe motion of…
2. Predict the behaviour…
Assess
Course 2
Marginal
Meets
Outcome 1
…
…
Teaching
Activity
Week 1
…
…
Outcome 2
…
…
Week 2
…
…
Outcome 3
…
…
Week 3
…
…
Exceeds
13
Software tools
14
HEQCO Learning outcomes
consortium
Issue
No one has effectively closed the loop in Ontario
Consortium goal
Development of useful learning outcomes
assessment techniques and to their wide-scale
implementation in their institutions
Focus on generic learning outcomes and
cognitive skills (critical thinking, communications,
lifelong learning, etc.)
15
Learning outcomes consortium
16
Developing or adapting outcomes
Tool: Learning outcomes collection
Diploma Bachelor
Aligning outcomes and curriculum
Tool: Curriculum map
Masters
Knowledge
…
…
…
Critical think
…
…
…
Writing
…
…
…
Interpersonal
…
…
…
Aligning outcomes within a course
Tool: Course planning table
Course 1
Outcome 1
Course 3
Develop
Outcome 2
Master/
assess
Assess
Outcome 3 Develop/a
ssess
Scoring performance
Tool: Rubrics
PHYS101 Course Outcomes: Students will:
1. Describe motion of…
2. Predict the behaviour…
Assess
Course 2
Marginal
Meets
Outcome 1
…
…
Teaching
Activity
Week 1
…
…
Outcome 2
…
…
Week 2
…
…
Outcome 3
…
…
Week 3
…
…
Exceeds
17
College sector
Durham: Is Student Success ePortfolio effective
for assessing Essential Employability Skills (EES)
George Brown: Tools & rubrics to assess EES
(communication and problem solving)
Humber: reliable instrument for reading, writing,
critical thinking, and problem solving across
curriculum
18
University sector
Guelph: Process and tools for mapping &
assessment of university-wide learning
outcomes using VALUE rubrics
Toronto: Analytic rubrics for communications,
application of knowledge, and teamwork in
engineering
Queen’s: Mixed methods assessment of generic
learning outcomes across four fields
19
Approaches to direct assessment of
learning outcomes across program
①Course-specific criterion-referenced scoring
using course deliverables
②Stand-alone standardized instruments
③General criterion-referenced scoring using
course deliverables
20
Approaches to direct assessment of
learning outcomes across program
①Course specific criterion-referenced scoring
using course deliverables
• Provide clear guidance to students
• Useful for course improvement
• Limited ability to assess development over
multiple years
②Stand-alone standardized instruments
③General criterion-referenced scoring using
course deliverables
21
Example: Leveled outcomes for each year
Theme
Communications
Process
Written
Oral
Graphical
First year
Second year
Describes typical
expectations engineers
to communicate
effectively.
Third year
Graduating year
Generates a traceable Writes and revises
and defensible record of documents using
a technical project using appropriate disciplinean appropriate project specific conventions
records system.
Summarizes and
Composes documents Demonstrates
Write concise, coherent
paraphrases written in styles including
conciseness, precision, and grammatically
work accurately with progress reports,
and clarity of language in correct materials that
appropriate citations professional career
technical writing.
reflect critical analysis
(cover letters, CV, RFP),
and synthesis,
design reports
appropriate to audience
needs.
Delivers clear and
Delivers effective
Demonstrates formal
Demonstrates
organized formal
formal oral
oral presentations with confidence in formal and
presentation following presentations including appropriate language, informal oral
established guidelines appropriate facial
style, timing and flow. communications
gestures, natural body
posture and movement
Creates effective
Creates accurate and
Uses graphics to explain,
figures, tables, and
complete technical
interpret, and assess
drawings employing
graphics.
information
standard conventions
to compliment text.
22
E.g. Course specific outcomes assessed using
course deliverables
7-8
outstanding
Info.
summary
Selfreflection
Arguments
Written
comm.
…
…
…
…
5-6
3-4
expectation marginal
Summarizes
and
assesses
credibility...
Analysis
identifies
limitations...
Claims
supported by
data…
Clearly
formatted
with…
…
0-2
below
GRADE
/8
…
…
…
…
/8
…
/8
…
…
/8
23
Performance by outcome within a course
60
Percentage (%)
50
40
30
20
10
0
FEAS - 3.12-FY1
FEAS - 3.12-FY2
FEAS - 3.12-FY5
FEAS - 3.12-FY6
Attributes
1 - Not Demonstrated
2 - Marginal
3 - Meets Expectations
4 - Outstanding
3.12-FY1 Uses information effectively, ethically, and legally to accomplish a specific purpose, including clear attribution of
Information sources.
3.12-FY2 Identifies a specific learning need or knowledge gap.
3.12-FY5 Identifies appropriate technical literature and other information sources to meet a need
3.12-FY6 Critically evaluates the procured information for authority, currency, and objectivity.
24
Approaches to direct assessment of
learning outcomes across program
①Course specific criterion-referenced scoring using
course deliverables
②Stand-alone standardized instruments (CLA, etc.)
– Measure development over multiple years,
institutional comparison. Validity & reliability data.
– Can be expensive, measure limited set of skills
– Low completion rates, poor motivation particularly
fourth year students, so results suspect
③General criterion-referenced scoring using course
deliverables
25
Approaches to direct assessment of
learning outcomes across program
①Course specific criterion-referenced scoring using
course deliverables
②Stand-alone standardized instruments
③General criterion-referenced scoring using course
deliverables
•
•
•
•
•
Can assess development over multiple years
No additional student work, so no problem with
motivation, completion rates
Encourages alignment between program course
outcomes and course delivery
Requires some additional grading time
Limited availability of validated rubrics
26
Valid Assessment of Learning in
Undergraduate Education (VALUE) Rubrics
• Meta-rubrics that synthesize the common
criteria and performance levels gleaned from
numerous individual campus rubrics for 14
Essential Learning Outcomes
• Can be used to mimic approach taken by some
critical thinking tests that allow programs to
provide their own “artifact” that is scored
against a common set of criteria
27
Rhodes, Terrel, ed. 2010. Assessing Outcomes and Improving Achievement: Tips and Tools for
Using Rubrics. Washington, DC: Association of American Colleges and Universities.
28
Assessing
development using
VALUE rubrics
A. Greenhoot, D. Benstein, Using VALUE Rubrics to Evaluate Collaborative Course Design,
Peer Review, vol. 13 no. 4, AAC&U
29
Queen’s approach: piloting general outcomes in
4 fields
Physical
science
Social science
Engineering
Humanities
30
Outcomes assessment plan over three years
Outcome 1. Course
specific
scoring
Critical
thinking
If available
Problem
solving
If available
Written
comm.
If available
Lifelong
learning
If available
2. Standard
tool (limited
cohort)
3. General
scoring
(VALUE)
4. Think
aloud
CLA or CAT
Critical
thinking
Local
CLA or CAT
Problem
solving
Local
CLA or CAT
Written
comm
LASSI, MLSQ
Info
lit/lifelong
learn
SRLIS
31
Engaging learning outcomes across a
discipline and in Institutions
If you don’t know where you’re going, you’ll
probably end up somewhere else.
Brian Frank
Queen’s University
OTHER SLIDES (USED ONLY IF THERE
ARE QUESTIONS)
Engineering Graduate Attribute
Development (EGAD) Project
33
CEAB requirements include:
a) Identified learning outcomes that
describe specific abilities expected of
students
b) A mapping of where attributes are
developed and assessed within the
program
c) Description of assessment tools used to
measure student performance (reports,
exams, oral presentations, …)
d) An evaluation of measured student
performance relative to program
expectations
e) a description of the program
improvement resulting from process
34
Performance by student in a course
Number of students
400
350
344
Below target
Below threshold
300
228
250
187
200
150
100
50
26
42
6773 65 62
46 38 45
0
0
1
2
3
4
100
7
5
20
41
0
0
25
0 10 0 2 0 2 0 0 0
6-10 11-15 16-20 21-25 26-30 31-35 36-40 41-50
Number of indicators
Engineering Graduate Attribute
Development (EGAD) Project
35
environmental
individual
4 approaches to facilitating change
Disseminating
(good for knowledge, poor for
long term change)
Enacting Policy
prescribed
Supporting individual
innovators
Developing shared
vision
emergent
Effective strategies: are aligned with or seek to change beliefs, long-term
interventions, understand university as a complex system, honest about issues and
problems.
Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM
instructional practices: An analytic review of the literature. Journal of Research in Science
Teaching, 48(8), 952–984. doi:10.1002/tea.20439
36
Adopting change
37
Software packages evaluated
•
•
•
•
•
•
•
Canvas
Desire2Learn
eLumen
LiveText
Moodle
Waypoint Outcomes
(No response from Blackboard)
Engineering Graduate Attribute
Development (EGAD) Project
Want to
merge into
one tool!
38
Engineering Graduate Attribute
Development (EGAD) Project
39
Assessment Analytics
Engineering Graduate Attribute
Development (EGAD) Project
40
Software summary
• Desire2Learn is the closest to a complete
package to manage courses, learning
outcomes, rubrics, and reporting; Analytics
tool in early stages
• eLumen outstanding at analysis, but poor
integration into general LMS
• Waypoint Outcomes/LiveText outstanding at
managing outcomes, rubrics, and feedback
Engineering Graduate Attribute
Development (EGAD) Project
41
Norm referenced evaluation
Grades
Student: You are here!
(67%)
Used for large scale evaluation to compare
students against each other
Criterion referenced evaluation
Student has marginally met
expectations because submitted
work mentions social,
environmental, and legal factors
in design process but no clear
evidence of that these factors
Impacted on decision making.
Used to evaluate students against stated
criteria
Engineering Graduate Attribute
Development (EGAD) Project
42