Module 6: Assessment - MERLOT (Multimedia Educational

Download Report

Transcript Module 6: Assessment - MERLOT (Multimedia Educational

Module 6: Assessment
IITE Professional Development Course
Lucknow University (6/4/2010)
Professor Tim Keirn
[email protected]
A Review: Standards-Based Approaches and
Learning Outcomes
 Programme learning outcomes
 Course learning outcomes
 Program curricular map w/ sequenced papers for two
certifications
 Physical Science: Teacher Ed, Chemistry, Mathematics and
Physics
 Biology/Life Science: Teacher Ed, Chemistry, Botany and
Zoology
Review Continued
 Learning outcomes for each paper
 Design an example of a lesson within a paper that is:
 Inquiry-based
 Aligned to a paper specific learning outcome
 Engages students with materials from the web
Review Continued
 Design an assessment that is aligned to the inquiry-based
lesson and the specified paper learning outcome
 Design a rubric for the aforementioned specific assessment
 Publish materials to the portal on the web
General Introduction to Assessment
 Do students learn what faculty believe they are teaching?
How do you know? On what evidence do you
substantiate your claims?
 As an employer of a candidate with an upper second B.Sc
from Lucknow in e.g. Botany -- what do I know ‘they
know’ and what do I know ‘they can do’?
 What more do they know and what more can they do than
someone with a ‘lower second’ and compared to some one with
a ‘first’?
Introduction to Assessment
 Can I assume that someone who did the same paper with Vivek
‘knows and can do’ the same as a student of Nalini?
 If so -- how can you substantiate these claims?
 Think-Pair-Share Strategy: Identify and discuss the
origins of three weaknesses in the current means by
which students are assessed at Lucknow University
 This may not be an exhaustive list!
3 Weakness of Current Assessment
Description
Impact of Weakness
Exams testing factual knowledge and asked to
reproduce knowledge
The exams are the same each year; responding
without a deeper understanding of the
concepts; not training/ developing skills
Evaluation of the exam is effected by
the readers mood, quality of other
papers
Evaluation is not continuous and
comprehensive, reliable and valid
Unreliable evaluation
Students and employers don’t have a
reliable confidence in what a student
could actually do
Traditional Assessment
 Traditional assessment is inseparable from traditional
modes of teaching and learning





PH.D. provides discretion as to what is taught
Stand and deliver
Design assessment to measure knowledge retention
Assign marks based on the ‘volume’ of knowledge retained
PH.D provides discretionary authority to assess the ‘volume’
itself
Problems with Traditional Assessment
 Serve to discriminate between students as opposed to
demonstrating competencies
 Almost always measures the reproduction of factual
knowledge
 Little if any variance in both the method of assessment and the
modality of learning
 Assessment is never deployed as a learning tool
 The secret handshake
 Blame the learner, not the teacher
Problems with Traditional Assessment
(Cont)
 Assessment is infrequent and heavily weighted (high stakes)
 Summative over formative assessment
 Limited measurement of teaching efficacy:
 Did the instructor get the content ‘across’?
 Did the students read and ‘remember’ the book?
Alternative Forms of Assessment
 Standards-, disciplinary- and inquiry-based approaches to
teaching and learning require a different approach to
assessment
 Seek to measure:
 Thinking and skill > factual retention
 Production and application of knowledge > reproduction of
knowledge
 What is learned (aligned to SLO) > What is taught
Alternative Assessment (Cont)
 Standards-based assessments:
 Are designed to measure task competence and degrees of
proficiency > ranking and discriminating between students
 Are done in multiple forms to measure multiple modalities of
learning
 Are learning tools in support of instruction and are transparent
to students
 Are on-going and used to support reflection and improvement
in teaching practice
Alternative Assessment Practicum
 In disciplinary groups -- design a draft of both a formative
and summative assessment aligned to specific student
outcome from a paper in the programme
 Specify the SLO
 Discuss what dimensions of a task are specifically measured in
your standards-based assessments
SLO
Demonstrate
Different Forms of Assessment and
Methodologies
 Formative Assessments
 Aligned to learning outcome and to summative assessment
 Should provide appropriate feedback to student in preparation
for the summative assessment
 Provide appropriate feedback to instructor about the efficacy of
the pedagogic methodology
 Monitoring for comprehension in lecture
 Think-pair-share
 Short prompts
Other types of formative assessment
 Multiple-choice quizzes
 Short exercises and prompts
 Meeting the challenge of marking
 Be specific about nature of feedback and limits of time
 Peer evaluation
 Rubrics
Multiple Choice Questions
 Design questions that assess thinking and skill > factual
content
 Bloom’s taxonomy
 Develop ‘justified’ multiple choice questions that demonstrate
thinking and process
 Develop distracters that demonstrate & identify student
(mis)understandings
 Questions that task students to substantiate or challenge claims
Bloom’s Taxonomy
 Bloom’s pyramid and active verbs
 Recall (list)
 Application (show)
 Analysis (compare)
 Synthesis (predict)
 Evaluation (dispute and/or substantiate)
Authentic Assessment
 Performance assessments tied to authentic disciplinary-
tasks -- students produce knowledge as opposed to
reproducing knowledge
Laboratory practicum
Research projects
Assessment constructed as a problem
Evaluating the validity of different interpretations and
conclusions and their evidentiary basis
 Counterfactual questions and prompts




Rubrics - A Scoring Guide that Provides
Criteria to Describe Levels of Student
Performance
 The advantages of using rubrics:
 Instructors marks more accurately, reliably and quickly
 Requires greater accuracy about the criteria of student
performance
 Serves as a learning tool and provides better feedback to
students and makes the standard of performance explicit
 Creates better reliability across sections
Challenges to Using Rubrics
 Initially time-consuming (but in long-run saves time)
 Difficulty to find exact language that distinguishes between
levels of performance and establishes criteria
 May require revision in initial implementation
Rubric Practicum
 Identify the dimensions of competence in the task that can be
both delineated and demonstrated in the student
performance (aligned with SLO)
 Holistic versus analytic (and the advantages of the latter within
limits)
 Weight and scale the dimensions within the task
Rubric Practicum Cont.
 Establish criteria for competent performance of each
specified dimension of the task
 Establish a scale of criteria performance
 How many clearly identifiable scales? E.g.,
 Competent and Not Competent
 Not Proficient, Proficient, Excellent
 Not Proficient, Developing, Proficient, Beyond Proficient, Exemplary
 # of scales needs to be justified by clearly delineated
performances of each dimension of the task
Rubric Practicum Continued
 The ideal process
 Create draft of rubric
 Implement and refine with evaluation of samples of student
work
 Calibrate with other faculty
 Mark!
Rubric Exercise
 In disciplinary groups -- create a draft rubric for a laboratory
practicum with three scales of performance for each
dimension
 Teacher education faculty -- to do the same but for a pre-service
teacher’s design of a laboratory practicum
SLO: Laboratory Practicum
Dimensions
Lab preparation
Execution of
methodology
Criteria
Not Proficient
Proficient/Baseline
skills
Exemplary
DESCRIPTION
DESCRIPTION
DESCRIPT