Assessment: A Collective Commitment
Download
Report
Transcript Assessment: A Collective Commitment
Developing and Assessing General
Education Learning Outcomes: A
Collaborative Commitment across
The Institution
Workshop at Miami Dade College
November 21, 2005
Peggy Maki
[email protected]
1
Workshop Foci
Building a Culture of Evidence across The
Institution
Grounding Assessment of GE in Teaching
and Learning
Collaboratively Developing Learning
Outcome Statements-- Claims about Student
Learning
2
Validating Learning Outcome Statements
through Maps and Inventories of Educational
Practice
Designing or Selecting Valid Assessment
Methods that Align with Students’ Educational
Experiences
Developing Standards and Criteria of
Judgment
3
•
Analyzing and Interpreting Results of Student
Work
•
Closing the Inquiry Loop
4
Gather Evidence
Interpret
Evidence
Mission/Purposes
How well do
students achieve
our
outcomes?
Learning Outcome
Statements
Enhance teaching/
learning;
inform institutional
decisionmaking, planning,
budgeting
5
Your Learning Outcomes
Articulate some GE learning outcome
statements that align with what and how
students learn in your programs and services
Map GE outcome statements to assure
students have diverse and multiple
opportunities to learn
Identify some direct and indirect assessment
methods to capture student learning
6
Develop some standards and criteria of
judgment to score student work
Identify when and where to assess and how
to collect evidence of student learning
Identify when and who will assess evidence
of student learning
7
Identify possible times across the institution
when colleagues can come together to
interpret results and reach consensus about
ways to improve student learning
After implementing changes, identify when
you will reassess the efficacy of changes.
8
Building a Culture of Evidence
Faculty
Peers
Tutors
Student
Services Staff
Spiritual
Leaders
Lab
Assistants
Mentors
Learning
Graduate
Students;
Teaching
Assistants
Intern
Advisors
Support
Services
Staff
Academic
Advisers
Librarians
and
Resource
Staff
Advisory
Board
Members
Athletic
Coaches
9
R.W. Emerson, “Intellect,” Essays
(1841)
“How can we speak of the action of the mind
under any divisions, as of its knowledge, of
its ethics, of its works, and so forth, since it
melts will into perception, knowledge into
act? Each becomes the other. Itself alone is.
Its vision is not like the vision of the eye, but
is union with the things known.”
10
How do you learn?
List several strategies you use to learn:
____________________________________
____________________________________
____________________________________
____________________________________
_________________________
11
Grounding Assessment of GE in
Teaching and Learning
Learning is a complex process of
interpretation-not a linear process
Learners create meaning as opposed to
receive meaning
Knowledge is socially constructed
(importance of peer-to-peer interaction)
National Research Council. Knowing What Students Know, 2001.
12
Learning involves creating relationships
between short-term and long-term memory
Transfer of new knowledge into different
contexts is important to deepen understanding
Practice in various contexts creates expertise
13
People learn differently—prefer certain
ways of learning
Deep learning occurs over time—transference
Meta-cognitive processes are a significant
means of reinforcing learning (thinking about
one’s thinking and ways of knowing)
14
Integration of learning and development over
time….
Cognitive
Psychomotor
Affective
15
Specific Questions that Guide
Assessment
What do you expect your students to know and
be able to do by the end of their education at
your institution?
What do the curricula and other educational
experiences “add up to?”
What do you do in your classes or in your
programs or services to promote the kinds of
learning or development that the institution
seeks?
16
Questions (con’d)
Which students benefit from various classroom
teaching strategies or educational experiences?
What educational processes are responsible for
the intended student outcomes the institution
seeks?
How can you help students make connections
between classroom learning and experiences
outside of the classroom?
17
Questions, con’d:
What pedagogies/educational experiences
develop knowledge, abilities, habits of mind,
ways of knowing/problem solving, and
dispositions?
How are the curriculum and co-curriculum
designed to develop knowledge, abilities,
habits of mind, ways of knowing, and
dispositions?
18
How do you intentionally build upon what
each of you teaches or fosters to achieve
programmatic and institutional objectives—
contexts for learning?
What methods of assessment capture
desired student learning--methods that align
with pedagogy, content, curricular and
instructional design?
19
Common Categories of GE
Learning
Writing
Speaking
Quantitative Reasoning
Problem solving, critical thinking
20
Leadership
Lifelong learning
Ethical awareness; social responsibility
Team work
Global perspectives; multiple perspectives
21
Mesa Community College
Categories (AZ)
Written and oral communication
Critical thinking/problem solving
Numeracy
Arts and humanities
Scientific inquiry
Information literacy
Cultural diversity
22
Inventory from MDC’s Student
Services Last Friday:
Writing
Speaking
Reading Comprehension
Critical thinking/problem solving
Quantitative reasoning/problem solving
Technology
Application of knowledge
Proficiency in a chosen field
23
Cultural literacy
Globalism
Teamwork/solo work
Self-initiative/independence
Social responsibility
Ethical awareness
Leadership
Ability to adapt to environments/changes
24
Categories under which
students learn and develop
List several categories under which you
believe students learn or develop as a result
of MDC’s GE program?
____________________________________
____________________________________
____________________________________
____________________________________
____________________________________
____________
25
Inventory Based on Nov. 21
Cross-Campus Group Work
Writing
Speaking
Listening
Quantitative reasoning, including ability to
assess and evaluate
Critical thinking
Ethical awareness
personal/social responsibility, including cultural
dimensions
26
Environmental ethics
Computer/information literacy
Cultural Literacies
Problem solving
Problem posing
Financial responsibility
Workforce skills
Knowledge about self, others, community,
world
27
Leadership
Active learning (self)
Teamwork
Ability to link across the curriculum and
experiences
Time management
Global perspectives/diversity
Cultural sensitivity
Interpersonal skills
Adaptability
28
Scientific thinking/methods
Appreciation of the arts, including a global
perspective
Life skills
29
Collaboratively Developing
Learning Outcome Statements
Learning outcome statements describe what
students should be able to demonstrate,
represent, or produce based on how and what
they learn at the institution through multiple,
varied, and intentional learning opportunities.
30
Rely on active verbs, such as create,
compose, calculate, develop, build, evaluate,
translate, etc., that target what we expect
students to be able to demonstrate
Emerge from what we value and how we
teach or students learn; that is, they emerge
from our educational practices and are
developed through consensus
Are written for a course, program, service, or
the institution
31
Can be mapped to the curriculum and cocurriculum
Can be assessed quantitatively or
qualitatively
32
Levels of Learning Outcome
Statements
Institution-level Outcome Statements, including GE
Department-,Program-, Certificate-level Outcome Statements
Course/Service/Educational Experience Outcome Statements
33
Distinguishing between
Objectives and Outcomes
Objectives state overarching expectations
such as–
Students will develop effective oral
communication skills.
OR
Students will understand different
economic principles.
34
Mesa Outcomes under Arts and
Humanities
Demonstrate knowledge of human creations
Demonstrate an awareness that different
contexts or world views produce different
human creations
Demonstrate an understanding and
awareness of the impact that a piece has on
the relationship and perspective of the
audience
Demonstrate an ability to evaluate human
creations
35
Capital Community College
(CT)
Communicate effectively
Reason scientifically and or quantitatively
Think critically
Develop a global perspective
(See handout)
36
Ethics—Students should be able
to…
Identify and analyze real world ethical
problems or dilemmas, and identify those
affected by the dilemma.
Describe and analyze the complexity and
importance of choices that are available to the
decision-makers concerned with this dilemma
37
Articulate and acknowledge their own
deeply held beliefs and assumptions
as part of a conscious value system
Describe and analyze their own and others’
perceptions and ethical frameworks for
decision-making
Consider and use multiple choices, beliefs,
and diverse ethical frameworks when making
decisions to respond to ethical dilemmas or
problems.
California State University Monterey Bay: University Learning Requirements, 2002
38
Example from ACRL
Literate student evaluates information and its
sources critically and incorporates selected
information into his or her knowledge and
value system.
ONE OUTCOME: Student examines and
compares information from various sources in
order to evaluate reliability, validity,accuracy,
timeliness, and point of view or bias.
39
Quantitative Literate Graduates according
to MAA Should be Able to:
1. Interpret mathematical models such as
formulas, graphs, tables, and schematics, and
draw inferences from them.
2. Represent mathematical information
symbolically, visually, numerically, and verbally.
3. Use arithmetical, algebraic, geometric, and
statistical methods to solve problems.
40
4.
Estimate and check answers to mathematical
problems in order to determine reasonableness,
identify alternatives, and select optimal results.
5.
Recognize that mathematical and statistical
methods have limits.
(http://www.ma.org/pubs/books/qrs.html)
The Mathematics Association of America (Quantitative
Reasoning for College Graduates: A Complement to
the Standards, 1996). See also AMATYC draft, 2006. 41
Writing
See NCTA Guidelines
See WPA Outcomes in attachments for
outcomes at the end of the first year of writing
42
Ways to Articulate Outcomes
Adapt from professional organizations
Derive from mission of
institution/program/department/service
Derive from students’ work
43
Derive from ethnographic process
Derive from exercise focused on listing one
or two outcomes “you attend to”
Consult taxonomies
44
Taxonomies That May Help You
Develop Outcome Statements
Bloom’s Taxonomy—cognitive, psychomotor,
affective
Webb’s Taxonomy—depth of knowledge
Shulman’s Taxonomy—table of learning
45
Depth of Knowledge (Webb)
Recall and recognition
Processing skills and concepts
Strategic thinking
Extended thinking (complex reasoning,
planning, design)
46
Dimensions of Knowledge
Facts
Procedures—series of step-by-step actions
and decisions that result in the achievement
of a task
Processes—flow of events or activities that
describe the big picture
47
Concepts—class of items, words, or ideas
known by a common name
Principles—guidelines, rules, parameters
Metacognitive—knowledge of one’s own
cognition
48
Shulman’s Taxonomy
Engagement (active learning)
Knowledge and understanding
Performance, practice, or action (act in and
on the world)
Reflection and critique (cease action to
discover or “make progress”)
49
Judgment and design—consider context—
even restraints
Commitment and Identity—move inward and
connect outward
http:///www.carnegiefoundation.org/elibrary/docs/printable/
making_differences.htm
50
Exercise: Write one or two GE
learning outcome statements
under a category of learning
__________________________________
__________________________________
___________________________________
51
Exercise:
How well do your learning outcome
statements meet the criteria for well-written
outcome statements (see handout)?
52
Validating Learning Outcome
Statements through Maps and
Inventories of Practice
•
Reveal how we translate outcomes into
educational practices offering students
multiple and diverse opportunities to learn
•
Help us to identify appropriate times to
assess those outcomes
•
Identify gaps in learning or opportunities to
practice
53
Help students understand our
expectations of them
Place ownership of learning on students
Enable them to develop their own maps or
learning chronologies
54
Collaborative Development of A
Curricular-Co-Curricular Map
I,R,E
Course
Course
Educational
experience
Outcome 1:
Outcome 2:
Outcome 3:
Outcome 4:
55
Inventories of Educational
Practice
Provide in-depth information about how
students learn along the continuum of their
studies
Identify the range of educational practices
and assessment experiences that contribute
to learning outcomes (See handouts)
56
Exercise: How will you use
maps and inventories?
Discuss how you will go about the process of
developing a curricular or curricular-cocurricular map and how you will label
peoples’ entries
Discuss how you might use inventories of
educational practices
57
Designing or Selecting Valid Assessment
Methods that Align with Students’
educational Experiences
“Every assessment is also based on a set of
beliefs about the kinds of tasks or situations
that will prompt students to say, do, or create
something that demonstrates important
knowledge and skills. The tasks to which
students are asked to respond on an
assessment are not arbitrary.“
National Research Council. Knowing what students know: The science and design of educational
assessment . Washington, D.C.: National Academy Press, 2001, p. 47.
58
Design or Select Assessment
Methods that Prompt Students to:
Transfer, integrate, apply, synthesize
Value interdependence among courses and
experiences
Re-use and re-configure what they have
learned (even to re-position their
understanding)
Self-reflect on their emerging learning
59
For example, do students
Apply business principles to a student-run
organization?
Apply principles of effective writing to a
proposal for an independent study or project?
Explore multiple perspectives in solving a
campus issue or problem?
Self-reflect on principles underlying their
actions or decisions?
60
Assumptions Underlying
Teaching
Actual Practices
Assumptions Underlying
Assessment Tasks
Actual Tasks
61
Inference Drawing
Validity of the Method
62
What Tasks Elicit Learning You
Desire?
Tasks that require students to select among
possible answers (multiple choice test)?
Tasks that require students to construct
answers (students’ problem-solving and
thinking abilities)?
Question: Consider the contexts for each of these kinds
of tasks in your work
63
When Do You Seek Evidence?
Formative—along the way?
For example, to ascertain progress
or development
Summative—at the end?
For example, to ascertain mastery level of
achievement
64
Direct Methods of Assessment
Focus on how students represent or
demonstrate their learning (meaning making)
Align with students’ learning and assessment
experiences
Align with curricular-and co-curricular design
verified through mapping
65
Invite collaboration in design (faculty, students,
tutors?)
66
Standardized Instruments
Psychometric approach—values quantitative
methods of interpretation
History of validity and reliability
Quick and easy adoption and efficient scoring
One possible source of evidence of learning
67
Do Not Usually Provide
Evidence of strategies, processes, ways of
knowing, understanding, and behaving that
students draw upon to represent learning
Evidence of complex and diverse ways in which
humans construct and generate meaning
Highly useful results that relate to pedagogy,
curricular design, sets of educational practices
68
Authentic, Performance-based
Methods
Focus on integrated learning
Directly align with students’ learning and
previous assessment experiences
Provide opportunity for students to generate
responses as opposed to selecting responses
Provide opportunity for students to reflect on
their performance
69
Do Not Provide
Immediate reliability and validity (unless there
has been a history of use)
Usually do not provide easy scoring unless
closed-ended questions are used.
70
Direct Methods across Students’
Learning Chronology
On-line tools
Critical events
Assemblage of learning objects
Virtual learning environments or situations
(including chatrooms and resource rooms)
71
Scenarios
Storyboards
Self-directed group projects
Magic box
Personal and annotated websites
72
Log book or journal tasks that explore
issue over time
an
Event analysis
Video clips
Case studies over time as students move
through courses and educational experiences
73
Externally or internally juried reviewed
projects
Oral defense
E-portfolio
Aristotle’s finger exercises
Interpreting visual material or data
74
Representation –concept mapping or
problem solving (3-D)
Practice of Artists’ Machetes
Mining data
Students’ drawings and models—perceptual
enhances understanding, analysis. and
analytical ability
75
Chronological tasks that prompt students
to stretch over time
Draw on knowledge/understanding to solve
problem in a different context
Problems with solutions: Are there other
solutions?
Team-based projects
Self-reflections
76
Magnify or reduce to seek wider implications
and relationships (a la Lewis Thomas)
Professional/disciplinary practices
Embedded assignments
77
Performance on national licensure
examinations
Locally developed tests
78
Indirect Methods of Assessment
Than Can Be Combined with Direct
Methods
Programs or Courses selected by students
Focus groups (representative of the
population)
Interviews (representative of the population)
Surveys
79
Other Sources of Information that May
Be Useful in Your Interpretation
CSSE results
Grades
Participation rates or persistence in support
services
80
Course-taking
Students’
patterns
majors
Transcript
analyses or audits (cocurricular transcript?)
81
Exercise:
Using the handout, determine the degree of
alignment of the direct and indirect methods
you may use to asses your outcome
statements.
82
Developing Standards and Criteria of
Judgment
A set of criteria that identifies the expected
characteristics of a text and the levels of
achievement along those characteristics.
Scoring rubrics are criterion-referenced,
providing a means to assess the multiple
dimensions of student learning.
Are collaboratively designed based on how
and what students learn (based on curricularco-curricular coherence)
83
Are aligned with ways in which students
have received feedback
(students’ learning histories)
Students use them to develop work and to
understand how their work meets standards
(can provide a running record of
achievement).
84
Raters use them to derive patterns of student
achievement to identify strengths and
weaknesses
Analytic
Holistic
85
Interpretation through Scoring
Rubrics
Criteria descriptors (ways of thinking, knowing
or behaving represented in work)
Creativity
Self-reflection
Originality
Integration
Analysis
Disciplinary
logic
86
Criteria descriptors (traits of the performance,
work, text)
Coherence
Accuracy
or precision
Clarity
Structure
87
Performance descriptors (describe well
students execute each criterion or trait along
a continuum of score levels):
Exemplary—Commendable–
Satisfactory- Unsatisfactory
Excellent—Good—Needs
Improvement—Unacceptable
Expert—Practitioner—Apprentice-Novice
88
Development of Scoring Rubrics
Emerging work in professional and
disciplinary organizations
Research on learning (from novice to expert)
Student work
89
Interviews with students
Experience observing students’ development
90
Consider the following guidelines as you
develop a scoring rubric for one or more
of our outcomes
Identify the purpose of the rubric—for student
feedback, for justifying a grade, for programlevel understanding about student learning
Identify the overall format—analytic or
holistic?
91
Identify the full range of criteria you will
assess with indicators for these criteria
Identify the performance descriptors; within
each cell identify leveled performance
92
Pilot-testing the Scoring
Rubric
Apply to student work to assure you have
identified all the dimensions with no overlap
Schedule inter-rater reliability times:
-independent scoring
-comparison of scoring
-reconciliation of responses
-repeat cycle
93
Analyzing and Interpreting Results
Seek patterns against criteria and cohorts
Build in institutional level and program
level discourse
Tell the story that explains the results—
triangulate with other data, such as
CSSE or participation rates
94
Be able to aggregate and disaggregate data
to guide focused interpretation
Collectively determine what you wish to
change
95
Examples of Changes:
Increased attention to weaving experiences
across the institution, a program, or a
department to improve student achievement
Changes in advising based on assessment
results
Closer monitoring of student achievement-tracking
96
Faculty and staff development to learn how
to integrate experiences that contribute to
improved student learning
Changes in pedagogy and curricular and cocurricular design
Development of modules to assist learning;
use of technology; self-paced learning,
supplemental learning
97
Closing the Inquiry Loop to Learn
Implement agreed upon changes
Re-assess to determine efficacy of changes
Focus on collective effort—what we do and
how we do it
98
Structures
Assessment Committees at the institution
and department or program levels
Development of task forces to assume
responsibilities
99
Working Group that
identifies
collective
expectations for
learning
Working group that
develops outcome
statements to guide
cycles of inquiry
Collective community
dialogue designed to
build on institutional
learning
Assessment
Committee
Collective community
contributions about ways
to adapt, revise, or
innovate practices to
improve student learning
Collective community
interpretations of
results
Working group that
selects or designs
assessment methods
to capture learning
over time
Working group that
collects and analyzes
students’ work or
responses
100
Communication: Collaborative
Interpretation
Disciplinary work groups
Cross-disciplinary work groups
Formal opportunities to share program-level
findings at the institution-level; opportunities
to share institution-level findings at the
program-level
101
Communication: Decision-making
Bodies
Planning (short- and long-term planning)
Budgeting
Decision-making
Allocation of Resources
102
Human, Financial, Technological
Support
Grad students or part-time support to assist
with development of methods or research on
methods, collection or analysis
Analysis of results
Faculty and staff development or resources to
support efforts
Development of technology to house results or
to draw from existing data
103
Exercise:
Describe the structures, processes,
decisions, and channels and forms of
communication that currently exist at
MDC, as well as your ideas for
deepening the commitment to
assessment (see handout).
104
“What and how students learn depends to a
major extent on how they think they will be
assessed.”
John Biggs, Teaching for Quality Learning at
University: What The Student Does. Society for
Research into Higher Education & Open University
Press, 1999, p. 141.
105
Works Cited
Biggs, J. (1999). Teaching for Quality Learning at University:
What The Student Does. Society for Research into Higher
Education & Open University Press, 1999, p. 141.
Maki, P. (2004). Assessing for Learning: Building a
Sustainable Commitment Across the Institution. Sterling,
VA: Stylus Publishing, LLC, and the American Association
for Higher Education.
National Research Council. (2001). Knowing What Students
Know: The Science and Design of Educational Assessment.
Washington, D.C.: National Academy Press
106