Transcript Slide 1

2008 T&L Symposium Session Overview
- Introduction
- Challenges of arriving at common
expectations
- What is your sphere of influence?
- Where have we been?
-2003 Assessment Plan
-2007 Assessment Audit
-2008 Preface to the 2003 Assessment
Plan
University of Wisconsin-Madison
Establishing Institution-wide
Expectations for Student Learning
Spheres of Influence
Where do you fit?
-Where are we going?
-Convergence and integration
-What do you think?
- Discussion about your roles and how
expectations for student learning
influence your work as an educator
- How do we know how we are doing?
About Assessment at UW-Madison:
http://www.provost.wisc.edu/assessment/
About the Wisconsin Experience:
http://www.provost.wisc.edu/teach.html
About the Essential Learning Outcomes:
http://www.ls.wisc.edu/gened/LEAP/default.htm
Courses
Co-Curricular
Activities
Institutional-level perspective
More aggregated
Presenters
Mo Noonan Bischof, Assistant to the Provost and CoChair of the Assessment Council
[email protected]
Jocelyn Milner, Director of Academic Planning and
Analysis, past Co-Chair of Assessment Council
[email protected]
Academic
Programs
High-resolution
-Where are we now?
-Essential Learning Outcomes (and
LEAP)
-Wisconsin Experience
External stakeholders
1
JLM/APA April 2008
Where have we been?
Early 1990’s General Education Assessment
1995 University Assessment Plan
2003 University Assessment Plan
Year-long process
Specified roles and responsibilities across levels
Evaluation is part of academic life; system = assessment
Every academic program has to have a plan, do
something (anything!) annually to evaluate student
learning
Use assessment findings *locally*
Intentionally laid aside the specification of universitywide expectations for student learning
Where have we been?
2007 Assessment Audit
- Explicit institution-level learning outcomes are necessary,
basis for knowing how we are doing, for doing better
- What expectations for student learning had been stated,
agreed upon by the university community?
- Examine expectations for student learning from existing
documents
- Our finding: Existing statements aligned with the
Essential Learning Outcomes (ELOs)
- Recommendation: adopt ELOs as university-wide
expectations for student learning
- To: Provost, VPT&L, UAC, UGEC, LEAPers
A number of campus groups have been
independently discerning the distinctive
nature of a UW-Madison education. These
groups, units, and individuals have started to
coordinate efforts in more intentional ways,
and are converging on a shared understanding
of the educational experience.
Two intersecting sets of ideas have found
strong resonance. The Wisconsin Experience
at UW-Madison (WI-X) provides a description
of the distinctive nature of the educational
experience. The Essential Learning Outcomes
(ELOs) align to cross-cutting values and
expectations for student learning that are
present in existing statements and
descriptions of the curriculum.
Together, WI-X and the ELOs provide a
framework for talking and writing about the
educational experience. They are not
prescriptive. Rather, they give us language to
describe, to ourselves and to others, goals for
student learning. And they provide a
reference point for planning and evaluation.
About the Wisconsin Experience:
http://www.provost.wisc.edu/teach.html
Leadership from Aaron Brower, Vice-Provost for
Teaching and Learning, and Lori Berquam, Dean of
Students.
About the Essential Learning Outcomes:
http://www.ls.wisc.edu/gened/LEAP/default.htm
A multi-year national research project by the AACU
and involving universities, students, business leaders,
and the professions. UW-Madison leadership from
Jolanda Vanderwal Taylor, Chair of German; Nancy
Westphal-Johnson, L&S Associate Dean; Elaine Klein,
L&S Assistant Dean.
University of Wisconsin-Madison
CONVERGENCE
around the Wisconsin Experience and the
Essential Learning Outcomes
Provost
Wisconsin LEAP Project
Dean of Students
VP-Teaching and
Learning
WAA
University General
Education Committee
First-Year Experience/
Office of New
Student Programs
L&S Student
Academic Affairs
Reaccreditation Project
ASM
University Assessment
Council
Academic Planning
and Analysis
Essential
Learning
Outcomes
L&S TA Training
School of Pharmacy
University Communications
Study Abroad Office
Office of the Registrar
Libraries
Council of
Associate Deans
Morgridge Center
DoIT Academic Technologies
University Academic Planning
Council
Faculty, staff, and students are
represented across the various units
4
JLM/APA March 2008
Where have we been?
2008 Preface to the Assessment Plan
- Recognizes that expectations for student learning are
implicit in many existing documents and policies
- Recommends the language of the Essential Learning
Outcomes as university-wide expectations for student
learning
- Discussed at three University Assessment Council
meetings; adopted February 26, 2008
- {March 12 Gen Ed Breadth Event}
- Presented to the University Academic Planning
Council, March 28, 2008
Where are we now?
Essential Learning Outcomes (ELOs)
Wisconsin Experience (WI-X)
Together these two sets of ideas provide a framework:
• for explicitly articulating students’ educational
experiences and goals for student learning, curricularly
and programmatically
• for integrating efforts more intentionally across campus
Where are we now?
Institutional-level Perspective
•
Serves as a reference point for planning and collecting
evidence at the local and campus-level
•
Provides a template for understanding where the
“gaps” are and what improvements are needed
•
Sets a foundation for building a campus-level
assessment report
Institutional-level Challenges
•
Documentation and communication: how do we
communicate to internal and external audiences what
we value, what we do, what we’ve learned, and what
we hope to improve?
7
A number of campus groups have been
independently discerning the distinctive
nature of a UW-Madison education. These
groups, units, and individuals have started to
coordinate efforts in more intentional ways,
and are converging on a shared understanding
of the educational experience.
Two intersecting sets of ideas have found
strong resonance. The Wisconsin Experience
at UW-Madison (WI-X) provides a description
of the distinctive nature of the educational
experience. The Essential Learning Outcomes
(ELOs) align to cross-cutting values and
expectations for student learning that are
present in existing statements and
descriptions of the curriculum.
Together, WI-X and the ELOs provide a
framework for talking and writing about the
educational experience. They are not
prescriptive. Rather, they give us language to
describe, to ourselves and to others, goals for
student learning. And they provide a
reference point for planning and evaluation.
About the Wisconsin Experience:
http://www.provost.wisc.edu/teach.html
Leadership from Aaron Brower, Vice-Provost for
Teaching and Learning, and Lori Berquam, Dean of
Students.
About the Essential Learning Outcomes:
http://www.ls.wisc.edu/gened/LEAP/default.htm
A multi-year national research project by the AACU
and involving universities, students, business leaders,
and the professions. UW-Madison leadership from
Jolanda Vanderwal Taylor, Chair of German; Nancy
Westphal-Johnson, L&S Associate Dean; Elaine Klein,
L&S Assistant Dean.
University of Wisconsin-Madison
CONVERGENCE
around the Wisconsin Experience and the
Essential Learning Outcomes
Provost
Wisconsin LEAP Project
Dean of Students
VP-Teaching and
Learning
WAA
University General
Education Committee
First-Year Experience/
Office of New
Student Programs
L&S Student
Academic Affairs
Reaccreditation Project
ASM
University Assessment
Council
Academic Planning
and Analysis
Essential
Learning
Outcomes
L&S TA Training
School of Pharmacy
University Communications
Study Abroad Office
Office of the Registrar
Libraries
Council of
Associate Deans
Morgridge Center
DoIT Academic Technologies
University Academic Planning
Council
Faculty, staff, and students are
represented across the various units
10
JLM/APA March 2008
Question 1.
Thinking about your role in teaching and the educational
experience, where to you see resonance with the Essential
Learning Outcomes?
To the extent that the ELO’s resonate with you, how do they
influence your work as an educator?
11
Question 2.
How do we know if expectations for student learning are being
met?
12
The following pages provide more
information about doing assessment in
your unit …...
13
Figure 2. Examples of Assessment Strategies Implemented at UW-Madison
1. After administering prelims, a faculty committee uses a scale to rate each student’s performance in
each of the identified learning goals. Those ratings are summarized annually as an indication of the
program’s effectiveness in conveying information students need to meet program expectations.
2. A capstone course requires upper-level students to complete a final project. A faculty committee
reviews these projects and rates the extent to which they reflect identified learning goals. Results are
presented at a faculty meeting in a discussion of the program’s effectiveness in conveying information
students need to meet program goals.
3. All students completing a course required for admission to the major take a final exam containing
one or more questions targeting one of the learning goals. Results are compiled to assess students’
“before” scores; later, when students complete their final requirements for the major, they are asked to
respond to the same question to evaluate their attainment of information related to that goal.
4. A department asks the quantitative assessment project to develop an examination to assess the math
preparation of students taking a course as a prerequisite for entry into the major; results are used to
improve communication with students about necessary quantitative skills, and online tutorials
developed to convey those skills.
5. Course directors meet with TA’s and instructors on a regular basis to discuss various components of
an introductory course sequence. Specific outcomes are identified for each stage of students’ progress
through the curriculum; the directors design a project used both to evaluate individual student
achievement (individual grades assigned by instructors) as well as program evaluation (a sample of
papers rated by all instructors using a common rubric).
6. A department publishes a list of problems that students should be able to perform on entering a
course and another list of problems that students should be able to solve on completing the course.
From time to time, and instructor in the course reports to the department on how the students are
measuring up to these expectations.
7. The department’s curriculum committee establishes a regular sequence of course offerings to ensure
that majors can fulfill degree requirements in a timely way; this sequence is consulted when the
timetable is built, when sabbaticals are considered, and when other decisions are made that influence
the regular scheduling of offerings. The arrival or departure of faculty may provoke a review of the
course array or of the requirements for the major.
14
Source: 2003 University Assessment Plan
Figure 3. Practices that Contribute to Successful Academic Assessment
1. Do not assess every learning goal every year. For example, a major with five discrete learning goals might
evaluate each goal in turn. Or one distinct methodology may be applied at any one time. Break the task into
achievable units to maintain a manageable assessment program.
2. Use both direct and indirect measures to evaluate student achievement of learning goals. For example, evaluate a
group of papers by graduating seniors against standardized expectations (a direct measure of student performance) or
survey students about their learning behaviors and perceptions of learning (an indirect measure of student
experience).
3. Use both formative and summative elements. For example, student performance on a goal might be evaluated
upon conclusion of a course required for admission to the major (as an early formative measure) and the same goal
might be evaluated when those students complete their degree requirements (a final summative measure).
4. Employ the highest research standards possible within the limits posed by resources and expertise. The value of
measurement increases if it is taken repeatedly over time and especially if the same measurement is taken repeatedly
over a period of time that spans a change in the program. Such trend analyses are likely to be sensitive to change
over time.
5. Collect retain and summarize data in ways that facilitate its use. Use data to support academic judgment.
6. Collect data when it becomes available even if the analysis of the data will take place later. Examples: course
closeout information; course evaluation forms; collections of capstone papers; faculty evaluations of preliminary
exams. Collect the same data at the same time each semester and/or year since time series data are essential to highquality assessment.
7. When possible make use of standard reports and tabulations of student curricular and budgetary data that are
produced regularly for campus use. Examples: enrollment and degree reports; grade distributions reports;
enrollment statistics by minority group and gender; Departmental Planning Profiles; Data Digest; Graduate Program
Profiles. Department or program records need not replicate all of this information if the historical data can be
retrieved from campus data resources (the UW Data Warehouse).
8. Those who undertake assessment projects that involve interaction with individuals seek advice on whether human
subjects review is necessary based on the most recent regulations and legislation (see the appropriate Graduate
School web site).
9. Students who participate in assessment activities need to understand their role in the assessment activity, its
purpose, and how results will be used. Students may come to the task with greater commitment if they understand
that the goal is to improve the program.
15
Source: 2003 University Assessment Plan
The Basic Assessment Plan
1. Specify expectations for student learning, goals
When students finish this (assignment, course, program) we expect them to
___________________________
Not too many; stick with what you agree on.
2. One direct measure:
Program faculty review senior work for progress to goals. For example selected
projects from a capstone course. Also works for graduate theses.
Subjective evaluations are acceptable.
3. One indirect measure.
A survey or focus group that asks students:
i) How well did they meet expectations for student learning
ii) What aspects of this (assignment, course, program) helped their learning?
Why? How?
iii) What might be done differently to help them learn more effectively? Why?
How?
Placement rates, alumni surveys may also be useful.
4. Annual meeting to review results, findings, identify a some
actionable improvements; decide on upcoming year measures.
16
Modified from Assessment Clear and Simple: A Practical Guide, Barbara Walvoord, 2004.
The Basic Assessment Plan: Use Scoring Schemes to
Organize Judgment
(Rubrics, Primary Trait Analysis)
Expectation for learning
1
Rating
2 3 4
5
Knowledge of human cultures and the
physical and natural world
Intellectual and practical skills
Personal and social responsibility
Integrative learning
Rating scale: 1=Well below expectations; 2=Somewhat below expectations;
3=About meets expectations; 4= Exceeds expectations; 5= Substantially exceed
expectations.
(or a scale that works for the given purpose).
17
Liberal Education Scorecard, Wick and Phillips,
in Liberal Education, V94(1):22-29
http://www.aacu.org/liberaleducation/le-wi08/documents/wi_08_Scorecard.pdf
Figure 2b. Sample Liberal
Education Scorecard: “Scientific
Reasoning” Emphasis
18