We Could Use Your Help - Higher Education Authority

Download Report

Transcript We Could Use Your Help - Higher Education Authority

Measuring What Matters:
Forging the Right Tools
to Assess Student
Learning Outcomes
George D. Kuh
Rankings and the Visibility of
Quality Outcomes in the EHEA
Dublin, Ireland
January 31, 2013
Overview
 The U.S. context
 A word about NILOA
 Assessment: Purposes and
approaches
 The kind of learning we need
today
 The measurement tools we need
 Concluding thoughts
The U.S. Context
 Unprecedented numbers of
increasingly diverse students
matriculating
 Many underprepared students
 Rising college costs
 Continuing shift of cost from
government to students
 Increasing numbers of part-time
instructors
 Worries about university quality,
global competitiveness
NILOA
NILOA’s mission is to discover and
disseminate effective use of assessment data
to strengthen undergraduate education and
support institutions in their assessment
efforts.
SURVEYS ● WEB SCANS ● CASE STUDIES ● FOCUS GROUPS ● OCCASIONAL
PAPERS ● WEBSITE ● RESOURCES ● NEWSLETTER ● LISTSERV ●
PRESENTATIONS ● TRANSPARENCY FRAMEWORK ● FEATURED WEBSITES ●
ACCREDITATION RESOURCES ● ASSESSMENT EVENT CALENDAR ● ASSESSMENT
NEWS ● MEASURING QUALITY INVENTORY ● POLICY ANALYSIS ● ENVIRONMENTAL
SCAN ● DEGREE QUALIFICATIONS PROFILE
www.learningoutcomesassessment.org
www.learningoutcomeassessment.o
Assessment 2013
Greater emphasis on student learning
outcomes and evidence that student
performance measures up
Assessment 2013
 Greater emphasis on student learning
outcomes and evidence that student
performance measures up
Demands for comparative measures
Increased calls for transparency --public disclosure of student and
institutional performance
Assessment “technology” has
improved markedly, but still is
insufficient to document learning
outcomes most institutions claim
Measuring Quality in Higher Education
(Vic Borden & Brandi Kernel, 2010)
Web-based inventory hosted by AIR of assessment
resources. Key words can be used to search the four
categories:
 instruments (examinations, surveys, questionnaires,
etc.);
 software tools and platforms;
 benchmarking systems and data resources;
 projects, initiatives and services.
http://applications.airweb.org/surveys/Default.aspx
Assessment Purposes
 Improvement
 Accountability
Two Paradigms of Assessment
Continuous
Improvement
Accountability
Strategic dimensions
Purpose
Formative (improvement)
Summative (judgment)
Orientation
Internal
External
Motivation
Engagement
Compliance
Instrumentation
Multiple/triangulation
Standardized
Nature of evidence
Quantitative and qualitative
Quantitative
Reference points
Over time, comparative,
established goal
Multiple internal channels
Comparative or fixed
standard
Public communication,
media
Reporting
Implementation
Communication of
results
Use of results
Multiple feedback loops
Ewell, Peter T. (2007). Assessment and Accountability in America Today: Background and Context. In Assessing and Accounting
for Student Learning: Beyond the Spellings Commission. Victor M. H. Borden and Gary R. Pike, Eds. Jossey-Bass: San Francisco.
Assessment Tools
 Direct (outcomes) measures
-- Evidence of what
students have learned or
can do
 Indirect (process) measures
-- Evidence of effective
educational activity by
students and institutions
Direct Measures
ETS Proficiency Profile & Major Field
Tests
ACT Collegiate Assessment of
Academic Proficiency (CAAP)
Collegiate Learning Assessment (CLA) –
the AHELO measure of general skills
Competency tests (e.g., nursing,
education)
Portfolios (authentic student work such
as writing samples)
Performances, demonstrations
Indirect Measures
National Surveys of Student Engagement
(NSSE/CCSSE/AUSSE/SASSE)
Beginning College Survey of Student
Engagement (BCSSE)
Faculty Survey of Student Engagement (FSSE)
Cooperative Institutional Research Program
(CIRP)
Your First College Year (YFCY)
College Student Experiences Questionnaire
(CSEQ)
Noel Levitz Student Satisfaction Inventory
Institution-level assessments of
learning outcomes for all institutions
Program-level assessments of
learning outcomes for all institutions
What Really Matters in University:
Student Engagement
Because individual effort and
involvement are the critical
determinants of college impact,
institutions should focus on
the ways they can shape their
academic, interpersonal, and
extracurricular offerings to
encourage student
engagement.
Pascarella & Terenzini, 2005, p. 602
The quality of student
experiences varies more within
than between institutions.
Supportive Campus Environment:
4th-Year Students at Master's Institutions
Percentile 10
Percentile 50
Percentile 90
100
80
60
40
20
0
1
2
3
4
5
6
7
8
9
Master's Institutions
10
11
12
13
14
% of Variance Between Institutions
Academic Challenge
Active/Collab Learning
Stu-Fac Interaction
Enriching Experiences
Supportive Campus
Deep Learning
Higher Order Thinking
Integrative Learning
Reflective Learning
Satisfaction
Practical Competence
Personal/Social Devel
General Education
0%
First-Year
Senior
20%
40%
60%
80%
100%
Commensurate Complexity
(Thorngate, 1976; Weick, 1985)
Generally Applicable
Simple
Accurate
US Economy Defined by Greater
Workplace Challenges and Dynamism
 More than 1/3 of the entire US labor force
changes jobs ANNUALLY.
 Today's students will have 10-14 jobs by age
38.
 Half of workers have been with their
company less than 5 years.
 Every year, more than 30 million Americans
are working in jobs that did not exist in the
previous year.
DOL-BLS
The World Wants More From Us
and Our Graduates
…more college-educated
workers.
…more educated workers with
higher levels of learning and
knowledge.
Employer expectations of employees
have increased
% who agree with each statement
Our company is asking employees to take on more responsibilities
and to use a broader set of skills than in the past
91%
Employees are expected to work harder to coordinate with other
departments than in the past
90%
The challenges employees face within our company are more
complex today than they were in the past
88%
To succeed in our company, employees need higher levels of
learning and knowledge today than they did in the past
88%
23
Raising The Bar – October/November 2009 – Hart Research for
Key Capabilities Open the Door for
Career Success and Earnings
“Irrespective of college major or
institutional selectivity, what matters
to career success is students’
development of a broad set of
cross-cutting capacities…”
Anthony Carnevale, Georgetown University
Center on Education and the Workforce
Narrow Learning is Not Enough:
The Essential Learning Outcomes

Knowledge of Human Cultures
and the Physical & Natural World
 Intellectual and Practical Skills
 Personal and Social Responsibility
 “Deep” Integrative Learning
Deep, Integrative Learning
 Attend to the underlying meaning of
information as well as content
 Integrate and synthesize different
ideas, sources of information
 Discern patterns in evidence or
phenomena
 Apply knowledge in different
situations
 View issues from multiple
perspectives
Do we measure what we value?
or
Do we value what we measure?
Wise decisions are
needed about what
and how to measure
the proficiencies
demanded by the
21st century
We need – and are poised for – a
“sea change” in what counts as
meaningful evidence of student
progress and accomplishment.
Evidence of College Graduates
Skills and Knowledge
Very effective
Fairly effective
Supervised internship/community-based project
83%
69%
Senior project (e.g., thesis, project)
79%
46%
Essay tests
60%
35%
Electronic portfolio & faculty assessments
56%
33%
Multiple-choice tests
7%
32%
Employers On Accountability Challenge – December 2007 – Hart Research for
To Get the Right Kind of Evidence…
We need assessment approaches
sensitive to a wide variety of
knowledge, abilities, proficiencies,
and dispositions
Employers On Accountability Challenge – December 2007 – Hart Research for
Promising Experiments
Shift the national
conversation from
what is taught to
what is learned by
providing HEIs with
a template of widely
agreed-upon
competencies
required for the
award of degrees.
http://www.learningoutcomesassessment.org/DQPCorner.html
3 levels:
Associate, Bachelor,
Masters
5 areas:
•Broad, Integrative
Knowledge
•Specialized
Knowledge
•Intellectual Skills
•Applied Learning
•Civic Learning
http://www.learningoutcomesassessment.org/DQPCorner.html
Occasional Paper #16
The Degree Qualifications
Profile: Implications for
Assessment
Peter T. Ewell & Carol Geary
Schneider
This paper offers guidance for how to
gather evidence about the extent to
which the competencies described in the
DQP are mastered at the levels claimed.
The challenges associated with
assessing DQP proficiencies are
outlined.
www.learningoutcomeassessment.org/OccasionalPapers.htm
Promising Experiments
A dozen two- and
four-year HEIs are
using the VALUE
Rubrics in a “proof of
concept at scale”
with an eye toward
building a national
vehicle for using
common rubrics
Valid Assessment of
Learning in Undergraduate
Education (VALUE) Rubrics















Inquiry and analysis
Critical thinking
Creative thinking
Written communication
Oral communication
Reading
Quantitative literacy
Information literacy
Teamwork
Problem solving
Civic knowledge and engagement
Intercultural knowledge and competence
Ethical reasoning and action
Foundations and skills for lifelong learning
Integrative learning
AAC&U VALUE Project – 15 Rubrics
Promising Experiments
Massachusetts effort led by the
Commissioner for Higher Education
is enlisting additional states to use
assessment of authentic student
learning work to compare
performance and monitor progress
Moving Quality Assurance Forward
Shift the motivation for assessment
work from compliance mentality to
institutional responsibility
Experiment with ways to “roll up”
program level results into meaningful
institution-level profiles of student
accomplishment
Reconcile or ameliorate the tensions
between the accountability and
improvement purposes and uses of
assessment
Moving Quality Assurance Forward
Show how assessment results are
being used to modify curriculum and
teaching and learning approaches and
enhance student learning
Transparency
Voluntary System of Accountability
(APLU/AASCU)
VSA Student Learning
Outcomes Pilot
Four-year experiment
Value-added approach
Institutional level evidence
Administer and publicly post results:
– Collegiate Assessment of Academic
Proficiency
– Collegiate Learning Assessment
– ETS Proficiency Profile
Templates
 Voluntary System of Accountability
(APLU/AASCU)
 U-CAN /Building Blocks for 2020
(NAICU)
 College Navigator (NCES)
 Transparency by Design/College
Choices for Adults (WCET)
 Voluntary Framework of
Accountability (AACC)
 Transparency Framework (NILOA)
Transparency Reports
Providing Evidence of
Student Learning:
A Framework for
Transparency
Based on an examination of
about 1000 institutional
websites, the Transparency
Framework provides
guidance to institutions for
effectively presenting
learning outcomes
assessment information on
their websites.
Transparency
Framework
http://planning.iupui.edu/assessment/
The things we have to learn
before we do them, we learn
by doing them.
Aristotle, Nicomachean Ethics
What Matters in University:
A Data-Based Narrative…
We need a campaign led by university
leaders, staff, students, and quality
assurance agencies that features
students’ best work, buttressed by
evidence that all students meet
established proficiencies in response to
staff-designed assignments that require
students to show they can use and apply
what they are learning to concrete
situations on and off the campus.
Questions
&
Discussion