Assessing the Mission of Doctoral Research Universities J. Joseph Hoey, Georgia Tech Lorne Kuffel, College of William and Mary North Carolina State University Workshop October 30-31,

Download Report

Transcript Assessing the Mission of Doctoral Research Universities J. Joseph Hoey, Georgia Tech Lorne Kuffel, College of William and Mary North Carolina State University Workshop October 30-31,

Assessing the Mission of
Doctoral Research
Universities
J. Joseph Hoey, Georgia Tech
Lorne Kuffel, College of William and Mary
North Carolina State University Workshop
October 30-31, 2003
11/6/2015
1
Guidelines for This Presentation
Please turn off or silence you cell
phones
Please feel free to raise questions at
anytime during the presentation, we will
also leave time at the end for general
discussion.
We are very interested in your
participation
11/6/2015
2
Agenda
Introduction and Objectives
Reasons for Graduate Assessment
Comparative Data Sources
Developing Faculty Expectations for Graduate
Students
Principles of Graduate Assessment
Physics Case Study
Taking Assessment Online
Summary and Discussion
11/6/2015
3
Objectives
Articulate motivations for undertaking
graduate assessment
Increase awareness of comparative data
sources
Program Linkages for Graduate Assessment
Hands-on: develop faculty expectations for
student competence; utilize diverse data
sources to evaluate a graduate program’s first
assessment efforts; etc.
11/6/2015
4
Why Assess Graduate Programs?
We are all interested in the quality and
improvement of graduate education
To help satisfy calls for accountability
Accreditation requirements: SACS
accreditation imperatives
“To change or improve an invisible
system, one must first make it visible”
– Schilling and Schilling, 1993, p. 172.
11/6/2015
5
Common Internal Reasons for
Graduate Assessment
Program marketing
Meet short-term (tactical) objectives or
targets
Meet long-term (strategic)
institutional/departmental goals
Funded project evaluation (GAANN, IGERT)
Understand sources of retention/attrition
among students and faculty
11/6/2015
6
SACS Principles of Accreditation
Core requirement #5: “The institution
engages in ongoing, integrated, and
institution-wide research-based
planning and evaluation processes that
incorporate a systematic review of
programs and services that (a) results
in continuing improvement and (b)
demonstrates that the institution is
effectively accomplishing its mission.”
11/6/2015
7
SACS Principles of Accreditation
Section 3 – Comprehensive Standards:
Institution Mission, Governance, And
Institutional Effectiveness

11/6/2015
“16. The institution identifies outcomes for its
educational programs and its administrative and
educational support services; assesses whether it
achieves these outcomes; and provides evidence
of improvement based on analysis of those
results.”
8
SACS Principles of Accreditation
Section 3 – Comprehensive Standards:
Standards for All Educational Programs


11/6/2015
“12. The institution places primary responsibility
for the content, quality, and effectiveness of its
curriculum with the faculty”
“18. The institution ensures that its graduate
instruction and resources foster independent
learning, enabling the graduate to contribute to a
profession or field of study.”
9
SACS Accreditation
The intent of the SACS procedures is to
stimulate institutions to create an
environment of planned change for
improving the educational process.
11/6/2015
10
Language
Much of the assessment literature
employs a fair amount of industrial or
business speak
Feel free to develop and use your own
Keep it consistent across the institution
Produce and maintain a glossary of
terms
11/6/2015
11
So What Do We Need to Do?
Do our departments have a clear mission statement?
Do we have departmental plans to evaluate the
effectiveness of our degree programs?




11/6/2015
Do our degree programs have clearly defined faculty
expectations for students?
Are they published and are they measurable or
observable?
Do we obtain data to assess the achievement of
faculty expectations for students?
Do we document that assessment results are used to
change or sustain the excellence of program
activities and further student gains in professional
and attitudinal skills and experiences?
12
So What Do We Need to Do?
(Cont.)
Based on assessment results, do we reevaluate the
appropriateness of departmental missions as well as
the expectations we hold for student competence?
The amount of work needed to satisfy accreditation
requirements is proportional to the number of ‘No’
responses to the above questions.
11/6/2015
13
IE Chart
Process of Institutional Effectiveness (PIE)
1.) Overarching Mission of the Institution
2.) Purpose or Primary Function of the Unit or Program
3a.) Faculty Expectations state the
3b.) Operational Objectives state the
desired student learning results
desire operational result of the unit's
associated with the unit's or
purpose or function.
program's purpose.
4.) Practices or experiences that are performed relative to the 'faculty
expectations' or 'operational' objectives.
5.) Standards or processes for measuring obtainment of desired expectations
and/or objectives.
6.) Collection of data to measure obtainment of desired expectations and/or
objectives.
7.) Evaluation of findings and recommendations for change when necessary
or actions to sustain excellence.
11/6/2015
14
Needed to Succeed
The department should want to do this
process
The department must use the
information collected
The institution must use the information
collected
Use participation in the process as part
of faculty reviews
11/6/2015
15
Focusing Efforts
It is important to achieve a strategic
focus for the program, decide what
knowledge, skills, abilities, and
experiences should characterize
students who graduate from our
program…
11/6/2015
16
What is Important to Measure?
To decide this, it is first vital to ask:



What are our strong areas?
What are our limitations?
What do we want to accomplish in
 Education of students?
 Research?
 Service?
11/6/2015
17
Purpose Statement (sample)
The Anthropology Department serves the institution
by offering courses and scholarly experiences that
contribute to the liberal education of
undergraduates and the scholarly accomplishments
of graduate students. Program faculty members
offer courses, seminars, directed readings, and
directed research studies that promote social
scientific understandings of human cultures. The
Department offers a bachelor’s degree major and
minor, an M.A. degree, and a Ph.D.
11/6/2015
18
Developing a Plan to Evaluate
Degree Programs
How to start a departmental plan: top
down or bottom up (Palomba and
Palomba, 2001)


11/6/2015
Top Down – As a group of scholars, decide what
are the important goals or objectives for the
program.
Bottom Up – Identify the primary faculty
expectations for student competence in core
courses in the program and use this list to develop
overarching expectations for student competence.
19
Develop an Assessment Plan
Desirable characteristics for assessment
plans: (Palomba and Palomba, 1999)





11/6/2015
Identify assessment procedures to address faculty
expectations for student competence;
Use procedures such as sampling student work
and drawing on institutional data where
appropriate;
Include multiple measures;
Describe the people, committees, and processes
involved; and
Contain plans for using assessment information.
20
Words to Remember When
Starting an Assessment Plan
It may be best to tackle the modest objectives first.
Assessment plans should recognize that students are
active participants and share responsibility for their
learning experience along with the faculty and
administration.
It takes a long time to do assessment well. So be
patient and be flexible.
The overriding goal is to improve educational
programs, not to fill out reports or demonstrate
accountability.
11/6/2015
21
Use a Program Profile to get
Started
Related to Operational Objectives
11/6/2015
22
Data for Profiles
Admissions: Applications, acceptance rates, and yield
rates
Standardized Test Scores
 Graduate Record Examination (GRE)
http://www.gre.org/edindex.html
 Graduate Management Admission Test (GMAT)
http://www.gmac.com/
 Law School Admission Test (LSAT)
http://www.lsac.org/
Undergraduate GPA
Headcount or Major Enrollments (Full/Part-Time)
Degrees Awarded
11/6/2015
23
Profiles (Cont.)
Formula Funding Elements when appropriate
Time-to-Degree and/or Graduation/Retention Rates
Support for Students (Type of Assistance)
Faculty Headcount (Full/Part, Tenure Status)
Faculty Salaries
Faculty Productivity or Workload Compliance
Research Proposals Submitted/Awarded
Research Award/Expenditure Dollars
Instructional and Research Facility Space
11/6/2015
24
Comparative Data
Survey of Earned Doctorates (SED)
National Center for Educational Statistics (NCES)
Institutional Postsecondary Educational Data System
(IPEDS)
National Research Council (NRC) Reports
Higher Education Data Sharing Consortium (HEDS)
Graduate Student Survey (GSS)
American Association of University Professors (AAUP)
or College and University Professional Association
(CUPA) Faculty Salary Surveys
11/6/2015
25
SED Data
Is administered annually and has a very high annual
response rate
Doctoral degrees awarded by broad field and
subfield by gender, racial/ethnic group, and
citizenship.
Institutional ranking by number of doctorate awards
(top 20) by broad field and by racial/ethnic group
Time-to-Degree (three measures) by broad field,
gender, racial/ethnic group, and citizenship
11/6/2015
26
SED Data (Cont.)
Financial resources for student support by broad
field, gender, racial/ethnic group, and citizenship
Postgraduate plans, employment, and location by
broad field, gender, racial/ethnic group, and
citizenship
Reports are available at
http://www.norc.uchicago.edu/issues/docdata.htm
11/6/2015
27
IPEDS Data
Fall enrollments by major field (2-digit CIP code) of
study, race/ethnicity and citizenship, gender,
attendance status (full/part-time), and level of student
(undergraduate, graduate, and first professional)

The discipline field data is reported in even years only.
Annual degrees conferred by program (6-digit CIP
code) or major discipline (2-digit CIP code), award
level (associate degree, baccalaureate, Master’s,
doctoral, and first professional), race/ethnicity and
citizenship, and gender.

11/6/2015
Reported annually
28
IPEDS Data (Cont.)
Useful for identifying peer institutions
Available at the IPEDS Peer Analysis System
http://nces.ed.gov/Ipeds/
These data are also published in the National Center
for Education Statistics (NCES), Digest of Education
Statistics
11/6/2015
29
National Research Council
Research-Doctorate Programs in the United
States
This information is dated (1982 and 1993) with a
new study scheduled for 2004 (?).
Benefit is rankings of programs. But some critics
suggest “reputational rankings cannot accurately
reflect the quality of graduate programs.” (Graham
& Diamond, 1999)
The National Survey of Graduate Faculty
1.
2.
3.
11/6/2015
Scholarly quality of program faculty
Effectiveness of program in educating research
scholars/scientists
Change in program quality in last five years
30
Profile Comparison for History
and Physics – NRC Ranking
History department ranked 46.5
Physics department ranked 63
(Goldberger, Maher, and Flattau, 1995)
11/6/2015
31
Profile Comparison for History
and Physics - Faculty
Five Year Average Faculty Counts
Faculty Status
History
Physics
Tenured
19
23
Tenure-Track
7
4
Full-Time Non Tenure-Track
4
0
Part-Time Non Tenure-Track
1
1
Total
31
28
11/6/2015
32
Profile Comparison for History
and Physics - Admissions
Five Year Average Admissions Data
History
Physics
#
Rate
#
Rate
Applicants
107
87
Accepted
32
30%
37
43%
Matriculated
18
56%
12
32%
Mean
Mean
GRE
W&M
Natl
W&M
Natl
Verbal
628
528
502
539
Quantitative
596
530
728
745
Analytical
663
575
643
668
UG GPA
11/6/2015
3.60
na
3.37
na
33
Profile Comparison for History
and Physics - Students
Five Year Average Enrollments & Degrees
History
Physics
Students Degrees Students Degrees
Doctoral
43
5
32
7
Master's
15
12
16
10
Undergraduates
193
116
34
19
Total
251
133
82
36
Doctoral
11/6/2015
Mean Time-to-Degree (Registered Time)
History
Physics
W&M
Natl
W&M
Natl
7.2
7.0
8.8
9.0
34
Profile Comparison for History
and Physics - Productivity
Five Year Average Student Credit Hours
History
Physics
SCH
% of Dept
SCH
% of Dept
Doctoral
1,002
9%
971
16%
Master's
622
6%
519
9%
UG Majors
3,037
28%
703
12%
UG Non Majors
6,277
57%
3,851
64%
Total
10,938
6,044
Five Year Average Research Expenditures
History
Physics
Amount % of Inst Amount % of Inst
Awards
36,083
0.1%
3,182,671 11.4%
Expenditures
57,048
0.2%
2,807,756 11.5%
11/6/2015
35
Describing Faculty Expectations
for Students
11/6/2015
36
Why Describe Faculty Expectations
for Students?
To sustain program excellence and
productivity
To give faculty feedback and the ability
to make modifications based on
measurable indicators, not anecdotes
To inform and motivate students
To meet external standards for
accountability
11/6/2015
37
What Are Our Real Expectations?
Read each question thoroughly. Answer all
questions. Time limit: four hours. Begin
immediately.
MUSIC: Write a piano concerto. Orchestrate it and perform it
with flute and drum. You will find a piano under your seat.
MATHEMATICS: Give today's date, in metric.
CHEMISTRY. Transform lead into gold. You will find a beaker
and three lead sinkers under your seat. Show all work
including Feynman diagrams and quantum functions for all
steps.
ECONOMICS: Develop a realistic plan for refinancing the
national debt. Run for Congress. Build a political power base.
Successfully pass your plan and implement it.
11/6/2015
38
Steps to Describing
Expectations - 1
Write down the result or desired end
state as it relates to the program.
Jot down, in words and phrases, the
performances that, if achieved, would
cause us to agree that the expectation
has been met.
Phrase these in terms of results
achieved rather than activities
undertaken.
11/6/2015
39
Steps to Describing
Expectations - 2
Sort out the words and phrases. Delete
duplications and unwanted items.
Repeat first two steps for any remaining
abstractions (unobservable results) considered
important.
Write a complete statement for each
performance, describing the nature, quality, or
amount we consider acceptable.
Consider the point in the program where it would
make the most sense for students to demonstrate
this performance.
11/6/2015
40
Steps to Describing
Expectations - 3
Again, remember to distinguish results
from activities.
Test the statements by asking: If
someone achieved or demonstrated
each of these performances, would we
be willing to say the student has met
the expectation?
When we can answer yes, the analysis
is finished.
11/6/2015
41
Steps to Describing
Expectations - 4
Decide how to measure the meeting of
an expectation: can we measure it
directly? Indirectly through indicators?
In general, the more direct the
measurement, the more content valid it
is.
For more complex, higher order
expectations: may need to use
indicators of an unobservable result.
11/6/2015
42
Steps to Describing
Expectations - 5
Decide upon a preferred measurement
tool or student task.
Describe the expectation in terms that
measure student competence and yield
useful feedback.
11/6/2015
43
Try it!
What Faculty Expectation? Our sample
is this: Graduates will be lifelong
learners
Decide: Under what condition? When
and where will students demonstrate
skills?
Decide: How well? What will we use as
criteria?
11/6/2015
44
Try it!
Under what condition?
Condition: Students will give evidence
of having the ability and the propensity
to engage in lifelong learning prior to
graduation from the program.
11/6/2015
45
Try it!
How well? Specify performance criteria
for the extent to which students:




11/6/2015
Display a knowledge of current disciplinary
professional journals and can critique them
Are able to access sources of disciplinary
knowledge
Seek opportunities to engage in further
professional development activities
Other?
46
Principles of Graduate
Assessment
Clearly differentiate master’s and
doctoral level expectations
Assessment must be responsive to more
individualized nature of programs
Assessment of real student works is
preferable
Students already create the products
we can use for assessment!
11/6/2015
47
Principles of Graduate
Assessment (continued)
Use assessment both as a self-reflection
tool and an evaluative tool
Build in feedback to the student and
checkpoints
Use natural points of contact with
administrative processes
11/6/2015
48
Common Faculty Expectations at
the Graduate Level
Students will demonstrate professional and
attitudinal skills, including:






11/6/2015
Oral, written and mathematical communication
skills;
Knowledge of concepts in the discipline;
Critical and reflective thinking skills;
Knowledge of the social, cultural, and economic
contexts of the discipline;
Ability to apply theory to professional practice;
Ability to conduct independent research;
49
Common Faculty Expectations at
the Graduate Level (continued)
Students will demonstrate professional and
attitudinal skills, including:




11/6/2015
Ability to use appropriate technologies;
Ability to work with others, especially in teams;
Ability to teach others; and
Demonstration of professional attitudes and values
such as workplace ethics and lifelong learning.
50
Areas and Linkage Points to
Consider in Graduate Assessment
Deciding on what is important to measure
Pre-program assessment
In-program assessment
Assessment at program completion
Long-term assessment
Educational process assessment
Comprehensive assessment (program
review)
11/6/2015
51
Use Natural Linkage Points
Admission: use diagnostic exam or GRE
subject test
Annual: advising appointment/progress check
Qualifying/Comprehensive exams: embed
items relevant to program objectives
Thesis and dissertation: develop rubrics to
rate multiple areas relevant to program
objectives
Exit: exit interview; exit survey at thesis
appointment, check-out, or commencement
11/6/2015
52
Pre-Program Assessment
Re-Thinking Admissions Criteria (Hagedorn and Nora,
1997):
 Problem: Graduate persistence.
 GRE is only designed to predict first-year
performance.
 UG GPA and GRE are not measures of professional
and attitudinal competency.
 A variety of skills, talents, and experiences is
necessary for success but not usually included in
admissions criteria.
Evaluating the fit between the program and the
student is important.
11/6/2015
53
Other Pre-Program
Assessment Tools
Portfolio and/or structured interviews
featuring:




Research interests and previous products
Critique of a report or research paper
Plan for a research project
Prior out-of-class experiences
Inventories to assess motivation,
personality, fit to program
11/6/2015
54
In-Program Assessment of
Student Learning
Based on faculty expectations
Methods may include assessment of:







11/6/2015
Case studies, term papers, projects
Oral seminar presentations
Preliminary exams, knowledge in field
Research and grant proposals
Portfolios
Problem-Based Learning or Team projects
Input from advisors, graduate internship director
55
Assessment at Program
Completion
Allows demonstration of synthesis of
knowledge, skills and attitudes learned
Ideal comprehensive assessment point -but a sense of where the student
began is desirable to assess change,
growth, and value added


11/6/2015
Qualitative analysis may be appropriate
Portfolio of research, scholarly products
56
Assessment at Program
Completion (continued)
Methods may include assessment of:






11/6/2015
Thesis/dissertation; oral defense
Professional registration or licensure exam
Published works, conference papers
Portfolio
Exit interview
Exit survey
57
Long-Term Assessment
Common sentiment: graduates can
adequately self-assess the outcomes of their
program only after they have been applying
their skills for several years following
graduation.
Pursuing long-term assessment, based on
identified learning objectives, is an important
component of a graduate assessment
program.
11/6/2015
58
Long-Term Assessment
(continued)
AAU (1998): important to track graduates of
post-baccalaureate programs:


to gain information on expectations vs. learning
experiences;
to gain data on outcomes and placement.
Other reasons: to them involved in the life of
the school; to bring them back as speakers,
mentors, advisory board members…and
donors.
11/6/2015
59
Long-Term Assessment
(continued)
May include assessment of:







11/6/2015
Job placement and linkage to degree
Career success
Production of scholarly work
Evidence of lifelong learning
Awards and recognition gained
Participation in professional societies
Satisfaction with knowledge gained
60
Long-Term Assessment
(continued)
Common Assessment Methods:




11/6/2015
Follow-up interviews, surveys or focus
groups
Journal publications
Citation indices
Membership lists and papers presented in
professional/disciplinary associations
61
Value of Assessing the
Educational Process
Widely viewed as key to graduate retention
Helps understand the strengths and needs for
improvement of graduate coursework,
research experience, teaching experience,
advising, and support services.
Environment and process assessment: see
Golde and Dore (2001) survey for Pew
Charitable Trusts.
11/6/2015
62
Ways of Assessing the
Educational Process (continued)
Graduate student advisory groups
Surveys of students, focus groups
Peer review of teaching
Institutional data: time to degree,
graduation rate
Advising process
Mentoring process
11/6/2015
63
Assessing the Mentoring Process
A primary graduate learning and professional
enculturation process
Mentoring at UC Berkeley (Nerad and Miller,
1996):



11/6/2015
All faculty advise individuals, but mentoring is the
shared responsibility of all members of dept.
Individual faculty mentors to students
Departmental seminars and workshops
64
Comprehensive Assessment:
Program Review
The combination of an internal selfstudy and an external review of the
program by qualified faculty peers
forms a very powerful and
comprehensive assessment device.
Program review encompasses an
examination of resources, processes,
and student learning outcomes.
11/6/2015
65
Program Review:
Examples of Areas to Evaluate
Achievement of Faculty Expectations

communication skills appropriate to the discipline,
professional and attitudinal competency, ability to
conduct independent research, etc.
Processes

coursework, research opportunities, teaching,
internships, comprehensive exams, theses, and
time in residence
Resources (Profile)

11/6/2015
faculty, students, library, instructional and lab
space, financial support, extramural support, etc.
66
Putting the Pieces Together
Adapted from Baird (1996): matrix of
faculty expectations, linkage points to
use in conducting assessment, and
some possible methods to use.
Adapt for use by each department by
inserting appropriate faculty
expectations for each program.
11/6/2015
67
Case Study
See case study handout
Doctoral program in Physics at Muggy
Research University (MRU)
First time through their assessment
process
Data in hand: What now?
You are the consultants!
11/6/2015
68
Case Study:
Debriefing Questions
What do you see in the results?
What do you recommend?
What actions do they need to take?
In light of their mission, what should
they do next time?
11/6/2015
69
Taking Assessment Online
Georgia Tech’s Approach: Online
Assessment Tracking System
(OATS)
11/6/2015
70
OATS-Purpose
Annual Assessment Updates are a key piece in Tech’s
efforts to demonstrate compliance with SACS
Principles of Accreditation.
Annual Assessment Updates concept was generated
by GT unit coordinators in 1998 as a way of
documenting Tech’s responsiveness to SACS
recommendations re: assessment practices.
Many people have requested that the process be
moved to an online environment.
The online process provides structure, formalizes best
practices in assessment of student learning, and thus
facilitates demonstration of compliance.
SACS 2005 will be an electronic remote review.
11/6/2015
71
Annual Assessment Update
Previous Method
What Did You Look At?
How Did You Look At
It?
What Did You Find?
What Did You Do?
11/6/2015
New Method
OBJECTIVES
METHODS
RESULTS
ACTIONS
72
Feature Comparison
Old System





11/6/2015
Many different formats
Hard copy only
Difficult to track progress
over time
Flexibility (but no
consistency across
Institute)
Difficult to provide
feedback internally and
to facilitate institutional
sharing of good practices
OATS






Consistent format
Database storage
Ability to track progress
over time
Flexibility maintained
Process facilitates
accreditation e-review
Easier to provide
feedback; facilitates
institutional sharing
73
OATS Application
Includes user id/password logon
Web accessible from any location
Defined format structure—Objectives, Methods,
Results, and Actions/Impact


Allows posting of formatted text (tables, charts, etc.)
Allows notes and written feedback
Review at School/Unit and College level keeps
everyone in the loop
OATS Production Date: October 1
Assessment Updates due: December 1 this year
11/6/2015
74
Main Menu: Current Year and History
11/6/2015
75
- example -
College Level: Ivan Allen College
Sent to College
11/6/2015
76
- example -
School Level: History, Technology & Society
Sent to College
Sent to College
Sent to College
11/6/2015
77
- example -
Degree Program Level: BS in HTS
11/6/2015
78
Summary
SACS requires assessment of graduate
programs, research and public service
Make it relevant to the program
Keep it simple and focused
Consider different assessments for each
stage of student progress
Start now: it takes several years to fine
tune
11/6/2015
79
References:
See references in back of handout
11/6/2015
80
Session Evaluation
What one aspect was the most useful to
you?
What one aspect most needs
improvement, and what kind of
improvement?
Other suggestions?
11/6/2015
81
Thank You!
Questions? Contact us!
[email protected]
[email protected]
11/6/2015
82