A Basic Toolbox for Assessing Institutional Effectiveness

Download Report

Transcript A Basic Toolbox for Assessing Institutional Effectiveness

A Basic Toolbox for
Assessing Institutional Effectiveness
Michael F. Middaugh
Assistant Vice President for Institutional Research and Planning
University of Delaware
Commissioner and Vice Chair
Middle States Commission on Higher Education
Workshop Objectives
• Identify context for why assessment of institutional effectiveness is
important
• Identify key variables in developing measures for assessing
institutional effectiveness
• Identify appropriate data collection strategies for measuring those
variables
• Identify appropriate strategies for communicating information (NOTE
THAT I DID NOT SAY DATA!) on institutional effectiveness
Context
Robert Zemsky and William Massy - 1990
“[The academic ratchet] is a term to describe the steady, irreversible shift
of faculty allegiance away from the goals of a given institution, toward
those of an academic specialty. The ratchet denotes the advance of an
entrepreneurial spirit among faculty nationwide, leading to increased
emphasis on research and publication, and on teaching one’s specialty in
favor of general introduction courses, often at the expense of coherence
in an academic curriculum. Institutions seeking to enhance their own
prestige may contribute to the ratchet by reducing faculty teaching and
advising responsibilities across the board, enabling faculty to pursue their
individual research and publication with fewer distractions. The academic
ratchet raises an institution’s costs, and it results in undergraduates paying
more to attend institutions in which they receive less attention than in
previous decades.”
(Zemsky and Massy, 1990, p. 22)
Boyer Commission on Educating Undergraduates - 1998
“To an overwhelming degree, they [research universities] have furnished
the cultural, intellectual, economic, and political leadership of the nation.
Nevertheless, the research universities have too often failed, and continue
to fail, their undergraduate populations…Again and again, universities are
guilty of advertising practices they would condemn in the commercial
world. Recruitment materials display proudly the world-famous
professors, the splendid facilities and ground breaking research that goes
on within them, but thousands of students graduate without ever seeing
the world-famous professors or tasting genuine research. Some of their
instructors are likely to be badly trained or untrained teaching assistants
who are groping their way toward a teaching technique; some others may
be tenured drones who deliver set lectures from yellowed notes, making
no effort to engage the bored minds of the students in front of them.”
(Boyer Commission, pp. 5-6)
U.S. News “America’s Best Colleges” - 1996
“The trouble is that higher education remains a labor-intensive service
industry made up of thousands of stubbornly independent and mutually
jealous units that support expensive and vastly underused facilities. It is a
more than $200 billion-a-year economic enterprise – many of whose
leaders oddly disdain economic enterprise, and often regard efficiency,
productivity, and commercial opportunity with the same hauteur with
which Victorian aristocrats viewed those ‘in trade’… The net result is a
hideously inefficient system that, for all its tax advantages and public and
private subsidies, still extracts a larger share of family income than almost
anywhere else on the planet…”
(America’s Best Colleges, p. 91)
National Commission on the Cost of Higher Education - 1998
• “…because academic institutions do not account differently for time spent
directly in the classroom and time spent on other teaching and research
activities, it is almost impossible to explain to the public how individuals
employed in higher education use their time. Consequently, the public
and public officials find it hard to be confident that academic leaders
allocate resources effectively and well. Questions about costs and their
allocation to research, service, and teaching are hard to discuss in simple,
straightforward ways and the connection between these activities and
student learning is difficult to draw. In responding to this growing
concern, academic leaders have been hampered by poor information and
sometimes inclined to take issue with those who asked for better data.
Academic institutions need much better definitions and measures of how
faculty members, administrators, and students use their time.”
(National Commission on Cost of Higher
Education, p. 20)
Spellings Commission on the Future of Higher Education
2006
“We believe that improved accountability is vital to ensuring the success
of all of the other reforms we propose. Colleges and universities must
become more transparent about cost, price, and student success
outcomes, and must willingly share this information with students and
families. Student achievement, which is inextricably connected to
institutional success, must be measured by institutions on a “value-added”
basis that takes into account students’ academic baseline when assessing
their results. This information should be available to students, and
reported publicly in aggregate form to provide consumers and
policymakers an accessible, understandable way to measure the relative
effectiveness of different colleges and universities.”
(Spellings Commission, p.4)
Middle States Accreditation Standards
Expectations: Assessment & Planning
It is the Commission’s intent, through the self-study
process, to prompt institutions to reflect on those
assessment activities currently in place (both for
institutional effectiveness and student learning), to
consider how these assessment activities inform
institutional planning, and to determine how to improve
the effectiveness and integration of planning and
assessment.
MSCHE Linked Accreditation Standards:
Standard 14: Student Learning Outcomes
Assessment of student learning demonstrates that, at
graduation, or other appropriate points, the institution’s
students have knowledge, skills, and competencies
consistent with institutional and appropriate higher
education goals.
Selected Fundamental Elements for
MSCHE Standard 14
• Articulated expectations for student learning (at
institutional, degree/program, and course levels)
• Documented, organized, and sustained assessment
processes (that may include a formal assessment plan)
• Evidence that student learning assessment information is
shared and used to improve teaching and learning
• Documented use of student learning assessment
information as part of institutional assessment
MSCHE Linked Accreditation Standards:
Standard 7: Institutional Assessment
The institution has developed and implemented an
assessment process that evaluates its overall
effectiveness in achieving its mission and goals and its
compliance with accreditation standards.
Selected Fundamental Elements for
MSCHE Standard 7
• Documented, organized, and sustained assessment
processes to evaluate the total range of programs and
services, achievement of mission, and compliance with
accreditation standards
• Evidence that assessment results are shared and used in
institutional planning, resource allocation and renewal.
• Written institutional strategic plan(s) that reflect(s)
consideration of assessment results
MSCHE Linked Accreditation Standards:
Standard 2: Planning, Resource Allocation
and Institutional Renewal
An institution conducts ongoing planning and resource
allocation based on its mission and goals, develops
objectives to achieve them, and utilizes the results of its
assessment activities for institutional renewal.
Implementation and subsequent evaluation of the
success of the strategic plan and resource allocation
support the development and change necessary to
improve and to maintain quality.
Selected Fundamental Elements for
MSCHE Standard 2
• Clearly stated goals and objectives that reflect conclusions
drawn from assessments that are used for planning and
resource allocation at the institutional and unit levels
• Planning and improvement processes that are clearly
communicated, provide for constituent participation, and
incorporate the use of assessment results
• Assignment of responsibility for improvement and
assurance of accountability
Variables
While we will discuss several variables today that
contribute to assessment of institutional effectiveness,
keep in mind that you don’t have to measure everything.
PRIORITIZE within the context of your institution’s culture
and needs.
Students
•
•
•
•
•
•
Admitted
Entering
Continuing
Non-Returning
Graduating
Alumni
Environmental Issues
•
•
•
•
Student and Faculty Engagement
Student and Staff Satisfaction
Employee Productivity
Compensation
- Market
- Equity
• Campus Climate
• Economic Impact
STUDENTS
Admitted Students
• What can we learn from monitoring admissions cycles?
• What additional drill down is needed to fully understand
student admissions behavior?
A Typical Admissions Monitoring Report
Eastern Seaboard State University
Weekly Admissions Monitoring Report
Campus Summary: First-Time Freshman Applicants, Their SAT Scores and Predicted Grade Index, by Admission Status and by Residency Status for the Entering Classes in the
Fall of 2005 as of 09/15/05; Fall 2006 as of 09/18/06; and Fall 2007 as of 09/13/07.
Counts
- Resident
- Nonresident
- Total
All
Applicants
2005 2006 2007
Admission
Denied
2005 2006 2007
Admission
Offered
2005 2006 2007
Admission
Accepted
2005 2006 2007
2,340 2,332 2,088
18,984 19,209 20,133
21,324 21,541 22,221
109 148 172
7,506 5,871 5,838
7,615 6,019 6,010
1,940 1,877 7,147
7,295 8,101 8,489
9,235 9,978 15,636
1,362 1,255 1,174
2,078 2,201 2,348
3,440 3,456 3,522
SAT Verbal
- Resident
- Nonresident
- Total
557
572
571
562
579
577
569
584
582
456
541
540
444
535
532
445
536
533
567
609
600
576
615
607
584
620
614
559
592
579
564
600
587
575
604
594
SAT Math
- Resident
- Nonresident
- Total
563
596
592
567
598
595
574
605
602
454
561
560
446
551
549
452
554
551
547
636
623
582
635
625
589
643
634
566
619
598
570
620
602
580
627
611
Predicted Grade
Index
- Resident
- Nonresident
- Total
2.67
2.90
2.82
2.75
2.94
2.88
2.83
2.94
2.88
1.66
2.40
2.22
1.60
2.36
2.30
1.71
2.41
2.37
2.75
2.98
2.90
2.78
3.04
2.95
2.85
3.05
2.98
2.75
2.98
2.89
2.78
3.05
2.95
2.85
3.04
2.98
Ratio of Offers
to Total Applications
2005 2006 2007
0.83
0.38
0.43
0.80
0.42
0.46
3.42
0.42
0.70
Ratio of Accepts
to Offers (Yield)
2005 2006 2007
0.70
0.28
0.37
0.67
0.27
0.35
0.61
0.28
0.23
Drilling Down
• Why do some students to whom we extend an offer of
admission choose to attend our institution?
• Why do other students to whom we extend an offer of
admission choose to attend a different school?
• How is our institution perceived by prospective students
within the admissions marketplace?
• What sources of information do students draw upon in
shaping those perceptions?
• What is the role of financial aid in shaping the college
selection decision?
Survey Research is Useful in Addressing
These Questions
• “Home-Grown” College Student Selection Survey
• Commercially Prepared
- College Board Admitted Student Questionnaire
- College Board Admitted Student Questionnaire-Plus
• Commercially prepared allows for benchmarking
Academic Preparedness of Respondent Pool
Average High School Grades of ASQ Respondents
Average High School Grades N (%)
A (90 to 100)
B (80 to 89)
C (70 to 79)
D (69 or Below)
All
Admitted
Students
Enrolling
Students
NonEnrolling
Students
1664 (65%)
1011 (66%)
653 (64%)
76%
24%
*%
0%
72%
27%
*%
0%
82%
18%
*%
0%
Academic Preparedness of Respondent Pool
Average SAT Critical Reading Scores of ASQ Respondents
SAT Critical Reading N (%)
Mean Score:
Median Score:
700 and Above
650 to 690
600 to 640
550 to 590
500 to 540
450 to 490
400 to 440
350 to 390
300 to 340
Below 300
All
Admitted
Students
Enrolling
Students
NonEnrolling
Students
1489 (59%)
897 (59%)
592 (58%)
610
610
594
600
634
630
13%
19%
26%
24%
12%
4%
2%
*%
*%
*%
9%
16%
26%
27%
14%
6%
2%
0%
*%
*%
20%
24%
27%
19%
8%
1%
1%
*%
*%
*%
Academic Preparedness of Respondent Pool
Average SAT Mathematical Scores of ASQ Respondents
SAT Mathematical N (%)
Mean Score:
Median Score:
700 and Above
650 to 690
600 to 640
550 to 590
500 to 540
450 to 490
400 to 440
350 to 390
300 to 340
Below 300
All
Admitted
Students
Enrolling
Students
NonEnrolling
Students
1490 (59%)
897 (59%)
593 (58%)
630
640
617
620
651
660
19%
27%
24%
19%
8%
3%
1%
*%
*%
0%
14%
24%
25%
22%
9%
4%
2%
*%
*%
0%
26%
31%
23%
13%
6%
1%
*%
*%
*%
0%
The survey allows respondents to rate 16 items with
respect to their influence on the college selection decision.
In choosing which institution to attend, the top three
considerations for both enrolling and non-enrolling
students are availability of majors, academic reputation of
the institution, and commitment to teaching
undergraduates. These are followed closely by educational
value for price paid.
What Admitted Students are Seeking
Importance of Selected College Characteristics to ASQ Respondents
Availability of Majors
Academic Reputation
Undergraduate Teaching Commitment
Value for the Price
Quality of Academic Facilities
Personal Attention
Quality of Social Life
Attractiveness of Campus
Cost of Attendance
Surroundings
Quality of Campus Housing
Extracurricular Opportunities
Availability of Recreational Facilities
Access to Off-Campus Activities
Special Academic Programs
Prominent Athletics
All
Admitted
Students
Enrolling
Students
NonEnrolling
Students
89%
81%
78%
77%
69%
64%
64%
61%
60%
60%
58%
55%
55%
35%
33%
27%
90%
82%
80%
79%
72%
65%
67%
66%
62%
62%
60%
56%
58%
37%
32%
27%
88%
80%
74%
73%
64%
63%
60%
54%
56%
56%
55%
54%
50%
33%
35%
26%
Survey respondents are then asked to rate the focal institution
on the 16 dimensions, compared with other institutions to
which they applied and were accepted.
NOTE: Student perceptions don’t have to be accurate to be
real. It is the reality of student perceptions that must
be addressed.
Important Perceptions about University of Delaware
Importance and Rating of Selected Institutional Characteristics
A. Less Important and the University Rated Higher
B. Very Important and the University Rated Higher
Special Academic Programs
Attractiveness of Campus
Cost of Attendance
Value for the Price
Quality of Social Life
Availability of Recreational Facilities
Extracurricular Opportunities
C. Less Important and the University Rated Lower
D. Very Important and the University Rated Lower
Access to Off-Campus Activities
Prominent Athletics
Academic Reputation
Personal Attention
Quality of On-Campus Housing
Undergraduate Teaching Commitment
Surroundings
Quality of Academic Facilities
Availability of Majors
"Charcteristics Considered Very Important" were those rated "Very Important" by at least 50 percent of
respondents. "Characteristics for Which the University was Rated High" were those for which the mean rating of
the University was higher than the mean rating for all other institutions. The characteristics are listed in
decreasing order of the difference between the mean rating for the University and the mean rating for all other
institutions.
Impressions of the University Among ASQ Respondents
COLLEGE IMAGES
Fun
Friendly
Comfortable
Large
Highly Respected
Partying
Intellectual
Challenging
Athletics
Career-Oriented
Selective
Prestigious
Diverse
Personal
Average
Back-Up School
Difficult
Isolated
Not Well Known
N (%)
All
Admitted
Students
Enrolling
Students
NonEnrolling
Students
2545 (100%)
1530 (100%)
1015 (100%)
61%
57%
53%
41%
41%
40%
36%
34%
30%
30%
29%
28%
26%
19%
13%
12%
9%
6%
6%
68%
63%
61%
43%
50%
41%
45%
43%
34%
38%
33%
35%
29%
23%
7%
5%
12%
2%
4%
50%
47%
41%
39%
27%
38%
23%
21%
26%
17%
23%
16%
22%
13%
22%
21%
4%
11%
8%
Financial Aid as a Factor in College Selection
• Survey allows respondents to report financial aid awards from “The
college you plan to attend.”
• Work study awards at UD and at competitors are virtually identical,
while the average loan award at competitors is about $1,000 higher
than at UD.
• The average need-based grant at competitors is double that for UD.
• The average merit grant at competitors is about $5,000 higher than at
UD.
• Total financial aid packages awarded by competitors are about double
that awarded by UD.
A Potential Competitive Disadvantage for UD
Average Aid
Awarded by
U of D
(Enrolling
Students)
Average Aid
Awarded by
School Attended
(Non-Enrolling
Students)
N
Average Award
89
$2,045
79
$2,107
N
Average Award
316
$3,948
193
$4,915
143
$4,310
104
$9,354
N
Average Award
301
$7,180
218
$12,024
N
Average Award
482
$8,281
297
$16,011
Students Reporting
Work-Study Awarded
Students Reporting
Loan Awarded
Students Reporting
Need-Based Grant Awarded
N
Average Award
Students Reporting
Merit Grant Awarded
Students Reporting
Total Aid Awarded
Entering Students
• ACT College Student Needs Assessment Survey: Ask
respondents to identify skills areas – academic and social –
where they feel they will need assistance in the coming
year.
• College Student Expectations Questionnaire: Asks
respondents to assess their level of expectations with
respect to intellectual, social, and cultural engagement with
faculty and other students in the coming year.
Continuing/Returning Students
•
Student Satisfaction Research
- ACT Survey of Student Opinions
- Noel-Levitz Student Satisfaction Inventory
•
ACT Survey of Student Opinions
Student use of, and satisfaction with 21 programs and services typically found at a
college or university (e.g. academic advising, library, computing, residence life, food
services, etc.)
Student satisfaction with 43 dimensions of campus environment (e.g., out-ofclassroom availability of faculty, availability of required courses, quality of advisement
information, facilities, admissions and registration procedures, etc.)
Self-estimated intellectual, personal, and social growth; Overall impressions of the
college experience
NOTE: Survey is available in four-year and two-year college versions.
What About Non-Returning Student Research?
TABLE 1: ENROLLMENT, DROPOUT RATES AND GRADUATION RATES
FOR FIRST-TIME FRESHMEN ON THE NEWARK CAMPUS (Total)
Enrollment and Dropout Rates
Entering
Fall Term
Graduation Rates
1st
Fall
2nd
Fall
3rd
Fall
4th
Fall
5th
Fall
6th
Fall
1995 N
% enrollment
% dropout
3154
100.0%
0.0%
2673
84.7%
15.3%
2439
77.3%
22.7%
2355
75.3%
24.7%
599
19.0%
26.4%
1996 N
% enrollment
% dropout
3290
100.0%
0.0%
2804
85.2%
14.8%
2585
78.6%
21.4%
2489
76.3%
23.7%
1997 N
% enrollment
% dropout
3180
100.0%
0.0%
2766
87.0%
13.0%
2523
79.3%
20.7%
1998 N
% enrollment
% dropout
3545
100.0%
0.0%
3080
86.9%
13.1%
1999 N
% enrollment
% dropout
3513
100.0%
0.0%
2000 N
% enrollment
% dropout
within
3 yrs
within
4 yrs
within
5 yrs
Total
113
3.6%
26.1%
21
1721
0.7% 54.6%
2219
70.4%
2344
74.3%
606
18.4%
26.1%
108
3.3%
26.7%
22
1825
0.7% 55.5%
2302
70.0%
2427
73.8%
2436
77.5%
22.5%
581
18.3%
24.3%
117
3.7%
24.5%
27
1827
0.8% 57.5%
2284
71.8%
2401
75.5%
2830
79.8%
20.2%
2762
78.5%
21.5%
653
18.4%
22.9%
118
3.3%
22.7%
22
2079
0.6% 58.6%
2621
73.9%
2727
76.9%
3126
89.0%
11.0%
2871
81.7%
18.3%
2757
79.4%
20.6%
526
15.0%
22.6%
83
2.4%
22.7%
31
2193
0.9% 62.4%
2632
74.9%
2684
76.4%
3128
100.0%
0.0%
2738
87.5%
12.5%
2524
80.7%
19.3%
2453
79.2%
20.8%
496
15.9%
23.9%
85
2.7%
23.8%
24
1884
0.8% 60.2%
2297
73.4%
--
2001 N
% enrollment
% dropout
3358
100.0%
0.0%
2976
88.6%
11.4%
2746
81.8%
18.2%
2674
80.6%
19.4%
472
14.1%
22.3%
0
0.0%
0.0%
31
2138
0.9% 63.7%
--
--
2002 N
% enrollment
% dropout
3399
100.0%
0.0%
3055
89.9%
10.1%
2866
84.3%
15.7%
2787
83.2%
16.8%
0
0.0%
0.0%
0
0.0%
0.0%
42
1.2%
--
--
--
2003 N
% enrollment
% dropout
3433
100.0%
0.0%
3035
88.4%
11.6%
2808
81.8%
18.2%
0
0.0%
0.0%
0
0.0%
0.0%
0
0.0%
0.0%
--
--
--
--
2004 N
% enrollment
% dropout
3442
100.0%
0.0%
3064
89.0%
11.0%
0
0.0%
0.0%
0
0.0%
0.0%
0
0.0%
0.0%
0
0.0%
0.0%
--
--
--
--
What About Non-Returning Student Research?
Drilling Deeper…..
• Commercial instruments exist, but response rates tend to be low, and
reported reasons for leaving politically correct – personal or financial
reasons.
• For the last several years, we have administered the Survey of Student
Opinions during the Spring term to a robust sample of students across
freshman, sophomore, junior, and senior classes.
• The following Fall, the respondent pool is disaggregated into those
who took the Survey and returned in the Fall, and those who took the
Survey, did not return in the Fall, and did not graduate.
• Test for statistically significant differences in response patterns
between the two groups.
Campus Pulse Surveys
• Based upon information gleaned from Survey of Student Opinions, we
annually develop five or six short, web-based focused Campus Pulse
Surveys directed at specific issues that surfaced. Among recent
Campus Pulse Surveys:
–
–
–
–
Registration Procedures Within a PeopleSoft Environment
Quality of Academic Advising at the University
Personal Security on Campus
Issues Related to Diversity within the Undergraduate Student Body
Note: While there are a number of instruments that allow
for assessment of student satisfaction among
undergraduate students, there is very little in the way of
instrumentation for assessing graduate student research.
Graduate students are virtually forgotten when it comes to
any facet of student research, and data collection
instruments are generally locally developed, if they exist at
all.
We have just developed a Graduate Student Satisfaction
Survey and will be happy to share it with interested parties.
Student Engagement
• College Student Expectations Questionnaire (CSXQ)
• College Student Experiences Questionnaire (CSEQ)
• National Survey of Student Engagement (NSSE)
Benchmarks of Effective Educational Practice
(NSSE)
• Level of academic challenge
–
Course prep, quantity of readings and papers, course emphasis, campus environment emphasis
• Student interactions with faculty members
–
Discuss assignments/grades, career plans & readings outside of class, prompt feedback, studentfaculty research
• Supportive campus environment
–
Resources to succeed academically, cope w/ non-academic issues, social aspect, foster relationships
w/ students, faculty, staff
• Active and collaborative learning
–
Ask questions & contribute in class, class presentations, group work, tutor peers, community-based
projects, discuss course-related ideas outside class
• Enriching educational experiences
–
Interact w/ students of a different race or ethnicity, w/ different religious & political beliefs, different
opinions & values, campus environment encourages contact among students of different economic,
social, & racial or ethnic backgrounds, use of technology, participate in wide-range of activities
(internships, community service, study abroad, independent study, senior capstone, co-curricular
activities, learning communities)
IV.A.1. TO WHAT EXTENT HAS YOUR EXPERIENCE AT THIS INSTITUTION
CONTRIBUTED TO ACQUIRING A BROAD GENERAL EDUCATION?
3.30
3.28
3.25
3.23
3.20
2001
3.14
3.15
2005
3.10
3.10
3.05
3.00
UD Freshmen
UD Seniors
Both 2005 University of Delaware freshmen and seniors estimate greater gains in general education than in
2001. The 2005 University of Delaware responses are not statistically significantly different from those at
national research and peer universities.
3.30
3.28
3.25
3.25
3.22
3.20
UD
3.17
3.15
Research Univ.
3.14 3.14
Peer Univ.
3.10
3.05
2005 Freshmen
2005 Seniors
1 = VERY LITTLE; 2 = SOME; 3 = QUITE A BIT; 4 = VERY MUCH
V.B.2. HOW WOULD YOU BEST CHARACTERIZE YOUR RELATIONSHIPS WITH
FACULTY AT THIS INSTITUTION?
5.50
5.43
5.40
5.29
5.30
5.24
5.20
5.10
2001
5.00
2005
4.94
4.90
4.80
4.70
4.60
UD Freshmen
UD Seniors
2005 University of Delaware freshmen and seniors both report a decline in the quality of their relationships
with faculty. That said, the University of Delaware responses from both groups are not statistically
significantly different from those at national research and peer universities.
5.40
5.29
5.30
5.21
5.23
5.20
5.10
5.00
4.90
UD
4.96
Research Univ.
4.94
Peer Univ.
4.85
4.80
4.70
4.60
2005 Freshmen
2005 Seniors
1 = UNAVAILABLE, UNHELPFUL, UNSYMPATHETIC; 7 = AVAILABLE, HELPFUL, SYMPATHETIC
Graduating Students
EMPLOYMENT AND EDUCATIONAL STATUS OF
BACCALAUREATES BY CURRICULUM GROUP
CLASS OF 2005
Curriculum Group
Number
in
Class
Number of
Respondents¹
N
%
Full-Time
Employment
%
Part-Time
Employment
%
Pursuing
Further
Education
%
Still
Seeking
Employment
%
Agriculture & Natural
Resources
175
63
36.0
58.7
12.7
19.0
7.9
Humanities
417
145
34.8
67.6
10.3
13.1
9.0
Social Sciences
904
295
32.6
71.2
4.4
14.2
8.5
Life & Health Sciences
158
63
39.9
73.0
--
25.4
--
Physical Sciences
166
61
36.7
72.1
--
16.4
9.8
Business & Economics
571
227
39.8
89.9
3.1
2.2
4.4
Engineering
189
103
54.5
85.4
--
11.7
1.9
Health Sciences
390
144
36.9
72.9
2.8
21.5
2.8
Human Services,
Educ. & Public Policy
557
213
38.2
85.9
2.8
4.2
5.6
2005 University Total2
3,527
1,314
37.3
77.2
4.0
11.9
5.9
Arts & Sciences
NOTE: Percentages may not total to 100.0 because of rounding
Alumni Research
• Commercially prepared instruments exist. Decide if they
meet your needs or if you have to develop your own.
• Decide early on why you are doing this research: Are you
assessing the continuing relevance of the college
experience? Are you cultivating prospects for the
Development Office? Both? Be up front if fund raising is a
component.
• Decide the which classes you need to survey; don’t go after
every living alumnus unless you are a very young
institution.
Assessing Student Learning Outcomes
• I’ll provide only a brief overview, as there are others (Linda
Suskie, Trudy Banta, Jeff Seybert) who are far better versed
than I am.
• That said, understand that assessment of student learning
is at the core of demonstrating overall institutional
effectiveness.
• Assessment of student learning is a direct response to the
inadequacy of student grades for describing general
student learning outcomes.
According to Paul Dressel of Michigan State University
(1983), Grades Are:
“
An inadequate report of an inaccurate judgment by a
biased and variable judge of the extent to which a student
has attained an undefined level of mastery of an unknown
proportion of an indefinite material. ”
There is no “one size fits all” approach to
assessment of learning across the disciplines
None of these should be applied to evaluation of individual student performance for
purposes of grading and completion/graduation status.
1. Standardized Tests
2.
3.
4.
5.
• General Education or Discipline Specific
• State, Regional, or National Licensure Exams
Locally Produced Tests/Items
• “Stand Alone” or Imbedded
Portfolios/Student Artifacts
• Collections of Students’ Work
• Can Be Time Consuming, Labor Intensive, and Expensive
Final Projects
• Demonstrate Mastery of Discipline and/or General Education
Capstone Experiences/Courses
• Entire Course, Portion of a Course, or a Related Experience
(Internship, Work Placement, etc.)
Non-Student Measures of
Institutional Effectiveness:
Teaching Productivity, Instructional Costs,
and Externally Funded Scholarship
Budget Support Metrics
In 1988, the University of Delaware…
• Was transitioning from a highly centralized, “closed”
management style with respect to sharing of information.
• Had grown from a total enrollment of 7,900 students in the late
1960’s to 20,000+ students in the mid-1980’s.
• When financial data were examined, found itself with $9 million
in recurring expenses on non-recurring revenue sources.
• Needed to eliminate 240+ full time positions from the basic
budget to achieve a balanced budget,
Ground Rules in Making Budgetary Decisions
• Decisions would be rooted in quantitative and qualitative data
that would be collegially developed and broadly shared.
• Savings would initially be achieved by eliminating vacant
positions, outsourcing non-essential functions, and taking
advantage of technology.
• In eliminating human and fiscal resources, to the largest extent
possible the academic core of the University would be insulated.
However, over the long term, resources would need to be
reallocated between and among academic units.
How Best to Make Resource Reallocation Decisions Within
Academic Units?
• A series of budget support metrics would be developed for
measuring instructional productivity and costs within academic
departments and programs.
• Input as to which variables should be used was sought from
deans and department chairs. These variables were
supplemented by those identified by Office of Institutional
Research and Planning, and a final set of productivity/cost
variables was achieved through consensus.
• The resulting product became known as ”Budget Support
Notebooks.”
Looking at a Budget Support Notebook Page from a
Humanities Department Within the College of Arts
and Science
BUDGET SUPPORT DATA
College of Arts and Science
1996-97 Through 1998-99
Department X
A. TEACHING WORKLOAD DATA
FALL
FALL
FALL
SPRING
SPRING
SPRING
1996
1997
1998
1997
1998
1999
FTE MAJORS
Undergraduate
38
31
39
38
40
39
Graduate
0
0
0
0
0
0
Total
38
31
39
38
40
39
DEGREES GRANTED
Bachelor's
-----
-----
-----
20
19
19
Master's
-----
-----
-----
0
0
0
Doctorate
-----
-----
-----
0
0
0
TOTAL
-----
-----
-----
20
19
19
STUDENT CREDIT HOURS
Low er Division
6,246
5,472
5,448
4,518
6,156
5,478
Upper Division
726
638
869
1,159
951
966
Graduate
Total
% Credit Hours Taught by
183
153
129
195
276
135
7,155
6,263
6,446
5,872
7,383
6,579
77%
81%
77%
82%
91%
82%
23%
19%
23%
18%
9%
18%
98%
97%
98%
96%
98%
97%
Faculty on Appointm ent
% Credit Hours Taught by
Supplem ental Faculty
% Credit Hours Consum ed by Non-Majors
BUDGET SUPPORT DATA
College of Arts and Science
1996-97 Through 1998-99
Department X
A. TEACHING WORKLOAD DATA
FALL
FALL
FALL
SPRING
SPRING
SPRING
1996
1997
1998
1997
1998
1999
FTE STUDENTS TAUGHT
Low er Division
416
365
363
301
410
365
Upper Division
48
43
58
77
63
64
Graduate
20
17
14
22
31
15
Total
485
424
435
400
504
445
Departm ent Chair
1.0
1.0
1.0
1.0
1.0
1.0
Faculty on Appointm ent
15.0
16.0
15.0
15.0
15.0
15.0
Supplem ental Faculty
1.5
1.0
1.3
1.0
0.8
1.5
Total
17.5
18.0
17.3
17.0
16.8
17.5
Student Credit Hrs./FTE Faculty
408.9
347.9
373.7
345.4
440.8
375.9
FTE Students Taught/FTE Faculty
27.7
23.6
25.2
23.5
30.1
25.4
FTE FACULTY
WORKLOAD RATIOS
BUDGET SUPPORT DATA
College of Arts and Science
1996-97 Through 1998-99
Department X
B. FISCAL DATA
FY 1996
FY 1997
FY 1998
($)
($)
($)
RESEARCH AND SERVICE
Research Expenditures
0
5,151
499
Public Service Expenditures
0
0
0
Total Sponsored Research/Service
0
5,151
499
Sponsored Funds/FTE Fac. On Appointment
0
312
31
1,068,946
1,141,927
1,144,585
Direct Expense/Student Credit Hour
81
84
88
Direct Expense/FTE Student Taught
1,198
1,229
1,301
3,960,208
4,366,720
4,311,275
3.73
3.82
3.77
COST OF INSTRUCTION
Direct Instructional Expenditures
REVENUE MEASURES
Earned Income from Instruction
Earned Income/Direct Instructional Expense
In the Initial Stages, Summary Data Such as Budget
Support Data May be Challenged, and Must be Supported
by Solid Background Analysis
Who Is Teaching What to Whom?
Table 1: Departmental Workload Verification Data
Name
Thomas Jones
Rank/
Course(s)
# Organized
Sections
Chairperson
SOC 454
SOC 964
1
0
Tenure/
Credits
Home Dept./
Course Type
Tenured
Sociology
3 Hrs.
3-12Hrs.
Lecture
Supv. Study
%
Load
100
100
Professor
SOC 201
SOC 203
William Davis
1
1
Professor
Assistant Professor
Sociology
3
3
Lecture
Lecture
S-Contract?
Yes/No
27
3
30
3
1
4
No
No
738
300
1038
3
3
6
No
No
Total
246
100
346
100
100
Tenured
Sociology
3Hrs.
Lecture
100
8
24
3Hrs.
Lecture
100
38
114
3
No
3Hrs.
1-3Hrs.
1-6Hrs.
Lecture
Supv. Study
Supv. Study
100
100
100
13
1
1
39
1
3
3
1
1
No
No
No
Tenure Track
Sociology
3
No
CSC 311
1
Cross Listed With SOC 311
SOC 311
1
Cross Listed With CSC 311
SOC 327
1
SOC 366
0
SOC 866
0
Roger Brown
Tenured
Teaching
Credits
9
0
9
Total
Mary Smith
Students
Enrolled
Credits
SOC 467
1
400 Level Meets With 600 Level
SOC 667
1
600 Level Meets With 400 Level
SOC 213
1
Cross Listed With WOMS 213
WOMS 213
1
Cross Listed With SOC 213
3Hrs.
Lecture
100
7
21
3Hrs.
Lecture
100
1
3
3Hrs.
Lecture
100
77
231
3
3Hrs.
Lecture
100
21
106
63
318
6
Total
No
No
No
At What Cost?
Departmental Expenditures, by Object and by Function: Fiscal Year 1999
Undergraduate Department in Humanities
Departmental
Research
(09)
Org. Activity
Educ. Depts.
(10)
27,570
0
0
0
0
0
993,612
23,985
0
0
0
59,601
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1,104,768
0
0
0
0
0
Support
Miscellaneous Wages
Travel
Supplies and Expenses
Occupancy and Maintenance
Equipment
Other Expenses
Credits and Transfers
1,884
12,275
10,496
1,270
0
13,932
0
0
6,510
2,925
0
0
230
0
0
0
0
0
0
0
0
499
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
Subtotal
39,817
9,665
0
499
0
0
1,144,585
9,665
0
499
0
0
Instruction
(01-08)
Public
Service
(41-43)
Research
(21-39)
Academic
Support
(51-56)
Expenditures
Salaries
Professionals
Faculty
Full-Time (Including Dept. Chair)
Part-Time (Including Overload)
Graduate Students
Post Doctoral Fellows
Tuition/Scholarship
Salaried/Hourly Staff
Fringe Benefits
Subtotal
TOTAL EXPENDITURES
Ground Rules For Using Budget Support Data
• Decisions are never made on the basis of a single year of
data. Trend information is the goal.
• Data are not used to reward or penalize, but rather as tools
of inquiry as to why productivity/cost patterns are as they
appear.
• It is understood that there are qualitative dimensions to
productivity and cost that are not captured in this analysis.
Budget Support Data
• Gave academic units a sense of buy-in and ownership of the
data being used for academic management and planning.
• Helped transform academic departments from 54 loosely
confederated fiefdoms into a coherent university, with each unit
contributing in meaningful ways to realization of the University’s
mission areas of teaching, research, and service.
Extending Budget Support Analysis….
• As useful as appropriate comparisons are between and among
like departments within the University, comparisons would be
substantially enhanced if, for example, the History Department
at the University of Delaware could be compared with History
Departments at actual peer universities, and at universities with
History Departments to which the University of Delaware
aspires.
• Out of this need for external comparative information, the
Delaware Study of Instructional Costs and Productivity was born.
Delaware Study of Instructional Costs and
Productivity
• Over the past decade, the Delaware Study of Instructional Costs and
Productivity has emerged as the tool of choice for benchmarking data
on faculty teaching loads, direct instructional costs, and externally
funded faculty scholarship, all at the academic discipline level of
analysis.
• The emphasis on disciplinary analysis is non-trivial. Over 80 percent of
the variance in instructional expenditures across four-year
postsecondary institutions is accounted for by the disciplines that
comprise a college’s or university’s curriculum.
Delaware Study:
Teaching Load/Cost Data Collection Form
Using Delaware Study Data
• We provide the Provost with data from multiple
years of the Delaware Study, looking at the
University indicators as a percentage of the national
benchmark for research universities.
• The Provost receives a single sheet for each
academic department, with graphs reflecting the
following indicators: Undergraduate Fall SCRH/FTE
Faculty, Total Fall SCRH/FTE Faculty; Total AY
SCRH/FTE Faculty (All); Fall Class Sections/FTE
Faculty; Direct Cost/SCRH; External Funds/FTE
faculty
Science Department
Undergraduate S tudent Credit Hours
Taught per FTE T/TT Faculty
Total S tudent Credit Hours Taught per FTE
T/TT Faculty
300
250
200
UD
150
Nat'l
100
50
0
1994
1996
1997
1998
350
300
250
200
150
100
50
0
Total Class S ections Taught per
FTE T/TT Faculty
Nat 'l
1996
1997
1998
1996
1997
1998
1999
Total S tudent Credit Hours Taught per FTE
Faculty (All Categories)
UD
1994
Nat'l
1994
1999
3.5
3.0
2.5
2.0
1.5
1.0
0.5
0.0
UD
1999
260
250
240
230
220
210
200
190
180
UD
Nat 'l
1994
Direct Instructional Expenditures per
S tudent Credit Hour
1996
1997
1998
1999
S eparately Budgeted Research and S ervice
Expenditures per FTE T/TT Faculty
$250
$120,000
$200
$100,000
$150
UD
$100
Nat 'l
$80,000
UD
$60,000
Nat 'l
$40,000
$50
$20,000
$0
$0
1994
1996
1997
1998
1999
FY95
FY97
FY98
FY99
FY00
The Delaware Study – Next Steps
• As useful as Delaware Study teaching load/cost benchmarks
are, they do not address the non-classroom dimensions of
faculty activity in an institution and its academic programs.
• It is possible that quantitative productivity and cost
indicators for a given program/discipline may differ
significantly from other institutional, peer, and national
benchmarks for wholly justifiable reasons of quality that
can be reflected in what faculty do outside of the
classroom.
• However, this cannot be determined unless measurable,
proxy indicators of quality are collected.
The Delaware Study – Faculty Activity Study
• In Fall of 2001, the University of Delaware was awarded a second
three-year FIPSE grant.
• This grant underwrote the cost of developing data collection
instruments and protocols for assessing out-of-classroom facets of
faculty activity.
• Once again, the grant supported an Advisory Committee charged with
responsibility for refining and enhancing data collection
instrumentation, data definitions, and study methodology.
• Faculty Activity Study collects data on 43 discrete variables related to
instruction, scholarship, service to the institution, service to the
profession, and public service.
Delaware Study:
Faculty Activity Study Data Collection Form
Transparency in Assessment
www.udel.edu/ir
Other Issues Related to Institutional Effectiveness
• In order to attract and retain the most capable faculty and
staff, you have to compensate them.
AAUP Academe (March/April Issue) annually publishes average
faculty salary, by rank, for most institutions in the country, 2-year and
4-year
CUPA-HR and Oklahoma Salary Study annually publish average faculty
salary, by rank, by discipline, by Carnegie institution type. Also publish
average salary for newly hired Assistant Professor.
Salary equity and salary compression studies
Other Issues Related to Institutional Effectiveness
• In order to attract and retain the most capable faculty and
staff, you have to provide a hospitable workplace.
Employee Satisfaction Studies
Campus Climate Studies
Economic Impact Studies
• While not a direct measure of institutional effectiveness, economic
impact studies can be a powerful tool in shaping college/government
relations for public institutions in particular. The methodology is
straightforward, and we will share it if you contact our office.
“In 2007, the University, its employees, and students spent approximately $410 million
in Delaware. Using U.S. Department of Commerce Bureau of Economic Analysis
multipliers, University of Delaware expenditures have an overall impact on the local
economy of approximately $751 million. In addition to the intellectual capital that the
University provides to the State of Delaware, there is significant financial capital as well.
University expenditures total more than three times the State’s appropriation to the
institution, and the overall economic impact of those expenditures is over six times the
State appropriation. “
Institutional “Dashboards” that report on key success
indicators can be an succinct means of reporting on
basic measures of institutional effectiveness.
www.udel.edu/ir/UDashboard
Claims of institutional effectiveness are stronger
when focal institution’s measures on important
measures are compared with those of peer
institutions.
Choosing Peer Groupings
• Scientific – Cluster Analysis or Other Multivariate Tool
• Pragmatic - e.g. Compensation Peers
• “Whatever!” - e.g. Admissions Peers
We are extending the Dashboard concept to
include key variables related to the University’s
new Strategic Plan that enable us to compare our
position vis-à-vis actual peers and aspirational
peers.
ts
ac
us
e
So
U
ut
.M
he
ia
rn
m
i
M
et
ho
di
St
st
.L
ou
is
U
Pi
.
tts
bu
rg
h
C
le
m
so
M
n
ar
qu
et
C
te
on
ne
ct
Pe icut
nn
St
at
e
D
el
aw
ar
e
Ba
Ka
yl
or
ns
U
. M as S
ta
as
te
sAm
he
R
rs
ut
t
ge
U
r
s
SC
N
B
-C
ol
um
bi
a
Au
bu
Vi
rg
rn
in
ia
M
ia
Te
m
SU
ch
iU
N
.O
Y
at
xf
or
Bi
d
ng
ha
m
to
n
Sy
r
15
10
Br
C
ow
as
e
n
W
es
te
rn
Le
hi
gh
R
oc
he
st
er
C
Tu
ar
ne
la
ne
gi
e
M
el
G
lo
eo
n
rg
et
N
o
w
ew
n
W
Y
illi
am ork
an U .
d
M
ar
Bo
y
st
on
N
W
C
ot
.
isc
re
on
D
am
sin
e
-M
U
ad
N
C
iso
M
C
ich
n
ha
ig
an pel
H
-A
ill
nn
Ar
bo
R
en
r
ss
el
ae
r
Vi
rg
in
ia
D
Te ela
w
xa
ar
s
e
U
at
.C
Au
al
ifo
st
rn
in
ia
-Ir
vin
e
Tu
f
30
Student Faculty Ratio: Actual Peer Group
25
26
20
12
13
9
14
9
15
9
15
9
16
9
16
10
17
11
17
11
17
15
18
13
19
13
13
19
14
20
15
20
15
20
20
21
17
22
10
5
0
Student Faculty Ratio: Aspirational Peer Group
30
25
18
19
15
11
7
5
0
D
el
aw
C
as
ar
e
e
W
es
t
N
ew ern
Yo
rk
R
U
en
.
U
.C
ss
al
el
ifo
ae
rn
r
ia
-Ir
vi
ne
R
oc
Te
he
xa
st
er
s
W
at
is
co
Au
ns
st
in
-M in
C
ad
ar
is
ne
on
gi
e
M
el
lo
n
Le
hi
gh
Bo
W
s
illi
t
am on
C
.
an
d
M
ar
M
y
ic
hi
ga
Tu
nAn fts
n
Ar
bo
r
U
Br
N
ow
C
C
n
ha
pe
lH
ill
Vi
rg
in
G
ia
eo
rg
et
ow
N
ot
n
re
D
am
e
Ka
ns
as
St
at
St
e
.L
U
.M
ou
is
as
U
sAm .
he
rs
So
t
ut
Ba
he
rn
yl
or
M
et
U
h
SC
od
is
-C
t
ol
um
bi
a
Au
bu
rn
C
le
m
R
SU
so
ut
n
N
ge
Y
r
s
at
N
Bi
ng .B.
ha
m
to
Pi
n
tts
b
Vi
ur
rg
gh
in
M
ia
ia
Te
m
ch
iU
.O
xf
o
Sy rd
ra
cu
se
D
el
aw
ar
e
U
.M
i
am
G
M
eo
i
ar
rg
q
e
ue
W
as tte
hi
ng
C
on ton
ne
ct
Pe icu
t
nn
St
at
e
Freshman to Sophomore Retention Rate: Actual Peer Group
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
100%
79%
91% 92% 93% 94%
87% 89% 89% 89% 89% 89% 90% 90% 90% 90%
83% 83% 84% 86% 86%
Freshman to Sophomore Retention Rate: Aspirational Peer Group
98%
94% 94% 95% 96% 96% 96% 97% 97% 97% 97%
90% 91% 92% 93% 93% 93% 93% 93%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
W
co
is
W
ow
a
rg
n
et
ow
N
ot
n
re
D
am
e
G
eo
Br
in
i
fts
84% 84% 86% 86% 87%
80% 82%
76% 77% 78% 78% 79%
Vi
rg
100%
Tu
st
in
es
ns
te
in
rn
-M
U
.C
a
di
al
so
ifo
n
rn
ia
-Ir
vi
n
R
oc e
he
st
R
er
en
ss
el
N
ew aer
U
Y
N
or
C
k
U
C
.
ha
C
p
ar
el
ne
H
ill
gi
e
M
el
lo
M
n
ic
hi
L
ga
eh
ni
An gh
n
Ar
bo
B
r
W
os
illi
to
am
n
C
.
an
d
M
ar
y
e
as
e
71%
Au
ar
90%
at
St
at
e
Au
U
S
b
C
-C urn
U
. M olu
m
as
bi
sAm a
he
R
ut
rs
ge
t
rs
N
.B
.
U
.M
ia
m
Pi
i
tts
bu
rg
So
h
ut
he
Ba
rn
yl
or
M
et
ho
C
di
on
s
ne t
ct
ic
ut
C
le
m
s
St
. L on
ou
is
U
D
.
el
a
w
SU
ar
M
N
e
ar
Y
qu
at
et
Bi
G
te
n
eo
rg gha
e
m
W
t
as on
hi
n
Vi
gt
rg
on
in
M
ia
ia
T
m
i U ech
.O
xf
o
Sy rd
ra
c
Pe us
nn e
St
at
e
s
sa
59%
C
s
xa
e
Ka
n
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
Te
el
aw
la
n
80%
D
Tu
Six Year Graduation Rate: Actual Peer Group
82% 85%
78% 78% 79% 79% 81%
73% 73% 73% 74% 74% 74% 75% 75% 76%
63% 63% 66%
Six Year Graduation Rate: Aspirational Peer Group
96%
91% 91% 92% 92% 94% 94%
70%
60%
50%
40%
30%
20%
10%
0%
Dashboard Variables Being Collected
STUDENT CHARACTERISTICS
Percent of Accepted Applicants Matriculated
Total Minority Percentage in Undergraduate Student Body
Freshman to Sophomore Retention Rate
Four Year Graduation Rate
Six Year Graduation Rate
RESEARCH ACTIVITY
Total R&D Expenditures per Full Time Faculty
Total Service Expenditures per Full Time Faculty
FINANCE
University Endowment as of June 30
Alumni Annual Giving Rate
INSTRUCTION
Full Time Students per Full Time Faculty
Total Degrees Granted
Total Doctoral Degrees Granted
Percent of Faculty With Tenure
Percent of Faculty That Are Full Time
Percent of Women Among Full Time Faculty
Percent of Minorities Among Full Time Faculty
Percent of Minorities Among Full Time Staff
Total Faculty Who Are National Academy Members
A Possible Format for the Strategic Dashboard
Another Potential Format for Strategic Dashboard
That’s All, Folks!
• What have I missed that you would like covered?
• Other questions????
• [email protected]