Transcript Document

MISSION POSSIBLE:
ASSESSING GRADUATE
AND PROFESSIONAL
PROGRAMS
Dr. Timothy S. Brophy
Director of Institutional Assessment
University of Florida
Gainesville, FL
TODAY’S GOALS
 Part 1: To introduce, describe, and explain the basic elements
of student learning outcomes and program goals, their
development, and measurement
 Part 2: To share a structure for assessment planning and
reporting for graduate and professional programs and review
an example
 Part 3: Review and discuss graduate sample academic
assessment data reports
COMMON CHALLENGES
Size and scope
•Multiple colleges/units
•Undergraduate, graduate, professional, and certificate programs
Institutional consistency
•Outcomes
•Assessment reporting
•Cycles
Institutional Culture
Management and Tools
Honoring unit autonomy, disciplinary distinctions, and institutional requirements
Faculty comportment
HOW ACCREDITORS DEFINE
EDUCATIONAL PROGRAMS
An educational program is a coherent
set of courses leading to a credential
(degree, diploma, or certificate)
awarded by the institution.
(SACSCOC, 2011)
EXPECTATIONS FOR ACADEMIC
ASSESSMENT FOR ACCREDITATION
There is a clear expectation that an institution be able to
demonstrate institutional effectiveness for all its diplomas,
certificates, and undergraduate and graduate educational
degree programs.
The expectation is that the institution will engage in ongoing
planning and assessment to ensure that for each academic
program, the institution develops and assesses student
learning outcomes.
Program and learning outcomes specify the knowledge,
skills, values, and attitudes students are expected to attain in
courses or in a program.
EXPECTATIONS FOR ACADEMIC
ASSESSMENT
Methods for assessing the extent to which students achieve
these outcomes are appropriate to the nature of the discipline,
and consistent over time to enable the institution to evaluate
cohorts of students who complete courses or a program.
Shared widely within and across programs, the results of this
assessment can affirm the institution’s success at achieving its
mission and can be used to inform decisions about curricular
and programmatic revisions.
At appropriate intervals, program and learning outcomes and
assessment methods are evaluated and revised.
PART 1:
STUDENT LEARNING
OUTCOMES, PROGRAM
GOALS, AND OUTPUTS
DEFINE AND DISSEMINATE THE TERMS
Student Learning Outcomes (SLOs) are
defined generally as “what students are
expected to know and be able to do by
completion of their degree program”
Define this for your faculty and ensure that
this definition is consistent across campus
and clearly posted
CONSIDER A CATEGORICAL ORGANIZING
FRAMEWORK FOR SLOS
Undergraduate
Content
Knowledge
Critical thinking
Graduate
Communication
Content
Knowledge
Professional
Behavior
Skills
THEE CHARACTERISTICS OF SLOS:
RECENCY, RELEVANCE, AND RIGOR
Student Learning Outcomes reflect the curriculum
the discipline, and faculty expectations; as these
elements evolve, learning outcomes change.
Recency has to do with the degree to which the
outcome reflects current knowledge and practice
in the discipline.
Relevance is the degree to which the outcome
relates logically and significantly to the discipline
and the degree.
Rigor has to do with the degree of academic
precision and thoroughness that the outcome
requires to be met successfully.
DISTINGUISH OUTPUTS FROM OUTCOMES
Outputs describe
and count what we
do and whom we
reach, and represent
products or services
we produce.
Processes deliver
outputs; what is
produced at the end
of a process is an
output.
An outcome is a
level of performance
or achievement. It
may be associated
with a process or its
output. Outcomes
imply measurement
- quantification - of
performance.
OUTCOMES AND OUTPUTS: WHAT IS THE
DIFFERENCE?
We seek to measure outcomes as well as their associated
outputs; however, SLOs focus on outcomes.
For example, while we produce a number of new graduates (the
output), it is critical that we have a measure of the quality of the
graduates as defined by the college or discipline (the outcome).
Outcomes describe, in measurable terms, these quality
characteristics by defining our expectations for students.
EXERCISE 1: ARE THESE
RESULTS STATEMENTS
OUTPUTS OR
OUTCOMES?
OUR PROGRAM GRADUATED 25
STUDENTS IN SPRING 2013.
A. Output
B. Outcome
om
e
0%
Ou
tc
Ou
tp
ut
0%
75% OF OUR STUDENTS ACHIEVED LEVEL 4 (OUT OF
5) ON OUR PRESENTATION ASSESSMENT RUBRIC.
A. Output
B. Outcome
om
e
0%
Ou
tc
Ou
tp
ut
0%
WE RECRUITED 10 ADDITIONAL
STUDENTS IN 2013-14.
A. Output
B. Outcome
om
e
0%
Ou
tc
Ou
tp
ut
0%
IN 2013, OUR DOCTORAL STUDENTS PUBLISHED 10
PAPERS IN THE INTERNATIONAL JOURNAL OF
PSYCHOLOGY.
A. Output
B. Outcome
om
e
0%
Ou
tc
Ou
tp
ut
0%
DISTINGUISH
SLOS AND PROGRAM GOALS
Student Learning
Outcomes (SLOs)
describe what
students should
know and be able to
do as a result of
completing an
academic program.
Program faculty set
targets for their
SLOs
Program Goals
describe the unit’s
expectations for
programmatic
elements, such as
admission criteria,
acceptance and
graduation rates,
etc.
EXERCISE 2: STUDENT
LEARNING OUTCOMES
OR PROGRAM GOALS?
WE WILL LOWER OUR ATTRITION RATE
TO 10%.
A. Program Goal
B. Student Learning
Outcome
0%
St
ud
en
tL
ea
rn
i
ng
Pr
og
ra
m
Ou
tc
om
e
Go
al
0%
MUSIC EDUCATION STUDENTS DISCRIMINATE
MUSICAL QUALIT Y BASED ON SOUND MUSICAL
REASONING.
A. Program Goal
B. Student Learning
Outcome
0%
St
ud
en
tL
ea
rn
i
ng
Pr
og
ra
m
Ou
tc
om
e
Go
al
0%
WE WILL REDUCE THE AVERAGE TIME TO DEGREE IN
OUR GRADUATE PROGRAM FROM THE 2012-13 RATE
OF 6.5 YEARS TO 5 YEARS IN 2014 -15.
A. Program Goal
B. Student Learning
Outcome
0%
St
ud
en
tL
ea
rn
i
ng
Pr
og
ra
m
Ou
tc
om
e
Go
al
0%
STUDENTS ANALYZE EXPERIMENTAL DATA AND
INTERPRET RESULTS IN THE CELLULAR AND
MOLECULAR SCIENCES.
A. Program Goal
B. Student Learning
Outcome
0%
St
ud
en
tL
ea
rn
i
ng
Pr
og
ra
m
Ou
tc
om
e
Go
al
0%
ENSURE THE OUTCOME IS MEASURABLE
EFFECTIVE SLOS:
 Focus on what students will know and be able to do.
 All disciplines have a body of core knowledge that students must
learn to be successful as well as a core set of applications of that
knowledge in professional settings.
 Describe obser vable and measurable actions or behaviors.
 Effective SLOs present a core set of observable, measureable
behaviors. Measurement tools vary from exams to complex tasks
graded by rubrics.
 The key to measurability: an active verb that describes a
observable behavior, process, or product
 A framework for developing SLOs: Bloom’s Taxonomy
VERBS AND PHRASES THAT COMPLICATE
MEASURABILIT Y
Understand
•An internal process that is indicated by demonstrated behaviors – OK for learning goals but not
recommended for program or course SLOs
Appreciate; value
•Internal processes that are indicated by demonstrated behaviors closely tied to personal choice or
preference; OK if the appreciation or valuing is supported by discipline-specific knowledge
Become familiar with
•Focuses assessment on “becoming familiar,” not familiarity
Learn about, think about
•Not observable; demonstrable through communication or other demonstration of learning
Become aware of, gain an awareness of
•Focuses assessment on becoming and/or gaining – not actual awareness
Demonstrate the ability to
•Focuses assessment on ability, not achievement or demonstration of a skill
DEVELOPING MEASURABLE SLOS:
A THREE-LEVEL MODEL (CARRIVEAU, 2010)
Program Learning Goal Level – programs establish learning goals for the degree
these goals require multiple actions over time to measure
Program-level – Student Learning Outcome
these describe what students will do to demonstrate they have met the learning goals
Course-level Student Learning Outcome
these are determined by the faculty and specify course-level, observable products or demonstrations
This model connects course-level and program-level SLOs directly to the program learning goals
EXERCISE 3: ARE THE
FOLLOWING OUTCOME
STATEMENTS
MEASURABLE?
STUDENTS UNDERSTAND GOOD WRITING
ST YLE.
A. Yes
B. No
0%
No
Ye
s
0%
STUDENTS SIGHT-SING A 16-MEASURE
MELODY WITH NO ERRORS.
A. Yes
B. No
0%
No
Ye
s
0%
STUDENTS EXPLORE AND LEARN ABOUT
GEOLOGICAL FORMATIONS.
A. Yes
B. No
0%
No
Ye
s
0%
STUDENTS DEFINE THE ETHICAL RESPONSIBILITIES OF
BUSINESS ORGANIZATIONS AND IDENTIFY RELEVANT
ETHICAL ISSUES.
A. Yes
B. No
0%
No
Ye
s
0%
BALANCE DIRECT AND INDIRECT
ASSESSMENTS
Direct assessments
of student learning
are those that
provide for direct
examination or
observation of
student knowledge
or skills against
measurable
performance
indicators.
Indirect
assessments are
those that ascertain
the opinion or selfreport of the extent
or value of learning
experiences (Rogers,
2011)
EXERCISE 4: DIRECT OR
INDIRECT
ASSESSMENTS?
COURSE FINAL EXAM
A. Direct
B. Indirect
ct
0%
In
di
re
Di
re
ct
0%
SERU OR NSSE SURVEY DATA.
A. Direct
B. Indirect
ct
0%
In
di
re
Di
re
ct
0%
FINAL PAPER, PERFORMANCE, OR
PRESENTATION GRADED BY A FACULT Y
DEVELOPED RUBRIC.
A. Direct
B. Indirect
ct
0%
In
di
re
Di
re
ct
0%
SENIOR EXIT INTERVIEW.
A. Direct
B. Indirect
ct
0%
In
di
re
Di
re
ct
0%
PART 2: PLANNING AND
REPORTING
ACADEMIC ASSESSMENT PLANNING
Academic Assessment Plans provide a
common framework for units to plan
how they assess and measure student
achievement of the SLOs
Plans also present the process for how
the data from these assessments are
used to enhance the quality of student
learning
WHY PLAN?
Provides faculty a focal point for the discussion
of the assessment of student learning in the
degree programs.
Planning discussions provide an opportunity to
revisit the curriculum and its relationship to the
SLOs.
Provides a consistent reference resource when
faculty and leadership change.
GRADUATE AND PROFESSIONAL
PROGRAM ASSESSMENT PLAN
Mission
Alignment
Student
Learning
Outcomes
Assessment
Oversight
Methods and
Procedures
Graduate
/Professional
program
Assessment
Plan
Assessment
Cycle
Assessment
Timeline
Research
ACADEMIC ASSESSMENT PLAN
APPROVAL PROCESS
Call issued by the
Office of Institutional
Assessment
SACS coordinators
collect and review
plans
Plans are submitted
to the online approval
system
Plans approved by
the AAC go to the
University Curriculum
Committee (UCC)
Approved plans go to
the Academic
Assessment
Committee (AAC)
Director of
Institutional
Assessment reviews
plans
UCC approves plans
and forwards to the
Student Academic
Support System or
Graduate School
These are uploaded
to the catalog
SECTIONS OF THE PLAN
Template:
Figure 4 –
pp. 3-4 in
your
handout
Rubric:
Figure 1 ,
p. 2
MISSION
Describe briefly the program’s mission, and how
the program meets the department, college, and
university missions. For example:
The mission of the<enter name> program is to
<enter text>. This aligns with the department
mission by <enter text.> It also supports the
college mission to <enter text>. It supports the
university mission by <enter text>.
STUDENT LEARNING OUTCOMES
 The complete file of graduate and professional
program SLOs is on the Institutional Assessment
website – http://assessment.aa.ufl.edu
 UF’s Graduate SLO categories are Content, Skills,
and Professional Behaviors
 Online resources at the UF Institutional Assessment
website:
 “Writing Measurable Student Learning Outcomes”
PowerPoint
 “Guide to Writing Student Learning Outcomes”
RESEARCH
What are the research expectations for students in your program?
Briefly describe the research expectations for students in the degree
program.
How does your program prepare the students to become researchers in the
discipline?
If the degree is NOT a research degree, briefly state this, and include a
brief description of any research-related activities that students complete
in the program.
ASSESSMENT TIMELINE
The Assessment Timeline is a matrix that shows when
the SLOs assessed and measured in the program
It should be clear to a student when an assessment
occurs, and the type of assessment that is planned
(assignment, project, paper, performance,
presentation, etc.)
ASSESSMENT TIMELINE
Program_____________________________________ College ____________________________________
Assessment
SLOs
Content Knowledge
#1
#2
Skills
#3
#4
Professional Behavior
#5
#6
Assessment 1
Assessment 2
Assessment 3
Enter more as needed
ASSESSMENT CYCLE
The Assessment Cycle is a matrix that graphically organizes the
frequency of SLO assessment
The Assessment Cycle is a multi-year process that is
completed in three years
It should be clear to a student when an assessment occurs, and
the type of assessment that is planned (assignment, project,
paper, performance, presentation, etc.)
ASSESSMENT CYCLE
Program
Analysis and Interpretation:
Improvement Actions:
Dissemination:
College
[Enter date or time frame here]
Completed by [Enter date here]
Completed by [Enter date here]
Year 11-12
12-13
13-14
14-15
SLOs
Content Knowledge
#1
#2
Critical Thinking (Undergrad)
Skills (Grad/Prof)
#3
#4
Communication (Undergrad)
Professional Behavior (Grad/Prof)
#5
#6
_
15-16
METHODS AND PROCEDURES
Each unit employs various methods and procedures to assess and
collect data on student learning.
In this section of the plan, units provide information on their specific
methods and procedures for the SLO assessments they identify in
Assessment Timelines.
They must include a sample grading rubric.
It’s OK to sample the total student population – see the
recommended sample size chart on the Institutional Assessment
website
ASSESSMENT OVERSIGHT
These are the
individuals who
are responsible
for managing
the assessment
work in your unit
List everyone
and their
contact
information
The first person
on the list
should be the
lead contact
Purpose:
ongoing
communication
DEVELOP A SYSTEM OR CYCLE OF
PLANNING AND REPORTING
Planning
Reporting
Establish
Mission, Goals,
and Outcomes
Modify and
Improve
Spring
Assessment
Plans
s u b m i t te d f o r
t h e n ex t AY
Assessment
Planning
Fall
Interpret and
Evaluate the
Data
Implement the
Plan and
Gather Data
Assessment Data,
results, and use of
results for previous
AY reported
ASSESSMENT PLANNING EXERCISE:
PHD IN ENGLISH
The review of Academic Assessment Plans is a constructive exercise
designed to guide faculty to consider their assessment work thoughtfully.
The 2012-13 PhD in English Academic Assessment Plan is on pp. 8-11 of
your handout.
Using the rubric on p. 2, discuss in your group the degree to which this plan
meets the guidelines.
Does the plan meet or not meet the criteria?
What comments would you provide?
REPORTING ASSESSMENT RESULTS –
STUDENT LEARNING OUTCOMES (P. 5)
Assessment Method:
• List the assignment, exam, project, etc.
• If this is a sample, describe the sampling procedure used
Results:
• Enter the criterion for success. The “criterion for success” is the minimum percentage of
students who pass the assessment that you consider to be acceptable for your program. If
the criterion is less than 70%, provide a rationale.
• State: “X number of students passed the assessment out of a total of Y students, for a
percentage of Z%”.
• State: This meets/does not meet the criterion for success.
• Attach the data you shared with your faculty (student names redacted).
Use of Results:
• State who reviewed the results.
• Refer to the results that were reviewed.
• State actions taken in past tense. For example:
• “Based on our review, we decided to…”
• “We modified our SLO #1 because the data indicated that…”
• “We changed the course content for ABCXXXX effective fall 20XX because the data
revealed that…”
REPORTING ASSESSMENT RESULTS –
ACADEMIC PROGRAM GOALS
Assessment Method:
• State the measurement method
Results:
• Briefly state your results
• Include or attach the data you collected in summary form
Use of Results:
• State who reviewed the results.
• Refer to the results that were reviewed.
• State actions taken in past tense. For example:
• “Based on our review, we decided to…”
• “Our results led us to modify our goal to…”
• “We developed a new measure for this long term goal based on our
review…”
DEVELOP A QUALITY
ASSURANCE PROCESS
ELEMENTS OF QUALIT Y ASSURANCE
Multi-step,
institutional review
and approval process
Cross-reference plans
with data reported
annually
Templates and
rubrics for guiding
faculty through the
process
Review and evaluate
faculty submissions
Develop and provide
professional
development
Model the process:
Modify and improve
quality assurance
processes based on
the data you collect
2012-13 ACADEMIC ASSESSMENT DATA
REPORTING REVIEW RESULTS
In early 2014 all of the 2012-13 graduate and professional
academic assessment data reports were reviewed.
76% were returned for revisions due to failure to follow our
guidelines
The most requested revision was in the Use of Results field.
Specifically, programs did not state who reviewed the results.
Programs were most likely cited for Program Goal Use of
Results statements.
Many programs were cited for not stating results and use of
results in the past tense
PART 3: EXAMPLES
DATA REPORTING REVIEW EXERCISE –
PHD IN ENGLISH
The 2011-12 Data Analysis Report for the PhD in English is on
pp. 6-7 of your handout (Figures 7 and 8).
The 2012-13 Data Analysis Report for this program is on pp. 1213 of your handout (Figure 10). This report is based on the
2012-13 AAP you have just reviewed. Compare the two data
reports. What differences and similarities do you find?
Using the Data Entry Guides for Faculty in Figure 5 (p. 5 in your
handout) , discuss the degree to which the 2012-13 data report
complies with these guidelines. Note: the guidelines were
provided beginning with the 2012-13 report; the 2011-12 report
was completed without these guidelines.
OTHER EXAMPLES OF DATA REPORTS
We will review Figure 11 together,
on pp. 14-16 of your handout
A SUMMARY
1. Define the terms and disseminate them
2. Consider an institutional categorical organizing
framework for SLOs
3. Recency, Relevance, and Rigor
4. Distinguish Outputs from Outcomes
5. Distinguish SLOs from Program Goals
6. Ensure the outcome is measurable
7. Balance direct and indirect assessments
8. Planning Timeline/Cycle
9. Templates and Rubrics
10. Approval and Management Process
11. A system or cycle of planning and reporting
12. Quality Assurance Process
QUESTIONS
Timothy S. Brophy, Ph.D.
Director, Institutional Assessment
235 Tigert Hall
Of fice of the Provost
Email: [email protected]
Phone: 352-273-4476