Assessment Toolkit

Download Report

Transcript Assessment Toolkit

University of Central Florida
Planning for Student Success:
The Role of Institutional
Effectiveness
Dr. Ron Atwell
Ms. Pam Rea
Dr. Mark Allen Poisel
© 2006 Krist, Atwell, & Poisel
International Assessment and
Retention Conference
June 8, 2007
Agenda
 Introduction and Premises
 Development and Purpose of Bi Level Processes
 Overview of Institutional Effectiveness Process
 Identifying and developing:
 Mission
 Operational Objectives
 Student Learning Outcomes
 Direct and Indirect Measures
 Conclusion and Questions
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
2
University of Central Florida
Fast Facts
 Established in 1963 (first classes in 1968),
Metropolitan Research University
 Grown from 1,948 to 46,848 students in 38 years
 39,661 undergrads and 7,187 grads
 12 colleges and 12 regional campus sites
 89% of lower division and 67% of upper division students
are full-time
 Fall 2006 FTICs Enrolled: 4,131; Transfers: 4,006
 Summer 2006 FTICs Enrolled Fall 2006: 2,545
 Average SAT Total: 1201; Average H.S. GPA: 3.7
 Fall 2006 FT FTIC Retention Rate: 83%
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
3
Introduction and Premises
 Nichols’ assessment model (1995)
 Assessment:
 formative: focus on continuous quality improvement
 summative: one time, evaluative
 addresses academic and student support areas
 Tinto (1993); Pascarella & Terenzini (2005)
 success = total college experience
 Upcraft and Schuh (1995)
 Student:





use and demand
needs
satisfaction
campus environment and cultures
outcomes
 Institution:
 benchmarking
 nationally acceptable
standards
International
Assessment and
Retention Conference, June 8, 2007,
10:30 AM
4
CAS Standards
Council for the Advancement of Standards
In Higher Education
“… purpose of developing and promulgating standards
of professional practice to guide higher education
practitioners and their institutions, especially in
regard to work with college students.” (Terrence
Strayhorn, 2006)
Identifies 16 domains of student learning and
development. FALDOS focus on indirect evidence of
student learning.
URL: http://www.cas.edu/
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
5
What is Assessment?
 Minute paper:
 On the top of a piece of paper, write the
components of good assessment.
 On the bottom of the page, write what
assessment should not include
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
6
Program Assessment Is
 formative: designed to collect information that can be
used for improvement
 ongoing
OR
 summative: takes a picture of where you are today
 contributes to resource allocation
 infused into regular operations
 clear and understandable
 comprehensive
 measures your primary functions or activities
 cost effective
 time
 money
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
7
Program Assessment Is Not
 used to evaluate individual staff or faculty
 used to evaluate individual students
 a solution
 It is a fact-finding mission.
 a replacement for good management and
leadership
 an analysis of operations or processes
 could indicate a need for this kind of analysis
something done by one person
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
8
Issues in Effective Assessment
 High level administrative support
 Mission driven
 Resource allocation
 Assessment support: SDES, OEAS, IR
 Culture of assessment:
 motivation
 use of assessment results
 experience
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
9
UCF Annual Assessment Process
 All academic, administrative and student affairs areas
develop outcomes: operational/process and student
learning outcomes
 two measures for each outcome
 each area has an assessment coordinator
 Divisional Review Committees review Results
(evidence collected) and Use of Results from
previous cycle and Plans (outcomes and measures)
for current cycle
 Divisional Review Committee Chairs report
assessment results to University Assessment
Committee
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
10
UCF Annual Assessment Process
President
Deans &
V. Presidents
University
Assessment
Committee
Divisional
Review Committee
Assessment Coordinators
Staff, Faculty
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
11
Mission Statement
 Brief (75 words or less) and comprehensive
 It should make sense to someone who knows
little or nothing about your unit.
 It should rarely need revision and be able to
endure changes in leadership.
 It should lead to the development of goals,
outcomes or objectives and performance
measures for those outcomes.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
12
Mission Statement
 Who are we?
 Name of the office, department, unit
 What do we do?
 Your unit’s primary purpose
 What do they do to accomplish the purpose?
 Your unit’s primary functions
 For whom do we do it?
 The stakeholders or customers of your unit
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
13
Mission Structure
“The mission of (your office name) is to (your
primary purpose) by providing (your primary
functions or activities) to (your stakeholders).”
(You may add additional clarifying statements)
*NOTE: the order of the pieces of the mission statement
may vary from the above structure
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
14
Mission Statement Office
Example
Name
The Mission of the UCF Office of Residence Life is to is
Primary
to provide quality Stakeholders
housing facilities and related services
Purpose
that are reasonably priced, safe, comfortable, wellmaintained and staffed by friendly, caring, and efficient
people to undergraduate students. The department
develops and promotes
Primaryprograms and staff interactions
that are conducive
to student learning, support the
Functions
University’s academic mission, and encourage individual
responsibility within a community setting.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
15
Defining Operational Objectives and
Student Learning Outcomes
 Objective or outcome
 A specific, measurable statement that describes desired
performance
 Operational objective: a type of objective that addresses
operational or procedural tasks, such as satisfaction
 Student learning outcome: a type of outcome that
describes the intended learning outcomes that students
should meet as a result of program(s) or service(s)
 More precise, specific, and measurable than a goal
 Can be more than one outcome related to each goal
 An operational objective or student learning outcome can
support more than one goal
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
16
Writing Objectives or Outcomes:
Think SMART
Specific
 clear and definite terms describing the abilities, knowledge,
values, attitudes, and performance
Measurable
 it is feasible to get the data; data are accurate and reliable; it
can be assessed in more than one way
Aggressive and Attainable
 the outcome has the potential to improve the program or unit
Results-oriented
describe what standards are expected from students or aspect
of the functional area being assessed
Time-bound
 describe a specified time period for accomplishing the
outcome
From: Drucker, 1954
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
17
Developing
Operational Objectives
Registrar
Objective: Transcripts will be processed more
efficiently in 2006-2007 compared to 2005-2006.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
18
Developing
Operational Objectives
Student Disability Services
Objective: Students with disabilities approved
for testing accommodations and faculty who
sent tests to SDS for proctoring will indicate a
85% satisfaction with the testing
accommodations procedures.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
19
Program/Operational Objectives:
Rate the Examples
 Orientation Services will increase efficiency of online registration for transfer students.
 The English B.A. program will hire qualified faculty to
develop the American literature track.
 Financial Aid processing function will reduce the
time required to process refunds to students.
 Students will be satisfied with the response of
advisors to e-mail.
 The Student Union will provide high quality services.
 To increase the number of workshops we provide.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
20
Developing
Student Learning Outcomes
Student Activities
Objective: Students will demonstrate
competencies in effective leadership.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
21
Developing
Student Learning Outcomes
Student Disability Services
Outcome: Students will demonstrate an
increased understanding of SDS test
accommodation procedures for timely test
request submissions.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
22
Student Learning Outcomes:
Rate the Examples
 Students will understand how to get around
campus.
 Student Scholars will earn a rating of at least
satisfactory on their tutoring interaction skills. A
rubric will be used to rate their responses to
hypothetical situations.
 Students will successfully navigate the on-line
registration process.
 After completing SLS 1520, students will show an
increase in their ability to use technological
resources to conduct research.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
23
Objective or Outcome
Assessment Measures
 direct measures: direct measurement or
observation of something
 For student learning outcomes, it is the direct
examination or observation of student knowledge,
skills, attitudes or behaviors to provide evidence of
learning outcomes.
 indirect measures: perception of efficiencies (e.g.,
timeliness); perceived extent or value of learning
experiences
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
24
MATURE: Measuring
Objectives and Outcomes
Matches
 directly related to the outcome it is trying to measure
Appropriate methods
 uses appropriate direct and indirect measures
Targets
 indicates desired level of performance
Useful
 measures help identify what to improve
Reliable
 based on tested, known methods
Effective and Efficient
 characterize the outcome concisely
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
25
Assessment Measures for
Operational Outcomes
direct measures
 staff time
 cost
 materials
 equipment
 other resources
 cost per unit output
 reliability
 accuracy
 courtesy
 competence
 reduction in errors
 audit, external evaluator
indirect measures
 written surveys and
questionnaires:
 stakeholder perception
 students
 administration and staff
 faculty
 interviews
 focus groups
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
26
Assessment Measures for
Student Learning Outcomes
direct measures
 pre-, post-test
 locally developed exams
 embedded questions
 external judge
 oral exams
 portfolios (with rubrics)
 behavioral observations
 simulations
 project evaluations
 performance appraisals
 minute papers
indirect measures
 written surveys and
questionnaires:
 student perception
 alumni perception
 employer perception of
program
 exit and other interviews
 focus groups
 student records
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
27
Linking Operational Objectives
and Measures
Registrar
Objective: Transcripts will be processed more
efficiently in 2006-2007 compared to 2005-2006.
Measure 1: There will be a decrease in the number of
days for processing transcripts in 2006-2007 from 5.8
in 2005-2006.
Measure 2: At least 80% of college contacts
responding to the annual Registrar Survey will rate
transcript processing “good” or “very good.” (20052006 survey results: 65%).
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
28
Developing
Operational Objectives
Student Disability Services
Objective: Students with disabilities approved for
testing accommodations and faculty who sent tests to
SDS for proctoring will indicate 85% satisfaction with
the testing accommodations procedures.
Measure 1: During Spring 2007 Semester, 85% of
enrolled students with testing accommodations will
'agree' or 'strongly agree' that they are satisfied with
the SDS testing accommodation procedures.
Measure 2: During Spring 2007 Semester, 85% of
faculty who send test to be proctored in SDS for
students with disabilities will indicate that they 'agree'
or 'strongly agree' with the SDS testing
accommodation procedures.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
29
Measures for Operational Objectives:
Rate the Examples
 80% of students responding to the survey in the Fall will
say they are satisfied.
 Records kept through the Fall semester of time of
request and time of response will show that all requests
are responded to within 48 hours.
 90% or more of student customers answering the survey
in the Spring and Summer terms will agree or strongly
agree that they are satisfied with the services of the
Cashier’s office.
 Students who participate in athletics in 2006-2007 will be
retained at a higher rate than those who do not.
 Advisors in the Student Success Center will respond in a
timely manner.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
30
Linking Student Learning
Outcomes and Measures
Student Activities
Objective: Students will develop competencies for
effective leadership.
Measure 1: Students who participate in leadership
development activities inn Spring 2007 will score at
least satisfactory on the Leadership Observation rubric
completed by coordinators of targeted student activities
who have been trained in the use of the rubric.
Measure 2: In 2006-2007 students who participate in
leadership programs will improve at least 15% from
pre-test to post-test on the test of leadership
competencies.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
31
Linking Learning Outcomes
and Measures
Student Disability Services
Outcome: Students will demonstrate an increased
understanding of SDS test accommodation
procedures for timely test request submissions.
Measure 1: During the Fall 2007 and Spring 2008
semesters 85% of the students will submit their test
requests at least four business days prior to their test
dates.
Measure 2: Late test request data from Fall 2007
and Spring 2008 will indicate a 5% decrease from
the late test request data from Fall 2006 and Spring
2007 semesters.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
32
Measures for Student Learning
Outcomes: Rate the Examples
 An increased number of students will participate in the events
planned for 05-06 compared with 04-05.
 Students who complete the 05-06 Advising workshop will
score at least 80% on the quiz about majors.
 At least 85% of students surveyed will agree or strongly agree
that membership in a Greek organization helped them adapt
to college successfully.
 Students who participate in academic organizations will be
retained at a higher rate than those who do not.
 Following training, RA’s will demonstrate effective counseling
skills in mock student interviews. They will be scored using a
rubric.
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
33
Closing the Loop
Operational
or Learning
Outcomes
SMART
Determine
evidence
needed
2+ Direct
Measures
MATURE
Collect data
CHANGE
Procedures
Resources
Outcomes
Measures
Report Results
What is next?
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
Who, what, when?
34
Organizations to Assure
Quality of Process
Institutional Effectiveness Committees
• University Assessment Committee
 Provides overall guidance and leadership for the university
assessment effort
 SACS compliance certification response
 Divisional Review Committees
 Conduct reviews of previous cycle results
 Conduct reviews of current and future cycle plans
 report to UAC
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
35
Organizations to Support
Quality of Process
Support offices
 Operational Excellence and Assessment Support
 assessment process training: DRC, coordinators
 survey, data analysis, & interpretation support
 website support, templates
 Faculty Center for Teaching and Learning
 assessment training with OEAS
 department and team sessions
 Institutional Research
 provide data
 SDES Assessment and Planning Office
 assessment processes guidance
 survey, data collection and analysis support (some with
OEAS)
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
36
QUESTIONS
?
?
?
?
?
?
?
?
?
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
?
37
Continue the Conversation
Dr. Ron Atwell, Director
Assessment and Planning
Student Development and Enrollment Services
[email protected]
Ms. Pam Rea, Assistant Director
Student Disability Services
Student Development and Enrollment Services
[email protected]
Dr. Mark Allen Poisel, Associate Vice President
Academic Development and Retention
Student Development and Enrollment Services
[email protected]
International Assessment and
Retention Conference, June 8, 2007,
10:30 AM
38