Transcript Document

Beth Lesen, Associate Vice President for Student Affairs
California State University, Sacramento
3-7-14
ASSESSING OUR EFFORTS IN
STUDENT SERVICES
Today’s Outcomes
 Distinguish between program objectives and student
learning outcomes
 Differentiate between direct and indirect assessment
 Identify useful assessment tools
 Practice interpreting assessment results
 Plan for evidence-based program improvement
Why engage in assessment?
 Because we care about our impact on students
 To use our resources and energy wisely
 To create a transparent culture of evidence
 To engage in an ongoing cycle of inquiry
 For re-accreditation
 Because we care about our impact on students
What does good assessment look like?

Flows from the department’s strategic plan and priorities

Aligns with the strategic plans of the division and the college

Articulates measurable program objectives and student learning
outcomes

Systematically gathers, analyzes, and interprets relevant
evidence

Uses resulting data to improve programs and services

Shares out findings
What will assessment allow us to do?
 Better serve our students
 Identify areas where we can improve
 Showcase our strengths
 Advocate for additional resources
 Meet re-accreditation expectations
 Respond to public calls for increased accountability
Why do we avoid assessment?
 No time
 Intimidating
 No buy-in
 Feels forced
 Under resourced
Avoid Avoiding Assessment: Lean in
 Schedule time for assessment like we do for meetings
 Include in departmental meeting & retreat agendas
 Establish a timeline with several milestones
 Make assessment more interactive
 Designate an expert & get them ongoing training
 Hire a consultant
 Publish reports on departmental webpages
What should I assess?
 What do you want to know?
 What do you suspect is happening?
 What weighs heavy on your mind?
 What is your charge?
Six Step Assessment Model
1. Mission = Purpose (2-3 sentences)
2.Goals = Aspirations (3-5 top priorities)
3. Objectives = Intentions (measurable program objectives or
student learning outcomes)
4.Measures = Methods (how)
5.Results = Evidence
6.Conclusion = Interpretation and Decision (now what?)
Step 1 – Mission Statement
The mission statement should focus on one or more priorities delineated in the
university’s mission, vision, or strategic plan.
Excerpt from University’s Mission Statement:
We are committed to providing an excellent education to all eligible
applicants who aspire to expand their knowledge and prepare
themselves for meaningful lives, careers, and service to their community.
Student Engagement and Success Unit Mission
Office Name
Primary Purpose
Primary Activities
Target Audiences
Student Engagement and Success, as a unit within the Division
of Planning, Enrollment Management and Student Affairs at
the California State University, Sacramento, facilitates student
development and success by guiding and supporting students’
academic, professional and personal pursuits and by promoting
active engagement with the campus and surrounding
community.
Step 2 – Goals
General planning statements. The starting point
for the development of objectives.
Example: Improve support toward student
academic success.
Goals are usually not measurable and need to be further
developed into measurable objectives.
Example: By July 2014, SES staff (outside the
advising center) will evidence increased fluency in
academic requirements for graduation.
Departmental Goals Align with College Goals
Write down one example of a departmental goal that
aligns with the college’s goals.
Step 3 – Objectives and Outcomes
Specific statements that describe desired outcomes
associated with broad goals of the unit.
Objectives and outcomes are measurable.
Typically one of two types:
Program objectives are about program improvements (e.g.
timeliness, efficiency and participant satisfaction)
Student learning outcomes reveal changes in attitudes or
behaviors that students demonstrate after utilizing a service or
program
Program Objectives
A good program objective will indicate:
Target population
Measurable result
Timeline
When formulating a program objective:
Align with a specific goal
Limit to one result per objective
Limit target population to a discrete group
Set a deadline
Examples of Program Objectives
By November 2014, 95% of first time freshmen will have
had at least one appointment with an academic advisor.
At least 90% of all students served by the Financial Aid
Office in the Spring 2014 semester will report being
satisfied or highly satisfied with the service they received.
The number of students involved with student
organizations on campus will increase by at least 10%
from AY 2012-2013 to AY 2013-2014.
Program Objectives
Aligning to University Priorities
University Priority = The University commits itself to increasing
students’ retention and graduation rates and decreasing their time to
degree.
Departmental Goal = Intervene earlier for students who are
struggling, academically.
Departmental Program Objective = By March 2014, every first
time freshman in the residence halls who enters second semester on
academic probation will have had at least two one-hour
appointments with an advisor.
Student Learning Outcomes
Direct Student Learning Outcomes
and
Indirect Student Learning Outcomes
With the emphasis on learning, most college campuses and
their various accrediting boards are interested in seeing
data related to direct student learning outcomes.
Direct Student Learning Outcomes
After participating in a program or utilizing a service
students demonstrate:
 Abilities
 Information retention
 Knowledge acquisition
 Attitudinal or behavioral change
Example: At least 85% of student residents participating in a
time management workshop will identify on a post-test, three
new strategies they plan on implementing.
Methods to Assess Direct Learning Outcomes
 Any type of tests including: pre- and post-tests,







standardized tests, licensure examinations, workshop
quizzes
Any type of portfolio including: e-portfolios, art portfolios,
multi-media portfolios
Evaluated performances such as role plays
Competency observations
Common assignments
Narratives with reflection
Juried art exhibits
Work/writing samples
Indirect Learning Outcomes
Self-report indicating perceived increase in
understanding
 Perception is not verified
 No measured demonstration of knowledge acquisition
 No observed behavioral/attitudinal change
Example: Ninety-five percent of students and parents who
attend orientation will indicate on the evaluation distributed
at the program’s conclusion that they “agree” or “strongly
agree” that they learned what will be required to earn a
degree at Sacramento State.
Methods to Assess Indirect Learning Outcomes
 Satisfaction surveys
 Program evaluation surveys
 Questionnaires
 Inventories
 Facebook or other social-networking site responses
 Informal peer-to-peer conversations
(e.g. with RAs, orientation leaders)
 Focus groups and interviews
Differentiating Direct and
Indirect Learning Outcomes
Direct or Indirect Learning?
1)
T / F This orientation session has helped me understand
the foreign language requirement at Sacramento State.
2)
Which of the following examples fulfill the foreign
language requirement necessary to graduate from
Sacramento State?
a) Demonstrated fluency in a language other than English
b) Passed the AP foreign language exam with a score of 3
or higher
c) Successfully completed, with C- or better, 3 years of
high school foreign language
d) All of the above
Write an example of a program objective or
student learning outcome
Write one example of a program objective or student
learning outcome that supports the goal you wrote
earlier.
Is it measurable? Your objective or outcome should
indicate:
 By when?
 Who?
 Will show what?
Step 4 - Measures
“The process of quantifying observations
[or descriptions] about a quality or attribute
of a thing or person”
Thorndike, R., & Hagen, E.
Measurement and evaluation in psychology and education (4th ed.). New York: Wiley.
Step 4 – Measures (continued)
The Components of Measurable Outcomes
 Audience: At whom is the program aimed?
 Behavior: What do you expect the audience to know/be
able to do?
 Conditions: Under what conditions or circumstances
will the learning occur?
 Degree: How much will be accomplished, how well will
the behavior need to be performed, and at what level?
Examples of Measurable Outcomes
 All participants who complete the post-test on academic
requirements and campus resources at the closing session of
orientation will score 85% or better.
 Orientation leaders observed in a twenty-minute structured role
play exercise will earn ‘4’ or better in each area of the advising
competencies rubric.
 At least 95% of students and parents will indicate on the
evaluation distributed at the closing session that they were
‘satisfied’ or ‘very satisfied’ with all aspects of their orientation.
Some Tools for Useful Data Collection
Direct Learning Outcomes
 Tests
 Portfolios
 Performances
 Competency




observations
Common assignments
Narratives
Juried art exhibits
Work/writing samples
Indirect Learning Outcomes
 Surveys
 Questionnaires
 Inventories
 Facebook responses
 Conversations
 Interviews
Some Tools for Useful Data Collection
Program Objectives
 Frequency Data
 Satisfaction Surveys
 Program Evaluation Surveys
 Social Networking stats/analytics
 Decreased Wait times
 Increased Registration/Participation
 Fiscal Savings
 Productivity Measures
Choose a Measure
Specify an example of a measure or tool that could
be used to collect evidence pertaining to the
objective or outcome you wrote earlier.
Example
Departmental Goal
Intervene earlier for students who are struggling, academically.
Departmental Program Objective
By January 15, 2014, all Residence Life Coordinators (RLCs) will be
appropriately trained as fully-functioning first year advisors.
Measure
All (RLCs) will complete the First Year Advising Training program
and will be certified as prepared to advise by the Director of Academic
Advising.
Example (cont’d)
Departmental Goal
Intervene earlier for students who are struggling, academically.
Departmental Program Objective
By March 2014, every first time freshman (FTF) in the residence halls
who enters second semester on academic probation will have had at least
two one-hour appointments with an RLC advisor.
Measure
Residence Life Coordinators will report out how many of their
assigned FTF on academic probation completed the two one-hour
advising appointments with them by March 2014.
Example (cont’d)
Departmental (direct) student outcome
By June 2014, 80% of the FTF in the residence halls who entered second
semester on academic probation will be in good academic standing.
Measure
A grade report run out of CMS after second semester grades are reported
will indicate that at least 80% of the identified FTF on second semester
academic probation achieved good standing by the end of the academic
year.
Example (cont’d)
Departmental (direct) student learning outcome
At least 90% of the FTF in the residence halls who entered second semester on
academic probation will identify at least two adaptive academic strategies
they learned from their RLC advising sessions.
Measure
Students will answer 5 questions after the advising sessions. One of the
questions will ask them to identify two useful academic strategies they
learned from their RLC advisor.
Example (cont’d)
Departmental (indirect) program objective
At least 90% of the FTF in the residence halls who entered second semester on
academic probation will report that their RLC advisor was knowledgeable and
helpful.
Measure
On the 5 question survey, students who see an RLC advisor will either
“agree” or “strongly agree” to items asking how knowledgeable they were
and how helpful they were.
Example (cont’d)
Departmental (indirect) student learning outcome
At least 90% of the FTF in the residence halls who entered second semester on
academic probation will report that they learned skills and strategies they will
use to perform better, academically, going forward.
Measure
On the 5 question survey, students who see an RLC advisor will either
“agree” or “strongly agree” to an item that asks whether they learned skills
and strategies they will use to perform better, academically, going forward.
Step 5 – Results
Results should highlight all significant findings and indicate
the extent to which the program/service reached its intended
outcomes.
Example: Freshman Orientation pre-/post-tests showed
that 87% of participants evidenced understanding of college
resources and graduation requirements.
However, only 75% of participants evidenced
understanding of general education requirements on the 3
related pre-/post-test items, falling short of the goal of 85%.
Triangulation
Denzin (1978) identified four basic types of
triangulation
 Data triangulation
 Investigator triangulation
 Theory triangulation
 Methodological triangulation: involves using more
than one method to gather data, such as interviews,
observations, questionnaires, etc.
Results: From Good to Great
 Integrate information from multiple sources.
 Investigate from multiple perspectives.
 Report out all results – not just positive.
 Go deep in analysis – cut up the data.
 If something’s missing, go find it.
 Try again or try differently.
Presenting Results
Methods to Present Results
 Descriptive text
 Tables
 Graphs
 Charts
 Videos
 Portfolios
Results example
 200 FTF in residence hall entered second semester on
academic probation
 The RLCs all completed training and were certified ready
by the stated deadline
 RLCs confirmed that 75% of the identified students (150)
completed the two advising appointments
 10% completed one appointment and 15% completed none
 75% of 200 students (150) ended in good standing
 Of the students who completed appointments:




80% said the advisors were knowledgeable
85% said the advisors were helpful
92% successfully identified 2 strategies
95% said they has learned at least two strategies
N=30
N=20
Step 6 – Conclusions
The conclusions should explain how the findings from
data will be used to improve the program and/or
increase student learning.
Given the results, what would you do differently?
Why?
What’s the new goal?
And the process begins again.
Step 6 – Conclusions (continued)
Use of Findings
When Objectives or Outcomes Were Not Achieved
 Modify program or service
 Modify policies or procedures
 Improve collaboration
 Improve communication
 Institute or improve training
 Modify program objective or learning outcome
 Modify measurement tools
 Modify methodology
Step 6 – Conclusions (continued)
Use of Findings
When Objectives or Outcomes Were Achieved
 Develop new objectives or outcomes
 Conduct a longitudinal study with current objectives or
outcomes
 Raise the criteria for achievement
 Develop more stringent measures
Step 6 – Conclusions (continued)
Did the Process Provide Information To…
 Improve programs or services that are aligned
with the university’s priorities?
 Understand and eventually increase student
learning?
 Make better planning or budgeting decisions?
Concluding Comments
 Assessment is an ongoing and evolving process.
 Every step of the process is the process.
 Good assessment never ends; it’s cyclical in nature.
 Even when the process goes well, we should continue
to examine it and contemplate next steps.
Questions?