Getting Started in Evaluating Student Learning in Student

Download Report

Transcript Getting Started in Evaluating Student Learning in Student

Getting Started in Evaluating Student Learning in Student Affairs/Services

Rebecca A. Sanderson, PhD Director, Student Affairs Research and Evaluation Oregon State University San Diego State University

Evaluating Institutional Learning Centeredness Conference

July 12-14, 2007

Sanderson, 2007

Assessment

Definition a rich conversation about students and student learning informed by data.” (adapted from Ted Marchese – AAHE) . . .the systematic collection, review, and use of information about programs/services undertaken for the purpose of improving student learning and development. (adapted from Palomba & Banta, 1999)

Sanderson, 2007

What is Assessment?

A systematic process of gathering information upon which to make decisions

Using information to determine what is working and what is not

An evaluation of effectiveness

Systematic feedback

An integrated and circular process

Sanderson, 2007

Assessment

Why?

To improve student learning in essential areas (e.g., diversity/multiculturalism)

To improve programs and services

To ensure that students are learning that which we intend

To focus efforts

To inform decision-making and clarify intentions

Sanderson, 2007

Assessment

And then there is this. . . .

Accreditation

Administrative mandates

Accountability

Spelling Commission report

Expectations from professional organizations

Another thing to add to an already busy schedule

Sanderson, 2007

Benefits

 Keeps the focus on students  Provides systematic information to guide program development  Can transform collaborations between Student Affairs and Academic Affairs  Fosters improvement Sanderson, 2007

Limitations

 Outcomes being measured may not reflect the true impact of a program/service  Probably will not prove that your program alone caused the outcome  Will not answer every question you may have  May take longer and need more resources than you expect Sanderson, 2007

Basic Assessment Questions

1.

2.

3.

4.

5.

6.

What are we doing and why are we doing it?

What do we expect a student to know, to think, or to do as a result of our program?

How well are we doing it?

How do we know?

How do we use the information to improve?

Does that work? (Bresciani, 2002)

Sanderson, 2007

Getting Started

Questions to ask yourself

Why do assessment?

How will it be used? Politics?

How do you ensure broad input and active participation?

How do you engage in reflective conversation about data that informs decision making?

With whom, how, and when do you share?

How do you sustain the effort?

Sanderson, 2007

Assessment Planning

Successful assessment finds that a written assessment plan is essential

To think through assessment needs and capture agreement about what matters

To give direction for action--To know who, when, where, what

To provide a means to determine if progress is being made

Sanderson, 2007

Structural Support Issues

 Assessment is best done inclusively  An Assessment Council?

 A Coordinator or director position?  A common language  Training and education over time  Feedback on assessment efforts  Tie to improvement rather than to accountability to start  Visible and vocal support from the top Sanderson, 2007

Assessment Accountability Structure

Vice Provost for Student Affairs Student Affairs Assessment Council Director, Research and Evaluation OSU Division of Student Affairs Student Affairs Departments/Units Programs/Services

Sanderson, 2007

OSU Assessment Council

 Anyone can join  Everyone agrees 

to learn,

to help others learn, and

the work goes on even if a person misses a meeting

to share the work

 We discuss and come to consensus around important issues  We strive for excellence and also for joy Sanderson, 2007

Creating Buy-In

  Begin small, but meaningful Data doesn’t need to be collected annually —create a cycle  Show your work to department  Get input  Include students  Show how data is being used  A successful experience can go a long way Sanderson, 2007

Elements of Assessment Planning

Department or Program Mission

Goals

Intended Student Outcomes

Methodology

Implementation

Results

Decisions/Recommendations

Sanderson, 2007

Assessment Planning

Mission

Describes the purpose of the organization and the constituents served

Should be able to state the mission in less than 25 words

Sanderson, 2007

Assessment Planning Goals

Broad general statements of what a program wants its constituents to know or to do. Goals generally describe what the program is trying to accomplish.

It is not a “to do” list.

Is aligned with university goals and the departmental mission

Provides departmental focus

Sanderson, 2007

Logic Model

MISSION Mission : Describes the purpose of the organization and the constituents served. It clearly relates to the Oregon State University and the Division of Student Affairs Missions.

Goals: Broad general statements of what a department wants its constituents to know or do. Goals generally describe what the program is trying to accomplish. Typically only 4-6 goals for a department.

GOALS PROGRAMS Programs: Sets of related activities and outcomes that consume a meaningful portion of the departmental resources (persons, dollars, time, etc.) and that are designed to support the department’s goals INPUTS Resources dedicated to the program

:

e.g., Money, Staff, Time, Equipment Constraints on the program

:

e.g., Laws, Regulations, Policies ACTIVITIES Activities done to deliver the program

:

e.g., Provide workshops Advise students Distribute brochures Develop handbook Teach classes Provide training Give tests SERVICE OUTCOMES Products from the activities

:

e.g., Number of workshops Number of people advised Types of brochures produced % served % satisfied Amount of money collected LEARNING OUTCOMES Benefits for participants

:

e.g., Gained new knowledge Increased skill Modified behavior Improved their condition Positively altered their status

Sanderson, 2007

Influences other than your program What level of outcome do you have the resources to measure?

What level of influence to do believe your program will have?

How is this linked to what we know about student learning?

M O D E L L O G I C Sanderson, 2007

Assessment Planning

Learning Outcomes

Detailed and specific statements derived from the goals. They are specifically about the intended end results of your program efforts and typically use

active verbs

such as: arrange, define, explain, demonstrate, etc.

Levels of learning or mastery —Bloom

Sanderson, 2007

Writing Learning Outcomes

Target group

 

Targeted Learning

Level of Mastery (

Bloom’s Taxonomy

)  Learning Outcome:  (

Target group

) will be able to (

Bloom’s Taxonomy word

) (

Targeted Learning

).

Sanderson, 2007

Assessment Planning

Methods

 The criteria, process, and tools used to collect evidence and to determine the degree to which the intended outcomes were reached . 

Assessment methods include:

the target audience,

the methods and tools for data collection,

criteria or targets that tell you when the outcome has been met, and

how the data will be analyzed.

Sanderson, 2007

Considerations

What method(s) will get the data to answer the question you are asking?

What level of reliability and validity are needed?

 

Reliability--consistency of measurement Validity —measures what it purports to measure

Does it make sense and look like it measures what we want it to measure?

What is the Timeliness, Cost, Motivation?

Sanderson, 2007

Assessment Planning

Methods

Types

Survey

Tests

Performance-based Measures

Checklists

Interviews & Focus Groups

Rubrics

Institutional Data

Sanderson, 2007

Methods

Survey

Self-reported information: Demographic/descriptive data, attitudes, opinions, values, experiences, behaviors, expectations, goals, needs

Dependent on accurate and honest recall

Can be commercial and/or standardized

Internally developed

Sanderson, 2007

What no multiple choice!! H-m-m-m??

Methods

Tests

Cognitive or thinking information

Can include written and oral presentations of material

Can be commercial and/or standardized

Internally developed

Sanderson, 2007

Methods

Performance-based Measures

Direct evidence of learning through performance

e.g., projects, work samples, capstone experiences, direct observation

Must develop criteria for evaluating the performance

Sanderson, 2007

Methods

Checklists

Direct evidence of

presence,

absence,

frequency of a behavior.

Often used with direct observation, can be used also for content knowledge

Sanderson, 2007

Methods

Interviews

Perceptions of experiences, stories, opinions, can be used to assess individual knowledge

Focus Groups

 

Perceptions of experiences, opinions, feedback on new product/service, etc.

Considerations: content, data and analysis, external credibility, time for analysis, transcription, selection of group members and facilitator(s) Sanderson, 2007

Methods

Rubrics

Used to score subjective type measures of performance

Involves prior determination of how performance will be rated

Answers the question: What does a satisfactory rating look like?

Sanderson, 2007

Methods

Institutional Data

Demographic information

Enrollment

Retention

Migration

Ethnicity/race

Graduation

Majors

Post graduation success

Success in subsequent courses, etc.

Sanderson, 2007

Assessment Planning

Implementation

 Who does what? When? Where? How?

 Often a matrix is used for this.

Goal Outcome Goal 1 1.B

Goal 1 1.B

Goal 1 1.D

Method Survey Who Janice When Nov. 15, 2007 Focus group Rose and Henry May 12 15, 2008 Performance Rubric Dean and Joe March 1, 2008 Sanderson, 2007

Assessment Planning

Results

Data Analysis

 Dictated mostly by the type of data you are collecting  Frequency distributions, some measure of central tendency, and may want to compare means or look for significant differences where applicable 

Depiction of information

 Graphs, tables —pictures can be very helpful in explaining data 

Reporting

 Do report and may need to produce more than one report depending on number of different audiences Sanderson, 2007

Assessment Planning

Decisions/Recommendations Loop) (Closing the

Now that we know--What are we doing about it?

      Celebrate Initiate changes Study further Enlist others to help with making further meaning of the data Add to or take away from X Revise assessment methods 

When will we look at this again?

Sanderson, 2007

Using Assessment Information in Decision-Making and Planning

 Documentation is Important 

As a record

To use as a guide for future decision making

To talk with constituencies we serve

To use with staff and others

To show progress to ourselves and our constituencies

Sanderson, 2007

Using Assessment Information in Decision-Making and Planning

 Discussions with Staff 

Share results with all staff and in multiple formats

Make results as transparent and public within the department as possible

Openness can build trust and fosters the integrity of the process

Department meetings, planning retreats, unit meetings

Sanderson, 2007

Using Assessment Information in Decision-Making and Planning

 Discussions with Students and others 

For students to invest time in our assessment efforts they must see the value

Sharing results and including them in conversations about how to make improvements based on data builds investment

Sanderson, 2007

Questions?? Comments??

OSU Student Affairs Research and Evaluation Web Site

http://oregonstate.edu/studentaffairs/assessment/index.html

Sanderson, 2007