"Assessing the Effectiveness" Powerpoint

Download Report

Transcript "Assessing the Effectiveness" Powerpoint

Assessing The Effectiveness Of Your Academic Advising Program

Tom Grites Assistant Provost The Richard Stockton College [email protected]

Audience Poll

What are your expected learning outcomes for this webinar?

A. Why do we assess?

B. What do we assess?

C. Where do we begin?

D. What tools do I use?

E. Who defines our success?

F. What I need to do tomorrow.

Overview

• • • • • Terminology and Rationale Assessment as a Process Focus on the Contexts Non-Assessment Summary

Terminology

• Assessment • Evaluation • Measurement (the tools)

Assessment ( re: academic advising)

“Assessment is the process through which we gather evidence about the claims we are making with regard to student learning and the process/delivery of academic advising in order to inform and support improvement” (Campbell, 2008)

Uses/Contexts

Assessment – tends to be more related to

programmatic

issues and outcomes Evaluation – tends to be more related to

people

(advisor) skills, performance, and outcomes It’s OK to use evaluation as part of the assessment process

Intentions

(related to both) •

Formative

– more associated with assessment; includes a wider range of efforts; requires more analysis; provides a broader perspective; focus on improvement •

Summative

– more associated with evaluation; more limited effort; focus on “Does it work?” or “How well was job performed?”

The Rationale

• “…a lack of assessment data can sometimes lead to policies and practices based on intuition, prejudice, preconceived notions, or personal proclivities – none of them desirable bases for making decisions” • (Upcraft and Schuh, 2002, p. 20)

More Rationale “In God we trust; all others bring data.” “An ounce of data is worth a pound of opinion.”

(Magoon, c. 1975)

Other Reasons

• • • • • Accountability Effectiveness Accreditation Trustees/Regents Legislators •

Program Improvement

(to monitor and improve student success) – the most important reason

The Assessment Process: A Cycle

• Resources: •

Assessment of Academic Advising

Package (3 CDs available from NACADA via www.nacada.ksu.edu

) • Assessment of Academic Advising Institute ( Feb 12-14, 2014 Albuquerque, NM)

Getting Started: Identify Stakeholders

• Complete set of advising constituents (students, staff and faculty advisors) • Broad range of key offices (Registrar, Enrollment Management, similar advising units, certain campus referral resources, IR office) • Critics, Antagonists, and Naysayers • FYIs – Faculty Senate, Deans Council, Retention Committee, others as appropriate

The Advising Hub

What Do We want To Know or Demonstrate as a Result of Academic Advising?

• • • Focus on student learning Connect learning to mission, vision, values, goals in your advising program – How will your program contribute to student learning?

– Who, what, where, when, how will learning take place?

Define measures of student learning – Gather evidence, set levels of expected performance

The Assessment Process/Cycle

• Alignment with institutional and unit missions • Specify goals and/or objectives • Identify the outcomes and/or programmatic) expected (student learning • Gather evidence (the measurements) • Share findings , interpretations, and recommendations • Begin implementation and re-start the cycle

Mission/Purpose

• A working model… • Academic advising is integral to fulfilling the teaching and learning mission of higher education. Through academic advising, students learn to become members of their higher education community, to think critically about their roles and responsibilities as students, and to prepare to be educated citizens of a democratic society and a global community. Academic advising engages students beyond their own world views, while acknowledging their individual characteristics, values, and motivations as they enter, move through, and exit the institution.

• (Preamble, Concept of Academic Advising, NACADA, 2006)

Goals/Objectives

(how we intend to achieve our mission) These need to emanate from and reflect the nature of the unit to be assessed (total institution, Advising Center and its clientele, College Dean’s Office, etc) Examples: • To assist students to become independent and lifelong learners • To assist students in understanding the relevance of the total curriculum • To assist students in making good decisions based on their own evidence (e.g., selecting a major)

Identify Outcomes

• Student Learning Outcomes – examples All students will

select an appropriate major

by the end of their third semester.

• All students will

become engaged

in at least one

co-curricular activity

each semester.

• All students will be able to identify and will

select courses

that enhance their human capital.

• At least 30% of the students will choose to

participate in

a

service learning

course.

• All (CC) students will be able to

distinguish

among the A.A., A.S., and A.A.S. degree programs

A Task For You…re: Course Selection

• • • • How many

courses

are in your Catalog? (A) How many

courses

are required to earn a degree from your institution? (B) What percentage of what your institution offers do students actually take in order to earn a degree? (B/A) Now, for each course a student takes, how many are eliminated?

Outcomes (continued)

• Programmatic/Process Outcomes – examples • As a result of our advising services, the retention/persistence rate of first-year students will increase by 10% in the next 3 years.

• As a result of our intervention strategies, the percentage of students who are removed from academic probation will increase by 10% in the next academic year.

• After two advising sessions, all students will come to their future sessions with a degree audit already run and with a plan for meeting outstanding requirements

Everybody’s Favorite All students will be able to understand, appreciate, and articulate the value of general education.

Gather Evidence

Mapping the Experience (Maki, 2004)* • Not all outcomes will necessarily occur as a direct result of what we do as advisors, so we need to know what goals/objectives.

other learning opportunities exist in order for the students to meet our stated • • •

WHAT

learning is to occur?

WHERE

might it be learned?

By

WHEN

should it be learned?

*This process can also inform the kinds of evidence that need to be gathered for appropriate assessment.

The Advising Hub

Types of Measurement and Data

Qualitative

– open-ended survey questions; focus groups; in depth responses, but small N

Quantitativ e

– descriptive, structured, numbers and statistics from surveys, demographics, etc; limited content responses, but large N

Direct

– observations; recorded data; pre-post information

Indirect

– perceptions, inferences, even “inclinations”

Use Multiple Measures!!!

Gather (Multiple) Evidence

Satisfaction Surveys

(OK, but not enough) •

Institutional Data

(changes of major, drop/add transactions, grades in gateway courses, retention and graduation rates, use of services provided elsewhere, advisor : advisee ratios, NSSE, etc) •

Office Data

(number of appointments vs. walk-ins, nature of sessions, results of sessions, transcript analyses, other advisor tasks/activities; “What did you learn?”) •

Focus groups

measure) (of clients, of faculty advisors, others – a qualitative • The

Advising Syllabus

* can inform what evidence should be collected * http://www.nacada.ksu.edu/Clearinghouse/AdvisingIssues/syllabus101.htm

http://intraweb.stockton.edu/eyos/page.cfm?siteID=123&pageID=42#syllabus

Share the Results

Tips

… • Be sure that the stakeholders you identified earlier are informed throughout the process in

order to enable their support

in the decision making for implementation of your recommendations.

• Academics have a preferred method of review, so it makes sense to conform to their expectations.

Sharing the Results

(Format and Content) These elements are often best provided in a standard research report or journal format… 

Purpose

of the assessment project 

Method

of data collection 

Results

found 

Interpretation

of the results 

Recommendations

with

timetable anticipated cost

of implementation for and 

Executive Summary

or

Abstract

How Results Will Inform Decision-Making •

Revise

pedagogy or curriculum or policy/procedure •

Develop/revise

advisor training programs •

Design

more effective programming – advising, orientation, mentoring, etc.

Increase

out-of-class learning opportunities •

Shape

institutional decision making – planning,

resource allocation

Sample Implementation Recommendations • • • • • • Redesign the

advising effort in the Orientation Program

Develop a

peer advising/mentoring

program Streamline office

procedures

Initiate proposals for

policy changes

Improve

communication

offices and personnel with other service Request/Reallocate and/or physical)

resources

(human, fiscal,

You Did It!!

• This will complete the assessment cycle, which provides the evidence for change and improvement.

• Completion of the cycle may also provide new goals and objectives, new assessment strategies and tools, and other aspects that will be need to be included in beginning the next cycle. (See Darling, 2005 handout)

You’ve Earned a Break

Please take a few minutes to submit any questions you may have at this point via the chat function.

Back to the Original Contexts

People

… Academic advising, as a teaching and learning process, requires a

pedagogy

that incorporates the preparation, facilitation, documentation, and assessment of advising interactions. Although the specific methods, strategies, and techniques may vary, the

relationship

between advisors and students is fundamental and is characterized by mutual respect, trust, and ethical behavior. (Concept of Academic Advising, NACADA, 2006)

NACADA Core Values

Academic Advisors are responsible  to the individuals they advise  for involving others, when appropriate, in the advising process  to their institutions  to higher education in general  to their educational community  for their professional practices and for themselves personally

Assessment (Evaluation) of Advisors

• SELECTION • TRAINING • EVALUATION • RECOGNITION/REWARD

Selection of Academic Advisors

• • • • • • Use the best Add from other resources/units Target specific populations Cross disciplinary lines Develop mentors Use other skills/expertise

Potential Pitfalls

Making a distinction

Faculty Advis

ing

Assessment) Faculty Advis

ors

(Programmatic; (Personal; Evaluation)

Inappropriate Comparisons

Professional Academic Advisors Peer Advisors

No Improvement Plan

Training

Faculty vs. Professional Staff Advisors

• • • Too often all are expected or required to advise, but also teach, publish, seek grants, etc – no

selection Training

ranges from near nothing to perhaps a day or 2, but usually only a few hours

Evaluation

is not systematic •

Recognition/Reward

is very limited in the tenure and promotion process; mostly intrinsic; can also be a reverse structure (better = more) • They are hired via a search process and have specific job descriptions – they are

selected

• Their

training

is systematic, intentional, and ongoing; staff development is expected • They are

evaluated

through annual performance reviews • They are

rewarded

and benefits with salary

ASSESSMENT (Evaluation)

• 37 % OF ALL INSTITUTIONS HAD NO PERFORMANCE EVALUATION MEASURES FOR FACULTY IN THEIR ACADEMIC ADVISING ROLE • 44 % in 2 yr public institutions • 25 % • 39 % in 4 yr public institutions in 4 yr private institutions (Habley, 2004)

PARAMETERS (faculty advisors)

• Faculty Contract • List of Responsibilities • Availability of Resources • Assignment of Advisees • Recognition/Reward

Tools for Assessment

(and/or Evaluation)

Of Advisors

• Self evaluation • Student surveys (locally designed) • Survey of Academic Advising (ACT) • Academic Advising Inventory (NACADA) • Student Satisfaction Inventory (Noel-Levitz) • NACADA Clearinghouse

Back to the Original Contexts

Program

… • “…a lack of assessment data can sometimes lead to policies and practices based on intuition, prejudice, preconceived notions, or personal proclivities – none of them desirable bases for making decisions” • (Upcraft and Schuh, 2002, p. 20)

Other Tools and Strategies

• • • • • • • •

Satisfaction Surveys Institutional Data Office Data Focus groups The Advising Syllabus External Reviews CAS Standards Others…

CAS Assessment Worksheet

An Economic Model

• Though

not

an outcomes-based model per se, this approach to assessment is a functional analysis based on the premise that every task an advisor performs and every operation that an advising unit conducts has some monetary value related to it.

• The analysis results in a comparison of the fiscal expenditures required to perform the tasks to the cost benefits as results.

• The model operates from the perspective of a threat to the existence of an advising unit, function or personnel. A quick example…

Determining Your Worth

• Identify every function the unit performs • Identify all possible alternatives for each function, if the unit was dissolved • Determine the cost of those functions that cannot be replaced and who would perform them; estimates will sometimes be required • Determine the cost of those functions that could be eliminated ( In Markee and Joslin, 2011)

Where are the data?

Bill Gates

– “colleges today know more about how many kids attend basketball games and which alumni give money than how many students showed up for economics class during the week…” (jn review of

Academically Adrift

).

Where are the Data?

Jeff Selingo

– “Think about it. Before we buy a car, we can find various measures on everything from gas mileage to results of safety tests. We can turn to objective sources to check comparisons of similar vehicles and see which cars hold their value over time. But when it becomes to potentially one of the most expensive purchases in a lifetime, the attitude from colleges has always been that we should just trust them on the quality of their product.” (p. 25)

What Are We Not Assessing…And Should We Be?

• Student expectations, intentions • Whether advising strategies actually can be attributed to different types of student success (removed from probation, successful choice of major, overcome a skills deficiency or harmful social habit, etc) • Retention and graduation rates of transfer students

Expectations vs. Experience

Be Undecided Change Majors Fail a course Extra time to complete degree Drop out Transfer institutions Work while in school Seek personal counseling 6 Need tutoring 15 Seek career guidance (Habley 2011) 5 Expect 7% 12 1 8 1 12 36 Experience 20% 65-85 16 60 40 28 60 27 20 25

Non-Assessment

(continued) • Use and value of articulation agreements – number of students who use them, are they updated • Currency of academic policies, e.g., course repeats, course pre-requisite criteria, drop/add/withdrawal processes, academic warning, probation, and suspension policies • Does advisor training result in better advising?

Summary

• Assessment is a process, not an event • Collaboration and cooperation are necessary for productive assessment to occur • “An ounce of data is worth a pound of opinion” (Magoon, c. 1975) Avoid the N of 1 syndrome • The purpose and results of assessment should always be used for program and/or advisor improvement in order to realize maximum student development and success

Questions?

[email protected]