Addressing Assessment Fatigue: Generating Student Interest

Download Report

Transcript Addressing Assessment Fatigue: Generating Student Interest

Addressing Assessment Fatigue: Generating Student Interest, Participation, and Ownership in Assessment

Marissa Cope StudentVoice Nathan Lindsay University of North Carolina Wilmington

Overview of Session

• • • • • • General information on applicable research Techniques for generating student participation Coordination of assessment activity and student ownership Generating interest: Sharing results with students Examples from UNCW and other campuses Discussion questions Audience participation will be requested throughout the session.

Assessment fatigue among students has been a challenge at our university.

17% 17% 17% 17% 17% 17%

A.

B.

C.

D.

E.

F.

Strongly Agree Agree Neither Agree nor Disagree Disagree Strongly Disagree Don’t Know/Not Applicable

S tr on gl y Ag N ei re th e er A A gr gr ee ee n or D i..

.

D S is tr ag on D re gl y on e Di ’t sa Kn gr ow ee /N ot A pp ...

General Information

• Survey response rates have been falling • • Difficult to contact people Refusals to participate increasing • National survey response rates fell from about 60% in the 1960s to just above 20% in the 1980s • Nonresponse may not be random • Two strategies for correcting low response rates: 1. Weight the data for nonresponse 2. Implement strategies to increase response rates

Research on Survey Fatigue

• Multiple surveys are seen as a burden; would expect response rates to decrease as requests increase • Time is a major issue • ▫ ▫ ▫ ▫ Example - 1998 study at Air Force Academy: Anecdotal evidence – survey about surveys 97 percent: somewhat over surveyed Should be surveyed only 3-4 times a year Over surveyed: “combination of frequent surveys perceived as irrelevant”

Implications from Research

• • • • • • Potential of multiple surveys can reduce response rates Non-respondents cite time concerns as reason Effects of survey fatigue may be moderated by salience of survey content Number of previous surveys may have an impact on current survey response Survey fatigue may have biggest impact on survey administered back-to-back Feeling of “I have done enough” (reciprocity)

Theories of Survey Response

1.

• • Reasoned action approach Calculation of costs and benefits Social exchange: rewards, cost, and trust • Increase rewards • Reduce costs • Establish trust 2.

▫ ▫ ▫ Psychological approach Informal decision rules Norm of reciprocity Norm of social responsibility

Theories of Survey Response

• What this means in action: ▫ More likely to comply when request appears to be from a legitimate authority ▫ Requests for help should be clear and straightforward in survey invitation ▫ Emphasize that recipients are part of a select group

Techniques for Generating Student Participation

pro.corbis.com

Technique 1: Multiple Contacts

Pre-notifications: Provides information about the project including data collection times, incentives, purpose, contact information Invitation with survey information: Repeat of pre-notification information Follow-up reminders: Ideally only to non-respondents Range of 2-5 day intervals

We usually use pre-notifications for our surveys.

17% 17% 17% 17% 17% 17%

A.

B.

C.

D.

E.

F.

Strongly Agree Agree Neither Agree nor Disagree Disagree Strongly Disagree Don’t Know/Not Applicable

S tr on gl y Ag N ei re th e er A A gr gr ee ee n or D i..

.

D S is tr ag on D re gl y on e Di ’t sa Kn gr ow ee /N ot A pp ...

Technique 2: Length

• • The longer the survey the higher the perceived cost • Moderate correlation between length and non response • • • Studies recommend no more than the following: 22 questions 13 minutes Also consider type of questions: text fields vs. answer choices

Our surveys are generally shorter than 22 questions.

17% 17% 17% 17% 17% 17%

A.

B.

C.

D.

E.

F.

Strongly Agree Agree Neither Agree nor Disagree Disagree Strongly Disagree Don’t Know/Not Applicable

S tr on gl y Ag N ei re th e er A A gr gr ee ee n or D i..

.

D S is tr ag on D re gl y on e Di ’t sa Kn gr ow ee /N ot A pp ...

Technique 3: Incentives

• • • • Research on the effect of incentives on college students is lacking Prepaid incentives consistently raise levels of response Effect of postpaid incentives is minimal Payment contingent upon completion = • compensation • • 2 effects: Getting paid removes reciprocity aspect Expectations for future participation

Incentives for assessments improve response rates at our institution.

17% 17% 17% 17% 17% 17%

A.

B.

C.

D.

E.

F.

Strongly Agree Agree Neither Agree nor Disagree Disagree Strongly Disagree Don’t Know/Not Applicable

S tr on gl y Ag N ei re th e er A A gr gr ee ee n or D i..

.

D S is tr ag on D re gl y on e Di ’t sa Kn gr ow ee /N ot A pp ...

Technique 4: Salience

Salience: How important or relevant a survey topic is to the survey recipient • • • 12-14% increase in response rates for salient surveys Difficult to control Highlight salience in survey invitations

Technique 5: Statement of Confidentiality

• • • • Anonymous vs. confidential Anonymous: responses are unidentifiable Confidential: can identify but will not share • Voluntary statement assuring participants of confidence of information ascertained through survey (may be required) Reduces perceived cost Establishes trust Heightens awareness of what may be asked Misuse can lead to decrease in response rates

Other Techniques

• •

Technique 6: Request for Help

Follows social responsibility theory Be careful of wording in contact, especially with subject lines • •

Technique 7: Sponsorship

Who is coordinating the survey Collaborations with external entity may need elaboration • •

Technique 8: Deadlines

Scarce/limited opportunity: more valuable Results are mixed: won’t hurt, may not help

Contact/Invitation Checklist

 Should convey the importance of the study  Should include a request for help  Should guarantee confidentiality (as appropriate)  Should provide information on how long it will take to complete the assessment  Should indicate the time period for which the survey will be accepting responses  Should include information on incentives

Timing of Contact/Administration

• Avoid busy times (e.g., finals) or holidays • Send email/pre-announcement 2-3 days prior to survey mailing • First half of semester/term may be better if you are surveying in an academic environment • Mid-week invitations are typically recommended

Web Survey Practices

• Possible ways of motivating respondents to continue: • Keep design simple • Make directions clear • Include questions on the first page • Break the survey up into multiple pages • Utilize a progress tool • Utilize skip patterns so that respondents do not have to see questions that do not apply to them

Generating Interest, Participation, and Ownership in Qualitative Research

• • • • • Many of the same principles outlined for quantitative assessment apply For sampling, a purposeful sample is often the best strategy, but captive audiences can also be used Incentives (t-shirts, food, drinks) are helpful in recruiting participants Provide opportunities to discuss positive/negative feedback Students can help in data collection, analysis, and reporting

Ownership: Include Students

When developing and administering the project, reviewing the data, and discussing potential changes/action, think about: • • • • • • Resident assistants Peer mentors Orientation leaders Student club/organization leaders Tutors/Supplemental Instruction leaders Student employees

Coordination of Assessment Activity

• Overseeing body: person or committee that monitors activity for department, division, or institution • Assessment calendar for division and institution • Think about what other areas on campus are doing and collaborate with them • Partnerships between units with related missions and objectives • Find opportunities to share assessment activities and results (e.g., “Assessments of the Month”) • Share results widely (internally and externally): Publish an

Assessment Update

Sharing Results

• Know your audience • Use drawings/other visuals to represent your assessment plan • Use graphs/charts/other visuals to represent your findings - Size/biggest or smallest - How things change over time - What is typical - What is exceptional - How one piece of data is related to another • Share results back with assessment participants/ publicize assessment results in meaningful ways

Sharing Results with Students at UNCW

• Reports posted on the Assessment Website, Other Websites • See University Learning Center website: http://www.uncw.edu/stuaff/uls/si_leaders.htm

• Results shared with leadership groups (Student Government, Resident Assistants, Campus Entertainment Executive Board, Peer Mentors) • Normative data on healthy behaviors shared in poster campaign • “Your Voice Has Been Heard” Marketing Campaign • Space specific flyers within each department (e.g., Housing, Union) • Campus Television Station/Student Newspaper • Parent Newsletter

Our institution has effectively shared assessment results with students.

17% 17% 17% 17% 17% 17%

A.

B.

C.

D.

E.

F.

Strongly Agree Agree Neither Agree nor Disagree Disagree Strongly Disagree Don’t Know/Not Applicable

S tr on gl y Ag N ei re th e er A A gr gr ee ee n or D i..

.

D S is tr ag on D re gl y on e Di ’t sa Kn gr ow ee /N ot A pp ...

Other Examples from UNCW

• Career Center Counseling Evaluations and Tutor Evaluations in the University Learning Center • Web link on a laptop used instead of emailed survey • Health Center and Housing and Residence Life • Bubble forms and paper/pencil surveys used to capture responses from the captive audience • Health Survey • Acronyms removed and incentives bolded in email invitation • Campus Life Study • High salience and $100 incentives • Sustainability Film Series • PDA’s used directly after the film

Additional Campus Examples

• University of Michigan - Spring Commencement ▫ New, temporary location for graduation ▫ Initial PDA survey followed by online surveys to graduating students ▫ High publicity: emails, newspaper, website • University of Richmond - First-Year Student Intervention ▫ Email from Vice President of Student Development ▫ 6 questions about experience at UR ▫ Direct follow-up by UR staff if requested

Discussion: Incentives

• What incentives have you used at your institution?

• How have these affected participation rates? • What recommendations do you have for using incentives?

Discussion: Sharing Results

• How have you shared results with your students?

• Have you noticed any effects on future participation?

• Are there other “best practices” for sharing results that you can pass along?

Discussion: Other Best Practices

• What other recommendations do you have for… ▫ generating student interest?

▫ increasing student participation?

▫ enhancing student ownership in assessment?

www.chiuni.ac.uk

Thank you!

Please feel free to contact us with any questions or further ideas regarding these topics!

Marissa Cope Associate Director of Assessment Programs StudentVoice [email protected]

Nathan Lindsay, Ph.D.

Director of Student Life Assessment University of North Carolina Wilmington [email protected]