Back Again…… Westdrewten 2013 to….. All of the CWS Staff With special thanks to Carol Peters and Dan Cook-Huffman WONDERFUL STUDENTS Nancy Vooys Joel Rhodes AND NOW ALUMNI Nick.

Download Report

Transcript Back Again…… Westdrewten 2013 to….. All of the CWS Staff With special thanks to Carol Peters and Dan Cook-Huffman WONDERFUL STUDENTS Nancy Vooys Joel Rhodes AND NOW ALUMNI Nick.

Back Again……
Westdrewten 2013
to…..
All of the CWS Staff
With special thanks to
Carol Peters and Dan Cook-Huffman
WONDERFUL STUDENTS
Nancy Vooys
Joel Rhodes
AND NOW ALUMNI
Nick Talisman
Alden Hiller
Catie Stroup
Rudi Carter
Courtney DeGemmis
Brooke Luke
Paige Stott
The questions
Broadly ...
• How do institutional policies shape
attendance at cultural events?
• How does attendance at cultural events on
campus impact students?
What we have. . .
Class of 2014
• Manipulation CWS:
• 1) Required 5 events;
• 2) 5 events restricted; or
• 3) 0 events
• Fall 2010 – Soc ID (pre & post),
Openness to experience,
Cultural event inventory (CEI),
Student Event Critiques
• Spring 2011 – Soc ID, CEI
• Spring 2012 – Soc ID, CEI
Class of 2015
• Manipulation CWS:
• 1) Required 8 events; or
• 2) 0 events
• Fall 2011 – Soc ID (pre & post),
Openness to experience, Cultural
event inventory (CEI), Student
Event Critiques
• Spring 2012 - Soc ID, CEI
Also have GPA and retention data for each semester
Attendance at Events
Currently….
Connecting
• Is there a relationship between
• Social Identity (across time)
• Openness to Experience
AND
• Number of events attended (across
time)
And….
CODING STUDENT EVENT CRITIQUES
• Reviewed 763 reaction papers from 206
students
• Initial Review of essays coded for:
1.
2.
3.
4.
5.
6.
Expectations about the event
Prior knowledge, experience or familiarity with
event/type of event
Specific connection with event itself
Overall Assessment of the event
Judgment about how the event reflected on
campus college community
Judgment about the impact upon the person who
attended themself
Rates of Reported Impact
70
n=763
60
57.4
59.1
50
40.5
40
38.5
Rater 1
Rater 2
30
20
10
2
2.2
0
Positive Impact
Negative Impact
No Reference
Inter-Rater Reliability Kappa = 0.768
(Substantial Agreement (Landis & Koch, 1977))
Frequency of Impact
Categories
What’s new
Coding Impact: Part 2
Type of Impact
1) Attitude
2) Knowledge
3) Behavior
Frequency of Impact
Categories*
*Preliminary Data
Inter-rater Reliability Kappa= .768
Next?
• Self-selection bias/Communication issues/Timing
issues?
• Other measures of impact we have missed?
• Anything else we should be worried about?