Course Evaluations Open a Door to Assessment

Download Report

Transcript Course Evaluations Open a Door to Assessment

Using NSSE to Inform
Course-Evaluation Revision
Edward Domber
Christopher J. Van Wyk
Drew University
Mise en scène
•
•
•
•
23 items, written in the mid-1970s
Scantron® with free response on reverse
Conducted near end of semester
Returned via dept chairs, 2-4 months later
– In a plain manila envelope
– With printout showing response distribution
for section and means for section,
department, division, and College
NSSE Workshop, SCSU, October
2006
2
Spring, 2000, Survey
(part of Middle States self-study)
• “Drew’s current use of student course
evaluations for assessing teaching and
learning is adequate” (1 = strongly
disagree; 6 = strongly agree)
• Faculty response
– mean: 3.5
– s.d.: 1.5 (among the largest on the survey)
NSSE Workshop, SCSU, October
2006
3
Audience Participation
What would faculty on your
campus say about student courseevaluation forms?
Memorable Words
• “When we surveyed several hundred faculty and
administrators, we found a surprising lack of
knowledge about the literature of student ratings
and even about the basic statistical information
necessary to interpret ratings reports accurately.
That lack of knowledge correlated significantly
with negative opinions about evaluation, student
ratings, and the value of student feedback.”
(Theall and Franklin, p. 46)
NSSE Workshop, SCSU, October
2006
5
Two Handy Starting Points
American Psychologist
52 (1997)
• Greenwald, ed.
• incl. McKeachie
New Directions in IR,
no. 109 (2001)
• Theall, Abrami, Mets,
eds.
• incl. Theall & Franklin
• incl. Kulik
NSSE Workshop, SCSU, October
2006
6
Re: Presentation of Results
• “[T]he use of norms not only leads to
comparisons that are invalid but also is
damaging to the motivation of the 50% of
faculty members who find that they are
below average. Moreover, presentation of
numerical means or medians (often to two
decimal places) leads to making decisions
based on small numerical differences”
(McKeachie, p. 1223, emphasis added)
NSSE Workshop, SCSU, October
2006
7
Re: Use of Results
• “[E]valuation experts usually advise
teachers with low ratings to concentrate on
their greatest relative weakness. Fix it, the
experts advise, and the whole profile of
ratings may go up. . . . changes in profile
elevation are commonplace with highly
intercorrelated rating scales” (Kulik, p. 22).
NSSE Workshop, SCSU, October
2006
8
Facts about Drew’s data
• On each of the 11 items that are answered
on a scale of 1-7:
– Modal response = 7 (the best possible)
– Mean response ≈ 5.9
– s.d. ≈ 1.3
NSSE Workshop, SCSU, October
2006
9
Analysis of Drew data, I
• Slight relationship (r < 0.2) between
EXPECTED GRADE and other responses
[N.B., “absence of evidence is not
evidence of absence” (Rumsfeld)]
• 13 items load onto one factor that explains
38% of the variation in responses
• Both results replicate findings in the
literature
NSSE Workshop, SCSU, October
2006
10
Added to this mix:
NSSE results
Analysis of Drew data, II
• Specific request: what can current course
evaluations tell us about engagement?
• Regression analyses using as
independent variables
– Class size
– Level of study
– Curricular division
– Reason for taking
– Anticipated grade
NSSE Workshop, SCSU, October
2006
12
Class Size Matters
• Two definitions of average
– Mean class size: 18 (“as seen by faculty”)
– Mean class size weighted by enrollment:
almost 25 (“as seen by students”)
– Why? “more students in a large class than in
a small class”
• Punch line: Student effort, reported
amount of work assigned, and satisfaction
all decrease as class size increases.
NSSE Workshop, SCSU, October
2006
13
Added Other Items
• Some phenomenological (e.g., pacing)
• “How often did instructor cancel class?”
• Some inspired by NSSE & mission stmt
– Student inputs (e.g., how often missed
class)
– Student outputs (e.g., how much class
contributed to ability to write clearly and
effectively)
NSSE Workshop, SCSU, October
2006
14
Sample Items Relating
Mission to NSSE Items
Questions 12-22 assess the extent to
which this course contributes to various
learning objectives. We understand that
not all courses are intended to
contribute to all of the objectives listed.
With that in mind, please feel free to
select “not at all” if that seems to be the
most appropriate answer for this course.
NSSE Workshop, SCSU, October
2006
15
“College challenges students…to
develop their capacities for:”
Phrase from Mission:
“critical thought”
NSSE Related Item (11c):
“This class contributed to my ability
to think critically”
NSSE Workshop, SCSU, October
2006
16
“College challenges students…to
develop their capacities for”
Phrase from Mission:
“Effective Communication”
NSSE Related Item (11d):
“This class contributed to my ability to speak
clearly and effectively.”
NSSE Workshop, SCSU, October
2006
17
“College challenges students…to
develop their capacities for:
Phrase from Mission:
“Problem Solving”
NSSE Related Item (11f):
“This class contributed to my ability to
analyze quantitative problems.”
NSSE Workshop, SCSU, October
2006
18
“College challenges students…to
develop their capacities for:
Phrase from Mission:
“Living… in an increasingly diverse
world”
NSSE Related Item (11l):
“This class contributed to my ability to
understand an increasingly diverse world.”
NSSE Workshop, SCSU, October
2006
19
“College challenges students…to
develop their capacities for:
Phrase from Mission:
“Creativity”
NSSE Related Item (none?):
“This class contributed to my ability to be
creative.”
NSSE Workshop, SCSU, October
2006
20
Faculty Discussions
• Draft circulated
• Sticking points
– “How often was class cancelled?”
– Length
– Order of items
– Unipolar v. bipolar Likert scales (!)
NSSE Workshop, SCSU, October
2006
21
Vote in May, 2005
• Linked two items’ fate
– “student missed class” (min response: never)
– “class cancelled” (min response: never)
• Provide cover sheet for instructors to
explain “unusual circumstances”
• Condone length for now; follow-up will
identify redundant questions
NSSE Workshop, SCSU, October
2006
22
Work in Fall 2005
• On-line version developed
• Pilot of on-line version to check technology
– Small Sample of classes
– Generate comments on each item
– Check web-based interface
NSSE Workshop, SCSU, October
2006
23
Work in Spring 2006
• Further Refinement
• Plan for Major Pilot-Testing
– 1/3 of Courses sampled
– Communication with faculty and students carefully
designed
– Both Paper (current) and on-line versions were
administered and linked
– Administration window was planned
– Ways of increasing response rate were considered,
but not instituted.
NSSE Workshop, SCSU, October
2006
24
Work in Fall 2006
• Pilot Data: Preliminary results
– Response Rate, N=2397 (85% paper, 66% online,
with n=730 able to be matched)
– Relationship of “paper” to new on-line form
• Comparable paper and on-line items
• Correlation of items measuring quality
– Relationship holds within different course levels.
–
–
–
–
Open-ended comments
Length, redundant items, and response rate
Considerable variability on NSSE related items
Factor structure
NSSE Workshop, SCSU, October
2006
25
Reporting to Faculty and Faculty
Evaluators
• Should we provide several report formats?
• Should we provide an overall rating of “teaching
quality”?
• What kinds of norms should we supply
– Breakdown by class size?
• Should constituents get reports on-line?
• How do we protect student “confidentiality”?
• How do we report “qualitative” data?
NSSE Workshop, SCSU, October
2006
26
Should we
• Ask faculty to indicate their
expectations for student responses to
the “mission related” and “other”
items? Discrepancies may provide a
benefit similar to NSSE-FSSE
comparisons.
• Develop an overall measure of quality and
provide more norms?
NSSE Workshop, SCSU, October
2006
27
Work is Not Over
• Ongoing assessment was promised
• Eventually, hope to have a bank of
optional items
• Suggestions for use by instructor
NSSE Workshop, SCSU, October
2006
28
Summary: NSSE’s Influence
• NSSE results engaged and enraged
faculty
• NSSE inspired content of many new items
• NSSE results suggested new kinds of
questions to ask on course evaluations
NSSE Workshop, SCSU, October
2006
29
Summary: NSSE’s Influence
• Embedding NSSE in course evaluations
keeps NSSE prominent in faculty
conversation
• New course evaluations will provide
information useful for course revision, not
just instructor evaluation
• Faculty able to derive useful information
are less critical of the process.
NSSE Workshop, SCSU, October
2006
30