IDEA--Student Ratings of Instruction

Download Report

Transcript IDEA--Student Ratings of Instruction

IDEA Student Ratings of Instruction:
A Diagnostic Guide for Improvement
Dr. Kristi Roberson-Scott
Purpose of Presentation

Interpretation of the Student Ratings of
Instruction
Forms
 Reports
 Interpreting the Diagnostic Form Report for
Improved Teaching Effectiveness

7/17/2015
2
IDEA is an acronym for..

Individual Development and Educational
Assessment
7/17/2015
3
IDEA Uses

IDEA system
Should be able to use IDEA for ID =
Individual Development
 Should be able to use IDEA for EA=
Educational Assessment

7/17/2015
4
Improvement of Student
Learning

Student Ratings can have a positive impact
if...

The instrument



The emphasis for “summative” faculty evaluation is
appropriate



7/17/2015
Is “learning focused”
Provides a diagnostic
30%-50% of the overall evaluation of teaching
Results are not over-interpreted
Faculty trust the process
5
IDEA: What you should know about
the student ratings
Reliability and validity of the IDEA
system
 How to interpret IDEA reports and use
IDEA resources for faculty improvement
plans
 How to interpret and adjusted vs.
unadjusted scores
 How to use group summary reports for
program improvement

7/17/2015
6
IDEA: What you should know about
the student ratings
How to complete the FIF
 How to interpret the reports
 How to use IDEA reports to improve
teaching effectiveness
 How to use IDEA resources to improve
teaching
 Student ratings are data that must be
interpreted

7/17/2015
7
Student Ratings- Reliable? Valid?

In general, student ratings tend to be
statistically reliable, valid and relatively free
from bias (probably more so than other data
used to evaluate teaching)
 Reliability – the likelihood that you will get the
same results if the survey is administered
again to the same group of students
 Validity – measures what it supposed
to/intended to measure
7/17/2015
8
Reliability & Validity
Dog, Saul T.
IDEA University
Spring 2007
Composition I 1010 (MWF – 10:00)
There were 12 students enrolled in the course and 9 students responded.
Your results are considered unreliable because the number responding is so
Small. The 75% response rate indicates that results are representative of
the class as a whole.





< 10 students
10-14 students
15-24
25-39
>30
7/17/2015
Unreliable
Marginally Reliable
Fairly Reliable
Reliable
Highly Reliable
9
Understanding the value of the IDEA
System’s uniqueness?

Student Learning Focus
 Diagnostic Component
 Scores Adjusted for Extraneous Influences

What was instructor’s influence on learning?

Documented Validity and Reliability
 National Comparative Data
 Group Summary Reports

7/17/2015
Program Assessment
10
IDEA as a Diagnostic to Guide
Improvement
And as a Tool to
Evaluate Teaching
Effectiveness
Underlying Assumptions

Students are not qualified to assess:
Faculty expertise
 Appropriateness of goals, content, and
organization of course
 Materials used in delivery
 How student work is evaluated, including
grading practices

7/17/2015
12
Underlying Assumptions

Nor are they qualified to assess “indirect”
contributions to instruction
Support for departmental efforts
 Assistance to colleagues
 Contributing to a positive atmosphere

7/17/2015
13
IDEA Student Ratings of
Instruction
The Student Learning Model
Student Learning Model
Types of learning must reflect instructor’s
purpose
 Effectiveness determined by student
progress on objectives stressed by
instructor

7/17/2015
15
Student Learning Model

Specific teaching behaviors influence
certain types of student progress under
certain circumstances.
7/17/2015
16
IDEA Student Ratings of
Instruction- Forms
Faculty
Information Form
Student Survey
Diagnostic
Form
IDEA: FIF
Faculty Information Form
Faculty Information Form


One FIF per class being evaluated
Course Information

IDEA Department Codes
Extended list:
http://www.idea.ksu.edu/StudentRatings/deptcodes.html



Course Description Items

7/17/2015
12 Learning Objectives
Best answered toward end of semester
19
FIF: Selecting Objectives

3-5 as “Essential” or “Important”




Is it a significant part of the course?
Do you do something specific to help students
accomplish the objective?
Does the student’s progress on the objective
influence his or her grade?
In general, progress ratings are negatively
related to the number of objectives chosen.

7/17/2015
Research Note 3
20
Relevant Objectives

Basic Cognitive


Applications of Learning


Items 1, 2
Items 3, 4
Expressiveness

Items 6, 8

7/17/2015
21
Relevant Objectives

Intellectual Development


Lifelong Learning


7, 10, 11
9, 12
Team skills

7/17/2015
5
22
Best Practices
Multi-section courses
 Curriculum committee review
 Prerequisite-subsequent courses
 Incorporate into course syllabus

7/17/2015
23
Best Practices

Discuss meaning of objectives with students




7/17/2015
Early in semester
Inform that will be asked to rate their own progress
on objectives
Reflect on their understanding of course purpose
and how parts of course fit the 12 objectives
Discuss differences in perception of objectives’
meaning
24
Student Survey
Diagnostic Form
Student Survey: Diagnostic Form

Teaching Methods: Items 1-20
 Learning Objectives: Items 21-32
 Student and Course


Student Characteristics: Items 36-39, 43
Course Management/Content: Items 33-35

Global Summary: Items 40-42
 Experimental Items: Items 44-47
 Extra Questions: Items 48-67
 Comments
7/17/2015
26
False Assumptions
Effective instructors effectively employ
all 20 teaching methods.
 The 20 teaching methods items are used
to make an overall judgment about
teaching effectiveness.
 Students should make significant
progress on all 12 learning objectives

7/17/2015
27
Using Extra Questions
20 Extra Questions available
 May be used to address questions at
various levels:

Institution
 Department
 Course
 Or all three

7/17/2015
28
Student Survey
How to use extra questions
 Comments


7/17/2015
Constructive
29
Report
Background
Comparison
Groups
Converted Scores
The Report: Comparative
Information

Comparison Groups
IDEA
 Discipline
 Institution

7/17/2015
31
Comparison Groups (norms)

IDEA Comparisons







7/17/2015
Classes rated in 1998-99, 1999-2000, 2000-2001
Diagnostic Form
Exclude first time institutions
Exclude classes with fewer than 10 students
No one institution comprises more than 5% of the
database
128 institutions
44,455 classes
32
Comparison Groups (norms)

Discipline Comparisons
Most recent 5 years of data 2000-2005
 Minimum of 400 classes
 Exclusions same as IDEA Comparisons


7/17/2015
Also exclude classes with no objectives selected
33
Comparison Groups (norms)

Institutional Comparisons
Minimum of 400 classes
 Most recent 5 years of data
 Exclude classes with no objectives selected
 Include all class sizes

7/17/2015
34
Report
Background
Report: Types of Scores
Average Scores – Numerical averages on a 5point scale
 Converted Scores – Compensate for different
averages among course objectives and
provide normative comparisons
 Raw Scores – unadjusted scores
 Adjusted scores – Compensate for extraneous
factors beyond instructor’s control, .. “level the
playing field”

7/17/2015
36
Converted Scores- WHY?
Method of standardizing scores with
different averages and standard
deviations
 Able to compare scores on the same
scale

7/17/2015
37
Converted Averages
In classes where “Gaining Factual Knowledge”
was an I or E Objective the average student
rating of progress was 4.00 (5-point scale)
 In classes where “Gaining a broader
understanding of intellectual/cultural activity”
was an I or E objective, the average rating of
progress was 3.69
 If only 5-point averages are considered, those
choosing the second objective would be at a
disadvantage

7/17/2015
38
Norms: Converted Averages
Method of standardizing scores with
different averages and standard
deviations
 Able to compare scores on the same
scale


Use T Scores
Average = 50
 Standard Deviation = 10


These are not percentiles
7/17/2015
39
Standard Deviation Tells us What?
7/17/2015
40
What do the converted ratings
mean?
Much Higher >63 score (highest 10%)
 Higher 56-62 (next 20 percent – 71-90%)
 Similar 45-55 (middle 40% of courses
31-70%)
 Lower 38-44 (next 20 percent (11-30%)
 Much Lower <37 (lowest ten percent)

7/17/2015
41
Adjusted Scores
Control for factors beyond instructor’s
control
 Regression equations

7/17/2015
42
Adjusted Scores: Diagnostic
Form
Student Motivation (#39)
 Student Work Habits (#43)
 Class Size (Enrollment, FIF)
 Course Difficulty (multiple items)
 Student Effort (multiple items)

7/17/2015
43
Impact of Extraneous Factors

Gaining
Factual
Knowledge
– Average
Progress
Ratings
Technical Report 12,
page 40
7/17/2015
Work
Habits
(Item 43)
Student Motivation (Item 39)
Low
Low
Avg.
Avg.
High
Avg.
High
Low
3.51
3.66
3.80
3.95
4.08
Low Avg.
3.60
3.76
3.91
4.05
4.07
Average
3.73
3.87
4.02
4.12
4.21
High Avg.
3.88
3.97
4.13
4.23
4.33
High
4.01
4.12
4.25
4.33
4.48
44
IDEA...The Report
The IDEA Report

Diagnostic Form Report

What were students’ perceptions of the
course and their learning?

What might I do to improve my teaching?
7/17/2015
46
The Report: Questions

What was the response rate and how reliable
is the information contained in the report?
 What overall estimates of my teaching
effectiveness were made by students?
 What is the effect of “adjusting” these
measures to take into consideration factors I
can’t control?
 How do my scores compare to other
comparison groups?
7/17/2015
47
Summary Evaluation of
Teaching Effectiveness
7/17/2015
48
Summary Evaluation of
Teaching Effectiveness
50%
25%
25%
7/17/2015
49
Summary Evaluation of
Teaching Effectiveness
7/17/2015
50
Questions Addressed:
Page 2





How much progress did students report on the
learning objectives that I identified as
“Essential”?
How does this progress compare to the
available comparison groups?
How much progress did students report on the
“Important” objectives?
How does this progress compare to the
available comparison groups?
Do conclusions change if “adjusted” rather
than “raw” ratings are used?
7/17/2015
51
Progress on Specific Objectives
4.1+4.1
4.0+4.0
+3.8
+3.9
6
7/17/2015
52
Questions: Teaching Effectiveness
Which of the 20 teaching methods are
most related to my learning objectives?
 How did students rate my use of these
important methods?
 What changes should I consider in my
teaching methods?
 Do these results suggest some general
areas where improvement efforts should
focus?

7/17/2015
53
Improving Teaching
Effectiveness
7/17/2015
54
Improving Teaching
Effectiveness

POD-IDEA Center Notes

www.idea.ksu.edu/podidea

POD-IDEA Center Learning Notes
 IDEA Papers


www.idea.ksu.edu/resources/Papers.html
IDEA Seminars

7/17/2015
www.idea.ksu.edu
55
Questions Addressed:
Page 2
How distinctive is this class with regard
to the amount of reading, amount of
other work (non-reading) and the
difficulty of the subject matter?
 How distinctive is this class with regard
to student self-ratings?

7/17/2015
56
Description of Course and
Students
7/17/2015
57
Questions Addressed:
Page 4

What was the average rating on each of the
questions on the IDEA form?
 How much variation was there in these
ratings?
 Are the distributions of responses relatively
“normal” (bell-shaped) or is there evidence of
distinctive subgroups of students?
 What are the results for the additional
questions I used?
7/17/2015
58
Statistical Detail
7/17/2015
59
Statistical Detail
7/17/2015
60
Teaching & Student Learning
Improvement

By using the recommendations afforded
by the IDEA analysis, individual faculty
will be able to formulate changes in their
pedagogical methods and course
structure that have tangible results in
next semester’s scores.
7/17/2015
61
IDEA Results

Faculty with adjusted t-scores placing
them in the similar, higher or much
higher comparison categories in
progress on relevant objective in the
majority of classes evaluated, have
arguably demonstrated effective
teaching performance
7/17/2015
62
IDEA Results
One semester of student ratings does
not really serve any useful long-term
evaluation purpose
 Multiple (4-5) evaluations spread over
time to draw long-term implications

7/17/2015
63
IDEA Results
IDEA ratings no more than 33% of the
measure of teaching effectiveness
 Ill-served to define teaching
effectiveness by a single measure of
teaching performance

7/17/2015
64
Teaching Effectiveness –
IDEA one piece of data

Triangulation








7/17/2015
Other Sources of Evidence
Self-evaluation/reflective statement
Description of current teaching
Course materials
Graded appraisal tools (tests, essays, papers, etc.)
Feedback from mentors/colleagues
Peer observations
Classroom assessment/research efforts
65
Teaching Effectiveness–
IDEA one piece of data

Consider







7/17/2015
Teaching occurs over time…just a snapshot
(compare progress in same course over several
semesters)
Type of course being evaluated
Number of students responding, and the
percentage of student responding
General education course major course
Global summary items
Written comments
Consider number, kind and difficulty of learning
objectives selected
66
Interpreting Diagnostic Form
Reports
Review results
 Use results to identify areas of
improvement
 Use IDEA resources
 http://www.idea.kstate.edu/podidea/index.html

7/17/2015
67
IDEA Center- POD Resources

These succinct papers were written in collaboration with the
Professional & Organizational Development Network in
Higher Education (POD). As a resource to support teaching
improvement, each is useful to anyone wanting to address
specific ways to employ different teaching methods – each
of which is utilized in the Diagnostic Form of the IDEA
Student Ratings of Instruction System.
http://www.idea.k-state.edu/index.html
(Learning Notes)
7/17/2015
68
IDEA Trainers

Faculty trainers within the division

7/17/2015
A trained faculty member who can assist
other faculty with interpreting IDEA data and
use the IDEA resources to improve teaching
effectiveness.
69
IDEA Resources





Institutional Effectiveness & Research Office &
Website
http://www.roanestate.edu/effectiveness/resources/
IDEA Online
IDEA Papers
POD-IDEA Center Notes
7/17/2015
70
7/17/2015
71
Faculty Workshop Dates

Today, Saturday, August 23rd


Oak Ridge Campus



Wednesday, Oct. 1st, 6:00-7:30 PM, Room TBA
Cumberland Campus


Wednesday, Sept. 17th, 12:30-2:00 PM, Room TBA
Wednesday, Sept. 17th, 6:00-7:30 PM, Room TBA
Harriman Campus


Harriman Campus -1:00-2:30; O-101
Wednesday, Oct. 8th, 11:00-12:30 CST, Room TBA
Additional Workshops as needed
7/17/2015
72
Questions
7/17/2015
73