See e-mail from anita

Download Report

Transcript See e-mail from anita

Jo-Ellen Asbury, Ph.D.
Rebecca Kruse
Office of Institutional Research and Assessment
Stevenson University
 We don’t have all the answers
 We invite audience input and insights
 We are not cheerleaders for national tests, it
was the decision that we made at that time
  No, we get no kick-back from ETS!
* We are not here to advocate use of the MAPP ETS
Proficiency Profile or any specific test or assessment.
We want to share our experience and generate a
conversation.
2
SU Core Curriculum Requirements (Bachelor’s Degree)
(General “Cafeteria” Style)
Min. 16 academic courses in liberal arts and sciences and 1
course in phys ed. All students must complete the following:
 Skills Courses:
 Three writing courses
 One communication course
 One physical education course
 Computer literacy requirement
 Distribution Courses:
 One fine arts course
 Two social science courses
 Three math and science courses (at least one lab)
 Four humanities courses humanities
 Core Electives (2 courses, 6 credits)
 Foreign Language (Bachelor of Arts only) 2 courses
3
How to assess the General Education program
 Unlike a major (psychology, math, etc.) does not
have:



A firm fairly prescribed list of requirements.
A faculty member (or group of faculty members) who take sole responsibility
for oversight.
A capstone project/paper/experience that could be used to assess student
learning outcomes.
 Student learning outcomes for gen ed were evolving.
 Currently, no centralized oversight.
4
Possible General Education
Assessment Approaches
Individual Course-Based Approach
 Information collected about learning in individual courses. Faculty
demonstrate that students acquiring knowledge, skills, values associated
with one or more gen ed goals. Assignments, exams, portfolios, etc.
Multicourse (Theme-Based) Approach
 Focus on faculty from number of disciplines rather than individual
courses. Review of syllabi, focus groups.
Noncourse-Based Approach
 Campuswide focusing on individual or groups of students rather than
courses. Gen ed assessment given to all or a sample of students.
Standardized testing, student and alumni surveys, transcript analysis.
Source: Palomba, C. A., & Banta, T. W. (1999). Assessing general education. In Assessment essentials:
Planning, implementing, improving (pp. 239-268). San Francisco: Jossey-Bass.
5
Selecting a Gen Ed Assessment Method
 Method(s) used needs to match learning goals
 Because gen ed programs include a broad range of
learning goals and objectives, critical thinking,
communication, values, attitudes…. Need to be careful
that the methods used will address all of these objectives
 May need more than one method
 Settled on some type of nationally-normed instrument.
6
~ from the paper, “The Role of Published Tests and Assessments in Higher Education”,
March 2006, by Linda Suskie, MSCHE Vice President
 Developed by testing professionals (test design, quality of
questions better)
 Can provide comparison data
 Provide detailed, diagnostic feedback
 Variety of published tests to reflect diversity among schools
and programs
 Longitudinal data confidence
7
ETS Measure of Academic
Proficiency & Progress (MAPP)
Examples of Tested
Writing Skills
Examples of Tested Critical Thinking
Skills



ACT Collegiate Assessment of
Academic Proficiency (CAAP)
Council for Aid to Education
Collegiate Learning Assessment
(CLA)




Discriminate between
appropriate and inappropriate
use of parallelism.
Recognize redundancy

Formulate an assertion about a 
given issue.
Organize and connect major

ideas.
Support ideas with relevant
reasons and examples.
Sustain a coherent discussion.
“The Role of Published Tests and Assessments in Higher Education”
Linda Suskie, Middles States Commission on Higher Education
March 25, 2006


Evaluate competing casual
explanations.
Determine the relevance of
information for evaluating an
argument or conclusion.
Generalize and apply information
beyond the immediate context.
Make appropriate comparisons.
Deal with inadequate,
ambiguous, and/or conflicting
information.
Spot deceptions and holes in the
arguments made by others.
8
 If no compelling incentive, students may not give
best effort. Challenge to get students to take and to
give best effort.
 Published tests for higher ed have less evidence of
quality than K-12 tests. Smaller # of students, may
not be representative, less funds, etc.
 Certain published tests may not yield enough useful
feedback .
from “The Role of Published Tests and Assessments in Higher Education”, Linda Suskie,
Middles States Commission on Higher Education, March 25, 2006
9
 Match goals for student learning set by the institution
 Specific content must correspond with institution’s
concepts (how does institution define critical thinking for
example)
 Provide rich, detailed feedback that can be used to
identify areas for improvement
 Have evidence of validity and reliability
 Provide some incentive for students to do their best
from “The Role of Published Tests and Assessments in Higher Education”, Linda Suskie,
Middles States Commission on Higher Education, March 25, 2006
10
Selected MAPP by ETS: Measure of Academic Proficiency and
Progress. (now called ETS Proficiency Profile)
 Corresponds well with university core and measures what we want to
measure
 Several different formats to choose from (online, standard,
abbreviated)
 Can add up to 50 of our own supplemental questions
 Rich reporting features including comparative data and diagnostic
feedback, norm-referenced scores and criterion-referenced scores
 SU has changed so rapidly and is still changing – important for us
to be able to do comparisons, benchmarking, see differences
between cohorts, etc.
11
Measure of Academic Proficiency and Progress
(now called ETS Proficiency Profile…)
 Assesses four core skill areas – critical thinking, reading,
writing and mathematics at three levels
 Measures academic skills developed, as opposed to
subject knowledge taught, in general education courses
12
 Multitude of reporting options available
 Comparison between cohorts/subgroups (separate out specific groups - majors,
schools w/in University, commuters vs. noncommuters, etc. Can ask different
cohorts different suppl. questions.)
 Identify specific proficiency level (1-3) of core skill deficiencies (ETS has specific
definitions at each level)
 External and internal benchmarking
 Value-Added – compare against other metrics such as GPA, SAT, etc.
 Identify patterns (e.g. do students do better in certain areas if certain courses are
taken in a certain order? Etc.)
13
14
Test students when then enter, then test
again at a later point in their Stevenson
career.
 WHEN should the second testing take place?
 Internal validity threats
 History
 Maturation
 Mortality
 Selection
 Testing
15
Compensates for (most of) the internal
validity threats
Provides both between subject and within
subject data.
16
COHORT #1 (entered F ‘08) COHORT #2 (enter F ‘10)
AY 2008-2009
Fall, 2008
AY 2009-2010
Spring, 2010
AY 2010-2011
Fall, 2010
AY 2011-2012
Spring, 2012
17
18
 Administer to incoming freshman
 Test same students again in the end of
sophomore year
19
How do we get a large number of freshmen to
take the test?
 Commitment from Director of First Year
Experience to administer in First Year Seminars
(all incoming freshman take a FYS)
 Goes on the syllabus
 Peer leaders (not us) to administer
20
 Test version? (long, abbreviated, online)
 2007 – used long version (2 hrs) switched to abbreviated
(40 mins)
 Cost (tests, materials)
 Student leader instructions for administering
 Very specific instructions / script
 Customize instruction book
 Materials to and from student leaders
 Tests, pencils, instructions, ID Cards, calculators
21
384 freshmen took in fall 2008
 Where and how can we test that amount of students
now as sophomores?
 Do we test all 384 at same time on same day in same
location? Do we have the room on campus?
 Do we have enough supplies to test all at one time?
 What’s the best time during the semester?
 Who would proctor the tests?
 How do we get sophomores to volunteer to take test?
No way to capture – no one class that all take.
22
Used to use scholarship hours
Pizza lunch
Gift card drawings
Offered choice of two different days
Marketed through emails, plasma screens in
student union, faculty
23
A week before, response still not great
Added more gift cards
Opened up to ALL sophomores, not just ones
who took it as freshmen
46 students out of 384 signed up
27 showed up split between both days
24
 Gift certificates or pay for all students who take the
test
 Change test format – use online format
 Reward those with high scores so test is taken
seriously and they do their best
 ETS reports that most effective is combination of extrinsic and
academic reward – something to get them there and something to get
them to take it seriously
 Put high scores on an honor roll
 Make it a requirement for registration for junior year
 Withhold grades until test is taken
25
 Try online non-proctored version.
 Recruit 100 random students from the 384
tested as freshman in 2008 who didn’t retake
it in spring.
 Give each one $10 gift card to take online
26
27
Cohort 1:
Summary of Stevenson University Proficiency Classifications (natl. comparison in parenthesis)
Proficient Freshmen (FA08)
Proficient Sophomore
(SP10)
Marginal Freshmen (FA08)
Marginal Sophomore
(SP10)
Not Proficient Freshmen (FA08)
Not Proficient Sophomore
(SP10)
Reading Level 1
56% (57%)
67% (59%)
24% (22%)
26% (23%)
20% (20%)
7% (18%)
Reading Level 2
26% (28%)
44% (27%)
19% (19%)
26% (20%)
56% (53%)
30% (53%)
Reading Level 3
(Critical Thinking)
3% (4%)
7% (3%)
9% (12%)
7% (10%)
88% (84%)
85% (86%)
Writing Level 1
62% (61%)
70% (59%)
26% (25%)
26% (28%)
12% (14%)
4% (13%)
Writing Level 2
15% (16%)
19% (14%)
38% (35%)
37% (34%)
48% (49%)
44% (52%)
Writing Level 3
6% (7%)
4% (5%)
24% (25%)
30% (24%)
70% (68%)
67% (71%)
Math Level 1
52% (50%)
59% (45%)
26% (28%)
22% (29%)
22% (22%)
19% (26%)
Math Level 2
26% (25%)
22% (19%)
26% (26%)
37% (26%)
48% (49%)
41% (54%)
Math Level 3
7% (6%)
7% (4%)
14% (15%)
11% (12%)
78% (79%)
81% (84%)
28
Cohort 1:
Distribution of Individual Student Scores and Subscores
Possible
Range
Total Score
SU Mean
Score
Natl.
Freshmen
Comparison
(FA08) n=380 (Freshmen)
SU Mean
Score
Natl.
Sophomore Comparison
(SP10) n=27 (Sophomore)
SU Score
Increase/
Decrease
(pts)
SU Score
Increase/
Decrease (%)
400-500
439.81
441.1
443.41
439.6
3.60
0.82%
Critical Thinking
100-130
110.23
110.3
110.26
110.0
0.03
0.03%
Reading
100-130
116.34
117.1
118.89
117.1
2.55
2.19%
Writing
100-130
113.75
113.8
114.67
113.5
0.92
0.81%
Mathematics
100-130
112.80
113.0
112.81
112.0
0.01
0.01%
Humanities
100-130
113.59
113.9
114.30
113.8
0.71
0.63%
Social Sciences
100-130
112.15
112.6
112.59
112.5
0.44
0.39%
Natural Sciences
100-130
114.17
114.0
115.93
113.9
1.76
1.54%
Skills Subscores:
Context-Based
Subscores:
29
 Determine the mechanism for internal decisionmaking and the process used for identifying
deficiencies and implementing change
 Share results
 Other measures of same core skills
 Content mapping
30
- assessing general education?
- recruiting students?
- using data and closing the loop?
- other?
31
Suskie, L. (2006, March 25). The role of published tests and assessments in
Higher Education. In Middle States Commission on Higher Education [Report].
Retrieved from http://www.msche.org/publications/
published-instruments-in-higher-education.pdf
ETS® Proficiency Profile Case Studies. (2008). Educational Testing Services.
Retrieved from http://www.ets.org/proficiencyprofile/case_studies/
ETS® Proficiency Profile Content. (n.d.). Educational Testing Service. Retrieved
from http://www.ets.org/proficiencyprofile/about/content/
Walvoord, B. E. (2004). For general education. In Assessment clear and simple: A
practical guide for institutions, departments and general education (pp.
67-79). San Francisco: Jossey-Bass.
Palomba, C. A., & Banta, T. W. (1999). Assessing general education. In
Assessment essentials: Planning, implementing, improving (pp. 239-268).
San Francisco: Jossey-Bass.
32