Interpreting and Using Assessment Results

Download Report

Transcript Interpreting and Using Assessment Results

The Lehman College Assessment Council
http://www.lehman.edu/research/assessment/council-documents.php
Interpreting and Using
Assessment Results
October 20, 2010
Timeline
Spring 2011
Spring 2010
• First Assessment Plan
• Programs begin gathering evidence
• Supporting workshops
• Results and Analysis reported
• Learning objectives on syllabi
•
•
•
•
Middle States report due April 1
Second completed assessment cycle of
student learning goals
Analyze evidence
Report on how fall assessment results
were used
Ongoing
assessment
Fall 2010
• First completed assessment cycle of student learning
goals
• Identifying goal/objective and begin gather evidence on
second goal (9/15)
• Report on how spring assessment results were used
(11/15)
• Supporting workshops through the fall semester
• Submission of fall assessment results
•Syllabi collection
Timeline: Fall 2010
September 15
Assessment Plan identifying second major/program
learning objective to be assessed.
November 15
Completed Assessment Report, indicating how results
from spring 2010 assessments are being used
Late December/January
Results from fall assessments due
Ongoing
Evidence gathering
Meetings with ambassadors
Syllabi revisions
Development opportunities
Planning for next Spring/Fall assessments
Assessment as a Four-Step
Continuous Cycle
Source: Suskie 2004: 4.
Step 1. Establishing Learning Goals
“Assessment begins not with creating or
implementing tests, assignments, or other
assessment tools but by first deciding on
your goals: what you want your students
to learn” (Suskie 2004: 73).
Overview of Exemplary Goals
Across Departments

Identify the contributions of key figures and events to the
historical development of sociology as a scientific discipline

Calculate and interpret descriptive and inferential statistics

Make ethical decisions by applying standards of the National
Association of Social Workers code of ethics

Design and conduct a study using an appropriate research
method

Demonstrate an understanding of the basic research
process and advocacy for the protection of human subjects
in the conduct of research
Department Specific Samples of
Exemplary Goals and Objectives
Develop written proficiency in the target language
consistent with an ACTFL Proficiency rating of
Advanced-Mid (www.actfl.org/files/public/writingguidelines.pdf.)
• Express themselves in written format on a variety of
topics and in academic, professional and informal
styles
• Employ grammatically-correct and semanticallyappropriate language in their writing
• Use complex linguistic structure to express
complex ideas
•Organize their thoughts into coherent and logical
prose
• Demonstrate critical thinking in the target language
•Produce target-language writing ranging from letters
to summaries to essays
Department Specific Samples of
Exemplary Goals and Objectives
Goal: Develop Visual Literacy
• Describe, analyze and interpret artwork of
their own creation.
• Analyze, interpret and evaluate the form
and content of works of art.
• Produce creative works that demonstrate
innovation in concepts, formal language
and/or materials.
• Compare and contrast contemporary
works with their art historical
antecedents.
• Analyze works of art contextually.
Step 2: Provide Learning Opportunities

Provide multiple learning opportunities for a
single goal across courses within a program

Articulate learning goals for every assignment
◦ Identify specific important learning goals for each
assignment, then create meaningful tasks or problems
that correspond to those goals

Some goals are not quantifiable: habits of mind,
behaviors etc.
◦ Allow time for reflection, honest self-appraisals of
actions, minute papers
Learning Opportunities

Provide a variety of assignments and assignment
types
◦ Examples in Suskie p. 158

Will students will learn significantly more from a
larger assignment than a shorter one—enough to
justify the time that they and you will spend on it?

Break apart large assignments into pieces that are
due at various times (“scaffolding”)
Suskie (2009) pp. 155-157
Step 3: Assessing Student Learning:
Gathering and Analyzing Data
(Direct Evidence)

Direct evidence of student learning is tangible,
visible, self-explanatory evidence of exactly what
students have and haven’t learned.

Examples of Direct Evidence:
◦ Embedded course assignments (written/oral)
graded with rubric
◦ Department wide exams
◦ Standardized tests
◦ Capstone projects
◦ Field experiences
◦ Score gains, Pre-Test/Post-Test
Suskie (2004), p. 95
Assessing Student Learning:
Gathering and Analyzing Data
(Indirect Evidence)

Indirect evidence provides signs that students
are probably learning, but evidence of exactly
what they are learning may be less clear and less
convincing.

Examples of Indirect Evidence:
◦
◦
◦
◦
Pre- and post-course surveys
Open-ended questionnaire survey
Focus group
Track admissions to graduate and professional schools
Suskie (2004) , p. 95
Assessing Student Learning: Gathering and
Analyzing Data (Direct Evidence)
Examples of Direct Evidence
Lehman Departments Utilizing Direct Evidence
Samples of student writing (e.g., journals, forensic
reports, term papers, reflective papers) scored with a
rubric
English; History; Languages and Literature; Latin
American and Puerto Rican Studies; Philosophy;
Anthropology
Class observations of student responses to questions
on assigned readings
Music; Environmental, Geographic & Geological
Sciences; Nursing
Homework assignments
Music
Mid-term and final exams
Languages and Literature; Chemistry; Economics,
Accounting and Business Administration; Health
Sciences; Physics and Astronomy; Sociology; Social Work
Pop quiz (grades not factored into final course grade)
Speech, Language & Hearing Sciences
Capstone experience (scored with a rubric)
Health Sciences (Health Education)
PowerPoint presentation (both written and oral skills
assessed)
Speech, Language & Hearing Sciences
Rating of student skills by field supervisor
Nursing
Assessing Student Learning: Gathering and
Analyzing Data (Indirect Evidence)
Examples of Indirect Evidence Lehman Departments Utilizing
Indirect Evidence
Course grades
History; Environmental, Geographic
and Geological Sciences
Assignment grades not
accompanied by a rubric
Social Work; Environmental,
Geographic and Geological Sciences
Student satisfaction with their
learning expressed through
surveys
Environmental, Geographic and
Geological Sciences
Step 4: Closing the Loop: Using
Results for Improvement
In your report, you will be asked to . . .

Explain the implications of the assessment results for
the program.
◦ How can the results be used to improve planning, teaching
and learning?
◦ Are changes in the program suggested? If so, what kinds of
changes? Are changes in the assessment plan indicated? If
so, what kinds of changes? The program changes may refer
to curriculum revision, faculty development, changes in
pedagogy, student services, resource management and/or
any other activity that relates to student success.
◦ What, if any, additional information would help inform
decision making regarding student achievement of the
objective(s)?
◦ What kinds of resources will you need to make changes?
Evaluating the Quality of Your
Assessment Process
Using the results from the assessment of
your first goal(s), discuss plans you have
regarding your:
 Learning Goals
 Curriculum
 Teaching Methods
 Assessment Strategies and Tools
(See handout for questions to guide your thinking for each of
these categories.)
Assessment Council Membership
•
•
•
•
•
•
•
•
•
•
•
Kofi Benefo (Sociology) [email protected]
Salita Bryant (English)
[email protected]
*Nancy Dubetz (ECCE) [email protected]
Robert Farrell (Lib)
[email protected]
Judith Fields (Economics) [email protected]
Marisol Jimenez (ISSP) [email protected]
Lynn Rosenberg (SLHS) [email protected]
Renuka Sankaran (Biology) [email protected]
Robyn Spencer (History) [email protected]
Minda Tessler (Psych)
[email protected]
Janette Tilley (Mus)
[email protected]
*Committee Chair
Administrative Advisor – Assessment Coordinator
Ray Galinski - [email protected]
References/Resources
Suskie, L. (2004). Assessing student learning: A
common sense guide. San Francisco: Anker
Publishing Co., Inc.
Suskie, L. (2009). Assessing student learning: A
common sense guide. San Francisco: John
Wiley & Sons, Inc.