Three Basis Questions
Download
Report
Transcript Three Basis Questions
1
Selecting Assessment Tools for Gathering
Evidence of Learning Outcomes
Lehman College Assessment Council
February 24, 2010
Timeline
Spring 2010
• First Assessment Plan
• Programs begin
gathering evidence.
• Supporting workshops
• Results and Analysis
reported (end May)
• Learning objectives on
syllabi continues
Fall 2010
• First completed assessment cycle of
student learning goals
• Report on how assessment results
were used
• Identify 2nd goal and begin to
gather evidence on second goal
• Supporting workshops through fall
semester.
Ongoing
assessment
Fall 2009
Spring 2011
• Articulate learning goals and
objectives for majors and
programs.
• Identify learning opportunities in
curriculum and places where
students demonstrate learning of
objectives. (February 16 target
date)
• Middle States report due April 1
• Second completed assessment cycle of
student learning goals
• Analyze evidence
• Report on how assessment results were
used (May)
Timeline: Spring 2010
3
February 16
Curriculum Maps
Assessment Plans
April 16
Assessment Council workshop
May 31
Assessment Results (w/supporting documents)
Ongoing
Evidence gathering
Meetings with ambassadors
Development opportunities
*** Syllabi ***
What do we want our students to
learn?
4
What…
knowledge,
skills,
abilities, and
habits of mind
…do we expect graduates of our
program to have?
Assessment Toolbox
Assessment tools recommended by Suskie (2004)
5
Portfolios
Tests (blueprinted – i.e., mapped back onto objectives)
Focus Groups
Interviews
Assignment Scoring Guides/Rubrics
Surveys
SEE ALSO: Suskie, Assessing Student Learning, 2nd ed., Ch. 2
Direct vs. Indirect Evidence
6
Direct evidence of student learning is tangible, visible, selfexplanatory evidence of exactly what students have and
haven’t learned.
Indirect evidence provides signs that students are probably
learning, but evidence of exactly what they are learning may
be less clear and less convincing.
While indirect evidence (feedback/surveys) can be useful,
direct evidence is often best for getting concrete indications
that students are learning what we’re hoping they’re learning.
This Assessment Cycle:
Direct Evidence
7
Embedded course assignments (written/oral)
Department wide exams (blueprinted)
Standardized tests (blueprinted)
Capstone projects
Field experiences
Score gains, Pre-Test/Post-Test
Videotape and audiotape evaluation of own performance
Portfolio evaluation and faculty designed examinations
Summaries and assessments of electronic class discussion threads
Student reflections on outcomes related to values, attitudes, beliefs
See: Suskie (2009), ch. 2, table 2.1
What Is a Scoring Guide or Rubric?
8
List or chart describing the criteria used to
evaluate or grade completed student assignments
such as presentations, papers, performances, etc.
Includes guidelines for evaluating each of the
criteria
Both a means of evaluating student work and
providing meaningful feedback to students
Using Scoring Guides/Rubrics to Assess
Program Goals
9
How can Rubrics be used to assess program learning goals?
Embedded course assignments
Capstone experiences
Field experiences
Employer feedback
Student self-assessments
Peer evaluations
Portfolios
Using a Scoring Guide/Rubric:
Advantages
10
Clarify vague, fuzzy statements – “Demonstrate
effective writing skills”
Help students understand expectations
Help students self-improve (metacognition)
Make scoring easier and faster
Make scoring accurate, unbiased and consistent
Reduce arguments with students
Help improve teaching and learning
Developing a Scoring Guide / Rubric:
Steps
11
Step I – Look for models
Step II – Define the traits or learning outcomes to assess:
Structure, Content, Evidence, Presentation, Technical, Accuracy, etc
Step III – Choose the scale / level of performance
(5pt/3pt, letter grades, Excellent-Poor, etc)
Step IV - Draw a table
Step V - Describe the characteristics of student work at each level.
Start with high end and then low end and then describe point in between
Step VI – Pilot test the rubric
Step VII – Discuss the results
Developing a Scoring Guide / Rubric:
List the Things You’re Looking For
12
Why are we giving students this assignment?
What are its key learning goals?
What do we want students to learn by
completing it?
What skills do we want students to demonstrate in
this assignment?
What specific characteristic do we want to see in
completed assignments?
Using Scoring Guides/Rubrics: Rating Scale
13
Source: Assessing Student Learning: A Common Sense Guide by Linda Suskie
Using a Scoring Guide/Rubric:
Descriptive Rubric
14
Source: Assessing Student Learning: A Common Sense Guide by Linda Suskie
How it All Fits Together: Course
Embedded
EXAMPLE 1 – Communication (any discipline)
Program Goal I: Students will be able to communicate effectively
Learning Objective IA: Express ideas in a clear and coherent
manner in an oral presentation
Class: Speech 101
Assignment: Students will make a persuasive argument (pro or
con) on a current domestic or international issue (health care,
Afghan war, financial crisis, etc)
Assessment Technique: Using an oral presentation rubric,
students will be evaluated on Organization, Content and Style
How It All Fits Together: Course Embedded Rubric
Below Expectations
Satisfactory
Exemplary
Organization
No apparent organization.
Evidence is not used to support
assertions. (0-2)
The presentation has a focus
and provides some evidence
that supports conclusions. (3-5)
The presentation is carefully
organized and provides
convincing evidence to
support conclusions. (6-8)
Content
The content is inaccurate or
overly general. Listeners are
unlikely to learn anything or
may be misled. (0-2)
The content is generally
accurate, but incomplete.
Listeners may learn some
isolated facts, but they are
unlikely to gain new insights
about the topic.
(5-7)
The content is accurate and
complete. Listeners are likely
to gain new insights about the
topic. (10-13)
Style
The speaker appears anxious
and uncomfortable, and reads
notes, rather than speaks.
Listeners are largely ignored.
(0-2)
The speaker is generally
relaxed and comfortable, but
too often relies on notes.
Listeners are sometimes
ignored or misunderstood.
(3-6)
The speaker is relaxed and
comfortable, speaks without
undue reliance on notes, and
interacts effectively with
listeners. (7-9)
Total Score:
Source: Assessing Academic Program in Higher Education by Mary J. Allen
Score
How It All Fits Together: Capstone
17
EXAMPLE 2 – Research (any discipline)
Program Goal I: Students will understand how to conduct research
Program Goal II: Students will be able to demonstrate proficiency in writing mechanics
Program Goal III: Students will understand sociological concepts and theories
Objectives: several objectives pertaining to each of the above goals will be assessed (see
program’s learning goals and objectives)
Class: Sociology 495
Assignment: Students will write a 20 page research paper on a topic in sociology. Students
are expected to develop a thesis, gather and analyze information, synthesize this
information to support their argument and demonstrate proficiency in writing mechanics.
Assessment Technique: Using a research rubric developed and tested by department
faculty, students work will be evaluated on six different criteria. The rubric is attached.
How It All Fits Together: Capstone Rubric
18
Thesis/Problem/Question
Information
Seeking/Selecting and
Evaluating
Analysis
Synthesis
Documentation
Product/Process
4 Student(s) posed a thoughtful, creative
question that engaged them in challenging
or provocative research. The question
breaks new ground or contributes to
knowledge in a focused, specific area.
Student(s) gathered
information from a
variety of quality
electronic and print
sources, including
appropriate licensed
databases. Sources are
relevant, balanced and
include critical readings
relating to the thesis or
problem. Primary
sources were included
(if appropriate).
Student(s) carefully
analyzed the
information collected
and drew appropriate
and inventive
conclusions supported
by evidence. Voice of
the student writer is
evident.
Student(s) developed
appropriate structure for
communicating product,
incorporating variety of
quality sources.
Information is logically
and creatively
organized with smooth
transitions.
Student(s) documented Student(s) effectively
all sources, including
and creatively used
visuals, sounds, and
appropriate
animations. Sources are communication tools to
properly cited, both in- convey their
text/in-product and on conclusions and
Works-Cited/Worksdemonstrated thorough,
Consulted pages/slides. effective research
Documentation is error- techniques. Product
free.
displays creativity and
originality.
3 Student(s) posed a focused question
involving them in challenging research.
Student(s) gathered
information from a
variety of relevant
sources--print and
electronic
Student (s) product
shows good effort was
made in analyzing the
evidence collected
Student(s) logically
organized the product
and made good
connections among
ideas
Student(s) documented Student(s) effectively
sources with some care, communicated the
Sources are cited, both results of research to
in-text/in-product and on the audience.
Works-Cited/WorksConsulted pages/slides.
Few errors noted.
2 Student(s) constructed a question that lends Student(s) gathered
Student(s) conclusions Student(s) could have
itself to readily available answers
information from a
could be supported by put greater effort into
limited range of sources stronger evidence.
organizing the product
and displayed minimal Level of analysis could
effort in selecting quality have been deeper.
resources
Student(s) need to use Student(s) need to work
greater care in
on communicating more
documenting sources. effectively
Documentation was
poorly constructed or
absent.
1 Student(s) relied on teacher-generated
Student(s) gathered
questions or developed a question requiring information that lacked
little creative thought.
relevance, quality,
depth and balance.
Student(s) clearly
plagiarized materials.
Student(s) conclusions Student(s) work is not
simply involved
logically or effectively
restating information.
structured.
Conclusions were not
supported by evidence.
Student(s) showed little
evidence of thoughtful
research. Product does
not effectively
communicate research
findings.
Group Exercise
19
Write a Rubric!
Using Surveys as an Assessment
20
Potential Advantages
Measure of attitudes, dispositions, values, habits of
mind
Complement other forms of assessment data
Triangulation of data from different perspectives
about how well a goal or objective is met
Efficient way of gathering information from program
completers or alumni in the workforce
Example: Using Complementary Survey
Data to Evaluate Program Outcomes
21
ECCE Undergraduate Program Goal:
Candidates must be able to plan instructional tasks and activities
appropriate to the needs of students who are culturally diverse and
those with exceptional learning needs in elementary schools. They must
be able to teach the literacy skills of listening, speaking, reading, and
writing to native English speakers and students who are English language
learners at the childhood level, including methods of reading enrichment
and remediation. (NYSDOE, ACEI)
Divisional surveys of education program completers conducted every
semester showed that graduates overall do not feel they are well prepared to
teach English Language Learners effectively.
Example: Using Complementary Survey
Data to Evaluate Program Outcomes
22
ECCE courses where goal is addressed: ECE 300, ECE 301, ECE 431, ECE
432, ECE 433
ECCE program assessment data:
Student portfolios (lesson plans demonstrating differentiated instruction for
ELLs and effectiveness at having ELLs meet lesson objectives)
Student teaching evaluations (completed by students, teachers, supervisors)
Online survey data at focus school from graduates and “experts”
(cooperating teachers, college supervisors) using Likert Rating Scale
Recent changes in program
Explicit attention to working with ELLs during field placements attached to
courses prior to final semester student teaching
Program faculty are revisiting content of courses at program meetings
(return to curriculum mapping)
Example: Using Complementary Survey
Data to Evaluate Program Outcomes
23
One Goal. Three Assessments. Triangulation:
Direct evidence: portfolio assessment.
Direct evidence: teaching performance evaluations
scored using rubrics.
Indirect evidence: later survey (graduates,
cooperating peers) regarding student preparedness
for real world teaching.
Using Surveys as an Assessment
24
Potential Limitations
Self-report data from a survey may or may not
accurately reflect student learning (indirect measure of
student learning)
Responses might be influenced by participants’
perceptions of what they believe the evaluator wants to
hear, e.g., if the course instructor is conducting the
survey
If used as an “add on” assessment, participation is
voluntary and may require follow-up
Issues of sampling, validity & reliability
Assessment Council Membership
25
Salita Bryant (English) [email protected]
Nancy Dubetz (ECCE) [email protected]
*Robert Farrell (Lib) [email protected]
Judith Fields (Economics) [email protected]
Marisol Jimenez (ISSP) [email protected]
Teresita Levy (LAPRS) [email protected]
Lynn Rosenberg (SLHS) [email protected]
Robyn Spencer (History) [email protected]
Minda Tessler (Psych) [email protected]
Janette Tilley (Mus) [email protected]
Esther Wilder (Soc) [email protected]
*Committee Chair
Administrative Advisor – Assessment Coordinator
• Ray Galinski - [email protected]
References/Resources
26
Suskie, L. (2004). Assessing student learning: A common sense
guide. San Francisco: Anker Publishing Co., Inc.
Suskie, L. (2009). Assessing student learning: A common sense
guide. San Francisco: John Wiley & Sons, Inc.
SurveyMonkey: www.surveymonkey.com