ASSESSMENTS - San Juan College

Download Report

Transcript ASSESSMENTS - San Juan College

Welcome…
The attendee will understand assessment basics with a focus on creating learning
activities and identifying assessment expectations.
 Apply the definition of formative assessment to a specific program
 Apply the definition of summative assessment to a specific program
 Design formative learning activities and experiences that are linked to
course outcomes to measure student learning and progress
(assessment FOR learning)
 Design summative learning activities and experiences that are linked to
course outcomes to measure student learning and progress
(assessment OF learning)
 Create Direct and Indirect data-collection methods utilized in each
program used to assess student competency
 Identify the assessment expectation(s)/benchmark(s) as a result of the
assessment process in each program area
Did you know that…
“Students can learn better when their college
experiences are not collections of isolated courses
and activities but are purposefully designed as
coherent, integrated learning
experiences in which courses
and out-of-class experiences
build on and reinforce one
another.”
Suskie, 2004
www.CartoonStock.com
Assessment of Student Learning
Higher Learning Commission defines assessment of
student learning:
Assessment of student learning is a participatory,
interactive process that:
 provides data/information you need on your students’
learning
 engages you and others in analyzing and using this
data/information to confirm and improve teaching
and learning
Assessment of Student Learning
 produces evidence that students are learning the
outcomes you intended
 guides you in making educational and institutional
improvements
 evaluates whether changes made improve/impact
student learning and documents the learning and your
efforts
http://www.uni.edu/assessment/definitionofassessment.shtml
Drivers of Assessment
 Learning-centered paradigm
 Requirements for accreditation
 Discipline/program accreditation
 Accountability
 Support for faculty and students to improve their
performance
Assessment Cycle
Assessment cycle is an on-going process of:
 establishing clear, measurable expected outcomes of
student learning or service.
(established learning goals/expected learning outcomes)
 ensuring that students or service users have sufficient
opportunities to achieve those outcomes.
(provide learning opportunities/curriculum map)
Assessment Cycle
As well as…
 analyzing and selecting assessment methods used to
monitor the alignment of the curriculum with the
student learning outcomes.
(assessment methods used: Direct and Indirect)
Assessment Cycle
And…
 systematically gathering, analyzing, and interpreting
evidence to determine how well student learning or
service matches our expectations.
(assess student learning/expectations/benchmarks)
 using the resulting information to understand and
improve student learning or service.
(use the results/improvement plan)
Assessment Cycle
EXPECTED LEARNING
OUTCOMES
HOW RESULTS WILL BE USED
LEARNING OPPORTUNITIES
(IMPROVEMENT PLAN)
(CURRICULUM MAP)
ASSESSMENT
EXPECTATIONS/BENCHMARK
METHODS USED TO ASSESS
STUDENT COMPETENCY
DIRECT/INDIRECT
Adapted from: http://manoa.hawaii.edu/assessment
Linking Assessments to Curriculum
In linking assessment to curriculum, assessment
demonstrates what students know and are able to do,
thus, this becomes the ACHIEVED curriculum—what
the assessment data says that students know and are
able to do.
Quantitative Assessments
Use structured, predetermined response options that
can be summarized into meaningful numbers and
analyzed statistically:





GPA
Grades
Exam scores
Standardized test scores
Standardized teaching scores
Qualitative Assessments
Use flexible, naturalistic methods and are usually
analyzed by looking for recurring patterns and
themes:





Ethnographic studies
Exit interviews
Participant observations
Writing samples
Open-ended questions on surveys and interviews
Formative Assessment
Informal product or performance designed to provide
student with instant feedback for self-monitoring of
strengths and weaknesses of personal learning
expectations.
“Assessment FOR learning”
Summative Assessment
Formal product or performance designed to inform
student and others about personal achievement
regarding learning expectations.
“Assessment OF learning”
Formative vs Summative
FORMATIVE
SUMMATIVE
 Improve teaching and
 Document learning or service
learning (or service and
satisfaction)
 Used while learning is taking
place
 Focus on feedback and
adjustment
and satisfaction
 Occurs at the end of a course
or service period
 Focus on sum or total with
little feedback
Assessments in 2004
PERFORMANCE
TRADITIONAL
 Students asked to
 ‘Blue Book’ essay questions





demonstrate skills
Authentic assessments –
‘real-life’ tasks
Field experiences
Studio assignments
Projects
Research papers
 Oral examinations
 Controlled and timed exam
setting
 Objective tests
Assessments in 2009
CONTEMPORARY
TRADITIONAL
 Carefully aligned with goals,
 Often planned and
the most important
information students must
learn
 Focused on thinking and
performance skills
 Developed from research and
best practices on teaching
and assessment
implemented without
considering learning goals
 Often focused on
memorizing knowledge – low
on Bloom’s Taxonomy
 Frequently poor quality tests
without consideration of
mastery of subject
Assessment at the Program Level
 Embedded course assignments
 Capstone experiences
 Field experiences
 Portfolios
 Certification tests
 Common Student Learning Outcome Rubric
Direct & Indirect Evidence of Student Learning
Data-collection methods for assessment purposes
typically fall into two categories: direct and indirect.
Direct evidence of student learning comes in the form
of a student product or performance that can be
evaluated.
Indirect evidence is the perception, opinion, or attitude
of students (or others). Both are important. But,
indirect evidence by itself is insufficient. Direct
evidence is required. Ideally, a program collects both
types.
Types of Evidence
DIRECT
INDIRECT
 Tangible, visible, self-
 Less convincing indicators
explanatory, compelling and
acceptable
 Scores/pass rates on
licensure/certification exams
 Portfolios of student work
 Capstone experiences
 Grades
 Student self-ratings
 Student/Alumni satisfaction
with learning
 Honors, awards, scholarships
Types of Direct Data-Collection Methods
DIRECT
METHODS
Examples
Licensure or
certification
Nursing program students' pass rates on the NCLEX (Nursing)
examination.
National exams or
standardized tests
a) Freshmen and seniors' scores on the Collegiate Learning Assessment
(CLA) or Collegiate Assessment of Academic Proficiency (CAAP)
b) Senior-level biology students' scores on the GRE Subject Test on
Biology.
Local exams
Entering students' scores on the SJC Placement Exams.
(external to courses)
Embedded testing or a) Students' pass rates on course tests or final exam. Two questions from
quizzes
final exams are scored by a team of faculty members and results used for
program-level assessment.
Embedded
assignments
The program selects course assignments that can provide information on a
student learning outcome. Students complete these assignments as a
regular part of the course. The assignments are scored using criteria or a
scoring rubric and the scores are used for program-level assessment and
not typically used to give students a grade on the assignment.
Grades calibrated to
clear student
learning outcome(s)
Professors give grades based on explicit criteria that are directly related to
particular learning outcomes.
Portfolios
A collection of student work such as written assignments, personal
reflection, and self assessments. Developmental portfolios typically
include work completed early, middle, and late in the students' academic
career so growth can be noted. Showcase portfolios typically include
students' best work and aim to show the students' highest achievement
level.
Direct Data-Collection Methods
Pre- post-tests
When used for program assessment, students take the pre-test as
part of a required, introductory course. They take the post-test
during their senior year, often in a required course or capstone
course.
Employer's or internship
supervisor's direct
evaluations of students'
performances
Evaluation or rating of student performance in a work, internship,
or service-learning experience by a qualified professional.
Observation of student
performing a task
Professor or an external observer rates each students' classroom
discussion participation using an observation checklist.
Culminating project:
capstone projects, senior
theses, senior exhibits,
senior dance performance
Students produce a piece of work or several pieces that showcase
their cumulative experiences in a program. The work(s) are
evaluated by a pair of faculty members, a faculty team, or a team
comprised of faculty and community members.
Student publications or
conference presentations
Students present their research to an audience outside their
program. Faculty and/or external reviewers evaluate student
performance.
Description or list of what
student learned
Students are asked to describe or list what they have learned. The
descriptions are evaluated by faculty in the program and compared
to the intended student learning outcomes.
Example: After completing a service learning project, students are
asked to describe the three most important things they learned
through their participation in the project. Faculty members
evaluate the descriptions in terms of how well the service learning
project contributed to the program outcomes.
Types of Indirect Data-Collection Methods
INDIRECT METHODS
Example
Student surveys
Students self-report via a questionnaire (online, telephone, or
paper) about their ability, attitudes, and/or satisfaction.
End-of-course evaluations or
mid-semester course
evaluations
Students report their perceptions about the quality of a course,
its instructor, and the classroom environment.
Alumni surveys
Alumni report their perceptions via a questionnaire (online,
telephone, or paper). E.g., alumni answer questions during a
telephone survey about the importance of particular program
learning outcomes and whether they are pertinent to their
current career or personal life.
Employer surveys
Potential employers complete a survey on job skills that
graduates should possess. Employers can be asked about the
quality of current employees who graduated from SJC.
Interviews
Face-to-face, one-to-one discussions or question/answer
session.
Focus group interviews
Face-to-face, one-to-many discussions or question/answer
session.
Percent of time or number of
hours/minutes spent on various
educational experiences in and
out of class
Students' self reports or observations made by trained
observers on time spent on, for example:





co-curricular activities
homework
classroom active learning activities verses classroom
lectures
intellectual activities related to a student learning
outcome
cultural activities related to a student learning outcome
Indirect Data-Collection Methods
Grades given by professors that Grade point averages or grades of students in a program.
are not based on explicit criteria
directly related to a learning
outcome
Job placement data
The percent of students who found employment in a field
related to the major/program within one year.
Enrollment in higher degree
programs
The number or percent of students who pursued a higher
degree in the field.
Maps or inventories of practice
A map or matrix of the required curriculum and instructional
practices/signature assignments.
Transcript analysis or coursetaking patterns
The actual sequence of courses (instead of the program's
desired course sequence for students).
Institutional/Program Research
data
Information such as the following:





Registration or course enrollment data
Class size data
Graduation rates
Retention rates
Grade point averages
Specific examples:
a) Number of sections in each course.
b) Percent of seats filled.
c) Number of students who dropped a course after first day of
classes.
d) Average enrollment in sections by course level.
Guidelines for Selecting Assessment Methods
 The evidence you collect depends on the questions you
want to answer…
Does the program meet or exceed certain standards?
2. How does the program compare to others?
3. Does the program do what is sets out to do?
4. How can the program experience be improved?
1.
Using these assessment questions to guide method selection can help
your data collection priorities.
Adapted from Volkwein, J., Program evaluation and assessment: What’s the question? (1996).
Guidelines for Selecting Assessment Methods
 Use multiple methods to assess each learning outcome
 Include both direct and indirect measures
 Include qualitative and quantitative measures
 Choose assessment methods that allow you to asses the strengths
and weaknesses of the program…finding out what is working well is
only one goal of program assessment
Adapted from Umass Amherst, OAPA Handbook, Program-Based Review and Assessment
Your Turn…
Student Competency Assessments - Direct Methods
DIRECT METHODS
EXAMPLES
Student Competency Assessments - Indirect Methods
INDIRECT METHODS
EXAMPLES
Pilot Reports
Program-Specific Student Learning Outcomes Assessment Plan (DRAFT)
Program:
School:
Report submitted by:
Approval of the Dean:
(CURRICULUM 101) (CURRICULUM MAPPING)
Program Outcomes
(Observable Student
Learning Outcome)
Course(s) in which the
competency will be
assessed
(ASSESSMENT BASICS)
Method used to assess
student competency
(Process/Instrument/rubric used
to assess performance level)
D = DIRECT I = INDIRECT
(min. 1 each)
Assessment
Expectations/Benchmark
(EVALUATION)
How Results
Will Be Used
(to make improvements in
courses, instruction, and/or
student support activities)
Questions?
Comments?
Thank you for Attending!
Reminder
Evaluation Measures
(with Guest Speaker Dr. Henry Oh)
Wednesday, April 20
4 – 6 pm
Thursday, April 21
8 – 10 am and 12 – 2 pm