Essentials of Assessment - Indiana State University

Download Report

Transcript Essentials of Assessment - Indiana State University

Eric Hampton, Ph.D.
Cindy Crowder, Ph.D.
Assessment in Context
 Evaluation
 A process of reaching a conclusion, judgment, or
decision about an evaluation object. This involves
judging the worth of something (Scriven, 1967).
 Assessment
 Procedures and processes which identify, collect, and
prepare data to serve evaluative needs (e.g. student
outcomes, educational objectives, program objectives).
 Measurement
 A process of systematically assigning numbers to
measured attributes according to established rules.
Attributes of High Quality
Assessment (Stiggins, 1997)
Clear Targets
1.

Clear achievement targets and knowing what you are after is
a must.
Focused Purpose
2.

Know why the assessment is conducted and how results will
be used.
Proper Method
3.

The method of assessment must match the target.
4. Sound Sampling

A representative sampling of possible performances is
gathered.
5. Accuracy in Assessment

Assessment limits error and bias in measurement.
Potential Educational Assessment
Targets (Adapted from Stiggins, 1997)
 Knowledge
 Reasoning/problem solving
 Skill
 Creation of products
 Dispositions/attitudes
Potential Assessment Methods
Direct Assessment Methods
Student knowledge, skill or product is directly examined or observed.
Student performance on the direct measure is compared against
measurable performance criteria.
 Student work products assessment
 Classroom assessment of knowledge/reasoning
 Observations of student skills
 Standardized tests
 National certification exams
 Locally developed tests
 Juried review
 Simulations
 Internship evaluations based on learning outcomes
Potential Assessment Methods
Indirect Assessment Methods
Assessment data gathered on reported perceptions of student learning
 Student self-assessment of learning
 Surveys (alumni, employer)
 Exit Interviews
 Focus groups
Assessment Paradigms
Quantitative
 Gathering assessment data in numeric form.
 Can be analyzed statistically.
Qualitative
 Gathering assessment data in narrative form.
 Can provide rich detail.
Matching Data to Outcomes
Do not address a specific outcome with a global measure.
 Course grades are good reflections of overall performance in a
class.
 Course grades are poor reflections of particular learning
outcomes.

Can be impacted by many factors not tied to the outcome
(attendance, participation, etc.)
 Summative exam scores are good reflections of overall
performance in a content area or construct of knowledge.
 Summative exams may not provide sufficient information
about particular outcomes.

Particular items from exams may provide a more direct measure of
student performance on particular outcomes.
Matching Data to Outcomes
Rubrics are useful in assessment of specific performance
criteria for outcomes.
 Can be used in student papers, theses, presentations,
portfolios, projects, etc.
1. Review existing rubrics for a match with desired student
learning outcomes.

2.
3.
Revise when the coherency of the match between rubric and
outcome can be improved.
Analyze existing student work (course projects, papers, etc.)
for match with desired student learning outcomes and
develop new rubrics.
Find those outcomes not adequately assessed by the first
two steps and develop new student performances and
corresponding rubrics.
Some Assessment Guidelines
 All assessment methods have advantages and




disadvantages. There is no “perfect” method in the
abstract.
A good assessment has a strong match with the specific
outcome to be assessed.
A good assessment demonstrates validity for the purposes
for which the data will be used.
A good assessment is measured with accuracy and
reliability.
A good assessment is feasible in terms of resources (time,
effort and money)
Use multiple methods of assessment and data
gathering.
 Any one method of data collection carries with it its own
strengths and its own weaknesses.
 Use of only one method leads to mono-method bias.
 Student learning or process outcomes should be
approached from multiple aspects, utilizing multiple
methods, and carried out by multiple individuals.
Assessment/Evaluation Steps
Generate assessment questions based on student
learning outcomes.
2. Generate ideas on behaviors, skills, attitudes,
performances, and dispositions which would provide
data to answer these assessment questions.
1.

Compare questions with ideas, looking for congruence. If
congruence is not perfect, consider whether ideas need to
be added or removed, or assessment questions added or
removed.
Assess existing data sources.
3.



What is already being gathered?
What would need to be developed?
What can be gathered?
Assessment/Evaluation Steps
Develop an assessment plan which collects data to provide
answers to assessment questions.
4.



Identify how/when the data will be collected.
Identify how/when the data will be analyzed.
Identify how/when the data will be disseminated and to whom.
Operationally define each measure.
5.



What behaviors, skills, etc. are targeted?
Where will each be exhibited or measured?
In what way will each be exhibited or measured?
Develop measurement/evaluation/assessment instruments.
6.


Identify what differentiates successful and unsuccessful
performance.
Develop rubrics for assessment of performance.
Assessment/Evaluation Steps
7. Initial assessment of evaluation plan adequacy.
 Adjust plan/measures/metrics accordingly.
8. Carry out assessment.
 Collect initial data.
 Analyze initial data.
9. Evaluate analyzed data for adequacy in answering
the assessment questions posed.



Evaluate effectiveness of instruments.
Evaluate effectiveness of analysis.
Make revisions to the assessment plan if necessary.
Assessment/Evaluation Steps
10. Evaluate data in comparison to assessment questions.

What are the strengths of student performance or program
preparation?

What are the weaknesses of student performance or
program preparation?
11. Disseminate assessment results.

Consider gathering and listening to feedback from
stakeholders on the usefulness/appropriateness of the
findings.
12. Continue data collection.
13. Revise assessment plan as necessary to meet
program/accreditation needs.
Assessment at ISU
 Standing Requirements
 Mission Statement
 Outcomes Library
 Curriculum Map
 Communication of Outcomes
 Assessment Plan
 2011-2012; 2012-2013; 2013-2014
 Assessment Cycle
 Assessment Findings
 Action Plan based on findings
 Status Report
Assessment at ISU
Timelines
 2011-2012 Assessment Cycle
 Assessment plan May 1, 2011.
 Assessment findings May 1, 2012.
 Action plan December 1, 2012.
 Status report May 1, 2013.
 2012-2013 Assessment Cycle
 Assessment plan May 1, 2012.
 Assessment findings May 1, 2013.
 Action plan December 1, 2013.
 Status report May 1, 2014.
Aspects Involved in Assessment
1.
2.
3.
4.
5.
6.
7.
8.
Mission Statement
Program Educational/Process Objectives (3-6)
Student Learning Outcomes/Process Outcomes (3-6 for
each objective)
Student Learning Outcomes/Process Outcomes Aligned
with Practices (Curriculum Mapping)
Assessment Plan: which objectives/outcomes will be
assessed when; methods; performance targets;
Assessment: Collection, Analysis of Evidence
Evaluation: Interpretation of Evidence
Action Plan
Mission Statement
• Links the function of the unit to the overall mission and
strategic priorities of ISU
• Identifies the program’s purpose
• Identifies the primary stakeholders (e.g., students)
• Formulating a mission statement:



What is the primary function of the unit?
What are the core activities?
What should those whom you serve experience while/after
interacting with your unit?
Outcomes Library
Language of assessment:
 LEARNING OBJECTIVE: general knowledge/skill/ability
a student should have at time of graduate
 LEARNING OUTCOME: Specific accomplishments to be
achieved. What are you looking for in student
performance to tell if they “get it”?

Stated in the form of :
<one action verb> + <one something>
 In general, aim for 3-6 OBJECTIVES and 3-6
OUTCOMES for each objective.
OBJECTIVE 1: Students will be able to conduct, analyze and
interpret experiments, and apply experimental results to improve
processes
 SLO 1.1: Students will develop and execute experiments to validate
designs.
 SLO 1.2: Students will design and execute test plans as a part of
system commissioning.
Objective 2: Students function effectively on teams
 SLO 2.1: Student gathers information that relates to the team’s
topic.
 SLO 2.2: Student shares in the work of the team.
 SLO 2.3: Student listens to other teammates.
(Could be measured using a rubric during observation and for peer
evaluation.)
Curriculum Map
 Educational experiences (e.g., courses, internships)
mapped to Objectives/Outcomes
 Ensure that experiences are present at appropriate
levels to support student achievement of each
outcome
 Communication tool
 Faculty identify gaps in curriculum
 Sharing with students help them understand how their
courses form a curriculum and support achievement of
identified outcomes
Curriculum Map
Educational Objective #1
Student Learning Student Learning Student Learning Student Learning
Outcome 1.1
Outcome 1.2
Outcome1.3
Outcome1.4
Courses and Learning Activities
EDU101
I
I
I
EDU 115
I
EDU 207
P
EDU 242
P
EDU 302
P
EDU 499
Capstone
R
Legend:
I = Introduced;
P = Practiced;
R
R = Reinforced
P
P
P
R
R
Communication of Outcomes
 A description of how students and other stakeholders are
informed about the programmatic learning outcomes.
 Identify programmatic stakeholders.
 Identify methods for informing constituents about learning
outcomes (e.g. program handbooks, syllabi, programmatic
documents, web publishing)
 The best method of communication for students may not be
the best method of communication for stakeholders.
 This is distinct from communication of assessment findings.
Assessment Plan
 Which outcomes are assessed?
 Include timeline – not all outcomes need be assessed
every year.
 How are the outcomes assessed?
 What methodology is employed to collect assessment
data?
 Where will outcomes be assessed?
 What are the targets for student achievement?
 Who is responsible for carrying out each element of
the assessment plan (identified by title)?
Assessment Cycles
 Developing assessment cycles:
 Don’t try to assess every outcome every year.
 Develop a timetable for assessment activities.
 Identify person(s) responsible for each assessment
activity.
 Try to avoid random acts of assessment.
Assessment Cycle Table
Student
Learning
Outcomes:
SLO 1.1
2010-11
2011-12
X
SLO 1.2
X
2014-15
X
X
X
X
SLO 2.3
X
X
X
2015-16
X
X
SLO 2.2
SLO 3.1
2013-14
X
SLO 1.3
SLO 2.1
2012-13
X
X
Assessment Findings
 Aggregate your data
 Analyze your data
 Ask whether the target for achievement was met.
 If not, what are the recommendations for improvement?
 If you cannot tell from the collected data, revise your
assessment plan.
 Provide supporting evidence (meeting minutes, etc.)
of faculty discussions of the assessment findings and
proposed improvements.
Action Plan
 Focus on the findings.
 In light of the findings, what will the program do?
 Include a timetable for implementing this response.
 Evaluate and discuss the resources necessary to
support the action plan.
 Identify individuals responsible for ensuring that
implementation occurs.
Assessment and Evaluation Cycles
Definitions
From G. Rogers, ABET, Program Assessment of Student Learning: Keep it Simple
Terms
Program Educational Objectives
Student (Learning)
Outcomes
Performance Criteria
Assessment
Evaluation
Definitions
Program educational objects are broad statements that describe what graduates are expected
to attain within a few years after graduation; early career (3-5 years) attributes that students
will be able to demonstrate upon graduation.
Student outcomes describe what students are expected to know and be able to do by the
time of graduation. These relate to the knowledge, skills, behaviors, and attitudes that
students acquire as they progress through the program.
Specific, measurable statements identifying the performance(s) required to meet the
outcomes; confirmable through evidence.
Assessment is one or more processes that identify, collect, and prepare data to evaluate the
attainment of student outcomes and program educational objectives. Effective assessment
uses relevant direct, indirect, quantitative and qualitative measures as appropriate to the
objective or outcomes being measured. Appropriate sampling methods may be used as part
of an assessment process.
Evaluation is one or more processes for interpreting the data and evidence accumulated
through assessment processes. Evaluation determines the extent to which student outcomes
and program educational objectives are being attained. Evaluation results in decisions and
actions regarding program improvement.
Questions
?
An Assessment Plan Table
Objective 1:
Student
Learning
Outcome/
Process
Outcome
1.1
1.2
1.3
1.4
1.5
Courses/
Assessment Source(s) of Time of
Educational Method(s) Assessment Data
Strategies
Collection
Person(s)
Responsible