Examining the Value of Assessment of Student Learning in

Download Report

Transcript Examining the Value of Assessment of Student Learning in

Assurance of Learning in Business
Schools: Observations,
Implementation Issues & Guidance
Thomas G. Calderon
Chair & Professor of Accounting/
Director, CBA Quality Assessment
College of Business Administration
The University of Akron
Akron, OH 44325-4802
Email: [email protected]
Tel: 330 972-6228
APLG/FSA 2006 Annual Seminar
February 12–14, 2006
http://aaahq.org/aplg/seminars/2006/program.htm
1
Material for this presentation is based on my experience in
leading assessment projects since 1996, my various research
projects, the AAA’s Teaching & Curriculum Section’s Best
Practices in Accounting Program Assessment (Calderon, Green
& Harkness) and Kathryn Martell & Thomas G. Calderon,
Assessment of Student Learning in Business Schools: Best
Practices, Each Step of the Way. Vol. 1&2. AIR/AACSB. 2005.
http://aacsb.edu/resource_centers/assessment/default.asp
About this Presentation

This presentation covers:
–
–
–

2
Assurance of Learning Fundamentals
Six Observations on Assurance of Learning
Lessons learned from each observation
The presentation offers practical guidance for
implementing assurance of learning projects
and identifies key issues that need attention
Congruency, Alignment & Change




3
Learning goals for each program determined by the
faculty and based on the program’s mission
Curriculum alignment
Evidence showing that students are achieving
program’s learning goals
Use evidence to initiate change and continuous
improvement
Curriculum Alignment



What are the program’s learning goals?
Where in the curriculum are these goals
addressed?
What is taught and how are students
assessed?
Program
Mission
5
Program
Learning
Goals
Courses
/CoCurr.
Curriculum Alignment – From AIS Course
Learning Goal (Students Will….)
Covered?
Briefly describe what you do (how covered and how
assessed)
1. Demonstrate knowledge and understanding of
core accounting fundamentals (financial
reporting, cost and management accounting,
auditing, tax, and systems).
Yes
This course (i) covers the basic AIS concepts and principles,
business processes, data modeling methods, systems
documentation techniques, data normalization, database
approaches, internal control frameworks, fraud, auditing
computer based systems, and systems analysis and design
issues (ii) applies various concepts and principles to the five
transaction cycles (i.e., revenue, expenditure, payroll,
manufacturing and financial cycles). Special emphasis is
placed on COSO, SOX and CoBIT frameworks.
3. Demonstrate the ability to think creatively
and apply their knowledge of accounting
fundamentals in innovative ways.
Yes
Students work both individually and in groups to examine
internal control issues in various accounting cycles. Further,
students work in a semester long project on an emerging
issue in AIS. As part of the requirements, students write a
detailed report and make a formal presentation before the
class.
6. Demonstrate ability to research an issue,
analyze qualitative and quantitative data, and
integrate information from multiple sources.
Yes
The semester long project requires students to research an
emerging issue, analyze various issues and prepare a formal
report. Students leverage Ohiolink sources, Web based
sources and library to conduct research and analysis.
Another assignment requires students to analyze SOX
section 404 material weaknesses in selected companies’ 10K
and 10Q reports accessed from the SEC’s website.
6
Evidence






7
Not Course Grades
Direct  based on actual samples of students’ work
 linked to program learning objectives
Indirect  attitudes, opinions and assumptions about
learning  linked to program learning objectives
Must be “public”  shared with faculty in the program
Must be based on individual student performance
unless assessing teamwork
May be shared with students
Closing the loop






8
Analyze and report evidence
Share evidence with faculty and stakeholders
Discuss evidence and implications
Use evidence in decision making processes
Act on evidence, propose improvements
Implement proposals and continuously
improve
Plan Before You Leap!

Before starting, develop an assessment plan
that includes:
–
–
–
–
–
–

9
Learning objectives
Evidence to be used
Method to collect evidence
Desired expectations
Who is primarily responsible for collecting evidence
When and how will evidence be reported, discussed
and used
Follow the plan
Observations and Practical Lessons



10
Six Observations on Assurance of Learning
Lessons learned from each observation
Examples and implementation ideas
Assessment and the apparent dominance
of indirect evidence

Observation 1:
–
–
11
Business schools and accounting programs
use multiple types of evidence to assess
learning
Indirect evidence appears to dominate
Assessment and the apparent dominance
of indirect evidence
Type of Evidence
12
Frequency
Percent
(N=160)
Type
Student Survey
114
71
Indirect
Alumni Survey
106
66
Indirect
Required Capstone or Senior Project
92
58
Direct
Business Advisory Board input
88
55
Indirect
Employer Survey
83
52
Indirect
Syllabi Review
77
48
Indirect
ETS Major Field Test
66
41
Direct
EBI surveys
60
38
Indirect
Course embedded assessment with program
standards (rubric)
59
37
Direct
Student Interviews
55
34
Indirect
Student GPAs
45
28
Indirect
Focus Groups
41
26
Indirect
Assessment and the apparent dominance
of indirect evidence

Lesson 1: Recognize important characteristics
of assessment evidence
–
–
–
–
–
13
Must be appropriate within the context of the
school’s mission
Focus is on degree program
Must relate to specific learning objectives
Place emphasis on direct evidence of student
learning
Use indirect evidence to supplement direct evidence
Critical pragmatism vs. scientific
orthodoxy in evidence collection

Observation 2:
–
–
–
Measurement is a concern for everyone
There is an inherent quest to “measure” learning
Measurement comes with certain connotations




–
Assessment data are sometimes criticized as



14
Precise
Objective and Unbiased
Representationally faithful
Valid and reliable
unreliable and unscientific,
having high inter-rater variability, and
difficult to develop and use
Critical pragmatism vs. scientific
orthodoxy in evidence collection

15
Lesson 2: Recognize vital evidence collection and
analysis issues
– Root evidence collection and measurement in
common sense and use evidence that facilitate
dialogue and understanding of student learning.
– Avoid fixation on traditional validity concepts;
rather employ the contemporary holistic approach
to validity – validity in use.
– Be aware of possible scoring variability among
different raters. How much are you willing to
accept?
Critical pragmatism vs. scientific
orthodoxy in evidence collection

16
Lesson 2 Continued: Recognize vital evidence
collection and analysis issues
– Build consensus at alignment stage and in
planning for assessment (e.g., standardized
rubric development)
– Have dialogue and invest time to develop a shared
understanding of evidence and traits
– Employ observable behavioral checklists rather
than traditional Likert scales with inherently wide
variability.
TRAIT
17
Good
Satisfactory
Not Acceptable
Recognizes own
personal biases that
can influence decision
making outcomes.
States assumptions and
identifies and clarifies
personal beliefs that may
affect decision outcomes
States assumptions and
identifies but does not
clarify personal beliefs
that may affect decision
outcomes
Does not state
assumptions or does
not identify personal
beliefs that may
affect decision
outcomes
Learns from history
by including a
discussion from the
past where managers
faced similar ethical
issues.
Demonstrates a good
appreciation for prior
history where managers
faced similar ethical
issues
Demonstrates a fair
appreciation for prior
history where managers
faced similar ethical
issues
Does not include a
discussion of prior
history where
managers faced
similar ethical issues
Limits the expression
of self-interest and
other outcomes of
marketplace logic.
Demonstrates
substantive constraint in
the expression of self
interest and other
outcomes of marketplace logic
Demonstrates some
constraint in the
expression of self interest
and other outcomes of
market-place logic
Demonstrates no
constraint in the
expression of self
interest and other
outcomes of marketplace logic
Abstains from the
tendency to justify the
means by virtue of the
end.
Clearly abstains from
tendency to justify the
means by virtue ends.
Issues are clearly
considered and decisionmaking is mindful of this
tendency.
Shows awareness of the
Does not abstain
tendency to justify the
from the tendency to
means by virtue of the
justify the means by
end but actions to avoid
virtue of the end
the tendency are not very
deliberate
Adapted from Fogarty 2005
A Hypothetical Summary of Evidence Based
on a Standardized Rubric
TRAIT
N
Good
Satisfactory
Not
Acceptable
Recognizes own personal biases that
can influence decision making
outcomes.
200
30
(15%)
160
(80%)
10
(5%)
Learns from history by including a
discussion from the past where
managers faced similar ethical issues.
200
22
(11%)
88
(44%)
90
(45%)
Limits the expression of self-interest
and other outcomes of marketplace
logic.
200
40
(20%)
100
(50%)
60
(30%)
Abstains from the tendency to justify
the means by virtue of the end.
200
36
(18%)
160
(80%)
4
(2%)
16%
64%
21%
Average
18
The assessor’s dilemma and courseembedded assessment

Observation 3:
– Assessment is useful and necessary
– Assessment requires resources, commitment, and attention






At many schools, students complete a battery of assessment center
activities as part of a required course (Bommer, Rubin & Bartles 2005).
Rockhurst University invested significant sums to create an infrastructure
to support course embedded assessment (Bassett, Daley, Haefele 2005).
Many schools administer assessment activities in the capstone course
(e.g., Business Strategy & Policy courses at The University of Akron)
Surveys used by many schools are costly and generate reams of data
(e.g., EBI)
Assessment evidence gathering, analysis and reporting require
significant time commitments
Accounting departments spent less than $5,000 in 2000 and 60% said
they had no resources for assessment in 2004 (Hindi & Miller 2000;
Calderon et al. 2004)
Assessors (faculty) need time for other valued activities
19
DIRECT
INDIRECT
The assessor’s dilemma and course
embedded assessment
Specialized assessment activities
(e.g., assessment center,
assessment day, special tests)
…Generally more overhead
UA uses the following:
• EBI surveys
• Graduating students exit surveys
• Employer surveys
• Alumni surveys
• Advisory board reviews
Attitudes & Opinions - Surveys,
interviews, focus groups
………Generally more overhead
Course-embedded assessment
(done as part of regular, on-going
activities in a course)
…………. Generally less overhead
20
UA uses the following in-house tests:
• The CCT (core curriculum test)
• Major field tests in finance
• Major field test in accounting
• Major field test in marketing
UA uses the following:
• “Management Projects”
• Cases, simulations and projects
• Writing samples and writing
portfolios
The assessor’s dilemma and course
embedded assessment

Lesson 3: Course-embedded assessment (CEA)
transforms the evidence collection process from a
potentially burdensome activity to one that is part of
normal classroom activity
–
–
–
–
21
Create a support infrastructure for CEA  Consensus,
Classroom Activities, Training, Instruments, & Medium for
Reporting and Dialogue (CATIM)
Focus on program learning goals rather than learning
goals for a specific course
Make achievement of program learning goals public
Use incentives to generate faculty interest in CEA – e.g.,
Rockhurst
Data reporting and active engagement vs.
data hoarding and passive compliance

Observation 4
–
–
–
–
–
22
Close to 75 percent of business school deans report that
their faculty discuss assessment data regularly at faculty
meetings.
Most do so at least once each academic year.
USAF Academy reports data to faculty regularly and have
dialogue on how data compare with expectations
Seton Hall and Ohio Northern prepare an annual
assessment report for faculty
Kings College sets aside an Assessment Day for faculty to
make presentations on assessment results.
Data reporting and active engagement vs.
data hoarding and passive compliance
How programs report results?
Faculty retreat
26
Regular faculty meetings
69
Departmental meetings
44
Committee meetings
44
Oral reports
25
Written reports
50
Special faculty meeting
14
N=160 vs. N=105? Implications
23
Percent
(N=105)
Data reporting and active engagement vs.
data hoarding and passive compliance

Lesson 4: Assessment reports are the basis for
dialogue, reflection, and proposals for change
–
–
–
–
24
Offer relatively seamless opportunities for viewing
and reviewing reports (e.g., WebCT)
Discuss reports regularly
Identify opportunities for improvement
Create a set of action items after discussions
A Hypothetical Summary of Evidence Based
on a Standardized Rubric
TRAIT
N
Good
Satisfactory
Not
Acceptable
Recognizes own personal biases that
can influence decision making
outcomes.
200
30
(15%)
160
(80%)
10
(5%)
Learns from history by including a
discussion from the past where
managers faced similar ethical issues.
200
22
(11%)
88
(44%)
90
(45%)
Limits the expression of self-interest
and other outcomes of marketplace
logic.
200
40
(20%)
100
(50%)
60
(30%)
Abstains from the tendency to justify
the means by virtue of the end.
200
36
(18%)
160
(80%)
4
(2%)
16%
64%
21%
Average
25
Visualization and improvement opportunities
Observation 5: Some schools use innovative charts and graphs
to focus faculty attention (e.g., Redle & Calderon 2005)
Radar Chart of Item Analysis
1
39
38
40
100.00%
2
3
4
90.00%
37
5
80.00%
36
6
70.00%
35
7
60.00%
50.00%
34
8
40.00%
33
9
30.00%
20.00%
32
10
10.00%
31
11
0.00%
30
12
29
13
28
14
27
15
26
16
25
17
24
18
23
26
19
22
20
21
l
Sp 199
rin 6
Su g 1
9
m
m 97
er
19
Fa 97
ll
Sp 199
rin 7
g
19
Fa 98
ll
Sp 199
rin 8
Su g 1
9
m
m 99
er
19
Fa 99
ll
Sp 199
rin 9
g
20
Fa 00
ll
Sp 200
rin 0
Su g 2
0
m
m 01
er
20
Fa 01
ll
Sp 200
rin 1
g
20
Fa 02
ll
Sp 200
rin 2
g
Sp 20
rin 03
g
20
Fa 04
ll
Sp 200
rin 4
g
20
05
Fa
l
Trends in U of Akron’s Major Test in
Finance
Trend in Mean and Median Student Scores
80.00%
75.00%
70.00%
65.00%
60.00%
55.00%
50.00%
45.00%
40.00%
Test date
27
Domain Objectives
80%
78.70%
76.12%
74.09%
75%
69.86%
70%
65%
61.77%
59.76%
60%
56.19%
54.39%
55%
51.87%
en
t
M
ap
ita
l
Se
cu
rit
y
W
or
ki
ng
C
e
Ti
m
an
ag
em
Va
lu
at
io
n
M
of
Va
lu
e
M
An
al
ys
is
an
d
Po
rtf
o
lio
is
k
on
ey
en
t
an
ag
em
et
ur
n
an
d
R
Th
eo
ry
R
an
d
In
te
re
st
Ra
te
al
ys
is
Fi
na
n
cia
l
M
ar
ke
ts
ia
lR
at
io
An
dg
et
in
g
Bu
Fi
na
c
ap
ita
l
C
In
st
itu
tio
ns
50%
Through Fall 2001
28
During 2002
Through Fall 2002
Through Spring 2004
Through Fall 2004
Through Spring 2005
Visualization and improvement opportunities

Lesson 5: Attention directing graphs and
charts lead to more focused assessment
dialogue
–
–
Oriented toward problem-solving
Focus may include:






29
Assessment method
Faculty development
Student
Classroom teaching and learning methods
Teaching resources and the physical infrastructure for learning
Learning goals and curriculum alignment
Closing the Loop
Observation 6:
Has assessment led to significant curriculum changes?
Percent (N=143)
30
No
42
Yes
58
Total
100
Closing the loop
36% of business schools report that
they have a formal written assessment
plan
 Schools with a formal plan are more
likely than those without to report
success in using assessment to
improve the curriculum (Chisquare=4.77, DF=1, p-value=.02)

31
Closing the loop

Lesson 6: Plan before you leap!!!
–
–
–
Powerful tool for effective management and communication of
assessment activities.
Guides schools through each major assessment activity
A basic plan could be designed as a grid that includes columns
for:






32
program learning objectives
assessment methods to be used in collecting evidence about student
learning for each objective
identification of the measurement metrics and expectations for each
learning goal
a procedure for administering the assessment activity
a time line for implementation
List of persons responsible for coordinating data collection
Assessment Plan – A Basic Example
33
Students will:
Method used/
Evidence
Expectations
Assessment Procedure
Primary
Responsibility of
Demonstrate
knowledge and
understanding of
core accounting
fundamentals
(financial, cost,
systems, auditing,
taxation)
Method: Major Field Test
in Accounting (MFTA) --(75 MC questions)
Evidence: Raw percentage
scores on the MFTA
- Satisfactory
value-added
(≥45%)
- Satisfactory raw
scores on the posttest (≥70%)
- Satisfactory
achievement on
each cognitive
learning goals
- Computerized
Pre/Post Test.
- Pre-test administered
in first accounting
course
-Post-test administered
in accounting capstone
- Consistent incentives
and awards (e.g., 5%
of grade in capstone)
Department Chair
Demonstrate ability
to apply core
accounting
fundamentals
through case
analyses and
projects.
Method: Courseembedded case/project to
demonstrate ability to
apply accounting
fundamentals
Evidence: Ratings for
individual students based
on standardized rubric
developed for the
program by the faculty
- At least
satisfactory rating
on all traits in the
standardized rubric
- Overall, 70%
perform at least at
the satisfactory
level
- Relatively short
case/project that span
multiple areas in
accounting
- Students analyze
cases or complete
project as part of
requirements capstone.
- Case analyzes/project
are rated by instructor
and a panel of
professionals
Faculty who teach
the accounting
capstone course
Conclusion





34
Assessment is effective if it leads to
congruency, alignment and change
Assessment must be planned, supported,
and executed in a systematic manner
Assessment is a systematic and deliberate
extension of what faculty normally do
Keep what you do simple
Avoid fixations!