Review of Measures Used In U.S. News & World Report

Download Report

Transcript Review of Measures Used In U.S. News & World Report

IE Presentation, November 20, 2008
Institutional Effectiveness at the
University of North Alabama
Dr. Andrew L. Luna
Institutional Research, Planning, and Assessment
Connections to IE?
Telephones and IE?
Walter Shewhart
Edwards Deming
Joseph Juran
Hawthorne Works, Bell Laboratories
Shewhart Cycle
Plan
Act
Continuous
Improvement
Do
Check
FACT...
The Shewhart Cycle is the foundation for all quality and
Continuous improvement processes that we use today
Points of Discussion

Similarities between the Shewhart Cycle
and Institutional Effectiveness
 Overview of Institutional Effectiveness at
UNA
 Review of Outcomes and Improvement
Processes
 Review of Assessment
 Questions
More on the Shewhart Cycle





Plan – Create a strategy as to what you want to do
and how you will measure success
Do – Follow the plan and do what you say you
will do
Check – Assess the effectiveness of the current
plan by looking at the success outcomes measures
Act – Make changes to the strategies to improve
the measured outcomes
Repeat the Cycle!
Why is the Shewhart Cycle Important?
If you can’t measure something, you can’t
understand it…
 If you can’t understand it, you can’t control it…
 If you can’t control it, you can’t improve it…
 If you can’t improve it…then why the heck are
you doing it?

So, What is Institutional Effectiveness?



A sharpened statement of institutional mission and
objectives
Identification of intended departmental/programmatic
outcomes or results (Plan)
Establishment of effective means of assessing the
accomplishments outcomes and results (Do, Act, Check)
FACT...
Institutional Effectiveness is primarily undertaken to
improve what we do…not just to pass accreditation.
Shewhart Cycle and SACS
Macro IE
Core Requirement 2.5:
Plan
Check and
Act
“The institution engages in ongoing, integrated, and
institution-wide research-based planning and evaluation
processes that (1) incorporate a systematic review of
institutional mission, goals, and outcomes; (2) result in
continuing improvement in institutional quality; (3)
demonstrate the institution is effectively accomplishing
its mission.”
Do
Key Points to Core Requirement 2.5




Emphasizes an expectation that the institution is the
primary focal point for compliance
Sets expectations for the description of planning and
evaluation processes that are active and continuous rather
than static or single occurrences.
Points to a clear and strong expectation for documentation
of the systematic review of institutional mission, goals and
accomplishments consistent with its mission
Sets expectations for the documented use of results of
institutional planning and evaluation to achieve
institutional improvements
Shewhart and SACS, Cont.
Micro IE
Plan
Comprehensive Standard 3.3.1:
“The institution identifies expected outcomes for its
education programs … and its administrative and
educational support services; assesses whether it
achieves those outcomes; and provides evidence of
improvement based on analysis of those results.”
Do and
Act
Check
Key Points to Comprehensive Standard
3.3.1




Emphasizes the unit level of individual educational
programs and support services
The expected achievements of educational programs and
support services should be articulated, and evidence
presented concerning accomplishments
Distinguishes between program outcomes and learning
outcomes
Sets expectations that improvement is guided by the
establishment and evaluation of program and learning
outcomes
Shewhart and SACS, Cont.
General Education and IE
Plan
Comprehensive Standards 3.5.1
“The institution identifies college-level
competencies within the general education core and
provides evidence that graduates have attained those
competencies.”
Do, Check, Act
Key Points to Comprehensive Standard
3.5.1



General Education should be part of the institutional
mission
The expected achievements of the General Education
program should be articulated, and evidence presented
concerning accomplishments
Improvement should be guided by the establishment and
evaluation of learning outcomes
Overview of Institutional Effectiveness
Focus on Assessment
Comprehensive
Dept./Program Review
Program Outcomes
•Quality Indicators
•Productivity Indicators
•Viability Indicators
Evaluation of Learning
Learning Outcomes
•What graduates know
•What graduates can do
•What attitudes/values
graduates possess
Mission
Strategic Goals
Continuous Improvement of
Programs and Departments
Continuous Improvement of
Student Learning
Institutional Effectiveness
Institutional Effectiveness System at
UNA

Annual Report - Annual Action Plan and
Assessment Report
 Comprehensive Program and Department review –
Five-year Review
 Review of General Education – Five-year cycle of
General Education assessment
Schematic of Institutional Effectiveness
Process
Five-Year
Review for
Selected Depts.
Five-Year
Review for
Selected Depts.
Five-Year
Review for
Selected Depts.
Five-Year
Review for
Selected Depts.
Five-Year
Review for
Selected Depts.
Year 1
Annual Reports
Year 2
Annual Reports
Year 3
Annual Reports
Year 4
Annual Reports
Year 5
Annual Reports
Area I
Five-Year
Assessment
Area III
Five-Year
Assessment
Area II
Five-Year
Assessment
Area IV
Five-Year
Assessment
Overall
Gen. Ed.
Assessment
No
Review of
Strategic Goals
Yes
5-Year
Cycle?
•OIRPA
•IE Committee
•Gen. Ed.
Committee
Five-Year Program/Department Review
Timeline (pending IE Committee Approval)
Last year’s Depts.
that underwent 5year review
submits outcomes
of review as
AAPAR priority
initiatives
September
October
OIRPA submits
Five-Year
Enrollment
report to
academic
departments
November
OIRPA meets
with
departments
up for review
December
OIRPA
meets with
Deans/VP
for overview
OIRPA conducts
assessment
workshop for
UNA campus
January
February
OIRPA initiates
individual
departments
meetings
March
April
May
June
July
August
Five-Year
Reviews
completed and
sent to Dean/VP
Deans/VPs
meet with
departments
to discuss
review
September
OIRPA submits
overview of
Five-Year
process to IE
Committee
Annual Action Plan and Assessment
Report Timeline (pending IE Committee Approval)
President, VP,
and Dean
Initiatives
Due
September
Next FY Priority
Initiatives by
VPs
October
November
December
1st part of
AAPAR due for
current fiscal
year w/ one
Priority Initiative
for next FY
January
February
Next FY Priority
Initiatives by
Deans
March
OIRPA submits
AAPAR overview to
IE Committee
Budget initiatives
based on Priority
Initiatives are
established
April
May
June
SPBS reviews
Next FY Priority
Initiatives
July
August
September
2nd Part of
AAPAR
completed by
depts.
Outcomes

Operational Outcomes - measures of how well the
institution/division/department is
meeting/exceeding requirements
 Learning Outcomes - statements of the
knowledge, skills, and abilities the individual
student possesses and can demonstrate upon
completion of a learning experience or sequence
of learning experiences (e.g., course, program,
degree).
Problems with Outcomes

Outcomes are too broad
 Outcomes do not address core
requirements/competencies or mission
 Outcomes are not measurable
Types of Measurement




Discrete or Attributes data
Binary data with only two
values
Continuous or Variable
data
Information that can be
measured on a continuum
or scale

Yes/No
 Good Bad
 On/Off
 Male/Female
 Pass/Fail
 Height/Weight
 Temperature
 Test Scores
 Time
 Distance
Bloom’s Taxonomy of Learning
Outcomes
Category
Knowledge
Definition
recalling or remembering
something without necessarily
understanding, using, or
changing it
Related Behaviors
define, describe, identify, label,
list, match, memorize, point to,
recall, select, state
alter, account for, annotate,
understanding something that calculate, change, convert, group,
has been communicated without explain, generalize, give examples,
Comprehension
necessarily relating it to
infer, interpret, paraphrase,
anything else
predict, review, summarize,
translate
apply, adopt, collect, construct,
using a general concept to solve
demonstrate, discover, illustrate,
problems in a particular
Application
interview, make use of,
situation; using learned material
manipulate, relate, show, solve,
in new and concrete situations
use
Bloom’s Taxonomy, Cont.
Category
Analysis
Synthesis
Evaluation
Definition
Related Behaviors
breaking something down
analyze, compare, contrast,
into its parts; may focus on
diagram, differentiate, dissect,
identification of parts or
distinguish, identify, illustrate,
analysis of relationships
infer, outline, point out, select,
between parts, or recognition
separate, sort, subdivide
of organizational principles
blend, build, change, combine,
relating something new by
compile, compose, conceive,
putting parts of different
create, design, formulate,
ideas together to make a
generate, hypothesize, plan,
whole
predict, produce, reorder,
revise, tell, write
judging the value of material accept, appraise, assess,
or methods as they might be arbitrate, award, choose,
applied in a particular
conclude, criticize, grade,
situation; judging with the
judge, prioritize, recommend,
use of definite criteria
referee, select, support
Forms of Measurement
Longitudinal data is gathered over an
extended period
Semester1
, Semester , Semester … Semester
2
3
t
Forms of Measurement, Cont.
Cross-sectional data represent a snapshot
of one point in time
What is Improvement?

Measurable actions that increase
learning, efficiency, effectiveness, and/or
the bottom line
 Decrease the Bad
 Increase the Good
 Decrease Variability
Decrease Variability?
What the heck is that?
Class A
100, 100
99, 98
88, 77
72, 68
67, 52
43, 42
Mean = 75.5
STD = 21.93
Mean
Class B
91, 85
81, 79
78, 77
73, 75
72, 70
65, 60
Mean = 75.5
STD = 8.42
Inputs, Processes, and Outcomes
Measurement
X’s
Materials
X’s
Methods
X’s
X’s
Input
Y’s
Outcomes
X’s
Environment
X’s
People
X’s
Machines
Assessment Steps






Develop learning/operational objectives
Check for alignment between the
curriculum/business process and the objectives
Develop an assessment plan
Collect assessment data
Use results to improve programs/department
Routinely examine the assessment process and
correct, as needed
Types of Assessment – Direct
Academic





Published Tests
Locally Developed Tests
Embedded Assignments and Course Activities
Competence Interviews
Portfolios
Types of Assessment – Direct
Educational Support/Administrative





People enrolled/participating/served
Work accomplished
Revenue generated
Turnaround time
Decrease in nonconformities
Types of Assessment - Indirect

Surveys
 Interviews
 Focus Groups
 Reflective Essays
How Can OIRPA Assist?
Create university wide reports – Five-year
departmental reports
 Analyze university-wide assessment data – NSSE,
CAAP
 Hold workshops on assessment and IE
 Work with individual departments on annual reports,
program review, and outcomes assessment
 Provide ad hoc data reports for departments
 Work with committees to develop assessment plans –
IE Committee, Gen. Ed. Committee

Questions or Comments?