Transcript Document

Tribal Colleges & Universities
Chief Academic Officers
3rd Annual Meeting
Wisdom Sharing: Assessment
and Academic Program Review
Dr. Koreen Ressler, Vice President of Academics
Sitting Bull College
1
HLC Criterion
Criterion Four. Teaching and Learning: Evaluation and
Improvement
The institution demonstrates responsibility for the quality of its
educational programs, learning environments, and support
services, and it evaluates their effectiveness for student learning
through processes designed to promote continuous
improvement.
Core Components
4.A. The institution demonstrates responsibility for the quality of
its educational programs.
1. The institution maintains a practice of regular program reviews.
2
Program Review Purpose
1. Report for degree programs and
certificates.
2. Purpose of report:
– Analysis
– Evaluation
– Improvement
3
Information in Program Review
• The role of the program within the institution
• Staff/faculty
• Student information – program numbers,
retention, persistence, graduation rates,
graduate employment data
• Revenue and budget information
• Future needs – who is all involved in planning
for the program – advisory committee
4
SBC Program Review Process
PROGRAM REVIEW
PREPARATION AND PROCESS FLOW CHART
Board of Trustees
President
Vice-President of Academics
Faculty
Curriculum Committee
Division Director or
Faculty of Record
for the Program
Program
Faculty, Staff,
Involved Adjunct
Faculty, and
Consultant
5
Curriculum at SBC
•
FUNCTION: Recommend academic and instructional policy to the Board of
Trustees.
•
SCOPE: Covers all matters of instructional policy, programs, and activities
as they relate to the curriculum.
•
Goal # 1: To provide and refine a systematic evaluation of current
academic and technical programs through 2017.
Objective 1: Assign programs to the annual review for the year.
Objective 2: Review & revise curricular components of the college catalog
Goal #2: To explore and evaluate future academic and technical programs
through 2017.
Objective 1: Evaluate & review potential new courses.
Objective 2: Evaluate & review potential new programs.
Objective 3: Explore online/hybrid delivery of course and/or program
offerings.
•
•
•
6
SBC Program Review
Evaluation Criterion
• Evaluation completed by Curriculum
Committee
–
–
–
–
Maintain a program
Enhance a program
Reconfigure a program
Reduce or phase-out a program.
• Five year cycle for all programs, unless
recommendations are made by curriculum
committee to complete within a designated
time.
7
HLC Criteria
4.B. The institution demonstrates a commitment to
educational achievement and improvement through
ongoing assessment of student learning.
1. The institution has clearly stated goals for student learning
and effective processes for assessment of student learning
and achievement of learning goals.
2. The institution assesses achievement of the learning
outcomes that it claims for its curricular and co-curricular
programs.
3. The institution uses the information gained from
assessment to improve student learning.
4. The institution’s processes and methodologies to assess
student learning reflect good practice, including the
substantial participation of faculty and other instructional staff
members.
8
Diagram for Assessment of
Student Learning
Establish
Learning
Goals
Use the
Results to
Implement
Change
Provide
Learning
Opportunity
for Students
Assess
Student
Learning
9
Goals versus Outcomes
• Goals (Intended)
– What do you want your students to know
upon completion – need to connect to mission
• Institutional
• General Education
• Program
• Outcomes (Achieved)
– Describe essential learning that students have
achieved and can reliably demonstrate at the
end of a program.
10
Establishment of
Program Outcomes
• Are program outcomes based on industry
standards?
– Advisory Committee input
• Are the program outcomes precise,
specific, and measureable?
11
Measurement of
Program Outcomes
• What do students complete throughout the
program that will provide evidence of mastery
of program outcomes?
– Pre and Post Tests
– National Tests - National Center for Construction
Education and Research (NCCER) Health
Education Service Incorporated (HESI Test)
– Internships/Practicums
– Self Assessments
– Projects
– Portfolios
12
Principal Indicators for Assessment
Sitting Bull College’s assessment is broken down into four areas: institution wide, pre-entry
and freshman level, general education, and program.
1.
2.
3.
4.
Institution-Wide Assessment
a.
Enrollment Trends
b.
Persistence and Retention rates
c.
Tracking of Student Withdrawals
d.
Student Satisfaction Survey (Noel-Levitz) or Community College Survey of Student Engagement (every other year for each
survey)
e.
Student Service Satisfaction Graduate Survey
f.
Satisfaction of Institutional Outcomes Graduate Survey
g.
Graduation Rates//IPEDS/AKIS
h.
Employer Survey
i.
Alumni Survey
Pre-entry and Freshmen Assessment
a.
COMPASS placement (pre) scores
b.
1st Year Freshman Advising
c.
1st Year Experience Course
d.
Freshman Orientation Evaluation
e.
Enrollment Trends
General Education Assessment
a.
General Education Outcomes Assessment Plan – English, Speech, Computers, NA Language, Science, Math
b.
Post COMPASS results
c.
Completion Rates
Program Assessment
a.
Program Assessment Plans & one page papers
b.
Program Reviews
c.
Retention/Persistence – report on program review
d.
Graduation rates – report on program review.
e.
Employer Survey
13
Assessment at SBC
•
FUNCTION: Review, report and make recommendations concerning student learning and
institutional effectiveness for continual quality improvement for all our stakeholders.
•
SCOPE: To oversee all institutional data collection and recommend new data that will
measure institutional effectiveness.
•
Goal #1: To review academic and student support data that demonstrates institutional
effectiveness through 2017.
Objective 1: Annually review program assessment data which supports the continued
improvement for student learning.
Objective 2: Annually review essential learning outcomes (general education) data which
supports the continued improvement for student learning.
Objective 3: Meet monthly during the academic year to review assessment data that may
be available at the time and/or plan for needed data collection to assist in data driven
decisions.
Objective 4: Annually review Student Support Services data including the Enrollment
Management Plan which supports the continued improvement of student learning.
•
14
Annual Plan
Program/General Education
Program
Outcomes
Measurement
Tool
(Who, what,
how, when?)
Measurement
Goal
(expected
results)
Findings
(Actual
results)
Analysis of Data
(What students
learned and what
they didn't learn)
Action or
Recommendation
15
Rubric for Annual Review of
Program/General Education Plans
Performance
Criteria
No
Evidence
0
Emerging
1
Developing
2
Achieving
3
Over 75%competencies/
program outcomes are
clear and understandable
Program
Outcomes
Competencies/program
outcomes are unclear
Measurement
Tools
Measurement tool is not
clear on answering the
“Who, What, How, and
When”
Competencies/Outcomes
only have indirect
measures
Over 50%
discipline/program
outcomes are clear and
understandable
Measurement tool is over
75% clear on answering
the “Who, What, How,
and When”
Competencies/Outcomes
only have direct
measures
Measurement
Goal (Expected
Results)
Measurement goal is not
clearly stated and is not
obtainable
Measurement goal is
either not obtainable or
not clearly stated
Measurement goal is
clearly stated and
obtainable
Findings
(Actual
Results)
There are no actual
results for all of the
measurement goals.
There are over 50%
actual results for all the
measurement goals.
There are 75% of results
for all measurement
goals.
Measurement
Styles
Comments
Measurement tool clearly
answers the “Who, What,
How, and When”
Competencies/Outcomes
have direct and indirect
measures
*Faculty Please note that you will NOT be
scored on this criteria this year, but it will
be applicable for the 2014-2015 academic
year!
16
Rubric Continued
Performance
Criteria
No
Evidence
0
Emerging
1
Developing
2
Achieving
3
Analysis of the
Results
Analysis states the
Analysis states the
relationship between
relationship between
actual and expected results actual and expected
results and describes
what it means
Recommended
Action(s)
Outcomes have actions
identified
Results of Last
Year’s
Recommended
Actions
Some actions implemented All actions implemented All actions implemented as
assigned and completed on
time. Analysis of
effectiveness included.
Outcomes showing
concerns have
recommended actions
listed
Comments
Analysis states the
relationship between
actual and expected results
and describes what it
means. Strengths and
opportunities for
improvement are identified
Outcomes showing
concerns have detailed
recommended actions
assigned to individuals to
be accomplished by a given
date.
Data analysis is interpreted
to justify recommended
actions.
Strengths
Opportunities
17
Direct Measures
• Instruments in which students demonstrate what they
have achieved or learned related to explicitly stated
learning outcomes. All involve the evaluation of actual
student performance vis-à-vis stated learning
outcomes.
–
–
–
–
–
–
–
•
Standardized tests
Locally developed tests
Essay tests
Projects
Juried exhibits
Oral presentations
Performance in internship
http://www.uta.edu/ctle/assessment/direct-indirect.php
18
Indirect Measures
• Measures which rely on perceptions or
opinions about student learning.
– Surveys (employer, alumni, student)
– Exit interviews
– Focus groups
– Global indicators of student achievement (graduate
rates, job placement rates)
•
http://www.uta.edu/ctle/assessment/direct-indirect.php
19
Examples – Good and Average
• Program
– Nursing high rating -2.90
– Energy Technology lower rating – 2.23
• General Education
– English & Communication high rating -2.84
– Science lower rating – 2.42
Select a partner and discuss the program
review and assessment process at your
institutions.
20
Thank You
• For additional information
– Contact:
Dr. Koreen Ressler
[email protected]
701-854-8001
21