Transcript Slide 1

ARE STUDENTS LEARNING WHAT WE SAY THEY ARE?
THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE
PROGRAM REVIEWS IN THE BUSINESS CURRICULUM
Presented by: Dr. Jorge A. Cardenas, DBA
Dr. Patricia A. Ryan, Ph.D.
Forbes School of Business at Ashford
University
WHAT SHOULD YOU KNOW?
Program reviews are not that complicated!
Program reviews are not that difficult!
Program reviews do require time!
Program reviews do require multiple experts and sources!
Program reviews do require Objectivity!
Program reviews are good stories, logical in thought and process!
And most importantly……
Program reviews distinguish Facts from IMPORTANT FACTS and tell
us…Are Students Learning What We Say They Are?
3
WHAT IS A PROGRAM REVIEW
AND WHY IS IT IMPORTANT?
A program review is an in-depth analysis of the quality and effectiveness
of an academic program through the assessment of:
a. curriculum quality;
b. faculty and teaching quality;
c. student learning success and;
d. program sustainability.
Program reviews allow the University to make informed, data-driven
decisions to improve academic programs and keep them current and
viable.
PURPOSE OF PROGRAM REVIEW
• To determine if an Academic Program is doing what it was intended to
do asking questions such as:
•Is the curriculum relevant and current?
•Are the faculty who teach qualified and experts in the discipline?
•Are students engaged in the program learning and meeting the
program objectives?
•Are there support services that exist for students and faculty?
•Requirements for Regional Accreditation.
SAMPLE
PROGRAM
REVIEW
OUTLINE
… have a template to
tell your story!
Here is ours 
WHERE DO YOU START?
•With a hypothesis!
• The academic program is doing what it was intended to do.
•How Do You Support or Disaffirm the hypothesis?
• Collect and analyze data related to the curriculum, students and faculty.
Students
Academic Program
Curriculum
Faculty
INPUTS
MIXED
METHODS
STUDENT
ASSESSMENT
DATA
DATA
FACULTY
DATA
STUDENT LEARNING
• Quantitative Data
• Qualitative Data
TRIANGULATION
OF DATA
Observations that have been
confirmed by two or more
independent measures
* Notice each tracks with
Part A of the Outline:
- Curriculum Quality
- Faculty & Teaching Quality
- Student Learning Success
CURRICULUM
- Addresses the extent to which students are achieving program goals/learning outcomes
based on evidence. Claims made about the program’s strengths and weaknesses must be
supported with qualitative and quantitative evidence.
Note: To have an impact, data needs to be presented as evidence (i.e., use the data to make your case,
rather than just listing numbers in a table, while continuing to be mindful of telling your program’s
story).
Qualitative
Quantitative
End of Course Surveys
Retention, Persistence,
Graduation Rates
End of Program Surveys
ILOs
Alumni Surveys
Capstone Exams/Assignments
Focus Groups
Course Completion Rates
PLOs
CLOs
9
STUDENT SUCCESS
Demographics
Career
advancement,
Promotions,
Facebook
feedback
GPA, ACT/SAT
scores, Transfer
Credits, etc...
Retention,
Persistence,
Graduation Rates
Students
Direct and
Indirect
Assignment
Target
Achievements
10
FACULTY AND TEACHING QUALITY
Rank Profiles
(% terminal
degrees),
FT/PT, Onground/line...
Scholarship,
Experience,
Licenses,
Professional
Development...
Program
Review Focus
Groups
Faculty
Performance
Evaluations,
Student
Satisfaction
Surveys
Governance,
Committee
Participation,
Training,
Recruitment...
Demographics
11
PART B: ANALYSIS
PROGRAM SUSTAINABILITY/VIABILITY
ADDRESSES THE PROGRAM’S CONTRIBUTIONS TO THE COMMUNITY
AND PROFESSION AS WELL AS ITS’ VIABILITY
Societal and professional demand for program
- Bureau of Labor Statistics
- Benchmark Comparison to Comparable Programs/Competitors
Contributions of the program to the community and/or profession
- What extent are graduates succeeding in relevant careers, community service,
creative endeavors, etc…
- Student perceptions about reaching their personal and professional goals
12
PART C: SUMMARY REFLECTIONS
•What did you discover about the program, student learning,
faculty, courses, etc…
•Were there any “AHA” moments and insights?
•Is the program doing what we say it is?
•How can it be improved?
13
PART D: PROPOSED IMPROVEMENTS
•Does the program align with the College/School/University mission?
•What changes need to be made to ensure and increase student
learning?
•What is working and should be maintained?
•What resources are needed to make the changes?
* Be Specific as Possible
14
GENERAL TIMELINE
EXAMPLE
•Program Review (PR) Cycle
•17 months
•PR Launch
•Kickoff meeting
•Present External Reviewers' CVs
•Qualified faculty CVs
•Present self-study draft to
Executive Dean & Provost
•Finalize self-study report
& submit to Executive Dean &
Provost for approval
GENERAL TIMELINE cont.
•External Reviewers Visit
•Meet with Deans, Chairs, Faculty
•Incorporate external
reviewers recommendations
and submit final report for
approval by Provost &
President
•17 month mark
DELIVERABLES
•External Reviewer CVs
•Must be doctorally qualified and experts in their discipline
•Located locally (e.g. CA, NV, AZ, CO, NM)
•Self Study/Program Review Document
•Length- 25 pages not including cover page, table of contents and appendices
•Executive Summary
•Summary of Self Study/Program Review Document
•Action Plan Report
•Details with regard to action items based on self study/program review recommendations
•Overview of Action Plan
•Summary of action items presented in the Action Plan Report
RESOURCES
Dr. Cardenas and Dr. Ryan
Self-Study Template (happy to share)
Interested in becoming an external reviewer for us? See us after 
18
FAILURE IS
NOT AN
OPTION!
Dr. Michael Reilly, Executive Dean
Forbes School of Business