2007 Program Review
Download
Report
Transcript 2007 Program Review
AY 2011 CTE Instructional
Program Review Process
Data Description
Process – Timeline
Rev. 10-18-11
Intro
The purpose of this presentation is describe the process we follow for both our local
Comprehensive and the system required Annual Program Reviews, as well as how
the data is calculated.
We have been asked to produce an annual program review for each and every one of
our instructional programs and units. They are required of each system CC and will
be taken to the U of H Board of Regents for their purview.
If you are normally scheduled to do a comprehensive review or are “Jumping”, you
will need to complete a comprehensive review this year. Additionally, every
instructional and non-instructional program will do an Annual Program Review this
year.
Not sure if you’re scheduled for a comprehensive review or not?
Click here for the
Comprehensive Program-Unit Review Cycle and Schedule
2
What are we doing to improve our program review process?
Upon conclusion of every program/unit review cycle, the IR Office takes extra care to ensure that we
are improving our program/unit review process on campus. This is accomplished by sending out
questionnaires specific to the groups, and by meeting with various groups across campus and
collecting feedback. Your suggestions for improving this process are then posted on the
Assessment Website and linked here for your convenience.
2010 Program-Unit Process Improvement Summary
Based upon the feedback we received from everyone last year from our program-unit review process
improvement focus groups, the following changes have been made and have been incorporated
into the planning of this year’s review:
Timeline for program/unit reviews reevaluated. Reviews due this year November 18th to Joni.
We will continue to use our college website as the mode of delivery for your program-unit review
documentation.
We will continue the practice of providing program-unit review training and presentation materials.
3
What are we doing to improve our program review process cont.?
Continue practice of providing training specific to your group (i.e. a presentation just for CTE)
You said that the training you received last year regarding changes to the Instructional Program
Review Comprehensive Template was helpful. We will continue to provide a brief overview of the
templates upon conclusion of our normally scheduled training.
We will continue to schedule our trainings based on the scheduling requests we send out. We
have 3 scheduled training sessions this year, one for Units, one for Liberal Arts, and one for CTE.
We will continue to seek your scheduling input to consider room size and number of sessions.
Online submissions last year seemed to work quite well for most folks. Please plan to do any
formatting to your report within the online tool to keep your formatting from getting scrubbed.
4
What are we doing to improve our program review process cont.?
There is no need to wait for the IR Office this year to copy, paste, and format your 3 years of data
into the comprehensive template because all 3 years data is now provided to you online.
Additionally, the CERC made a change to the comprehensive template requiring you to paste the
entire annual review into your comprehensive review template. You may also hyperlink the
annual review in if you wish. More on template changes later in the presentation today.
There was an interest last year to have our annual review feedback given to the program initiators.
This was accomplished through email last year from the VCAA to the Initiators.
There were suggestions last year that assessment needs to be better integrated into the
comprehensive program reviews. The requirement for listing your unit outcomes and their
assessments have now been added to the Instructional Program Review Comprehensive
Template.
It was suggested that having writing assistance in the program review process would be helpful.
Mary Goya has offered to help programs out until we have our Assessment Coordinator on-board.
All templates have been updated since last year based on both your feedback, and an evaluation
of the templates by CERC.
5
What is different this year?
Three separate training sessions will be given this year in order to focus on specific groups. They
are:
–
–
–
1. CTE Programs
2. Liberal Arts Division Programs
3. Units
3 full years of data provided to programs—no more fall data mixed with academic year data.
No upload capabilities with web submission tool. Your analysis should be on the data provided.
There is a new tab in the online submission tool, which was requested by the VCAA’s for the
inclusion of SLO’s. This is now required with your submission.
The new “due to VCAA” date this year is November 18th.
Your budget numbers will be made available to you this year on the Assessment website at
AY 2011 Instructional Program Review Budget Table
Annual Reports of Program Data are completely on-line this year for Student Services.
Remedial/Developmental programs have a new data element this year called, “Success at the
next level”
6
What’s a Jumper?
A Jumper is a locally defined term that is used to describe an instructional
program or non-instructional program (unit) that has decided to jump out of
their normally scheduled slot for their comprehensive program reviews and
into this years cycle.
Jumping into this years comprehensive cycle means that you will have an
opportunity to be considered for any budgetary decisions that will be made
in this years budget process.
Jumpers will still have to do their comprehensive review on their next
scheduled review. Jumping does not affect the existing schedule—you are
voluntarily doing an extra review to be considered in this budget cycle.
7
I belong to an Instructional Program… which template do I use?
COMPREHENSIVE REVIEWS
Comprehensive Instruction Program Review Template
(Use this template ONLY if you are scheduled for a comprehensive program review
this year or are jumping)
-------------------------------------------------------------------------------------------------------------------
ANNUAL REVIEWS
Your programs data table is available on-line using the link below. You should have
everything you need to begin writing your review within the web submission tool.
Plan to save your work often—especially when switching between screens, and
plan to do most of your formatting within the tool if you are copying and pasting in
from Word.
UHCC Annual Report of Program Data Web Submission Tool
(ALL Instructional programs will need to complete this—even if you’re completing a
comprehensive review)
8
Terminology / Timing
The Census freeze event is the fifth Friday after the first day of instruction.
The End of semester freeze event is 10 weeks after the last day of instruction.
FISCAL_YR_IRO: Fiscal year, where the value indicates the ending of the fiscal year. For
example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004-2005 (July 1,
2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005
semesters.
9
Instructional Program Review Data Elements
Student information data this year comes exclusively from the Operational Data Store
(ODS). Organizationally this means that all community colleges are getting the data from
the same place and at the same snapshot in time (this is a good thing).
The following slides will explain in detail what data has been provided to you for your
comprehensive-annual instructional program review write ups and how it has been
calculated.
10
#1
New and Replacement Positions (State)
Economic Modeling Specialists Inc. (EMSI) annual new and replacement
jobs at state level. Compiles data based on Standard Occupational
Classification (SOC) codes that the college has linked to the instructional
program.
Data based on annual new/replacement positions projections as of Spring
2011
State position numbers are not pro‐rated.
From their website, “…EMSI specializes in reports that analyze and quantify
the total economic benefits of community and technical colleges in their
region, and also creates data-driven strategic planning tools that help
colleges maximize their impact through labor market responsiveness…”
11
#2
New and Replacement Positions (County prorated)
Economic Modeling Specialists Inc. (EMSI) annual new and replacement
jobs at county level. Compiles data based on Standard Occupational
Classification (SOC) codes that the college has linked to the instructional
program.
Note: It is possible for the number of new and replacement positions in the
county to be higher than the state if the projection in other counties is for a
loss of new and replacement positions.
County data pro‐rated (when more than one CC is in the county) to reflect
number of programs aligned to the SOC code and weighted by number of
majors in each program/institution.
Data based on annual new/replacement positions projections as of Spring
2011.
12
#3
Number of Majors
Count of program majors who are home‐institution at your college.
Count excludes students that have completely withdrawn from the
semester at CENSUS.
This is an annual number. Programs receive a count of .5 for each
term (fall and spring) within the academic year that the student is a
major. A maximum count of 1.0 (one) for each student.
13
#4
SSH Program majors in Program Classes
The sum of Fall and Spring SSH taken by program majors in
courses linked to the program. Captured at Census and excludes
students who have already withdrawn (W) at this point.
Note: for programs where year‐round attendance is mandatory,
Summer SSH are included.
Excludes Directed Studies (99 series). Differs from MAPS as UHCC
data includes Cooperative Education (93 series) as there is a
resource cost to the program.
Not sure what your program classes are? Click here to find out:
Courses Taught Aligned to Instructional Programs
14
#5
SSH Non-Majors in Program Classes
The sum of Fall and Spring SSH taken by non‐program majors (not
counted in #4) in courses linked to the program. Captured at Census
and excludes students who have already withdrawn (W) at this
point.
Note: for programs where year‐round attendance is mandatory,
Summer SSH are included
Excludes Directed Studies (99 series). Differs from MAPS as UHCC
data Includes Cooperative Education (93 series) as there is a
resource cost to the program.
15
#6
SSH in All Program Classes
The sum of Fall and Spring SSH taken by all students in classes
linked to the program. Captured at Census and excludes students
who have already withdrawn (W) at this point.
Note: for programs where year‐round attendance is mandatory,
Summer SSH are included.
Excludes Directed Studies (99 series). Differs from MAPS as UHCC
data Includes Cooperative Education (93 series) as there is a
resource cost to the program.
16
#7
FTE Enrollment in Program Classes
Sum of Student Semester Hours (SSH) taken by all students in
classes linked to the program (#6) divided by 30. Undergraduate,
lower division Full Time Equivalent (FTE) is calculated as 15 credits
per term.
Captured at Census and excludes students who have already
withdrawn (W) at this point.
Note: for programs where year‐round attendance is mandatory,
summer SSH are included.
17
#8
Total Number of Classes Taught
Total number of classes taught in Fall and Spring that are linked to
the program. Includes Summer classes if year‐round attendance is
mandatory.
Concurrent and Cross listed classes are only counted once for the
primary class.
Excludes Directed Studies (99 series). Differs from MAPS as UHCC
data Includes Cooperative Education (93 series) as there is a
resource cost to the program.
18
CTE Program Scoring Rubric Definitions
Your program health is determined by 3 separate types of measures:
Demand, Efficiency, and Effectiveness. This slide explains why
these measures were chosen to determine program health.
Demand: A seeking or state of being sought after.
i.e. your programs ability to attract new students every year based
on your offering.
Efficiency: Acting or producing effectively with a minimum of waste,
expense, or unnecessary effort.
i.e. your programs ability to use its resources in the best possible
way.
Effectiveness: Stresses the actual production of, or the power to
produce an affect.
i.e. your programs ability to produce the desired result.
19
Determination of program’s health based on demand
This year the system office will calculate and report health calls for all
instructional programs using academic year 2011 data. The following
instructions illustrate how those calls are made.
Program Demand is determined by taking the number of majors (#3) and
dividing them by the number of New and Replacement Positions by County
(#2).
The following benchmarks are used to determine demand health:
Healthy:
Cautionary:
Unhealthy:
1.5 - 4.0
.5 – 1.49; 4.1 – 5.0
<.5; >5.0
Finally, an Overall Category Health Score is assigned where:
2 = Healthy
1= Cautionary
0= Unhealthy
20
#9
Average Class Size
Total number of students actively registered in Fall and Spring
program classes divided by classes taught (#8). Does not include
students who have already withdrawn from the class by Census.
Excludes Directed Studies (99 series). Differs from MAPS as UHCC
data. Includes Cooperative Education (93 series) as there is a
resource cost to the program.
21
#10 Fill Rate
Total active student registrations in program classes
(number of seats filled) at Fall and Spring census divided
by the maximum enrollment (number of seats offered).
Captured at Census and excludes students who have
already withdrawn (W) at this point.
22
#11 FTE BOR Appointed Faculty
Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty
(excludes lecturers and other non BOR appointees).
Uses the “hiring status” of the faculty member – not the teaching/work load.
Uses the Employing Agency Code (EAC) recorded in the Human Resources
(HR) database to determine faculty’s program home.
Data as of 10/31/2010
Data provided by UH Human Resources Office.
Click here for the count of BOR Appointed Program Faculty in your
program:
2011 BOR Appointed Program Faculty
23
#12 Majors to FTE BOR Appointed Faculty
Number of majors (#3) divided by sum appointments (#11) (1.0, 0.5,
etc.) of all BOR appointed program faculty.
Data show the number of student majors in the program for each
faculty member (25 majors to 1 faculty shown as “25”)
24
#13 Majors to Analytic FTE Faculty
Number of majors (#3) divided by number of Analytic
FTE faculty (13a).
25
#13a Analytical FTE Faculty (Workload)
Calculated by sum of Semester Hours (not Student Semester
Hours) taught in program classes divided by 27.
Analytic FTE is useful as a comparison to FTE of BOR appointed
faculty (#11). Used for analysis of program offerings covered by
lecturers.
26
Where can I find my budget numbers?
Again this year we will use the actual expenditures of both BBudgets and salaries, as part of your overall budget.
The budget numbers you need for your program review are now
available on the Assessment website at
AY 2011 Instructional Program Review Budget Table
27
How do I get the budget numbers into the
web submission tool?
Go to the UHCC Annual Report of Program Data (ARPD) Web Submission Tool
Click on the 2011 Instructional ARPD(*) link.
Click on Web submission link.
Log in, using your UH username and password.
Select the “Cost per SSH” tab.
Click the “Edit” button for your program.
Go to the link on the assessment website for your budget and…
Enter the value of general funds. (this is b-budget plus salary)
Enter the value of federal funds.
Enter any other funds that you wish to add.
Click the “Save Cost per SSH Data” button at bottom of page.
Your overall budget allocation (sum of everything you entered) and
the cost per SSH will automatically populate the data table in your
review.
28
#14 Overall Program Budget Allocation
The overall program budget allocation = General Funded Budget
Allocations (14a) + Special/Federal Budget Allocations (14b)
29
#14a General Funded Budget Allocation
The general funded budget allocation = actual personnel costs + b
budget expenditures
Personnel costs this year include the salaries for : faculty, lecturers,
overload, APT, student help, and clerical.
30
#14b Special/Federal Budget Allocation
The expenditure of dollars from Federal grants
31
#15 Cost per SSH
Overall Program Budget Allocation (#14) divided by SSH in all
program classes (#6)
32
#16 Number of Low Enrolled (<10) Classes
Classes taught (#8) with 9 or fewer active students at
Census.
Excludes students who have already withdrawn (W) at
this point.
Excludes Directed Studies (99 series).
Includes Cooperative Education (93 series) as there is a
resource cost to the program.
33
Determination of program’s health based on efficiency
This year the system office will calculate and report health calls for all
instructional programs using AY 2011 data. The following instructions
illustrate how those calls are made.
Program Efficiency is calculated using 2 separate measures…Fill rate
(#10), and Majors to FTE BOR Appointed Faculty (#12).
The following benchmarks are used to determine health for Fill Rate:
Healthy:
Cautionary:
Unhealthy:
75 – 100%
60 – 74%
< 60%
An Overall Category Health Score is assigned where:
2 = Healthy
1 = Cautionary
0 = Unhealthy
34
Determination of program’s health based on efficiency cont…
The following benchmarks are used to determine health for Majors/FTE
BOR Appointed Faculty :
Healthy:
Cautionary:
Unhealthy:
15 - 35
30 – 60; 7 - 14
61 +; 6 or fewer
An Overall Category Health Score is assigned where:
2 = Healthy
1 = Cautionary
0 = Unhealthy
Finally, average the 2 overall health scores for Class fill rate and
Majors/FTE BOR Appointed Faculty then use the following rubric:
1.5 - 2
.5 - 1
0
= Healthy
= Cautionary
= Unhealthy
35
#17 Successful Completion
(Equivalent C or higher)
Percentage of students actively enrolled in program classes at Fall
and Spring census who at end of semester have earned a grade
equivalent to C or higher.
36
#18 Withdrawals (grade = W)
Number of students actively enrolled (at this point have not
withdrawn) at Fall and Spring census who at end of semester have a
grade of W.
37
#19 Persistence Fall to Spring
Count of students who are majors in program at fall
census (from Fall semester #3) and at subsequent
Spring semester census are enrolled and are still majors
in the program.
Example:
31 majors start in Fall
21 majors of the original 31 persist into Spring
21/31 = .6774 or 67.74%
38
#20 Unduplicated Degrees/Certificates Awarded
Unduplicated headcount of students in the fiscal year reported to whom a program
degree or any certificate has been conferred. (Sum of 20a, 20b, 20c, and 20d).
Uses most recent available freeze of fiscal year data.
For ARPD year 2009, the most recent fiscal data on August 15, 2009, was from FY
2008.
For ARPD year 2010, the most recent fiscal data on August 15, 2010, was from FY
2010.
For ARPD year 2011, the most recent fiscal data on August 15, 2011, was from FY
2011.
39
#20a Number of Degrees Awarded
Degrees conferred in the FISCAL_YEAR_IRO.
The count is of degrees and may show duplicate degrees received in the
program by the same student if the program offers more than one degree.
Uses most recent available freeze of fiscal year data.
FISCAL_YEAR_IRO:
“Fiscal year, where the value indicates the ending of the fiscal year. For
example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year
2004‐2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004,
Fall 2004, and Spring 2005 semesters…”
40
#20b Certificates of Achievement Awarded
Certificates of achievement conferred in the FISCAL_YEAR_IRO.
The count is of program certificates of achievement and may show multiple
certificates of achievement in the same program received by the same
student.
Uses most recent available freeze of fiscal year data.
FISCAL_YEAR_IRO:
“Fiscal year, where the value indicates the ending of the fiscal year. For
example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year
2004‐2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004,
Fall 2004, and Spring 2005 semesters…”
41
#20c Academic Subject Certificates Awarded
The count is of program Academic Subject Certificates and may show
multiple Academic Subject Certificates in the same program received by the
student.
Uses most recent available freeze of fiscal year data.
42
#20d Other Certificates Awarded
The count is of other program certificates (such as APC) and will show
multiples received by the same student.
Uses most recent available freeze of fiscal year data.
43
#21 Transfers to UH 4-yr programs
Students with home campus UH Manoa, UH Hilo, or UH West Oahu for the
first time Fall 2010 who prior to Fall 2010 had UH community college as
home campus.
Also includes students who for the first time in Fall 2010 show Maui CC,
Applied Business Information Technology as home campus and major.
This is a program measure. A student is included in the count of program
transfers in as many programs in which they have been a major at the
college.
44
#21a Transfers with credential from program
Students included in #21 who have received a degree from the
community college program prior to transfer. Does not include any
certificates.
45
#21b Transfers without credential from program
Students included in #21 who did not receive a degree from the
community college program prior to transfer.
46
Determination of program’s health based on effectiveness
This year the system office will calculate and report health calls for all instructional
programs using academic year 2011 data. The following instructions illustrate how
those calls are made.
Program Effectiveness is calculated using 3 separate measures: Unduplicated
Degrees/Certificates Awarded (#20) / Majors (#3), Unduplicated Degrees/Certificates
Awarded (#20) / Annual new and replacement positions (County prorated) (#2), and
Persistence Fall to Spring (#19).
The following benchmarks are used to determine health for Unduplicated
Degrees/Certificates Awarded per major:
Healthy:
Cautionary:
Unhealthy:
> 20%
15 - 20%
< 15%
An Overall Category Health Score is assigned where:
2 = Healthy
1= Cautionary
0= Unhealthy
47
Determination of program’s health based on effectiveness cont…
The second measure used to determine health is Unduplicated Degrees/Certificates
Awarded (#20) / Annual new and replacement positions (County prorated) (#2).
The following benchmarks are used to for this measure:
Healthy:
Cautionary:
Unhealthy:
.75 – 1.5
.25 - .75 and 1.5 – 3.0
< .25 and >3.0
An Overall Category Health Score is assigned where:
2 = Healthy
1 = Cautionary
0 = Unhealthy
48
Determination of program’s health based on effectiveness cont…
The third measure used to determine health is Persistence (Fall to Spring) (#19).
The following benchmarks are used for this measure:
Healthy:
Cautionary:
Unhealthy:
75 - 100%
60 - 74%
< 60%
An Overall Category Health Score is assigned where:
2 = Healthy
1 = Cautionary
0 = Unhealthy
49
Determination of program’s health based on effectiveness cont…
You should now have a value of zero, one, or two for each of the 3 effectiveness
measures. The process of determining the Effectiveness health call score contains
the following 3 steps:
Step #1: Add up all 3 Overall Category Health scores for the effectiveness
measures. (the zero’s, one’s and two's you assigned earlier)
Step #2: Determine the effectiveness category health call range where:
5-6
2-4
0 -1
= Healthy
= Cautionary
= Unhealthy
Step #3: Now use the scoring rubric below to determine the effectiveness health call
score: (for example: you had a healthy 5 in the previous step you would assign it a
healthy 2 here)
2
1
0
= Healthy
= Cautionary
= Unhealthy
50
Determination of program’s overall health
You should now have a value of zero, one, or two for each of the 3 program health
calls; Demand, Efficiency, and Effectiveness. Simply add those 3 values together
and use the Scoring Range Rubric below to determine the overall health of your
program.
5-6
2-4
0 -1
= Healthy
= Cautionary
= Unhealthy
51
#22
Number of Distance Education Classes Taught
Measures the number of classes taught with the mode of delivery as
“Distance Completely Online.”
In setting up the class, the college indicates the method of
instruction used by the instructor in conducting the class.
If the method is Distance Education, and the college indicates the
“mode” of distance delivery as “Distance Completely Online” the
class will be included in this count.
52
#23
Enrollment Distance Education Classes
At the Fall and Spring census, the number of students actively
enrolled in all classes owned by the program and identified as
Distance Completely On‐Line (#22).
Does not include students who at Census have already withdrawn
from the class.
53
#24
(DE) Fill Rate
Total active student registrations in program distance education
classes (#23) classes (number of seats filled) at Fall and Spring
census divided by the maximum enrollment (number of seats
offered).
Does not include students who at Census have already withdrawn
from the class.
54
#25
(DE) Successful Completion (Equivalent C or
higher)
Percentage of students enrolled in program Distance Education
classes (#23) at Fall and Spring census who at end of semester
have earned a grade equivalent to C or higher.
55
#26
(DE) Withdrawals (Grade=W)
Number of students actively enrolled in program Distance Education
classes (#23) at Fall and Spring census who at end of semester
have a grade of W.
56
#27
(DE) Persistence (Fall to Spring not limited to Distance Ed)
Students enrolled in program distance education classes at Fall census who
at subsequent Spring semester census are enrolled in the college.
Not limited to students continuing to take program distance education
classes.
Example:
31 majors enrolled in DE classes in Fall
21 majors of the original 31 majors persist into any Spring class
21/31 = .6774 or 67.74%
57
#28 Perkins Core Indicator: Technical Skills
Attainment (1P1)
Perkins Core Indicators are used in the development of the HawCC Program Health
Indicator (PHI) reports completed annually for CTE programs. The data that you see
in your program review is for the 2011 academic year. All Perkins Core Indicator
goals not met must be addressed in the narrative and action plan in your review.
A concentrator is a student who has a major (taken from the major field in Banner) for
a CTE program, and who has completed 12 or more credits by the end of the Perkins
year.
Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data
show State goal and College actual.
Technical Skills Attainment is calculated by:
Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and
who have stopped program participation in the year reported.
Number of concentrators who have stopped program participation in the year reported.
58
#29 Perkins Core Indicator: Completion (2P1)
Perkins Core Indicators are used in the development of the HawCC Program Health
Indicator (PHI) reports completed annually for CTE programs. The data that you see
in your program review is for the 2011 academic year. All Perkins Core Indicator
goals not met must be addressed in the narrative and action plan in your review.
A concentrator is a student who has a major (taken from the major field in Banner) for
a CTE program, and who has completed 12 or more credits by the end of the Perkins
year.
Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data
show State goal and College actual.
Completion is calculated by:
Number of concentrators who received a degree or certificate in a Career and Technical Education program and who
have stopped program participation in the year reported.
Number of concentrators who have stopped program participation in the year reported.
59
#30 Perkins Core Indicator: Student Retention or
Transfer (3P1)
Perkins Core Indicators are used in the development of the HawCC Program Health
Indicator (PHI) reports completed annually for CTE programs. The data that you see
in your program review is for the 2011 academic year. All Perkins Core Indicator
goals not met must be addressed in the narrative and action plan in your review.
A concentrator is a student who has a major (taken from the major field in Banner) for
a CTE program, and who has completed 12 or more credits by the end of the Perkins
year.
Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data
show State goal and College actual.
Student Retention or Transfer is calculated by:
Number of concentrators in the year reported who have not completed a program and who continue postsecondary
enrollment or who have transferred to a baccalaureate degree program.
Number of concentrators in the year reported who have not completed a program.
60
#31 Perkins Core Indicator: Student Placement
(4P1)
Perkins Core Indicators are used in the development of the HawCC Program Health
Indicator (PHI) reports completed annually for CTE programs. The data that you see
in your program review is for the 2011 academic year. All Perkins Core Indicator
goals not met must be addressed in the narrative and action plan in your review.
A concentrator is a student who has a major (taken from the major field in Banner) for
a CTE program, and who has completed 12 or more credits by the end of the Perkins
year.
Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data
show State goal and College actual.
Student Placement is calculated by:
Number of concentrators in the year reported (previous Perkins year) who have stopped program participation and who
are placed or retained in employment, military service, or an apprenticeship program within
unemployment insurance quarter following program completion.
Number of concentrators in the year reported (previous Perkins year) who have stopped program participation
61
#32 Perkins Core Indicator: Nontraditional
Participation (5P1)
Perkins Core Indicators are used in the development of the HawCC Program Health
Indicator (PHI) reports completed annually for CTE programs. The data that you see
in your program review is for the 2011 academic year. All Perkins Core Indicator
goals not met must be addressed in the narrative and action plan in your review.
A concentrator is a student who has a major (taken from the major field in Banner) for
a CTE program, and who has completed 12 or more credits by the end of the Perkins
year.
Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data
show State goal and College actual.
Nontraditional Participation is calculated by:
Number of participants from underrepresented groups who participated in a program that leads to employment in
nontraditional fields.
Number of participants who participated in a program that leads to employment in nontraditional fields.
62
#33 Perkins Core Indicator: Non-Traditional
Completion (5P2)
Perkins Core Indicators are used in the development of the HawCC Program Health
Indicator (PHI) reports completed annually for CTE programs. The data that you see
in your program review is for the 2011 academic year. All Perkins Core Indicator
goals not met must be addressed in the narrative and action plan in your review.
A concentrator is a student who has a major (taken from the major field in Banner) for
a CTE program, and who has completed 12 or more credits by the end of the Perkins
year.
Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data
show State goal and College actual.
Non-Traditional Completion is calculated by:
Number of concentrators from underrepresented gender groups who received a degree or certificate in a
program that leads to employment in nontraditional fields.
Number of concentrators who received a degree or certificate in a program that leads to employment in nontraditional
63
fields.
Publish all
Program & Unit
Reviews to
Assessment
Website
Schedule
Training for all
LA, Instruction,
and Units
Provide
Annual
Program
Review
Training to
Campus
Review and edit
all PR (not CERC)
documentation
Sept
Sept 22nd
Oct
Collect and
deliver
needed data
to Student
Support
Services
Oct
All program and
unit reviews
(Comprehensive
and Annual) due
to Interim VCAA
or UHCC ARPD
website by EOB
Friday November
18th.
Nov 18
Work with
AC and
Interim
VCAA to
determine
what
feedback
needs to be
taken back
to UHCC
IPRC
Dec 20th
Jan 25th
Comprehensive & Annual Program Review Timeline
Jan 5th
Sept-Oct
Plan this year’s
program review
based on
suggested
improvements
from last year’s
review
Sept
Build out 2011
program review
page on the
assessment
website and
publish all
updated
materials
Sept - Oct
Develop
training
materials for
all LA,
Instruction,
and Units
Oct
Dec 15th
Collect and
deliver
needed data
to Academic
Support
Services
Package and
Delivery of all
Program &
Unit Reviews
to System
Office
Jan 27th
Develop and
administer PR
Process
Improvement
Sessions
(questionnaire,
schedule
meetings,
collect
feedback)
Summarize
PR Process
Improvement
Feedback,
communicate
results to
groups,
publish to
web.
64
AY 2011 Comprehensive & Annual Program Review Process
Step 1
Write your program review using the appropriate template or
online submission tool.
Step 2
Send your documents (one Word document per review) to
Interim VCAA Joni Onishi by email no later than end of
business, Friday November 18th, 2011.
Step 3
Interim VCAA will ensure that all required documents
have been received and that they are adequate.
Step 4
Interim VCAA will forward all approved reviews to the
Institutional Research Office for further processing.
Step 5
The Annual reviews will be sent to the System Office for review
by the UH Board of Regents. Comprehensive reviews will be
forwarded along as appropriate following CERC guidelines.
Step 6
All reviews will finally be converted to PDF and posted
to the Assessment Web Site.
65
Questions?
The intention of this presentation was to provide a single source for all of
the documentation related to the Comprehensive & Annual Program Review
process. I have linked all of the documents you should need directly into
this presentation.
If you need more information on this process please feel free to contact me:
Shawn Flood
934-2648
Mahalo!
66
CERC Comprehensive Program
Review Template & Process
Mary will now discuss the comprehensive unit review template, and briefly cover the
current CERC process.
Comprehensive Instruction Program Review Template
If you need assistance contact me:
Mary Goya, Professor
Early Childhood Education
Assessment Coordinator (part time)
Email: [email protected]
(Email is best)
Phone: 934-2629
Mahalo!
67