Using Instructional Program and Administrative Unit Review to Accomplish Institutional Effectiveness

Download Report

Transcript Using Instructional Program and Administrative Unit Review to Accomplish Institutional Effectiveness

Using Instructional Program
and Administrative Unit Review to
Accomplish Institutional Effectiveness
AIR Conference, June 1, 2004
Boston, MA
Terri M. Manning, EdD
Denise H. Wells, MS
Central Piedmont Community College
Charlotte, North Carolina
Some history….
CPCC is the largest community college in
both the Carolinas with approximately
60,000 students.
 We have over 100 instructional program
areas including for-credit, continuing
education and literacy
 We have almost 50 administrative and
student affairs units

Some history….



When we started this process (1998), the
College had done nothing truly “IE related” since
the last SACS visit in 1992 when they received a
recommendation for every must statement in
section III.
The IR office staff had rolled over and all
institutional memory was gone.
We had less than 5 years to put every program
and unit through a review process.
Institutional Effectiveness and the
Southern Association of College
and Schools



At the heart of SACS’ philosophy is the concept
of institutional effectiveness (IE)
IE involves a process of planning, evaluation,
assessment and use of results
SACS requires that in addition to providing
evidence of planning and evaluation in its
educational program, the institution must
demonstrate planning and evaluation in its
administrative and educational support services
IE at Central Piedmont
CPCC established an IE committee in
1998 and an IE website in 2000
(http://inside.cpcc.edu/IE)
 Committee members were from various
instructional departments, student services
and administrative/business offices
 The committee established an IE plan for
the college

Visual of the IE Plan
Four Major Overlapping Pieces of
the IE Plan

General
Education
Assessment
Program/
Unit Review


Annual Goal
Setting Cycle
College
Assessment
Process

Annual goal setting and
follow-up required by all
units
Program/unit review on a
3-5 year cycle
College assessment
plan involving surveys,
assessment and data
analysis
The evaluation of
general education
Developing the Process



We wanted to create a meaningful
program/unit review process
We wanted programs to complete the
process having learned something
valuable (not a document to set on a
shelf)
We wanted the process to be “outcomebased” to stand the test of time
Developing the Process




The first review process was created for
instructional programs (main focus at most
institutions)
During the 1998-99 year, 18 programs were
reviewed (approximately 100 programs over five
years)
The perception at the beginning was that “this is
just another academic exercise”
But the results were very different than any
review process previously completed
Organizational Stages of
Outcome Evaluation
Stage 4
Stage 5
Acceptance & adaptation
Challenge & competition
Catalyst - Proactive
Depression - exhaustion
Compliance - Passive reactive
Stage 3
Bargaining - no time/no money
Seek outside sources
Stage 2
Anger and antagonism
Resistant & Reactive
Stage 1
Disbelief & Denial
Paralysis - Passive resistance
Instructional Program Review

The review process contained five
sections:
I.
 II.
 III.
 IV.
 V.

The Program Profile
Program Content
Student Learning Outcomes
Need for Change
Future Issues
Instructional Program Review

Tasks accomplished through review:














Defined their program
Detailed faculty and staff (credentials, prof. dev, accomplishments)
Unit mission or purpose
Link unit mission to the college mission
Defined the content of their program
Defined the population served
Determined student learning outcomes specific to their program
Set outcome objectives
Performed some means of assessment
Analyzed results
Determined strengths and weaknesses
Created strategies for improvement
Determined future needs (including curricular change, needed
resources, future trends, staff and space)
All units had to complete a one-year follow-up
We focused on measuring…

Administrative or service objectives

Outcomes (student learning outcomes and
outcome objectives)
Administrative Objectives

"Administrative Objectives" are
measurable descriptions of what a unit
hopes they or their clients will achieve
through the delivery of services
Administrative Objectives
Many units do not directly serve students
or they want results within their units that
are not truly outcomes.
 They want to improve services or
approach an old problem in a new way.
 They want to become more efficient and
effective.
 They will set administrative objectives.

Administrative Objectives

Objectives set for the program (has
nothing to do with students outcomes)

Example
 to
recruit one new faculty member
 to seek and gain accreditation
 to increase retention by 10%
 to send each faculty member to at least one
professional conference per year
 to gain funding for an innovative program through
a grant proposal
Outcomes



"Outcomes" are benefits for people: changes in
knowledge, values, position, skills, behavior or
status. More simply stated, outcomes are typically what
service providers hope recipients achieve once they
complete a program or receive services. This is not the
“what” but the “why” of education.
Student learning outcomes are outcomes related to the
learning that takes place in the classroom… what are
the benefits to a student who receives an associate
degree in Nursing or completes a math class?
Outcome objectives are just objectives that relate to the
identified outcomes.
Program Outcome Model
INPUTS
ACTIVITIES
OUTPUTS
Resources
Services
Products or Results of
Activities
Staff
Buildings
Facilities
State funds
FTE
Education (classes)
Services
Counseling
Student activities
Numbers served
FTE (input next year)
Classes taught
Students recruited
Constraints
Laws
State regulations
Program Outcomes Model
INPUTS
> ACTIVITIES > OUTPUTS
> OUTCOMES
Benefits for People
*New knowledge
*Increased skills
*Changes in values
*Modified behavior
*Improved condition
*Altered status
Instructional Program Review

Rules for the process included:



Involvement on the part of all faculty in the
department under review (not just one person). It
is recommended that the program begin with a
brief faculty retreat to discuss and divide tasks.
All programs must use at least one external
committee (advisory groups are fine) to provide
feedback to programs.
All programs must utilize feedback from
students.
During the first few years…
Planning and Research conducted training
on “how to conduct program review” with
each unit scheduled to go up for review
that year.
 A staff member from Planning and
Research served as a liaison to each
program being reviewed to help them with
the process.

What We Learned….
The first group was dragged kicking and
screaming through the process
 Faculty had very little time for the details
 Faculty had trouble identifying “student
learning outcomes”
 The results produced great marketing
materials
 Once they saw the results, faculty
embraced the process

What Happened
Deans used the results to make a case for
resources
 The administration became interested in
what was learned through the review
process
 Instruction created a position to deal
mainly with program review and IE issues
within instruction
 Faculty knew their programs were working

IE Committee Decision
The instructional program review process
was successful
 The review of instructional units was
important – but administrative/student
services units helped create an environment
conducive to learning at the college and
supported the learning process
 Administrative units should go through a
similar process

IE Committee Decision, cont.




Create a committee to draft a similar process
for the administrative and student services
areas of the college
Administrative units were spread across three
VP areas
Two representatives from each VP area
participated in drafting the new review
process
We spent approximately six months creating
a workable process
Timeline for Administrative Units

During the:
1999-2000 year – six units went through
the review
 2000-2001 year – eight units went
through the review
 2001-2002 year – nine units went
through the review


Vice Presidents set an annual timeline for
their units with due dates for each section
The Administrative Unit Review
Design
I. The Unit/Program Profile
A. The Mission/Purpose
1.
Role unit plays in the college mission
2.
Unit/program goals as they relate to the
college’s mission
B. The Staff
1.
Professional and administrative staff (ft
since the last review)
a.
Position description/duties
b.
Credentials (full and part-time, if any)
c.
Accomplishments (if applicable)
d.
Service to college, community and
nation
The Administrative Unit Review
(continued)
B. The Staff (continued)
e.
Professional development activities
2.
Classified Staff
a.
List of names and positions
b.
List of required credentials (if any)
C. The Customer/Client Served
1.
Breakdown of students/faculty or staff by
type or demographic information
(thorough explanation of who is served)
The Administrative Unit Review
(continued)
II.
Definition of Services or Program
A.
Definition of day-to-day duties of the unit
B.
Innovations, new projects, new initiatives,
local, state-wide or national efforts
C.
Required functions of unit (description and
status of compliance)
1.
SACS "must" statements
2.
State mandates
3.
Federal mandates
4.
Other
The Administrative Unit Review
(continued)
III.
Administrative Objectives and Student
Outcomes (where appropriate)
A.
Administrative Objectives (2-3 objectives)
B.
Outcomes (or status if incomplete) of
innovations, new projects, new initiatives,
local, state or national efforts
C.
Assessment explanation (what was
assessed, who, when, how many)
D.
Results of Administrative Objectives
based on assessment
Assessment of Administrative
Units
During the year of program review, the
“Annual Faculty/Staff Survey” contained
questions from those units being reviewed.
 Results were given to each unit and
broken out by campus and job type
http://inside.cpcc.edu/planning
Click on “survey results”

The Administrative Unit Review
(continued)
IV.
Need for Change
A.
Strengths identified by external sources,
faculty, staff and students
B.
Weaknesses identified by external
sources, faculty, staff and students
C.
Recommendations by faculty, staff,
external sources and students to improve
the unit's services and programs
D.
Strategies for change (based on input
from A, B & C above) - closing the loop
E.
A one-year follow-up brief report to the
Unit VP reporting on the progress of D
above (due April 15, the year following
review)
The Administrative Unit Review
(continued)
V.
Future Issues - Resources needed for future efforts
A.
Market trends within the broad service unit or
program area (based on best-practices, the
literature or training received)
B.
Anticipated future changes and needs
(based on market trends)
C.
Resources, equipment, space, staffing and
work load changes needed for future growth
or continuation
D.
Future plans of unit
Assistance from the Website
The IE website:
 http://inside.cpcc.edu/IE

Explanation of IE process
 Templates and forms for review
 “Solid” examples for clarification
 Schedule for review

Unit Review – Lessons Learned

To do this well you need several critical
pieces:
1. Support from the top (President or Chancellor,
Vice Presidents or Vice Chancellors)
2. Buy in from the grassroots level (participation
in the development of the process)
3. Across-the-college participation (no one is
exempt)
4. Technology to make it easy (web page and
review templates)
5. Technical support from institutional research
Overall Benefits


#1 No recommendations in the area of Institutional
Effectiveness from SACS during the October
2002 re-accreditation visit
The college became change-oriented





Units had to define strategies for change
Didn’t have to be perfect but rather making continuous
progress
Strategies for change helped identify needed resources for
units
Units had to “close the loop” with the one-year follow-up
(couldn’t promise and not deliver)
Once they started, we couldn’t get them to stop doing it.
Overall Benefits

Units became empowered to perform their
functions in an optimal manner and to ask
for what they needed (no one noticed them
before, now they do)
Created data to support needs
 Accomplishments were reported to major
administrative groups across the college
(President’s Cabinet, Planning Council, IE
Committee, etc.)

Overall Benefits
The college community understood the
purpose and function of every unit
 Senior administration realized the benefits
and became strong supporters

Data are reported annually from program/unit
review
 VPs became stronger advocates for their units
making changes to improve services

Administrative Units Reviewed















Security
Health and Safety Office
Human Resources
Professional Development
Resource Development
The Foundation
EEO Office
President’s Office
Facilities Management
Distribution Services
Facilities Design and Construction
Financial Services
Payroll
Budgeting
Financial Reporting
Purchasing
Accounts Payable
Cashiering
Audit and Compliance
Basic Skills Reporting
Inventory Control
Bookstore
Campus Printing
Vending
Information Technology
Administrative Computing Services
Planning and Research
Marketing and Community
Relations
Students Affairs Units Reviewed












Counseling and Advising
Financial Aid/Veteran’s Affairs
Career Services
Disability Services
“Campus-based” student services (6 campuses)
Admissions and Records
Registration Services
The Library
The Academic Learning Center (tutoring)
Testing Center
Graduation Office
Student Life
Six Units
Some Brief Results
Planning and Research

Weaknesses Identified:



Strategy for Change:



Data/information needs to be disseminated to the
grass roots level of the College (not just
administration)
Data needs to be made more accessible and easier
to understand
Make improvements in the department’s webpage
Place more information on the webpage
Result:


Units are using data to make decisions
Planning and Research has a heavily used website
with user-friendly spreadsheets, tables and an
online Fact Book. In-person requests for data have
dramatically declined because faculty/staff are
using the webpage
Professional Development

Weaknesses Identified:


Strategy for Change:


Instructors could not attend training sessions due to the times
they were scheduled (top of the hour when classes were
scheduled on the half hour)
Starting times for all training sessions will be adjusted to
begin on the half hour rather than the hour
Results:

More faculty were able to attend trainings and numbers
increased in some areas
Health and Safety Office

Weaknesses Identified:



Strategy for Change:


Not all employees are sure what Health and
Safety does
Health and Safety Officer title is confusing
(perception of enforcement, relative to security)
Change the department’s name to Occupational
Health and Safety
Results:

Clear understanding of who is responsible for
security and who is responsible for occupational
safety
The Bookstore

Weaknesses Identified:


Strategies for Change:


Communication between faculty and staff and the
bookstore management needs improvement
A bookstore advisory committee has been formed.
Action items identified will be followed up to insure
that the bookstore is meeting the needs of
faculty/staff
Results:




Decided to outsource the bookstore
Worked with faculty to ensure an adequate number
of books were available at the beginning of the term
Created a process to move books from campus to
campus to accommodate student needs
Created more buy-back centers on the campuses to
improve services
Information Technology Services

Weaknesses Identified:




Strategies for Change:



Better follow-up is needed on Help Desk requests
Requests need to be answered faster
Have knowledgeable people at the Help Desk
Employ students with pertinent certification to man
the Help Desk
Empower Help Desk staff with decision-making
ability that helps provide the required level of
service
Results:

Help Desk created a tracking system that has
radically improved time from request to completion
Financial Services

Weaknesses Identified:


Strategies for Change:



Lack of training on various financial services areas
(payroll, budgeting, purchasing)
Work with Professional Development to establish a
comprehensive training schedule
Announced training in advance and held them at
different campuses
Results:



Less calls with questions
Less errors in forms
People have more confidence in ability to manage
budget
Major Benefits for IR Office





We created work for ourselves with
great payoff.
Each unit being reviewed had a
liaison from IR to help them with
surveys, objectives, etc.
Now, everyone knows us.
They think we are very helpful and a
great asset to the institution.
They call us for information and then
listen to what we say.
“Best” Results of “Best” Practice






Better use of data across the college
We have become more student/customer
focused
Review gives direction for goals and needed
changes
Departments are empowered to do their jobs
(can’t slip through the cracks and be
unnoticed)
Problems must be resolved (there is no hiding
and no excuses)
Surveys provide needs assessment data as
well as evaluation which gives departments
direction
For a copy of this presentation
Contact Terri Manning
 [email protected]
 Download or print presentation:
 http://inside.cpcc.edu/planning

Click on “studies and reports”
 Listed as SAIR Presentation
