Document 7133059

Download Report

Transcript Document 7133059

Using Program/Unit Review in Instructional, Business/Administrative and Student Services Areas to Facilitate Change

The League for Innovation Innovations Conference 2005, NYC March 8, 2005 Terri M. Manning, EdD Denise Wells, MS Kathy Drumm, DBA, CPA Central Piedmont Community College Charlotte, North Carolina

Some history….

   CPCC is the largest community college in both the Carolinas with approximately 60,000 enrolled students.

We have over 100 instructional program areas including for-credit, continuing education and literacy We have almost 50 administrative and student affairs units

Some history….

   When we started this process (1998), the College had done nothing truly “IE related” since the last SACS visit in 1992 when they received a recommendation for most must statements in section III.

The IR office staff had rolled over and all institutional memory was gone.

We had less than 5 years to put every program and unit through a review process.

Why this Process?

   The evaluation of teaching and educational effectiveness is a top priority for most institutions However, the evaluation of business, administrative and student services units is often undervalued and overlooked Evaluating the effectiveness of services (not just customer satisfaction) makes good business sense

Institutional Effectiveness and the Southern Association of College and Schools

   At the heart of SACS’ philosophy is the concept of institutional effectiveness (IE) IE involves a process of planning, evaluation, assessment and use of results Under the new core requirements and comprehensive standards, there are two “mandates” that specifically address program or unit reviews

Under the New Core Requirements and Comprehensive Standards

Core Requirement 2.5 states “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement, and (b) demonstrates that the institution is effectively accomplishing its mission.” Comprehensive Standard 3.3.1 States “The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.”

IE at Central Piedmont

   CPCC established an IE committee in 1998 and an IE website in 2000 ( http://inside.cpcc.edu/IE ) (this website gets an average of 61 hits a day – most outside the college) Committee members were from various instructional departments, student services and administrative/business offices The committee established an IE plan for the college

Visual of the IE Plan

Four Major Overlapping Pieces of the IE Plan

General Education Assessment Annual Goal Setting Cycle Program/ Unit Review College Assessment Process

    Annual goal setting and follow-up required by all units (Strategic Plan) Program/unit review on a 3-5 year cycle College assessment plan involving surveys, assessment and data analysis The evaluation of general education

Developing the Process

   We wanted to create a meaningful program/unit review process We wanted programs to complete the process having learned something valuable (not a document to set on a shelf) We wanted the process to be “outcome-based” to stand the test of time

Developing the Process

    The first review process was created for instructional programs During the 1998-99 year, 18 programs were reviewed (approximately 100 programs over five years) The perception at the beginning was that “this is just another academic exercise” But the results were very different than any review process previously completed

Organizational Stages of Outcome Evaluation Stage 5 Acceptance & adaptation Challenge & competition Catalyst - Proactive Stage 4 Depression - exhaustion Compliance - Passive reactive Stage 3 Bargaining - no time/no money Seek outside sources Stage 2 Anger and antagonism Resistant & Reactive Stage 1 Disbelief & Denial Paralysis - Passive resistance

Instructional Program Review

 The review process contained five sections:  I.

The Program Profile     II.

III.

IV.

V.

Program Content Student Learning Outcomes Need for Change Future Issues

Instructional Program Review

 Tasks accomplished through review:             Defined their program Detailed faculty and staff (credentials, prof. dev, accomplishments) Unit mission or purpose Link unit mission to the college mission Defined the content of their program Defined the population served Set outcome objectives Performed some means of assessment Analyzed results Determined strengths and weaknesses Created strategies for improvement Determined future needs (including curricular change, needed resources, staff and space)

Instructional Program Review

 Rules for the process included: 

Involvement on the part of all faculty in the department under review (not just one person). It is recommended that the program begin with a brief faculty retreat to discuss and divide tasks.

 

All programs must use at least one external committee (advisory groups are fine) to provide feedback to programs.

All programs must utilize feedback from students.

What We Learned….

     The first group was dragged kicking and screaming through the process Faculty had very little time for the details Faculty had trouble identifying “student learning outcomes” The results produced great marketing materials Once they saw the results, faculty embraced the process

Identifying Outcomes

 We focused on three types of outcomes:  Outcomes   Program Outcomes Student Learning Outcomes  Administrative Outcomes

Outcomes and the Classroom

   "Outcomes" are benefits for people: changes in

knowledge, values, position, skills, behavior or

status. More simply stated, outcomes are typically what service providers hope recipients achieve once they complete a program or receive services. This is not the “what” but the “why” of education.

Student learning outcomes are outcomes related to the learning that takes place in the classroom. We measure improvements in writing, speaking, understanding the scientific method, etc.

Program outcomes are the benefits to a student who receives an associate degree in Nursing or completes a certificate in Network Administration? Typical outcomes might be licensure exam scores, job placement, etc.

Program Outcome Model

INPUTS Resources ACTIVITIES Services OUTPUTS Products or Results of Activities Staff Buildings Facilities State funds FTE Constraints Laws State regulations Education (classes) Services Counseling Student activities Numbers served FTE (input next year) # Classes taught # Students recruited # Who participated

Program Outcomes Model

INPUTS > ACTIVITIES > OUTPUTS > OUTCOMES Benefits for People *New knowledge *Increased skills *Changes in values *Modified behavior *Improved condition *Altered status *New opportunities

Administrative Outcomes

    Many units do not directly serve students or they want results within their units that are not truly student learning outcomes.

They want to improve services or approach an old problem in a new way.

They want to become more efficient and effective.

They will set administrative outcomes.

My Administrative Outcomes

1. 80% of faculty/staff responding to the faculty/staff survey will perceive that Planning and Research responds quickly to their requests for data.

2. 80% of faculty/staff responding to the survey will perceive that Planning and Research makes a significant contribution to the College.

3. 80% of faculty/staff responding to the survey will perceive that Planning and Research contributes to the effectiveness of CPCC.

4. 80% of faculty/staff responding to the survey will indicate that Planning and Research produces enough reports to meet the planning and information needs of faculty and staff.

We Had to do Training on How to Set Outcome Objectives

 There’s no magic number  e.g. 80% or 90%      What is reasonable?

What can you afford?

What realistically can your staff accomplish?

May need to benchmark (e.g. enrollment growth) What percent shows you’re not committed and what percent shows you’re naïve?

How to Set Outcome Objectives

 Examples:  Fifty percent of students will be able to communicate effectively in writing (complete the writing exam with a grade of 60 [D] or better)  By the end of the spring term, 95% of faculty and staff will have completed 20 contact hours of professional development (workshops, college courses, conferences, onsite trainings, etc.)

More Realistic

 Seventy percent of students will be able to communicate effectively in writing (complete the writing exam with a grade of 75 [C+] or better)  By the end of the spring term, the professional development office will increase their offerings for faculty and staff by 10% over what was offered last year (workshops, college courses, conferences, onsite trainings, etc.)

What Happened

   Deans used the results to make a case for resources The administration became interested in what was learned through the review process Faculty knew their programs were working

Unit Review – Lessons Learned

 To do this well you need several critical pieces: 1. Support from the top (President or Chancellor, Vice Presidents or Vice Chancellors) 2. Buy in from the grassroots level (participation in the development of the process) 3. Across-the-college participation (no one is exempt) 4. Technology to make it easy (web page and review templates) 5. Technical support from institutional research (liaison to mentor)

IE Committee Decision

       The instructional program review process was successful The review of instructional units was important – but administrative/ESS units helped created an environment conducive to learning at the college and supported the learning process Administrative units should go through a similar process Create a committee to draft a similar process for the administrative and students services areas of the college Administrative units were spread across three VP areas Two representative from each VP area participated in drafting the new review process We spent approximately six months creating a workable process

Administrative Units Reviewed

       

Planning and Research Health and Safety Office Human Resources Professional Development Resource Development Equal Opportunity Employment Office Facilities Services (Distribution Services, Facilities Design and Construction, Facilities Management) Financial Services (Payroll, Budgeting, Financial Reporting, Purchasing, Accounts Payable, Cashiering)

Administrative Units Reviewed, cont.

          

Audit and Compliance Basic Skills Reporting Inventory Control College Services (Bookstore, Campus Printing and Vending) Information Technology Administrative Computing Services Security Executive Assistant to the President Marketing and Community Relations The College’s Foundation The Library

Student Services Units Reviewed

            

Counseling and Advising Career Services Graduation Office Financial Aid/Veteran’s Affairs Student Activities Admissions Registration Retention Services Recruitment Campus Student Services Information Services Welcome Center/Student Success Center Testing Center

The Administrative Unit Review Design

I. The Unit/Program Profile A. The Mission/Purpose 1.

Role unit plays in the college mission 2.

B. The Staff 1.

Unit/program goals as they relate to the college’s mission Professional and administrative staff (since the last review) a.

Position description/duties b.

c.

d.

Credentials (full and part-time, if any) Accomplishments (if applicable) Service to college, community and nation

The Administrative Unit Review (continued)

B. The Staff (continued) 2.

e.

Professional development activities Classified Staff a.

List of names and positions b.

List of required credentials (if any) C. The Customer/Client Served 1.

Breakdown of students/faculty or staff by type or demographic information (thorough explanation of who is served)

The Administrative Unit Review (continued)

II.

Definition of Services or Program A.

Definition of day-to-day duties of the unit B.

Innovations, new projects, new initiatives, local, state-wide or national efforts C.

Required functions of unit (description and status of compliance) 1.

SACS requirements 2.

3.

4.

State mandates Federal mandates Other

The Administrative Unit Review (continued)

III.

Administrative Outcome Objectives and Student Outcomes (where appropriate) A.

Administrative Outcome Objectives (2-3 objectives) B.

Outcomes (or status if incomplete) of innovations, new projects, new initiatives, local, state or national efforts C.

D.

Assessment explanation (what was assessed, who, when, how many) Results of Administrative Outcome Objectives Based on Assessment

Assessment of Administrative Units

   During the year of program review, the “Annual Faculty/Staff Survey” contained questions from those units being reviewed.

Results were given to each unit and broken out by campus and job type http://inside.cpcc.edu/planning

There’s a link to survey results.

The Administrative Unit Review (continued)

IV.Need for Change A.

Strengths identified by external sources, faculty, staff and students B.

Weaknesses identified by external sources, faculty, staff and students C.

D.

E.

Recommendations by faculty, staff, external sources and students to improve the unit's services and programs Strategies for change (based on input from Sections A, B & C) - closing the loop A one-year follow-up brief report to the Unit VP reporting on the progress of D above (due at the end of the year following review)

The Administrative Unit Review (continued)

V.

Future Issues - Resources needed for future efforts A.

best Market trends within the broad service unit or program area (based on practices, the literature or training received) B.

C.

future D.

Anticipated future changes and needs (based on market trends) Resources, equipment, space, staffing and work load changes needs for growth or continuation Future plans of unit

Assistance from the Website

  The IE website: http://inside.cpcc.edu/IE  Explanation of IE process    Templates and forms for review “Perfect” examples for clarification Schedule for review

Overall Benefits

  #1 No recommendations in the area of Institutional Effectiveness from SACS during the October 2002 re-accreditation visit The college became change-oriented     Units had to define strategies for change Didn’t have to be perfect but rather making continuous progress Strategies for change helped identify needed resources for units Units had to “close the loop” with the one-year follow up (couldn’t promise and not deliver)

Overall Benefits

 Units became empowered to perform their functions in an optimal manner and to ask for what they needed (no one noticed them before, now they do)  Created data to support needs  Accomplishments were reported to major administrative groups across the college (President’s Cabinet, Planning Council, etc.)

Overall Benefits

  The college community understood the purpose and function of every unit Senior administration realized the benefits and became strong supporters  Data are reported annually from program/unit review  VPs became stronger advocates for their units making changes to improve services

Overall Lessons Learned Through Program/Unit Review

      The response rate to on-line surveys was double that of pencil-paper surveys Faculty and staff did not know what many units did Faculty and staff were unsure of many policies (how to obtain funds from the Foundation, etc.) Email was the preferred method of communication for faculty and staff Overall, faculty and staff were pleased with services Wednesday afternoons were the best time to hold trainings

Overall Lessons Learned Through Program/Unit Review (cont.)

     Faculty and staff had good ideas on how to improve services – if we just asked them Use of P-Cards was saving the college time and money and faculty/staff preferred using them Faculty and staff wanted to continue to receive paycheck receipts in the mail (all employees are direct deposit) More than 80% of faculty and staff were using remote access to email Faculty and staff wanted changes in the budgeting process – too complicated

Administrative Response

   Units spend a lot of time working on the program review.

If we want them to take it seriously, we have to take it seriously.

Once reviews are reported at the end of the year, their Dean or Vice President/Chancellor need to:  1. Read them  2. Respond to them

From Personal Experience

       Sit down with the unit (group meeting) Give them my attention Discuss what was learned Discuss their major issues Discuss what they need to make improvements Actually attempt to direct resources to them to make those improvements Then they do not see it as an academic exercise

How It Works for Us

     Administrative Units Reviewed in 2003  Administrative Technology Services  Compliance & Audit, Inventory Control, Basic Skills Compliance & Reporting Facilities Services Information Technology Services Institutional Advancement/Foundation Planning and Research Resource Development

Facilities Services

      Consists of the following departments: Facilities Design and Construction – Oversee planning, design, and construction of new facilities and major renovations Facilities Management – Operates and maintains existing facilities of the College Security – Provides physical security services at all campuses Distribution Services – Manages collection and delivery of U.S. and intercampus mail, package deliveries, and centralized shipping and receiving services Inventory Control – Manages the College’s inventory tracking, reporting, and disposal functions

Facilities Services

 Survey respondents gave high rankings (in the 80-90% range) for:  Newly constructed or newly renovated facilities being of high quality and meeting needs  Knowing what was planned for the future facilities of their respective campuses  Classrooms in buildings built since 1990 being adequate in size, furnishings and other amenities

Facilities Services

 Classrooms and common areas being generally clean  Lighting levels being adequate  A safe and secure environment being provided for all members of the College community  Knowing procedures to follow if a medical emergency on campus  Providing mail, shipping and receiving, and inventory control services

Areas Needing Improvement And Actions To Be Taken

 Only 73% of respondents were pleased with new office spaces.

 To address this concern, the Facilities Standards Subcommittee of Facilities Partners will review the adopted standards for offices and furnishings and make recommendations for revisions.

 Only 50% faculty and staff were pleased with the temperatures in classrooms.  The Central Energy Plant began operating in the summer of 2003; when connected to the main body of Central Campus, the back-up capabilities of this facility should minimize the disruption of cooling. The College is also developing a plan for the future pairing of boilers to provide back-up heating for Central Campus buildings. At the suburban campuses, central plant approaches are being used to link buildings, thereby providing back-up capability.

Areas Needing Improvement And Actions To Be Taken

 Only 57% of the respondents expressed satisfaction with the cleanliness of bathrooms.  The College entered into a new housekeeping contract as of July 1, 2003 which employs a new vendor and has enforceable cleanliness standards that must be met. Additionally, Facilities Management is addressing the appearance of some of the older restroom facilities on Central Campus as part of a cyclical rehabilitation program.

Future Considerations

 The College will be adapting its long-term capital planning to new guidelines from the County and seeking the use of alternative funding sources.

 Facilities Services will be working with other units of the College to ensure coordinated planning relating to the staffing and specialty services for new facilities.

 Facilities Services will be developing new strategies for meeting operational demands in the face of decreasing dollars per square feet in the County budgets.

One Thing We Learned

  People have a lot to say when you hit them where they live (e.g. facilities services and information technology) The open-ended questions produced pages and pages of comments

An Example from Instruction

   Workplace Basic Skills  This program is a literacy initiative that goes directly into the worksite and teaches ESL classes, GED prep and GED classes.

During their review, they surveyed both employers and students. This was the first time they had ever done this.

What They Learned

 Employers said:  43.8% of employers reported increases in employee performance as a result of participation in the program.

    31.3% reported a reduction in absenteeism by participants.

87.5% said classes improved the morale of their employees 37.5% said participants received raises 50% said communication had improved.

What Students Said

     70.2% reported being able to fill out job forms better 35.5% said they could now help their children with their homework 91.1% said they felt better about themselves 44.4% said they had received a raise, promotion or opportunity as a result of the courses 86.3% said their ability to communicate in the workplace had improved

What Has Happened Since

   Their assessment data has shown up in their marketing brochures to employers.

Their enrollment has grown dramatically.

They have received funding and marketing support from Charlotte Reads (considered a model adult literacy program).

“Best” Results of “Best” Practice

      Better use of data across the college We have become more client/customer focused Review gives direction for goals and needed changes Departments are empowered to do their jobs (can’t slip through the cracks and be unnoticed) Problems must be resolved (there is no hiding and no excuses) Surveys provide needs assessment data as well as evaluation which gives departments direction

For a copy of this presentation

     Contact Terri Manning or Denise Wells [email protected]

[email protected]

Download or print presentation: http://inside.cpcc.edu/planning   Click on “studies and reports” Listed as League Presentation