Designing Assessment and Institutional Research for a New

Download Report

Transcript Designing Assessment and Institutional Research for a New

Lily Hwang, Director, Institutional Research
Juliana Lancaster, Director, Institutional Effectiveness





4-year, State College in the University System
of Georgia
Authorized by GA Legislature in May 2005
President hired in September 2005
Campus opened with 118 students and 10
faculty in August 2006
Home of the Grizzlies!

Students:




Faculty (Fall 2008):


Fall 2007 Enrollment: Headcount 787
Spring 2007 Enrollment: Headcount 867
Fall 2008 Enrollment: Headcount 1563
Instructional full-time faculty: 120
Instructional part-time faculty: 10
Facilities:

6 Buildings: A, B, C, D (Student Services Ctr), E (Valentine Bldg), F
(Fitness Ctr)
 Building E not occupied yet Total: 474,351 square feet
Parking Deck: 734 cars
 Total acreage: >200


Four Degree Programs

BBA, Business; BS, Biology; BS, Information Technology; BS,
Psychology




Commitment at every level to student learning and
effectiveness
Institutional focus on interdisciplinary/ integrated
education
Openness to going “outside the box” – provided there
is a plan for assessment
Created the opportunity for a ground-up design of an
INSTITUTIONAL assessment plan and of wellintegrated institutional research functions
Initial Design
 The First Full Year
 Lessons Learned
 Next Steps


Advantages of starting from scratch




Strong executive level support for and understanding of IE
Limited number of programs and offices at start-up
Absence of legacy or standing processes and structures
Disadvantages to starting from scratch



Absence of legacy or standing processes and structures
Each individual brings a different set of assumptions and
expectations
Rapid growth and hiring leads to continuous need for
explanation/education


In order to get “…ongoing, integrated, and institution-wide researchbased planning and evaluation processes…[SACS]” for we needed:
 Structure and resources
 Broad buy-in, consensus and agreement
Working “ground rules”
 Institution-wide and pervasive
 Integrated with institution’s mission & strategic plan
 Faculty/staff participation and basic control
 Interdisciplinary and developmental assessment of student learning





Program level student learning outcomes and assessment plans
General Education curriculum designed around student learning
outcomes
Agreement to develop and assess for institutional student learning
outcomes
Agreement to integrate curricular and co-curricular student learning
efforts
Leading to: Integrated Educational Experience (IEE) Student Learning
Outcome Goals for GGC
Conceptual Relationships Among Outcome Goals and Objectives
Institutional Goals
Administrative Unit Outcome Goals
Integrated Educational Experience SLO Goals
Student Affairs Goals
General Education Goals
Program of Study Goals
Course Goals
Student Affairs
Activity Goals
Lesson Objectives
Organizational Structure to Manage Resulting Flood of Data
Assessment Steering Committee
•Integrated review of all assessment results
•Strategic analysis of results; impact on strategic plans
IEE Assessment Review Committee
•Communication
•Integrated review of IEE assessment results
IEE Goal Team
•Interdisciplinary
•Operationally define & plan assessment(s)
•Integrated review of program findings
Administrative Review Committee
General Education Committee
General Education Goal Teams
Program Goal Teams

Planning
 All operating units, both academic and administrative
developed assessment plans.
 Academic units focused on course-level, embedded
assessments.
 All faculty and numerous staff engaged in discussing and
planning assessment.
 Goal teams developed operational definitions of each
institution-level student learning outcome (GE and IEE)

Execution
 All units attempted to fully execute their assessment plans
 Some outcomes were not measurable
 Some measures called for unobtainable data
 All units were able to collect valid data on at least one outcome
 Most units were able to identify at least one needed action in
response to assessment
 60% identified needed changes in curriculum or operations
 34% identified needed changes in assessment plans

Challenges & Lessons Learned
Implementing program-level assessment plans while still developing the
institutional framework
 Communicating the history of and basis for having both General Education and
IEE student learning outcomes at the institutional level
 Articulating the initial task of the Goal Teams: To operationally define each
Student Learning Outcome
 Managing expectations at multiple levels


Next Steps
Review the conceptual and actual relationships between the two sets of
institution-wide student learning outcomes
 Initiate a campus-wide discussion about whether or not to make changes and,
what those might be
 Continue developing a broad base of informed, skilled individuals across
campus to lead assessment efforts.
 Continue efforts to establish systematic, manageable assessment at all levels


Unique Setting/Environment

Major Tasks

Major Challenges

Institutional Environment

Banner hosted institution -- technical environment
located at a central location – Office of Information &
Instructional Technology (OIIT)

Internal support available for IR: a core data
manager (Banner function person), and a
programmer (IT).

Major Tasks
To learn legacy data system, e.g., Student
Information Reporting System (SIRS) and
Curriculum Inventory Reporting (CIR), etc.
 To learn USG reports, e.g., Semester Enrollment
Report (SER)—their definitions.
 To learn new Academic Data Mart (ADM) systems.
 Producing reports (routines, ad hoc/internal &
external).
 Producing the College Fact Book.


New Major Task


Began IPEDS reporting
Began many other surveys:
 CUPA Faculty Salary Survey (began earlier)
 National Postsecondary Student Aid Study (did not
have data due to non-Title IV status at the data point)
 The Consortium for Student Retention Data Exchange
(CSRDE), National Student Clearinghouse—supported
by USG.

Major Challenges


Entering in the transitional period from the legacy
data system to new ADM system; allowing very
brief learning curve.
Learning together with other Units, e.g., the
Registrar’s Office, Human Resources; requiring close
relationships.

Example:

A collaborative effort on establishing a CIP list
representing GGC’s teaching disciplines/areas.

Why is this important for GGC?
 GGC does not have departments.
 School >>Major (program) >> Tracks/Concentration

IE and IR


As does every unit of GGC, IR operates within the
college framework IE facilitates and monitors.
Specific tasks for IR in support of IE operations:
 Institutional information request for accreditation purposes
 Information support for assessment projects, e.g., NSSE and
Course Evaluations

Anticipated tasks for IE in support of IR
 Providing benchmark and assessment data for Fact Book
 Collaboration in design of specific studies
Presenters:
Juliana Lancaster
Director, Institutional Effectiveness
[email protected]
Lily Hwang
Director, Institutional Research
[email protected]