Designing Assessment and Institutional Research for a New
Download
Report
Transcript Designing Assessment and Institutional Research for a New
Stas Preczewski
Vice President, Academic & Student Affairs
Juliana Lancaster
Director, Institutional Effectiveness
Lily Hwang
Director, Institutional Research
4-year, State College in the University System
of Georgia (1st in over 100 years)
Authorized by GA Legislature in May 2005
President hired in September 2005
Leadership team assembled during Spring 2006
Charge from University System of Georgia
Desires of Gwinnett County
Initial Environmental Scan
Background research on student engagement &
learning
Continuous review, assessment, change &
experimentation
Holistic student focus
Partnerships with local constituents
Competent, action-oriented, innovative
faculty/staff
Innovative and appropriate use of technology
Global/Multicultural environment and focus
Supportive and collegial work environment
Relatively flat organizational structure
Academic & Student Affairs combined
Deliberate integration of personnel across areas
Frequent, focused discussion among decisionmakers
Careful hiring of faculty/staff who “fit”
Leadership efforts to model collegiality, etc
Institutional focus on interdisciplinary/
integrated education
Commitment at every level to student learning
and effectiveness
Openness to going “outside the box” –
provided there is a plan for assessment
Students:
Semester
Spring 2008
Fall 2008
Spring 2009
Headcount
788
867
1,563
1,608
FTE
696
753
1,374
1,401
Enrolled students from Gwinnett County at time of matriculation: 72.7%
Faculty (Fall 2008):
Fall 2007
Instructional full-time faculty: 105
Instructional part-time faculty: 10
Facilities:
Total acreage: >200
5 Occupied Buildings: A, B, C, D (Student Services Ctr), F (Fitness Ctr)
Library, Student Housing under construction
Student Center groundbreaking in 2009
Parking Deck: 734 cars
Current Degree Programs
BBA, Business; BS, Biology; BS, Information Technology; BS,
Psychology
Planned Future Programs (pending accreditation, system
approval, and substantive change approval)
Education: Early Childhood, Special Education, Secondary
Nursing & Allied Health areas
Spectrum of Liberal Arts & Sciences
Accreditation Status
Initial Application to SACS-COC in Fall 2007
Hosted Candidacy Committee in Spring 2008
Admitted to Candidacy in June 2008
Hosted Accreditation Committee in Spring 2009
Awaiting action by Commission in June 2009
Initial Design
The First & Second Years
Lessons Learned
Next Steps
Advantages of starting from scratch
Strong executive level support for and understanding of IE
Limited number of programs and offices at start-up
Absence of legacy or standing processes and structures
Disadvantages to starting from scratch
Absence of legacy or standing processes and structures
Each individual brings a different set of assumptions and
expectations
Rapid growth and hiring leads to continuous need for
explanation/education
In order to get “…ongoing, integrated, and institutionwide research-based planning and evaluation
processes…[SACS]” we needed:
Structure and resources
Broad buy-in, consensus and agreement
Working “ground rules”
Institution-wide and pervasive
Integrated with institution’s mission & strategic plan
Faculty/staff participation and basic control
Interdisciplinary and developmental assessment of student
learning
Program level student learning outcomes and assessment plans
General Education curriculum designed around student learning
outcomes
Agreement to develop and assess for institutional student learning
outcomes
Agreement to integrate curricular and co-curricular student
learning efforts
Leading to: Integrated Educational Experience (IEE) Student
Learning Outcome Goals for GGC
Conceptual Relationships Among Outcome Goals and Objectives
Institutional Goals
Administrative Unit Outcome Goals
Integrated Educational Experience SLO Goals
Student Affairs Goals
General Education Goals
Program of Study Goals
Course Goals
Student Affairs
Activity Goals
Lesson Objectives
Organizational Structure to Manage Resulting Flood of Data
Assessment Steering Committee
•Integrated review of all assessment results
•Strategic analysis of results; impact on strategic plans
IEE Assessment Review Committee
•Communication
•Integrated review of IEE assessment results
IEE Goal Team
•Interdisciplinary
•Operationally define & plan assessment(s)
•Integrated review of program findings
Administrative Review Committee
General Education Committee
General Education Goal Teams
Program Goal Teams
Planning
All operating units, both academic and administrative
developed assessment plans.
Academic units focused on course-level, embedded
assessments.
All faculty and numerous staff engaged in discussing and
planning assessment.
Goal teams developed operational definitions of each
institution-level student learning outcome (GE and IEE)
Execution
All units attempted to fully execute their assessment plans
Some outcomes were not measurable
Some measures called for unobtainable data
All units were able to collect valid data on at least one outcome
Most units were able to identify at least one needed action in
response to assessment
60% identified needed changes in curriculum or operations
34% identified needed changes in assessment plans
Planning
Academic and administrative assessment plans improved.
Academic units continued course-level, embedded assessments
and began identifying critical program-level assessment points.
All faculty and numerous staff engaged in discussing and
planning assessment.
Goal teams completed operational definitions of each
institution-level student learning outcome (GE and IEE)
Execution
All units executed their assessment plans
All units were able to collect valid data on each outcome
Most units were able to identify at least one needed action in
response to assessment
Challenges & Lessons Learned
Implementing program-level assessment plans while still developing
the institutional framework
Communicating the history of and basis for having both General
Education and IEE student learning outcomes at the institutional
level
Articulating the initial task of the Goal Teams: To operationally
define each Student Learning Outcome
Managing expectations at multiple levels
Next Steps
Review the conceptual and actual relationships between the
two sets of institution-wide student learning outcomes
Initiate a broad-based process to determine what, if any,
changes are needed
Continue developing a broad base of informed, skilled
individuals across campus to lead assessment efforts.
Continue efforts to establish systematic, manageable
assessment at all levels
Unique Setting/Environment
Major Tasks
Major Challenges
IE and IR
Plans
Institutional Environment
Banner hosted institution -- technical environment
located at a central location – Office of Information &
Instructional Technology (OIIT)
Internal support available for IR: a core data
manager (Banner function person, currently vacant),
and a programmer (IT).
Major Tasks
Learning legacy data system, e.g., Student
Information Reporting System (SIRS) and
Curriculum Inventory Reporting (CIR), etc.
Learning USG reports, e.g., Semester Enrollment
Report (SER)—State definitions.
Learning new Academic Data Mart (ADM) systems.
Producing reports (required, routine, ad hoc,
internally & externally).
State reports, IPEDS, common surveys (e.g. CUPA)
Institutional information support for accreditation purposes
College Factbook (currently, the 2nd book).
Major Challenges
Entering in the transitional period from the legacy
data system to new ADM system; allowing very
brief learning curve.
Learning together with other Units, e.g., the
Registrar’s Office, Human Resources (e.g., transition
form PeopleSoft to ADP); requiring close
relationships .
IE and IR
IR operates within the college framework that IE
facilitates and monitors.
Specific tasks for IR in support of IE operations:
Information generated for assessment projects, e.g., NSSE
and Course Evaluations
Anticipated tasks for IE in support of IR
Collaboration in design of specific studies
Plans
Identifying and developing research agenda (for major
studies) in support of institutional decisions on growth
Team (committee) required
e.g., environmental scanning
Continuous support for Enrollment Management
Identifying report items to be routinely supplied; e.g.,
retention/graduation analysis, analysis of fall enrollment,
benchmarking analyses
Planning for Program Review.
Team (committee) required
Presenters:
Stanley Preczewski
Vice President, Academic & Student Affairs
[email protected]
Juliana Lancaster, Director, Institutional Effectiveness
[email protected]
Lily Hwang, Director, Institutional Research
[email protected]
AIR Forum 2009, Atlanta GA
Session 682
June 2, 2009