Creating an Assessment Bridge

Download Report

Transcript Creating an Assessment Bridge

The Academic Program Review

Bridging Standards 7 and 14 Middle State’s Annual Conference December 10, 2010

Presenters

Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs

Overview

• Framework to Address the APRs • Structure/Challenges/Approach • Examples of Metrics • Current Action Plan • Integration of “End User” Technology • Next Steps • Benefits of Our Approach • Questions

Assessment Cycle-2005

Plan to meet Meet to plan Meet to plan Plan to meet Report out on planning

What we had to build on

• Strong focus on programs.

• State mandated 5-year academic program review in need of revision.

• Institutional Effectiveness Model (IEM) with performance indicators benchmarked through State and National data bases.

Mission Strategic Initiative: Access & Success Institutional Effectiveness

IEM

• Needed a way to assess how the College was performing on key metrics in relation to prior. years/semesters and compared to other institutions.

• Historical/Trend data • Benchmark data – Pennsylvania & National Peers

• Institutional Effectiveness Model

Where we started

• Restructured the Academic Program Review process • Incorporated the use of technology

Goal of the restructuring

• Measure student performance as evidence by results of assessment of student learning outcomes.

• Measure program performance as evidenced by comparison of program performance to overall college performance on specific key indicator (current and aspirational).

Challenges

• Usual issues with assessment in general.

• Faculty had little knowledge of the College’s performance indicators.

• Organizational separation of assessment of institutional and student learning outcomes.

Approach

Began by building it backwards from the IEM by mapping out specific core indicators to program data, making additions where needed.

Examples of Metrics Used for APR 14,50%

14,14% College's Graduation Rate by Cohort 14,10%

14,00% 13,50% 13,00% 12,50% 12,00% 11,50% Fall 2002

12,87%

Fall 2003

13,26%

Fall 2004

12,70%

Fall 2005 Fall 2006

TARGETS

Graduation Rate

Caution

<19%

Acceptable Aspirational

19%-23% >23%

35,00% 30,00% 25,00% 20,00% 15,00% 10,00% 5,00% 0,00%

27,24%

Fall 2002

College's Transfer Rate by Cohort 32,27% 30,37%

Fall 2003 Fall 2004

26,40%

Fall 2005

26,80%

Fall 2006

TARGETS Caution

Transfer Rate <29%

Acceptable Aspirational

29%-32% >32%

Definitions of Success & Retention Success=Grades of (A,B,C & P)/(A, B, C, D, P, D, F, & W) Retention=Grades of (W)/(A, B, C, D, P, D, F, & W)

Retention Rates in Core Courses

100,0% 90,0% 80,0% 70,0% 60,0% 50,0% 40,0% 30,0% 20,0% 10,0% 0,0% Fall 2005 Fall 2006 Fall 2007 Fall 2008 Fall 2009 Comp I 90,3% 89,9% 90,9% 91,6% 91,7% Comp II 85,6% 85,4% 86,7% 86,9% 87,5% College Algebra 72,4% 76,0% 74,5% 81,9% 79,0% Speech 93,1% 90,5% 90,5% 90,3% 89,6%

100,0% 90,0% 80,0% 70,0% 60,0% 50,0% 40,0% 30,0% 20,0% 10,0% 0,0% Fall 2005 Fall 2006 Fall 2007 Fall 2008 Fall 2009 Comp I 76,7% 74,6% 76,9% 76,9% 75,6%

Success Rates in Core Courses

Comp II 74,1% 80,3% 77,1% 79,3% 78,8% College Algebra 61,2% 61,7% 62,9% 61,8% 66,2% Speech 88,0% 83,3% 83,1% 86,3% 84,3%

Added a curricular analysis

• How well program goals support the college’s mission.

• How well individual course outcomes reinforce program outcomes.

• How well instruction aligns with the learning outcomes.

• Specific assessment results.

• Changes made based on the assessment findings. • Evidence of closing the loop • Changes made to the assessment plan.

Action Plan

• Outcomes expected as a result of appropriate actions steps.

• Timelines and persons responsible for each action step.

• Resources needed with specific budget requests.

• Evaluation plan with expected benefits.

Bottom Line

• Is there sufficient evidence that the program learning outcomes are being met? • Is there sufficient evidence that the program is aligned with the college on specific key indicators?

The Framework

STANDARDS 1-7 Focus on Institutional Performance

Standard 7: Assessment of Institutional Effectiveness How well are we collectively doing what we say we are doing, with a specific focus on supporting student learning. Assessment must be included in Standards 1-6.

STANDARDS 8-14 Focus on Student Performance Standard 14: Assessment of Student Learning

Do we have clearly articulated learning goals, offer appropriate learning activities, assess student achievement of those learning outcomes, and use the result of assessment to improve teaching and learning and inform budgeting and planning. Also in Standards 8-13. 

Strategic Analysis

Institutional data  Link to the IEM through use of common data sets

Curriculum Analysis

 Course assessment   Program Assessment Core assessment   

Planning and Budgeting (Standard 2)

APR Action Plan APR Annual Report Annual Academic Planning

Assessment Results

Curriculum Committee President’s Office Curriculum BOT & BOT

Addition of Technology

• Worked in concert with Information Technology to integrate iStrategy with ERP (Datatel).

• The implementation of this permitted end users to obtain the data needed for program assessment, without the middle man (IR and/or IT).

Next Steps in the Evolution of of College and Program Outcomes

Example of APR Report Card

Full-time Persistence Part-time Persistence

Fall to Fall

Program College Program College 2005 75% 64.39% 25% 33.52% 2006 50% 63.16% 43.48% 37.09% 2007 70% 50% 33.33% 34.41% Persistence - These data are based on a cohort of first time students from a specific semester and follows their enrollment patterns one year out

Persistence

Full-time Persistence

Fall to Spring

Program College Program 2006 70% 78.19% 69.57% 2007 80% 80.98% 45.83% 2008 76.92% 80.93% 46.15% Part-time Persistence College 45.12% 41.60% 39.00% Persistence - These data are based on a cohort of first time students from a specific semester and follows their enrollment patterns from Fall semester to the following Spring semester

Degrees Conferred

2006 2007 2008

Graduation and Transfer

Three Year Percent Change Program 9 9 8 -11.11% College degrees earned 1058 1046 1127 6.52% Degrees Conferred – These are the actual number of degrees conferred, not the Pass Rates on Licensure Exams

Graduation and Transfer Rate: Cohort Entering 2005

Full-time Program College Total Students 14 1524 Program 9 Part-time College 990 First time students to College, no transfer credits *Acceptable 19-23% Degrees 1 192 0 26 Graduation Rate (Within 3 Years) 7.14% 12.60%* 0% 2.62% Transfer 5 402 4 232 Transfer Rate (Within 3 Years) 35.71% 26.38% 44.44% 23.43%

Fiscal

Evidence of High Priority and Employment Predictions

• Examples of Course Success

Success in ACC 111

Total College Success: ACC 111

100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003/FA % Success # Success % Non Success # Non Success 2004/FA 2005/FA Success 2006/FA 2007/FA Non Success 2008/FA 2009/FA 2003/FA 61.4% 329 38.6% 207 2004/FA 57.1% 276 42.9% 207 2005/FA 55.4% 253 44.6% 204 2006/F A 55.3% 281 44.7% 227 2007/F A 51.4% 261 48.6% 247 2008/F A 44.2% 244 55.8% 308 2009/F A 48.3% 249 51.7% 267

Success in ACC 111

ACC 111: Success by Gender

100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003/FA 2004/FA 2005/FA Female 2006/FA Male 2007/FA 2008/FA 2009/FA % Female Success Female Success % Male Success Male Success 2003/FA 63.3% 145 59.8% 180 2004/FA 57.5% 111 56.8% 163 2005/FA 58.8% 104 53.2% 149 2006/FA 57.7% 123 53.6% 158 2007/FA 57.3% 114 47.2% 145 2008/FA 51.8% 115 39.1% 127 2009/FA 58.7% 105 42.1% 133

Success in Math 010

100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003/FA

Total College Success: Math 010

2004/FA 2005/FA Success 2006/FA 2007/FA Non Success 2008/FA % Success Success % Non Success Non Success 2009/FA 2003/FA 53.6% 310 46.4% 268 2004/FA 46.3% 266 53.7% 309 2005/FA 47.3% 276 52.7% 307 2006/FA 45.7% 293 54.3% 348 2007/FA 44.8% 297 55.2% 366 2008/FA 43.3% 288 56.7% 377 2009/FA 47.4% 344 52.6% 381

Success in Math 010

Math 010: Success by Race

100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003/FA 2004/FA 2005/FA African American 2006/FA 2007/FA Caucasian 2008/FA 2009/FA % African American Success African American Success % Caucasian Success Caucasian Success 2003/FA 42.6% 2004/FA 37.7% 43 58.2% 202 46 51.8% 184 2005/FA 38.5% 40 50.3% 180 2006/FA 25.8% 2007/FA 26.9% 33 52.7% 217 45 53.5% 206 2008/FA 29.9% 2009/FA 34.7% 56 48.4% 180 51 52.0% 141

Benefits

• Build a bridge between Standards 7 and 14.

• Better data.

• By putting data in the hands of faculty, have them actively engaged with using data in decisions/planning.

• IR time better used.

• Annual planning cycle developed.

• Built a culture of assessment in several of the academic divisions.

• Curricular changes that align with graduation initiative.

• Curricular and program improvement.

• Created a college-wide model for improvement of student learning.

Evolution of the Dashboard

• Creation of a Student Success Dashboard       Metrics: Course level success and retention (Developmental and College-Level) Persistence (fall to spring and fall to fall) Progression of various cohorts of students College level success in Math or English after Developmental Math or English Graduation Transfer

Graphic Representation for the SSD

Graphic Representation for the SSD

Final Thoughts

  It’s not perfect, but it works for us.

Do the research on which tools are appropriate for your college    Assessment of the core curriculum Launching of assessment software It all starts with asking the right question  PRR 2010

Questions

Presenters

Mr. H. Leon Hill [email protected]

Director of Institutional Research Dr. Joan E. Brookshire [email protected]

Associate Vice President of Academic Affairs