Data: What Do We Mean and What Kind Do We Need?

Download Report

Transcript Data: What Do We Mean and What Kind Do We Need?

Data: What Do We Mean and
What Kind Do We Need?
Facilitator: Chris Hill, Grossmont College
Presenters: Elaine Kuo, Foothill College
Mallory Newell, De Anza College
State of Accreditation Today
American Council of Education recommendations on
accreditation
1. Transparency – accreditation documents; student outcomes
2. Metrics and measures – improvement of data; common,
comparative measures
3. Defined pathways to student success – increased focus
on matriculation and student support; incentives for
completion
4. New pedagogies and new entities – MOOCs; assure
comparable quality of online learning
5. Adaptive accreditation reviews – flexibility for expedited
reviews of sounds institutions
6. Enhanced gatekeeping – more pressure on accreditors
and institutions to maintain quality between site visits
From ACCJC News, Fall 2012
2
State of Accreditation Today
From June 2011-June 2012, the ACCJC took the following
institutional actions:
1. 28 institutions had a comprehensive evaluation
2. 12 institutions were reaffirmed
3. 10 institutions were placed on warning
4. 5 institutions were placed on probation
5. 1 institution was placed on show cause
From ACCJC Report of Institutional Actions from the June 2011, January 2012, June 2012
Commission Meetings
3
WHAT BRINGS YOU HERE?
4
Overview of session
What does ACCJC say about data and
evidence?
 What types of data/evidence are needed?
 Where and how are data discussed?
 How are data used for continuous
improvement?

5
WHAT DOES ACCJC
SAY ABOUT DATA AND
EVIDENCE?
6
Characteristics of Evidence
Good evidence used in evaluations has the following characteristics:
1. It is intentional, and a dialogue about its meaning and relevance
has taken place.
2. It is purposeful, designed to answer questions the institution has
raised.
3. It has been interpreted and reflected upon, not just offered up in
its raw or unanalyzed form.
4. It is integrated and presented in a context of other information
about the institution that creates a holistic view of the institution or
program.
5. It is cumulative and is corroborated by multiple sources of
data.
6. It is coherent and sound enough to provide guidance for
improvement.
Source: ACCJC/WASC Guide to Evaluating Institutions, 2009
7
Data 101 - Principles
1.
Use longitudinal data when possible
2.
Use data in context
3.
Look for both direct and indirect data
4.
Do not oversimplify cause and effect of data
5.
Use appropriate levels of data for appropriate levels of decisions
6.
Perception is the reality within which people operate
7.
Use of data should be transparent
8.
Consider carefully when to aggregate or disaggregate data
9.
Focus on data that is actionable
10. Consider
implications and the “What if?
From Data 101: Guiding Principles for Faculty (ASCCC, 2010)
8
WHAT TYPES OF DATA
ARE NEEDED?
9
Data Sources

External
◦
◦
◦
◦
◦
◦
◦
◦
◦

Accountability Reporting for California Community Colleges (ARCC)
Community College Survey of Student Engagement (CCSSE)
CSU Analytics/UC Statfinder
Department of Education – Data Quest
Career Technical Education Outcomes Survey
CCCCO DataMart
NCES (National Center for Educational Statistics)
CPEC (California Postsecondary Education Commission)
CalPASS
Internal
◦
◦
◦
◦
◦
◦
◦
◦
◦
Annual Master Plan Updates
Degree and Transfer Data
Annual Factsheets by Program
Annual Program Review Data
Student Learning and Service Outcomes Assessments
Key Performance Indicators (KPIs)
Environmental Scans
Surveys
Ad hoc research studies
10
Information Capacity Challenges

Building an Evidence-based Infrastructure




Keeping Up with the Demand




Thinking about everything you do as a research study
Working with your researcher and department to collect data
Sharing the data with your department and/or campus
Setting aside a time each year to discuss the data in your area
Linking research to (resource) planning
Making changes to your program based on the data
Turning Data into Action



Ask questions, share it, refine it, ask for more
Keep track of all the great things you are doing with the data
Share with others that may be data weary
From Fulks, Hasson, and Mahon - 2010 AI presentation 11
Helpful Hints




Disaggregate the data by ethnicity!
Collect data and evidence on an annual basis
to inform your accreditation cycle (don’t
have to collect it all in year 6).
Monitor the college’s progress on
completing it’s planning agendas on an annual
basis.
Assess the college’s institutional level
outcomes on an annual basis to stay on track
within your cycle, don’t forget about your
AUOs as well.
12
Examples of evidence related to
student learning
Information on Student Achievement: (i.e., student progress through the institution)
Is there evidence that the college has the capacity to collect, does collect, and uses in its own evaluation
and planning processes, data on student achievement? Is there evidence that the college does so regularly?
Is this data disaggregated by subpopulations where appropriate? This includes data on:
◦ Student demographics
◦ Student preparedness for college, including performance on placement tests
◦ and/or placement
◦ Student needs (i.e., local employment training needs, transfer education needs,
◦ basic skills needs, etc.)
◦ Course completion data
◦ Data on student progression to the next course/next level of course
◦ Data on student program (major) completion
◦ Data on student graduation rates
◦ Data on student transfer to a four-year institution
◦ Data on student job placements
◦ Data on licensure exams (scores, pass rates)
13
Examples of evidence related to
student learning (cont’d)
Information on Student Learning Outcomes: (i.e., student mastery of the
knowledge, skills, abilities, competencies, etc. identified by those designing the
educational experience of the college).
Is there evidence that student learning outcomes are defined?
◦ By course
◦ By program
◦ By degree (including General Education requirements)
Is there evidence there was dialogue about the SLOs?
◦ Prior to development
◦ As part of developing integrated educational services and
courses/programs • As part of institutional self evaluation, planning, and
improvement
◦ At the appropriate level of inclusion for the SLOs for courses, programs,
and degrees (is it evident that SLOs are “tracked” from courses, through
programs, and to certificate and degrees)
◦ In terms of how institutional processes can be oriented to better support
learning.
14
Examples of evidence related to
student learning (cont’d)
Information on Student Learning Outcomes: (i.e., student mastery of the
knowledge, skills, abilities, competencies, etc. identified by those designing the
educational experience of the college).
Is there evidence the SLOs are measured and the measurements are analyzed in
order to:
◦ Inform pedagogy and help improve the educational services
◦ Evaluate institutional effectiveness and plan institutional improvements
 The rubrics created to describe SLOs and related measurement strategies o The
ways in which specific pedagogical practices are changed in response
 to analyses of SLO attainment
 Analyses of SLO attainment used in the Program/Unit Review process to improve
student learning, programs, and services?
Is there evidence that students are learning?
◦ Samples of student work
◦ Copies of summary data on measured student learning outcomes
15
Course Level Data:
Math Performance Success – Course Success Rates
by Ethnicity
2001-2002 to 2011-12
Pass
MATHD210. Pre-Algebra
MPS
FACE_TO_FACE
OTHER FACE_TO_FACE
MATHD212. Beginning Algebra
MPS
FACE_TO_FACE
Did Not Pass
Withdrew
Total
Grades
Percent
Grades
Percent
Grades
Percent
Grades
Percent
176
69%
54
21%
25
10%
255
100%
7,916
57%
3,471
25%
2,608
19%
13,995
100%
869
76%
155
14%
124
11%
1,148
100%
13,621
54%
6,140
24%
5,352
21%
25,113
100%
FACE_TO_FACE
1,073
80%
163
12%
98
7%
1,334
100%
OTHER FACE_TO_FACE
18,136
60%
6,614
22%
5,695
19%
30,445
100%
405
40%
308
30%
299
30%
1,012
100%
982
88%
86
8%
53
5%
1,121
100%
OTHER FACE_TO_FACE
19,115
65%
4,443
15%
5,920
20%
29,478
100%
OTHER DISTANCE
1,759
48%
855
24%
1,020
28%
3,634
100%
OTHER FACE_TO_FACE
MATHD114. Intermediate Algebra MPS
OTHER DISTANCE
MATHD010. Elementary Statistics MPS
FACE_TO_FACE
From FHDA IR&P
Institution Level Data:
Online Course Success Rates by Ethnicity
Foothill 2011-12 Course Success Rates by
Instructional Method and Ethnicity
100%
81%
77%
80%
78%
70%
67%
59%
60%
81%
71%
70%
84%
73%
62%
57%
47%
40%
20%
0%
Af rican
American
Asian
Filipino/PI
Success Online
Latino/a
Native
American
White
Decline/Unk
Success Not online
Foothill total enrollment=141,538; African American online=2,972; African American not online=4,868; Asian
online=10,970; Asian not online=29,902; Filipino/PI online=2,297; Filipino not online=5,831; Latino/a online=6,054;
Latino/a not online=18,737; Native American online=358; Native American not online=879; White online=13,457; White not
online=33,956; Decline to state/Unknown online=2,324; Decline to state/Unknown not online=8,942 Source: FHDA IR&P
Survey Based Data – CCSSE: How many hours do
you spend in a 7-day week preparing for class?
Full-time students who spend over 21 hours per week preparing for classes
(studying, homework, rehearsing, reading, writing)?
Developmental Students
15% 17%
23%
0%
50%
Non Developmental Students
15%
14%
23%
0%
50%
De Anza
Foothill
2012 CCSSE Cohort of
full-time students
Students who have taken or plan to take a developmental course. Developmental – DA=619, FH=290; Non Developmental – DA=306, FH=302
Survey Based Data – CCSSE: How often do you use
academic advising/planning?
De Anza
Ethnicity
Foothill
Respondents
Percent
Respondents
Percent
African American
14
56%
19
58%
Asian /PI
387
58%
199
60%
Latino/a
112
52%
92
60%
White
129
53%
136
48%
Other
78
56%
46
45%
Includes students who selected often or sometimes. Other includes Native American, Other, Decline to state.
Total respondents DA=1,286; FH=904. DA: African American=25, Asian/PI=663, Latino=215, White=224,
Other=139; FH: African American=33, Asian/PI=329, Latino=154, White=286, Other=102.
From FHDA IR&P, CCSSE 2012
Program level data: Labor Projections and Program Completions
Radiologic Technology
Radiologic Technology Occupation Performance
Target Occupations
Radiologic Technologists and Technicians
Regional
Openings (2011)
106
Median Hourly
Earnings
$41.17
Growth
(2012-2015)
5.8%
§
Job Postings
As of 10/29/12
0
Regional Radiologic Technology Training Providers
3
110*
Institutions
Completions (2010-2011)
*Based on IPEDS data.
Degrees
Certificates
Total
Completions
0
66
66
Foothill College
25
0
25
Canada College
19
0
19
Institution
Institute of Medical Education
From EMSI, 2012
20
Service Area Level Data:
Vocational Training Projections
1/3 of jobs will require training involving up to one year’s worth
of experience, training and/or instruction.
From Tim Nadreau, EMSI
21
HOW ARE DATA AND
ASSESSMENT RESULTS
COMMUNICATED,
DISCUSSED, AND USED
FOR CONTINUOUS
IMPROVEMENT?
22
Institutional Data Fills Two Important Gaps
(Poor Planning)
(Good Planning)
Standard Level of
Organizational
Awareness
Elevated Level of
(Good Performance)
Improved
Processes &
Performance
Organizational
Awareness
Knowledge Gap
Performance Gap
Strategic Planning
Function
Institutional Effectiveness &
Student Success Function
Institutional Outcomes and
Benchmarks
Classroom and Service Area
Assessment
From Fulks, Hasson, and Mahon - 2010 AI presentation 23
Review of data and evidence is most meaningful when it
informs decision making at the proper place of practice
1,000 ft
Perspective
• Resource
Allocation
• Institutional
Policies
• System
Structures
• Program Alignment
• Program Redesign
100 ft
Perspective
• Program Curriculum
• Pedagogy
• Course Redesign
• Innovations in Learning
On the
Ground
Institutional
Strategies
Program
Improvements
Classroom
Innovation
From Gregory Stoup, Pinpointing Areas to Improve (2012)
Where are data communicated
and discussed?
Department
College and/or District
Professional Development Activities
Committee/Council meetings
Department Meetings
Institutional Planning Documents
Program Review
Email/Newsletters
Student Learning Outcomes
Assessment
Factsheets/Environmental Scans
Annual Planning Documents
Institutional Website
25
Cañada College Student
Performance and Equity
Dashboard
developed and maintained by
The Office of Planning, Research and Student Success
From Gregory Stoup, Pinpointing Areas to Improve (2012)
26
DETAILED TABLES
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
Successful Course Completion Rates…………………….……………………………. 8
Fall-to-Spring Persistence ……………………………………………………………………
10
Fall-to-Fall Persistence ………………………………………………………………………..
12
Student Success Rates during their first year……….…………..…………………
14
Success Rates in Gen Ed Course ………………………………………………………….16
Success Rates in CTE Courses ……………………………………………………….……..
18
Success Rates in Pre-Transfer Courses ………………………………………….…….. 20
Success Rates in ESL Courses ……………………………………………………….……..
22
Six Year Degree Completion Rates ……………………………………………………... 24
Six Year Certificate Completion Rates …………………………………………………. 26
Median Number of Years to Degree …….………………………….…….………….. 28
Average # of Credits Accumulated after 1 Year ..……….………………………..
30
Average # of Credits Accumulated after 2 Years ..……….……………….……..
32
Pct Placed into BS Math & taking BS math in first term ……………………….
34
Pct Placed into BS Math & taking BS math in first term ………………………
36
Pct Placed into BS Math & taking BS math in first term ………………………
38
From Gregory Stoup, Pinpointing Areas to Improve (2012)
27
Five Year Trend in Succesful Course Completion Rates
(Cañada College 2007/08 – 2011/12)
80%
Why this matters: Course completion is perhaps the most widely used and reported indicator of
student academic achievement. Higher levels of course completion are associated with higher levels
of degree and certificate completion. This report highlights that for Cañada students course
completion rates vary widely by both student ethnicity and student age.
75%
71.8%
71.2%
70%
69.9%
68.8%
69.8%
65%
60%
55%
2007/08
2008/09
2009/10
2010/11
2011/12
Course Completion Rate: also referred to as the college-wide course pass rate and the college success rate is an aggregation of student course taking performance.
The success rate is the percentage of grades awarded that indicate successful course completion, namely, a grade of A, B, C, P or CR.
From Gregory Stoup, Pinpointing Areas to Improve (2012)
28
Feedback from college constituencies
College-wide Succesful Course Completion Rate
Summary of the discussion of findings
College Planning Council. Reviewed on September, 2012.
Why the overall decline over this period? What are the underlying forces driving the trend and what are
their various magnitudes? How much of this is due to demographic changes? Budget? Policy? Hiring
patterns? Process reengineering?
Student Equity Committee. Reviewed on September, 2012.
The 25 percentage point gap by ethnicity is not acceptable. We must understand where we can best
target efforts to pull up the lowest performing groups . We need to embark on a path of modest gains in
terms of both closing the gap and improving the overall college average.
Next steps
College Planning Council.
OPRSS will identify completion rate trends within the college’s most enrolled courses. How do the
rates of different courses compare? Did student enrollment in courses with traditionally lower
completion rates over this period? CPC will review Program Review data that provides completion rate
trends by dept.
Student Equity Committee.
OPRSS will provide completion information on cross sectional groups including ethnicity by gender,
ethnicity by age, ethnicity by geography & ethnicity by financial aid status. The goal being to identify
specific populations where focused interventions can be deployed and be most effective.
From Gregory Stoup, Pinpointing Areas to Improve (2012) 29
Setting objectives related to college
mission and goals
FOOTHILL COLLEGE
Core Mission Workgroup Objectives for 2012-13
Institutional Goal
Workgroup Objective
Target
Metrics
●
●
●
References & Notes
Lead Role
●
●
●
Map to Institutional Learning Outcomes
Critical Thinking
Computation
Communication
Community
Resource Planning
Estimated Cost $________________
Funding Source
Existing Potential
Requested
NA
Timeline
Target Date(s) ___________________
Supporting Documentation
ACCJC Standard _______
District Priority
Educational and Strategic Master Plan
Equity Plan
PaRC Initiative
Program Review
Other_____________________________
Workgroup Participants
30
Documenting your process and
progress
FOOTHILL COLLEGE
Core Mission Workgroup Reflections for 2012-13
Institutional Goal:
Workgroup Objective
Target Summary
Completed
Successes
●
●
●
In Process
Challenges
●
●
●
Not Initiated
Explain:
Resource Planning Review
Cost(s) $______________
Financial
Personnel
Technology
Time
Other ___________________
Progress Indicators (Metrics Update)
References & Notes
Workgroup Participants
31
Example - Planning Agenda Progress
and Completion Template
Group
*Date
Assignment
Completed
I.A.3
As a component of the new planning process, the mission statement will be publicized
College Council
on a regular basis. The review of the mission statement will be integrated into
the planning process.
I.B.1
The college will implement the integrated planning process that incorporates
College Council
outcomes assessment results into institutional planning, and provide the
time and space for broad- based dialogue aimed at improving student learning.
I.B.3
The college will implement the integrated planning process that incorporates
College Council
outcomes assessment results into institutional planning, and student learning provide
the time and space for broad-based dialogue aimed at improving student learning.
II.A.1 Working with other Planning and Budgeting Teams and College Council, the Instructional
College Council
Planning and Budget Team (IPBT) will review and modify the Annual Program
PBTs
Review Update and Comprehensive Program Review processes on a regular basis. Academic Senate
II.A.1.b Develop a Distance Learning course student evaluation, based on the
Academic Senate
Foothill-De Anza Faculty Agreement Article 6 and Appendix J2W.
& FA
II.A.1.b Develop faculty training on effective online teaching strategies to improve
Academic Services
student success and retention.
II.A.1.c Institutional Research will continue its commitment to assisting faculty and staff in their
College Council
assessment efforts at the course and program levels.
Standard
Planning Agenda
Date to be
Completed
32
Action Research Guided Questions
Developing the Research Agenda
1.
What and who will be researched?
2.
How is research tied to college plans, goals, initiatives and/or activities?
3.
How will the information be used, by whom and how often?
4.
Which methodology or approach will be used?
Turning Data into Information
1.
What do the data tell us?
2.
Which questions were fully answered by the research and which need
more exploration?
3.
What are reasonable benchmarks based on the research?
Taking Action on the Information
1.
What interventions or strategies do we need to deploy in order to move
the needle?
2.
How should this information be shared and applied across the college?
From Fulks, Hasson, and Mahon - 2010 AI presentation 33
WHAT IS YOUR
BIGGEST TAKEAWAY
FROM THIS SESSION?
34