Transcript Document

USING DATA TO IMPROVE LEARNING

Pre-Conference Workshop Presenters Rob Johnstone

, Senior Research Fellow, The Research & Planning Group for California Community Colleges, Berkeley, CA

Kurt Ewen

, Assistant Vice President, Assessment and Institutional Effectiveness, Valencia College, Orlando, FL

Overview of the Workshop

• Introductions • Discussion about the Nature and Use of Data in Higher Education – Implications for practice • Playing with Actual Data and discussions about data display

INTRODUCTIONS

• Name • The name and location of your College • Your roll at the College • What are you hoping for from the workshop?

USING DATA TO IMPROVE STUDENT LEARNING

• Think – Pair – Share – In your work in higher education, what is the best piece of data you have ever seen? The most effective / actionable data?

• Why?

• An example of Great Data • What is different about data from learning assessment and student success?

LESSONS LEARNED ABOUT DATA COLLECTION AND USE

LESSON 1: THE PURPOSE OF ASSESSMENT (DATA) IS TO IMPROVE STUDENT LEARNING

• Assessment of learning creates the possibility of better conversations • Course Level Assessment • Faculty – Student • Student – Students • Program Level Assessment • Faculty – Faculty

LESSON 2: DATA FROM LEARNING OUTCOMES ASSESSMENT AT THE PROGRAM LEVEL IS MESSY AND THE MESSINESS IS AN IMPORTANT PART OF THE PROCESS

An Example

What should we expect from Students exiting Comp1 at a Community college?

LESSON 3: THE SEEMINGLY MORE USEFUL / UNDERSTANDABLE DATA ABOUT STUDENT LEARNING IS TO AN “OUTSIDE” AUDIENCE THE LESS ACTIONABLE THAT DATA WILL BE TO FACULTY.

An Example

What kinds of data can be reported as a result of assessment efforts using this rubric?

LESSON 4

• A “culture of evidence” requires the development of an institutional practice that 1. gives careful consideration to the question being asked 2. gives careful consideration to the data needed in order to answer the question.

• A genuine “culture of evidence” is dependent on a “culture of inquiry”

WHAT IS A CULTURE OF INQUIRY?

•Institutional

capacity

for supporting open, honest and collaborative

dialog

focused on

strengthening

the institution and the

outcomes

of its students.

CULTURE OF INQUIRY: FEATURES

● Widespread sharing and easy access to user-friendly information on student outcomes ● Encouraging more people to ask a wider collection of questions and use their evidence and conclusions to enhance decision making ● Shared, reflective and dynamic discussions

CULTURE OF INQUIRY: MORE FEATURES

● Multiple opportunities to discuss information within and across constituency groups ● Continuous feedback so adjustments can be made along the way and processes can be adapted ● Culture that values curiosity, questions and robust conversations

Climate of Innovation

1000’s of opportunities tried.

Maintain a Research and Development Component.

Hypothesis

Level I Level II

100 are selected for support as Phase I Innovations.

10 supported as Phase II Innovations.

“Angel Capital Stage” “Venture Capital Strategy Pilot Implementation (Limited Scale) Prototype Level II Innovations must be scalable and must show potential to bring systemic change and “business-changing results.”

Level III

to scale and Institutionalized.

“Eye for Evidence”: More rigorous at each level.

1 or 2 Institutionalization The challenge is moving from Level II to Level III.

We have yet to figure out how this will work in the new structure Standard of evidence and reflection

increases

at each level.

16

When gathering evidence, make sure you are focusing on

the right data

.

20 YEAR TREND FOR CALIFORNIA CC COURSE SUCCESS RATES

100% 90% 80% 70%

Retention Rate Success Rate

60% 50% 40% 30% 20% 10% 0% 1989 1989 1990 1991 1992 What does that tell us about the usefulness of these metrics in setting 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2008

LESSON 5

A culture of evidence is one that seeks data supported decisions.

Data driven decision making

runs the risk of over looking / underestimating the human factor which is very often concealed by the desire for statistical significance –

Data driven decision making

runs the risk of underestimating the role / significance / importance of evidence informed hunches to inform our decisions

LESSON 5 ( con’t)

Data supported decisions concerning student success oriented programs requires a consideration of "meaningful improvement" and may require balancing all or most of the following:

– Statistically significant improvement in target measures.

– Reflection on the “human impact” – Economic efficiency in relationship to difficulty of the task at hand.

– A consideration of perception as it relates to benefit versus cost.

LESSON 6

Structured reflection and dialogue allows for data to be transformed into meaningful (actionable) information

– The “meaning” of data in Higher Education is not generally self-evident and requires the benefit of the intersection of multiple perspectives – The more meaningful learning data is to an outside audience the less actionable it is to those who work with students

21

Data do not

speak

for themselves.

THE VITAL ROLE OF CONVERSATION

● In order to make data useful, ample time and space are needed to discuss and analyze the information and connect it back to the original research question.

● Answers are not always immediately apparent, so skilled facilitation may be needed to dig out the deeper meaning.

● Multiple perspectives and types of information are often needed to make sense of individual data points.

!

DIALOGUE VS. DISCUSSION

AN ETYMOLOGICAL DISTINCTION

• • • • • Dialogue: Seeing the whole among the parts Seeing the connections between parts Inquiring into assumptions Learning through inquiry and disclosure Creating shared meaning • • • • Discussion: Breaking issues / problems into parts Seeing distinctions between parts Justifying / defending assumptions Gaining agreement on one meaning / result

LESSON 7

Meaningful information promotes consensus about lessons learned and a shared vision / plan for the future

– No data should be shared as

information

until it has been processed in a collaborative and thoughtful way.

LESSON 8

Meaningful information from data does not generally emerge from a single data point but from the intersection of multiple and varied (quantitative and qualitative) sources of data

– There is rarely a silver bullet (and if there is, then the question being asked is probably not particularly interesting) – What data can be add to learning data to make it more meaningful • Student Assessment of instruction data • CCSSE • Grade distribution report • ??

LESSON 9

Data about student learning / success must be tailored to the needs / questions of particular groups

• • • • • •

Data concerning Students’ ability to write at the college-level

Course Program Department Institution External Audience Etc.

LESSON 10

• A simple assessment measures does not necessarily produce less meaningful / actionable data.

– Checklist – 4 question multiple choice “test”

LESSON 11

The best insights often come as an unintended result of simple questions asked about things you were not planning to question.

LESSON 11

• How are our new students doing?

• Data was provide on

FTIC Degree Seeking Students

• Who are our new students?

• Development of our Philosophy statement on the New Student – All Students with less than 15 College-level credits at Valencia • Who are our new Students?

WHO ARE OUR NEW STUDENTS?

HOW ARE OUR NEW STUDENTS DOING?

National Institute for Learning Outcomes Assessment (NILOA) “From Gathering to Using Assessment Results: Lessons from the Wabash National Study”

FROM GATHERING TO USING ASSESSMENT RESULTS

• “Most institutions have routinized data collection, but they have little experience in reviewing and making sense of data actions might follow.” . It is far easier to sign up for a survey offered by an outside entity or to have an associate dean interview exiting students than to orchestrate a series of complex conversations with different groups on campus about what the findings from these data mean and what

LESSON 12

The sharing of Information should reflect standards of scholarly communication and evidence

– Biases and conclusions about the information should be clearly articulated – Open and unanswered questions should be articulated (but should not be allowed to stop the process) – Differing perspectives on the meaning of the information should be given equal time – The use of programmatic and academic jargon should kept to a minimum – The visual presentation of information should be monitored for consistency