Evaluation - Oklahoma State Regents for Higher Education

Download Report

Transcript Evaluation - Oklahoma State Regents for Higher Education

Evaluation

(a.k.a. Assessment)

Krista S. Schumacher Schumacher Consulting.org

918-284-7276 [email protected]

www.schumacherconsulting.org

Prepared for the Oklahoma State Regents for Higher Education

2010 Summer Grant Writing Institute

To Evaluate or to Assess?

 Technically speaking….

◦ Assessment  Long-term outcomes, aggregated judgment ◦  Evaluation Short-term outcomes, “unique event” judgment [email protected] 918-284-7276 2

Why Evaluate?

 How will you know your project is progressing adequately to achieve objectives?  How will funders know your project was successful? ◦    Increasing emphasis placed on evaluation, i.e., U.S. Department of Education National Science Foundation Substance Abuse and Mental Health Services Administration (SAMHSA) [email protected] 918-284-7276 3

Why Evaluate?

   Improve the program – ◦

“Balancing the call to prove with the need to improve.” (

W.K. Kellogg Foundation ◦ ◦ Determine program effectiveness – Evaluation supports “accountability and quality control” (Kellogg Foundation) Significant influence on program’s future ◦ ◦ Generate new knowledge – Not just research knowledge Determines not just that a program works, but analyzes how and why it works  With whom is the program most successful?  Under what circumstances?

[email protected] 918-284-7276 4

Why Evaluate?

WHAT WILL BE DONE WITH THE RESULTS?????

“Evaluation results will be reviewed (quarterly, semi-annually, annually) by the project advisory board and staff. Results will be used to make program adjustments as needed.” [email protected] 918-284-7276 5

Types of Evaluation Process evaluation:

 What processes are used and how well do they work?

Outcome evaluation:

 Did the project achieve its stated objectives? [email protected] 918-284-7276 6

Process Evaluation

◦     What was provided and to whom? services (modality, type, intensity, duration) recipients (individual demographics and characteristics)  gender, age, race/ethnicity, income level, first-generation status context (institution, community, classroom) cost (did the project stay within budget?) • Do processes match the proposed project plan? ◦ What types of deviation from the plan occurred?

◦ What led to the deviations?

◦ What effect did the deviations have on the project and evaluation?

[email protected] 918-284-7276 7

Outcome Evaluation

◦ ◦ ◦ ◦   What effect did the program have on participants? Activities / Objectives Achievement / Attitudes and beliefs What program/contextual factors were associated with outcomes? What individual factors were associated with outcomes?  How durable were the effects?

What correlations can be drawn between outcomes and program? 

How do you know that the program was the cause of the effect?

[email protected] 918-284-7276 8

Who will Evaluate?

 ◦ ◦ External evaluators increasingly required or strongly recommended ◦ Partners for effective and efficient programs ◦ Methodological orientations Philosophical orientations Experience and qualifications [email protected] 918-284-7276 9

How much will it cost?

    External evaluations cost money…period.

Standard recommendation:   5% to 10% of total budget Kellogg Foundation; U.S. Dept of Ed.

Check funder limits on evaluation Ensure cost is reasonable but sufficient [email protected] 918-284-7276 10

Two Types of Data

 Quantitative ◦ Numbers based on objectives and activities ◦ Types of data needed:     Number of participants (process) Grade point averages (outcome) Retention rates (outcome) Survey data (outcome and process)  ◦ ◦ Qualitative ◦ Interviews Focus groups Observation [email protected] 918-284-7276 11

Methods/Instruments

 ◦ ◦ ◦ ◦ ◦ ◦ ◦ How are you going to get your data? Establish baseline data Pre- and post-assessments (knowledge, skills) Pre- and post-surveys (attitudinal) Enrollment rosters Meeting minutes Database reports Institutional Research Office (I.R.) [email protected] 918-284-7276 12

Data Analysis

 

Quantitative data

◦ Data analysis programs: SPSS (Statistical Program for the Social Sciences), Stata, etc... ◦    Descriptive and statistical data: On a scale of 1 to 5, with 1 being “strongly disagree” and 5 being “strongly agree,” please indicate the extent to which you agree or disagree that pigs can fly. Descriptive:  150 (or 75%) of respondents agree or strongly agree that pigs can fly.

Statistical (t-test, ANOVA, etc.)  There is a statistically significant difference (p<.05) between the sexes that pigs can fly, with men more likely than women to agree or strongly agree with this statement. [email protected] 918-284-7276 13

Data Analysis

 ◦ ◦ ◦ Qualitative Data  Data analysis programs NVivo , ATLAS.ti, etc…   More than pithy anecdotes “May explain – and provide evidence of – those hard-to-measure outcomes that cannot be defined quantitatively.” – W.K. Kellogg Foundation Provides insight into how and why a program is successful Analyze for themes that support (or don’t) quantitative data [email protected] 918-284-7276 14

Two Types of Timeframes

 Formative ◦ Ongoing throughout life of grant ◦ Measures activities and objectives  Summative ◦ At conclusion of grant funding  NEED BOTH!

[email protected] 918-284-7276 15

Timelines

 ◦ ◦ ◦ When will evaluation occur? ◦ Monthly? ◦ Quarterly? ◦ Semi-annually? Annually? At the end of each training session? At the end of each cycle? [email protected] 918-284-7276 16

Origin of the Evaluation:

Need and Objectives Need: For 2005-06, the fall-to-fall retention rate of first-time degree-seeking students was 55% for the College’s full-time students, compared to national average retention rates of 65% for full-time students at comparable institutions (IPEDS, 2006). Objective: The fall-to-fall retention rate of full-time undergraduate students will increase by 3% each year from a baseline of 55% to 61% by Sept. 30, 2010.

[email protected] 918-284-7276 17

Evaluation Data Collection and Reporting Plan

Objectives Data collected and timeline Methods for data collection and timeline Instruments to be developed and timeline Reports/ outcomes timeline

Increase fall to-fall retention by 3% per year to 61% Student Enrollment enrollment in first fall and second fall within one month of start of second fall entered by gender and race/ ethnicity into database within first four weeks of each semester Enrollment rosters separated by gender and race/ethnicity by Jan. 15, 2009 At mid point of each semester [email protected] 918-284-7276 18

BEWARE THE LAYERED OBJECTIVE!

 By the end of year five, five (5) full-time developmental education instructors will conduct 10 workshops on student retention strategies for 200 adjunct instructors.

[email protected] 918-284-7276 19

Logic Models

   From:

University of Wisconsin-Extension, Program Development and Evaluation

http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html

A Logic Model is……

◦ A depiction of a program showing what the program will do and what it is to accomplish.

◦ A series of “if-then” relationships that, if implemented as intended, lead to the desired outcomes ◦ The core of program planning and evaluation Situation Inputs Outputs Outcomes Hungry Get food Eat food Feel better [email protected] 918-284-7276 20

Evaluation Resources

       W.K. Kellogg Foundation – “Evaluation Toolkit” ◦ http://www.wkkf.org/default.aspx?tabid=75&CID=281&NID=61&LanguageID=0 Newsletter resource – The PEN (Program Evaluation News) ◦ http://www.the-aps.org/education/promote/content/newslttr3.2.pdf

NSF-sponsored program ◦ www.evaluatorsinstitute.com

American Evaluation Association ◦ www.eval.org

Western Michigan University, The Evaluation Center ◦ http://ec.wmich.edu/evaldir/index.html

(directory of evaluators) OSRHE list of evaluators and other resources ◦ http://www.okhighered.org/grant%2Dopps/writing.shtml

“Evaluation for the Unevaluated” course ◦ http://pathwayscourses.samhsa.gov/eval101/eval101_toc.htm

[email protected] 918-284-7276 21

Evaluation Resources

      The Research Methods Knowledge Base ◦ http://www.socialresearchmethods.net/ The What Works Clearinghouse ◦ http://www.w-w-c.org/ The Promising Practices Network ◦ http://www.promisingpractices.net/ The International Campbell Collaboration ◦ http://www.campbellcollaboration.org/ Social Programs That Work ◦ http://www.excelgov.org/ Planning an Effective Program Evaluation short course ◦ http://www.the-aps.org/education/promote/pen.htm

22