Transcript Slide 1

Planning and Focusing an Evaluation

Program Evaluation Basics Webinar Series

Mary E. Arnold, Ph.D.

Associate Professor and Youth Development Specialist Oregon State University 4-H Professional Development Webinar January 10, 2013

Webinar Agenda

Building on previous month’s topic of logic modeling we will: • Review the basic components of a logic model, using the YA4-H! Teens as Teachers program as an example • Learn the importance of identifying the purpose and stakeholders of a program evaluation • Learn the

Four Level Model

of educational program evaluation:

1.

2.

3.

Reaction Learning Transfer of learning 4.

Results

• Introduce three important parts of evaluation planning:

1.

Evaluation questions 2.

Evaluation design 3.

Data collection methods

• Learn about IRB regulations for evaluation projects

YA4-H! Teens as Teachers Program

We received a $50000 competitive grant to Washington State University for 18 months of program funding

Program Outline

This is the educational 1. YA4-H Teens as Teachers • Develop the YA4-H! Teens as Teachers curriculum • Provide 3-days of training to teens and adults •

Participants Learn About:

o o Developing Youth - Adult partnerships Forming a county YA4-H Teens as Teachers team o Learning and implementing the part of the program and where we will look for short-tem outcomes

Choose Health: Fun, Food, and Fitness

(CHFFF) curriculum 2.

• • Nutrition Education for Youth Ages 9-12 Teens teach the 6 CHFFF lessons to younger youth

Younger Youth Learn:

o The amount of sugar in drinks o o Eating more fruits and vegetables How to read a nutrition label o Whole grains o High fat and high sugar foods o Eating breakfast Again, this is educational content. Do you think the younger youth are the only ones who will NO! The teens will learn?

learn these things as well, simply through teaching it to others!

YA4-H! Teens as Teachers Program

We have received a $50,000 competitive grant to Washington State University for 18 months of program funding Note that these

3. Medium Term Outcomes (Behaviors) Younger Youth

• Replace sweetened drinks with low-fat milk and water • Eat more fruits and vegetables outcomes are focused on what we want to see HAPPEN!

ACTION!!!!!!!

• Eat fewer high-fat and high-sugar foods and more nutrient-rich and high-fiber foods • Eat only as often and as much as needed to satisfy hunger • Play actively 60 minutes a day • Limit screen time to two hours or less a day

Teens as Teachers

• All of the above outcomes AND • Actively promote healthy behaviors • Develop and practice teaching and leadership skills • Increase PYD • Act as a role model for younger youth

YA4-H! Teens as Teachers Program

We have received a $50,000 competitive grant to Washington State University for 18 months of program funding

4.

Long Term Outcomes (Results)

Youth are less obese and more active • Healthier food choices are available for youth • There is a reduction in prevalence of sugar drinks and low-nutrition food • Communities are united to provide healthy spaces for youth • There is a reduction in chronic diseases related to obesity over time

YA4-H! Teens as Teachers Program Evaluation

Inputs Outputs

What is done.

Who is reached..

Outcomes Teens as Teachers Curriculum

Staff Money Partners Youth-Adult Teams are trained

Provide training to youth-adult teams Teams return to the community and implement the CHFFF program

• Program theory is important, but it also has to make practical sense in order for evaluations to be meaningful! • That is why logic models are so useful- they can identify critical links in the program’s theory of change. • If the links aren’t “logical” than the program may have little practical value

Why Do We Evaluate?

• • • • • • • • • Help others understand the program (stakeholders) Understand the need for a program Improve the program Improve teaching Understand the program’s impact Determine if the program is progressing as planned Determine if the program is worth the cost Meet grant reporting criteria Meet administrative requirements

Why Do We Evaluate?

Poll:

What is the number ONE reason you evaluate your programs?

Share Your Experience:

What motivates you to do evaluation, and how do you feel about that motivator?

Ahhhh …. Stakeholders!

Stakeholders will have different needs for evaluative evidence!

Who Cares?

• People affected by the program either directly or indirectly (youth, parents, volunteers) • • • • • • • County boards, elected officials Community leaders Colleagues, volunteers, supporters, collaborators Extension administrators Grantors Tenure committees Other stakeholders

It is essential to think about who cares about the evaluation results and determine how the results will be used early in the evaluation planning process.

Poll:

Who are your most important stakeholders? (Check all that apply)

Share Your Experience:

Share an experience of needing difference evaluation evidence for difference stakeholders

The Four Level Model of Educational Program Evaluation (Kirkpatrick & Kirkpatrick, 2006) 1. Participant

Reaction

to the program- you want a favorable response to the program; people are more motivated to participate and learn if the program is a positive experience for them.

2. Participant

Learning

– the extent to which participants change attitudes, improve knowledge, and increase skills.

3. Participant

Behavior Change

(Transfer of Learning) • 4 conditions for transfer of learning to take place: 1) The person must have a desire to change 2) The person must know what to do and how to do it 3) The person must be in a climate that supports the change 4) The person must receive some type of reward for changing (intrinsic and/or extrinsic) 4. Participant

Results

–One way to guide this part of the evaluation is to ask What is the main reason for this program? Then we ask also have to determine the links between the learning and behavior changes and these results (yep, this is a cloaked logic model!)

An introduction to three important aspects of evaluation planning:

• Evaluation questions • Evaluation design • Data collection methods

We begin with the evaluation questions, because the questions determine the design and data collection methods

Focusing Evaluation Questions

Questions of participant reaction:

Who actually participated in the program?

Are there barriers to participation?

Were participants satisfied with the program?

Was the content of the program relevant to the participants?

Questions of participant learning:

Did participants learn the intended program content?

Did the program change participant attitudes? Did participants leave the program with new or enhanced skills?

Questions about behavior change

Do participants plan to use what was learned? (potential change) Do participants have a supportive climate for implementing what was learned?

Did participants actually implement new behaviors? Why or why not?

Questions about Results

What difference has this program made for the audience, community, and other stakeholders? (the SO WHAT question)

Choosing an Evaluation Design

Critical to Understand!

Your evaluation question determines your evaluation design and data collection methods!!!

(too often I hear: I need a survey to evaluate my program  )

Evaluation Designs

1) Post only design (X O) 2) Post only control group design E (X O) C (X O) 3) Retrospective pretest ( X O) O

= “Observation” (data collection)

E

= Experimental group (program participants)

X

= “intervention” (program)

C

= Control group (non-participants )

Evaluation Designs

4) One group pretest- posttest design (O X O) 5) Control group pretest- posttest design

E

(O C (O X X O) O) 6) Time series design (with control group)

E

(O O O X O O O) C (O O O X O O O) O

= “Observation” (data collection

E

= Experimental group (program participants)

X

= “intervention” (program)

C

= Control group (non-participants )

Choosing an Evaluation Data Collection Method

Some Common Methods

• Mailed survey • On-line survey • Individual face to face interviews • Focus group interviews • Phone survey or interview • Observation • Archival data (records and documents) • Test (e.g. scenarios or skill/knowledge tests) Stay tuned on February 14 th for our next webinar, which will focus more in depth on design and methods!

Poll:

What data collections methods have you used? (Check all that apply)

Share Your Experience:

Share an experience of using a data collection method that challenged you or gave you great information

Oregon State University Institutional Review Board (IRB)

AKA… Do I have to do that?

Research is determined by three qualities:

1. Systematic inquiry into a phenomena 2. That is designed to develop or contribute 3. Generalizable knowledge

Human subjects Are:

Living individuals about whom an investigator conducting research obtains: 1. Data through an intervention or interaction with the individual, or 2. Identifiable private information

Institutional Review Board (IRB)

Okay, I have to do it… now what?

Go to the IRB Website at: http://oregonstate.edu/research/irb/

1.

Complete the “Does Your Study Require IRB Review? Form 2. Complete online ethics training modules 3.

Complete the steps listing under “Preparing and Initial Submission” at http://oregonstate.edu/research/irb/preparing initial-submission

That’s all for now!

Join in next month for:

Evaluation Designs and Methods

Don’t forget to complete an evaluation of today’s webinar at: http://www.surveymonkey.com/s/4HEvaluationwebinar