How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink [email protected].

Download Report

Transcript How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink [email protected].

How do we know
it works?
Evaluating Learning Technology Projects
EDUCAUSE Learning Initiative Seminar
Clare van den Blink
[email protected]
Seminar Goals
Select project
goals that can be
evaluated.
Identify
relevant
indicators to
evaluate
project goals.
Identify data
collection
methods
Create an evaluation plan for
an academic technology
project that is tied to their
project assumptions and
strategies.
Assess
considerations for
developing
evaluation activities
Seminar participants will be able to….
Please indicate the type of projects
you’re interested in evaluating.
What are the key challenges in
fully evaluating your projects?
A Framework
Project Goals
Focus of Evaluation
Evaluation Design
Overview
Indicators - measures of
”success”
Data collection
-Methods
-Population
-Procedures
Timeline
Data Analysis
Reporting Findings
Introduction
How this process was developed…
“To evaluate the effectiveness of the technology
enhancement and it’s impact on student
learning”
…..With constraints of staff, time and budget
limitations.
Importance-Complexity
High
Quasi-experimental
Research on Tech
Intervention
Staff
Effort
LMS Pilot ProjectService Decisions
Small Project
with technology
intervention
Low
Complexity of Evaluation
Importance-Complexity
High
Quasi-experimental
Research on Tech
Intervention
Staff
Effort
LMS Pilot ProjectService Decisions
Small Project
with technology
intervention
Survey
Low
Interview
Complexity of Evaluation
Importance-Complexity
High
Quasi-experimental
Research on Tech
Intervention
Staff
Effort
LMS Pilot ProjectService Decisions
Small Project
with technology
intervention
Survey
Low
Surveys
Interviews
Usability
Tech
Review
Interview
Complexity of Evaluation
Importance-Complexity
High
Quasi-experimental
Research on Tech
Intervention
Control
Staff
Effort
Small Project
with technology
intervention
Survey
Low
group
LMS Pilot ProjectService Decisions
Surveys
Interviews
Surveys
Interviews
Usability
Tech
Review
Interview
Complexity of Evaluation
Observations
To inform the evaluation
What led to the
development of
the project?
Assumptions
about the
strategies &
technologies
selected?
Based on
Prior
research?
Literature
reviewWhat will
inform the
evaluation?
Selecting Goals….
Since not all projects may be evaluated within the
timeframe of the project…
OR
may be difficult to measure…
Selecting Goals….
Since not all projects may be evaluated within the
timeframe of the project…
OR
may be difficult to measure…
How can you SELECT goals that can be evaluated within the
scope of the project?
Selecting Goals….
Since not all projects may be evaluated within the
timeframe of the project…
OR
may be difficult to measure…
How can you SELECT goals that can be evaluated within the
scope of the project?
 How would you PRIORITIZE the goals that are the most
critical to evaluate?
Sample Goals:
Review examples ofgoals and idenitfy goals that could be evaluated
within the project constraints.
EXAMPLE #1
Instructional Goals:
1.) Encourage active participation and critical thinking
skills by using video clips of main stream movies to initiate
class discussions.
2.) Encourage student involvement and active learning by
creating a mechanism for students to record interviews in
the field. (part of a class assignment)
3.) Create a repository of student-collected audio
interviews for ongoing use in the curriculum. Audio clips
will be used to illustrate the diversity of public education
experiences.
Other Project Goals:.
4.) Develop work flow and documentation for student
recording of audio interviews and video clip processing.
5.) Choose, create, and provide an archiving mechanism
for cataloguing clips.
Sample Goals:
Review examples ofgoals and idenitfy goals that can be
evaluated within the project constraints.
EXAMPLE #2
Instructional Goals:
A. ) Students will be able to practice application of fluid
therapy, under various conditions, employing a unique
computer-based simulation
B.) Students will be able to interpret symptoms presented
in a sick dog, select an appropriate treatment, administer
fluids, monitor patient’s reaction and modify treatment
plan accordingly.
C.) Case simulation will enable students to experience
clinical variability in a manner similar to hands-on practice.
Other Project Goals:
D.) Simplify the creation of a set of teaching models, or
prototypes, that are the basis of the cases.
E.) Create a method for generating unique computerbased cases that build from the prototypes.
F.) Provide a method for saving case data for comparison.
Developing the
evaluation plan
Evaluation Process
Identify Project Goals
Survey
Focus
Groups
Interviews
Data
analysis
Observation
Case Study
Evaluation Process
Identify Project Goals
Focus of the Evaluation?
What goals will be evaluated?
Survey
Focus
Groups
Interviews
Data
analysis
Observation
Case Study
Evaluation Process
Identify Project Goals
Focus of the Evaluation?
What goals will be evaluated?
Indicators:
type of data to collect?
Survey
Focus
Groups
Interviews
Data
analysis
Observation
Case Study
Evaluation Process
Identify Project Goals
Focus of the Evaluation?
What goals will be evaluated?
Indicators:
type of data to collect?
Methods:
how will data be
collected?
Survey
Focus
Groups
Interviews
Data
analysis
Observation
Case Study
Evaluation Process
Identify Project Goals
Focus of the Evaluation?
What goals will be evaluated?
Indicators:
type of data to collect?
Methods:
how will data be
collected?
Population
Survey
Focus
Groups
Interviews
Data
analysis
Observation
Case Study
Evaluation Process
Process: Select Goals …Focus
 What goals will be the Focus of the evaluation?
Project Goals
1. Use Personal Response System (polling)
with questions to encourage critical thinking
& student engagement.
2. Use of PowerPoint presentations with
interactive lecture.
3. Implement use of a Tablet PC for
annotating presentations for visual and
interactive lectures
Evaluation Process
Process: Select Goals …Focus
 What goals will be the Focus of the evaluation?
What is feasible to evaluate in project’s timeframe?
Project Goals
1. Use Personal Response System (polling)
with questions to encourage critical thinking
& student engagement.
2. Use of PowerPoint presentations with
interactive lecture.
3. Implement use of a Tablet PC for
annotating presentations for visual and
interactive lectures
Focus of the Evaluation
• Use Personal Response
System (polling) with
questions to encourage
critical thinking & active
participation.
(student engagement)
Evaluation Process
Process: Select Goals …Focus
 What goals will be the Focus of the evaluation?
What is feasible to evaluate in project’s timeframe?
Project Goals
1. Use Personal Response System (polling)
with questions to encourage critical thinking
& student engagement.
2. Use of PowerPoint presentations with
interactive lecture.
3. Implement use of a Tablet PC for
annotating presentations for visual and
interactive lectures
Focus of the Evaluation
• Use Personal Response
System (polling) with
questions to encourage
critical thinking & active
participation.
(student engagement)
 Identify what INDICATORS can be used to collect data, both
indirect and direct measures.
Evaluation Process
Evaluation Focus: Example 1
Formative:
Since the project is assisting with the development of online modules, a formative
evaluation of the modules will be conducted to look at the interface design,
navigation, usability, organization and presentation of content, and the usefulness
for student learning.
The key focus will be on the “functionality” of the module.
 Interface / Navigation / Design (not important in phase I)
 Technology performance: test across browsers, OS, distance, etc.
 Organization/presentation of content
 Use of images, illustrations, images
 Learning objectives**
Summative: (part of the overall program evaluation)
Since the project is assisting with the development of web-based modules, the
summative evaluation will examine the effect on student perception of learning from
the implementation of instructional technology in the course.
Measures of success may include the student perception of greater ease in learning
difficult concepts, and positive feedback about new modules.
Evaluation Process
Select Methods
Develop data collection
METHODS for the indicators,
such as surveys, interviews,
observations, etc.
What method(s) would you
select evaluate the focus
area- “functionality” of the
online module?
1. Surveys
2. Interviews
3. Observation
4. Log analysis
5. All of the above
6. Other – post in chat
Evaluation Process
Select Methods
Develop data collection
METHODS for the
indicators, such as
surveys, interviews,
observations, etc
What method(s) would
you select to evaluate how
well the online module
met the “learning
objectives”?
1. Surveys
2. Interviews
3. Observation
4. Log analysis
5. All of the above
6. Other – post in chat
Evaluation Process
Other considerations
 Identify the population from
where the data will be
collected. Can you contact
this group?
 Identity other data sources,
such as logs, document, etc.
 Are there considerations for
human subjects research &
informed consent on your
campus?
For example, using grades as
data, what permissions are
necessary?
Evaluation Process
Timelines:
How much time do you have? Need?
Fall Project
Development
Aug – Dec
Fall Semester Project Implementation
Dec-Jan
Project Transition &
Closeout
Project Evaluation
Planning
Implement Project
Evaluation
Evaluation:
Data analysis &
reports
Evaluation Process
Evaluation Timeline
Develop an initial timeline & staffing effort.
Develop evaluation plan.
September 1
Create an observation protocol
September 15
Observe EDU 271 during two class sessions
Sept - October
Complete the student survey. (instrument)
October 1, 2005
Create an interview protocol
October 15
Administer student survey.
Mid- November
Conduct student interviews
End of November
Conduct data analysis
December - January
Complete evaluation report
February 1
Evaluation Process
Evaluation Timeline
Develop an initial timeline & staffing effort.
Develop evaluation plan.
September 1
Create an observation protocol
September 15
Observe EDU 271 during two class sessions
Sept - October
Complete the student survey. (instrument)
October 1, 2005
Create an interview protocol
October 15
Administer student survey.
Mid- November
Conduct student interviews
End of November
Conduct data analysis
December - January
Complete evaluation report
February 1
Where reality
meets ideal
evaluation
methods….
Implementing
the Plan
&
Reporting Results
Evaluation Process
Implementing Methods
Surveys:
Interviews:

 Develop interview
Identify or develop
questions.
 Do survey questions
map to indicators?
questions & protocols.
 Schedule and conduct
interviews.
 Survey distribution &
associated permissions.
Resources about quantitative and qualitative methods can guide
development and implementation of methods and data analysis.
Evaluation Process
Analysis & Reporting
DATA ANAYSIS
Example:
What type of analysis will
be completed?
Overall, I am satisfied with the
use of instructional technology in
this course.
Quantitative: survey
analysis
Qualitative: interview
analysis based on
interview protocols.
“I interviewed Prof. X about her
experience with the email simulation…..”
Mean = 1.2
1 Strongly Agree
82%
2 Agree
18%
3 Neutral
0
4 Disagree
0
5 Strongly Disagree 0
Evaluation Process
Analysis & Reporting
Constraints
Evaluation Considerations & Tools
Staffing
How can this planning all be completed within limited staff
hours, while maintaining the INTEGRITY
of the evaluation process?
Do you think it is feasible to
re-train staff for evaluation?
Evaluation Considerations & Tools
Staffing
How can this planning all be completed within limited staff
hours, while maintaining the INTEGRITY of the evaluation
process?
 How can staff be trained in this process without having a
deep evaluation background?
 What staff skills might be adapted?
 Other campus resources?
What type of existing skills
could be adapted for evaluation?
Evaluation Considerations & Tools
Supporting Tools
 Have an overall evaluation plan template that an be
adapted to other projects.
 Informed consent templates
 Use common interview/observation protocols.
 Develop question banks for survey questions.
 Use of Video in Course Presentations and lectures
 Use of Online instructional tutorials
 Use of Presentations in Lecture
Evaluation Considerations & Tools
question banks
Evaluation Considerations & Tools
question banks
Summary
Consider…
 How can this METHODOLOGY be applied to your projects and
institution?
 How VIABLE is this as an evaluation methodology for your
projects?
When does a more ROBUST process need to be put in place?
A Framework
Project Goals
Focus of Evaluation
Evaluation Design
Overview
Indicators - measures of
”success”
Data collection
-Methods
-Population
-Procedures
Timeline
Data Analysis
Reporting Findings
A Framework
1. Project Goals
2. Focus of Evaluation
3. Evaluation Design
 Overview
 Indicators - measures of ”success”
 Data collection
- Methods
- Population
- Procedures
4. Timeline: what, when, how
5. Data Analysis
6. Reporting Findings
Questions?
[email protected]