Evaluating Educational Technology Planning and Implementation

Download Report

Transcript Evaluating Educational Technology Planning and Implementation

NECC 2007 - SAP126
Program Evaluation Tools
and Strategies for
Instructional Technology
[email protected]
978-251-1600 ext. 204
www.sun-associates.com/necc2007
This presentation is linked to that site along with
other resources on program evaluation
In particular, download a copy of the district
evaluation workbook available on that page
2
2
Where Do We Stand?
Who’s working on an actual project?
Current?
Anticipated?
Your expectations for today
3
3
Workshop Goals
To review the key elements of effective
program evaluation as applied to
instructional technology evaluations
To consider evaluation in the context of
some example projects
4
4
Why Evaluate?
To fulfill program requirements
Evaluation is part of program/project accountability
Most other state and federal proposals require an
evaluation component
And not simply a statement that “we will evaluate”
But actual info on who will evaluate, the evaluation
questions, and methodologies
Project sustainability
Generation of new and improved project ideas
Others?
5
5
By Definition, Evaluation…
 Is both formative and summative
 Helps clarify project goals, processes, products
 Should be tied to indicators of success written for your
project’s goals
 Is not a “test” or simply a checklist of completed
activities
 Qualitatively, are you achieving your goals?
 What adjustments can be made to your project to
realize greater success?
6
6
A Three-Phase Evaluation Process
 Evaluation Questions
 Tied to original project goals
 Indicator rubrics
 Allow for authentic, qualitative,
and holistic evaluation
 Data Collection
 Tied to indicators in the rubrics
 Scoring and Reporting
 Role of the evaluation committee
7
pg 5 in workbook
7
Who Evaluates?
Committee of stakeholders (pg 10)
Outside facilitator?
Task checklist (pg 6)
Other issues…
Perspective
Time-intensive
8
8
Project Sample
9
9
An Iterative Process
Evaluation breaks your vision down into
increasingly observable and measurable
pieces.
10
10
Goals Lead to Questions
What do you want to see happen?
These are your goals
Rephrase goals into questions
Achieving these goals requires a process
that can be measured through a formative
evaluation
11
11
…And Then to Indicators
What is it that you want to measure?
What are the conditions of success and to what
degree are those conditions being met?
By what criteria should performance be judged?
Where should we look and what should we look for to
judge performance success?
What does the range in the quality of performance
look like?
How should different levels of quality be described
and distinguished from each other?
12
12
Indicators should reflect your project’s unique
goals and aspirations
Rooted in proposed work
Indicators must reflect your own environment...what
constitutes success for you might not for someone
else
Indicators need to be highly descriptive and can
include both qualitative and quantitative measures
You collect data on your indicators
13
13
Try it on a Sample
Using the Evaluation Logic Map, map
your:
Project purpose/vision
Goals
Objectives
Actions
We’ll take 15 minutes for this…and then
come back for indicators
14
14
15
15
16
16
Next…Indicators
Pick one of your intermediate outcomes
Write a brief statement of what it would
LOOK LIKE to achieve ultimate success in
this indicator.
What would change (go down) if success
was less than ultimate?
17
17
To Summarize...
Start with your proposal or technology plan
Logic map the connections between actions,
objectives, and goals
From your goals/objectives, develop evaluation
questions
Questions lead to indicators
Indicators are organized into rubrics
Data collection flows from that rubric
18
18
Evidence/Data Collection
Classroom observation, interviews, and workproduct review
What are teachers doing on a day-to-day basis to
address student needs?
Focus groups and surveys
Measuring teacher satisfaction
Triangulation with data from administrators and
staff
Do other groups confirm that teachers are being
served?
19
19
Data Collection Basics
Review Existing Data
Current technology plan
Curriculum
District/school improvement plans
Sample quesitons on the webpage for this
presentation
www.sun-associates.com/eval/sample
20
20
Surveys
Creating good surveys
Length
Differentiation (teachers, staff, parents,
community, etc..)
Quantitative data
Attitudinal data
Timing/response rates (getting returns!)
 www.sun-associates.com/eval/samples/samplesurv.html
21
21
Online Survey Tools
Online
VIVED
Profiler
LoTi
Zoomerang
SurveyMonkey.com
22
22
Survey Issues
 Online surveys produce high response
rates
 Easy to report and analyze data
 Potential for abuse
 Depends on access to connectivity
23
23
Focus Groups/Interviews
Focus Groups/Interviews
Teachers
Parents
Students
Administrators
Other stakeholders
24
24
Classroom Observations
Using an observation template
Using outside observers
25
25
Other Data Elements?
Artifact analysis
A rubric for analyzing teacher and student
work?
Solicitation of teacher/parent/student
stories
This is a way to gather truly qualitative data
What does the community say about the use
and impact of technology?
26
26
Dissemination
Compile the report
Determine how to share the report
School committee presentation
Press releases
Community meetings
27
27
Conclusion
Build evaluation into your technology
planning effort
Remember, not all evaluation is
quantitative
You cannot evaluate what you are not
looking for, so it’s important to —
Develop expectations of what constitutes
good technology integration
28
28
More Information
[email protected]
978-251-1600 ext. 204
www.sun-associates.com/evaluation
www.edtechevaluation.com
29
29
[email protected]
978-251-1600 ext. 204
www.sun-associates.com/necc2007
This presentation will be linked to that site
30
30