Evaluating Educational Technology Planning and Implementation
Download
Report
Transcript Evaluating Educational Technology Planning and Implementation
How Do We Know It’s Working?
Creating Evaluations for
Technology Projects and
Evaluations (part I)
Contact Information
[email protected]
978-251-1600 ext. 204
www.edtechevaluation.com
This presentation will be linked to that site (on the
Tools page)
Where Do We Stand?
Who’s working on an actual project?
Current?
Anticipated?
Your expectations for today
Workshop Goals
To review the key elements of effective
program evaluation as applied to
technology evaluations
To consider evaluation in the context of
your actual projects
Why Evaluate?
To fulfill program requirements
NCLB and hence Title IID carry evaluation
requirements
To realize your investment in technology
What sort of “difference” has all of this
technology made?
Basis in NCLB
“The application shall include:…
A description of the process and accountability measures that the
applicant will use to evaluate the extent to which activities funded
under this subpart are effective in integrating technology into curricula
and instruction, increasing the ability of teachers to teach, and enabling
students to meet challenging State academic content and student
academic achievement standards.”
NCLB Act, Title II, Part D, Section 2414(11)
One consistent thread in NCLB is
evaluation and assessment
How can you document that this
“intervention” is making a difference?
All funded work must be based in
reflection and data-driven decision-making
Naturally, this translates to local district
proposals
A Framework for Review
From Designing P rofession al
Development for T eachers of Science
and Mathem atics, Loucks-Horsley,
Hewson , Lo ve, an d St iles. Corwin
Press Inc. 19 98
Evaluation
Helps clarify project goals, processes, products
Must be tied to indicators of success written for your
project’s goals
Not a “test” or checklist of completed activities
Qualitatively, are you achieving your goals?
What adjustments to can be made to your project to
realize greater success?
The Basic Process
Evaluation Questions
Creating a
District-wide
Technology
Evaluation
G enerate
leadership
support
Determine scope
of the evaluation
effort
Appoint
Committee
Tied to original project goals
O rient and Train
In-District
Evaluation
Committee
Performance Rubrics
Formulate
Evaluation
Q uestions
Allow for authentic, qualitative,
and holistic evaluation
Review
Q uestions
Data Collection
Stage 1
Committee orientation,
evaluation framing, and
training
Develop Indicator
Rubrics
Tied to indicators in the
rubrics
Data Collection
Stage 2
Data collection and
analysis
Data Analysis
Scoring and Reporting
Scoring the
Rubrics
Role of this committee (the
evaluation committee)
Findings
Initiating the Next
Review Cycle
Recommendations
Dissemination of
Report
Stage 3
Findings, recommendations,
and reporting
Who Evaluates?
Committee of stakeholders (pg 12)
Outside facilitator?
Data collection specialists?
Task checklist
Other issues:
Honesty
Perspective
Time-intensive
Evaluation Starts with Goals
Evaluation should be rooted in your goals
for how you are going to use or integrate
that technology
Is more than an infrastructure plan
Focuses on technology’s impact on teachers
and students
Has clear goals and objectives for what you
want to see happen
Evaluation Logic Map
Project Sample
Your Project?
Using the Evaluation Logic Map, map
your:
Project purpose/vision
Goals
Objectives
Actions
Goals Lead to Questions
What do you want to see happen?
These are your goals
Rephrase goals into questions
Achieving these goals requires a process
that can be measured through a formative
evaluation
We Start with Goals…
To improve student achievement through their participation in
authentic and meaningful science learning experiences.
To provide advanced science and technology learning
opportunities to all students regardless of learning styles or
abilities.
To produce high quality science and technology curriculum in
which the integration of technology provides “added value” to
teaching and learning activities.
To increase students’ knowledge of the Connecticut River’s
history and geology, and to gain and understanding its past,
present and possible future environmental issues.
…and move to questions
Has the project developed technology-enhanced
science learning experiences that have been
instrumental in improving student mastery of the
Skills of Inquiry, understanding of the
history/geology/ecology of the Connecticut
River, and of the 5-8 science curriculum in
general?
Has the project offered teacher professional
development that has resulted in improved
teacher understanding of universal design
principles and technology integration strategies?
…And Then to Indicators
What is it that you want to measure?
Whether the projects have enhanced learning
The relationship between the units and
The selected curriculum
The process by which they were developed
Increases in teacher technology skills (in relation to
particular standards)
Whether the professional development model met
with its design expectations
Collaborative and sustainable
Involves multiple subjects and administrators
Indicators should reflect your project’s unique
goals and aspirations
Rooted in proposed work
Indicators must be indicative of your unique
environment...what constitutes success for you might
not for someone else
Indicators need to be highly descriptive and can
include both qualitative and quantitative measures
Try a Sample Indicator
Going back to the Logic Map, try to
develop a few indicators for your sample
project
Keep it simple
Qualitative and quantitative
Will you be able to see the indicator?
To Summarize...
Start with your proposal or technology
plan
From your goals, develop indicators and a
performance rubric
Coming in Part II
Data Collection
Reporting
How Do We Know It’s Working?
Creating Evaluations for
Technology Projects and
Evaluations (part II)
Creating a
District-wide
Technology
Evaluation
G enerate
leadership
support
Determine scope
of the evaluation
effort
Appoint
Committee
O rient and Train
In-District
Evaluation
Committee
Formulate
Evaluation
Q uestions
Review
Q uestions
Stage 1
Committee orientation,
evaluation framing, and
training
Develop Indicator
Rubrics
Data Collection
Stage 2
Data collection and
analysis
Data Analysis
Scoring the
Rubrics
Findings
Initiating the Next
Review Cycle
Recommendations
Dissemination of
Report
Stage 3
Findings, recommendations,
and reporting
A Basic Process
Evaluation Questions
Must be tied to original planning goals
Performance Rubrics
Allow for authentic, qualitative, and holistic
evaluation
Data Collection
Tied to indicators in the rubrics
Scoring and Reporting
Measures?
Classroom observation, interviews, and workproduct review
What are teachers doing on a day-to-day basis to
address student needs?
Focus groups and surveys
Measuring teacher satisfaction
Triangulation with data from administrators and
staff
Do other groups confirm that teachers are being
served?
Data Collection
Review Existing Data
Current technology plan
Curriculum
District/school improvement plans
www.sun-associates.com/eval/sample
Create a checklist for data collection
Surveys
Creating good surveys
length
differentiation (teachers, staff, parents,
community, etc..)
quantitative data
attitudinal data
timing/response rates (getting returns!)
www.sun-associates.com/eval/samples/samplesurv.html
Surveys
Online
Profiler
LoTi
Zoomerang
Survey Issues
Online surveys produce high response
rates
Easy to report and analyze data
Potential for abuse
Depends on access to connectivity
Focus Groups/Interviews
Focus Groups/Interviews
Teachers
Parents
Students
Administrators
Other stakeholders
Classroom Observations
Using an observation template
Using outside observers
Other Data Elements?
Artifact analysis
A rubric for analyzing teacher and student
work?
Solicitation of teacher/parent/student
stories
This is a way to gather truly qualitative data
What does the community say about the use
and impact of technology?
Dissemination
Compile the report
Determine how to share the report
School committee presentation
Press releases
Community meetings
Conclusion
Build evaluation into your technology
planning effort
Remember, not all evaluation is
quantitative
You cannot evaluate what you are not
looking for, so it’s important to —
Develop expectations of what constitutes
good technology integration
More Information
[email protected]
978-251-1600 ext. 204
www.sun-associates.com/evaluation
www.edtechevaluation.com
This presentation is linked to that page