Data Collection Techniques

Download Report

Transcript Data Collection Techniques

Data Collection Techniques
For Technology Evaluation
and Planning
Contact Information
[email protected]
978-251-1600 ext. 204
www.edtechevaluation.com
This presentation will be linked to that site (on the
Tools page)
Where Do We Stand?
Who’s working on an actual project?
Current?
Anticipated?
Your expectations for today
Objectives
To review the key elements of effective
program evaluation as applied to
technology evaluations
Understanding the role of data collection
in an overall evaluation process
Reviewing various data collection
strategies
Why Evaluate?
To fulfill program requirements
NCLB and hence Title IID carry evaluation
requirements
One of the 7 seven program requirements for NY
Title IID Competitive Grants
“Each grantee will be required to develop “process and
accountability measures” to evaluate the extent to which activities
funded are effective in (1) integrating technology into curricula and
instruction; (2) increasing the ability of teachers to teach; and (3)
enabling students to meet challenging State standards. Records
relating to these “process and accountability measures” are to be
made available on request to the NYS Education Department (or
its agents).”
Project evaluation is also required as an
overall part of each proposal…
“Describe the plan for evaluating the effectiveness of the
competitive grant project. The plan should include clear
benchmarks and timelines to monitor progress toward
specific objectives and outcome measures to assess impact
on student learning and achievement. It must address the
extent to which activities funded are effective in (1)
integrating technology into curricula and instruction; (2)
increasing the ability of teachers to teach; and (3) enabling
students to meet challenging State standards.”
10% of the points…10% of the budget?
A Framework for Review
From Designing P rofession al
Development for T eachers of Science
and Mathem atics, Loucks-Horsley,
Hewson , Lo ve, an d St iles. Corwin
Press Inc. 19 98
Evaluation
 Helps clarify project goals, processes, products
 Must be tied to indicators of success written for your
project’s goals
 Not a “test” or checklist of completed activities
 Qualitatively, are you achieving your goals?
 What adjustments to can be made to your project to
realize greater success?
The Basic Process
 Evaluation Questions
Creating a
District-wide
Technology
Evaluation
G enerate
leadership
support
Determine scope
of the evaluation
effort
Appoint
Committee
 Tied to original project goals
O rient and Train
In-District
Evaluation
Committee
 Performance Rubrics
Formulate
Evaluation
Q uestions
 Allow for authentic, qualitative,
and holistic evaluation
Review
Q uestions
 Data Collection
Stage 1
Committee orientation,
evaluation framing, and
training
Develop Indicator
Rubrics
 Tied to indicators in the
rubrics
Data Collection
Stage 2
Data collection and
analysis
Data Analysis
 Scoring and Reporting
Scoring the
Rubrics
 Role of this committee (the
evaluation committee)
Findings
Initiating the Next
Review Cycle
Recommendations
Dissemination of
Report
Stage 3
Findings, recommendations,
and reporting
Who Evaluates?
Committee of stakeholders (pg 13)
Outside facilitator?
Data collection specialists?
Task checklist (pg 11)
Data Collection vs. Evaluation
Evaluation is more than data collection
Evaluation is about…
Creating questions
Creating indicators
Collecting data
Analyzing and using data
Data collection occurs within the context
of a broader evaluation effort
Evaluation Starts with Goals
Evaluation should be rooted in your goals
for how you are going to use or integrate
technology
A logic map can help highlight the
connections between your project’s
purpose, goals, and actions
And actions form the basis for data
collection!
pg 15
Example Project Logic Map
Goals Lead to Questions
What do you want to see happen?
These are your goals
Rephrase goals into questions
Achieving these goals requires a process
that can be measured through a formative
evaluation
We Start with Goals…
 To improve student achievement through their participation in
authentic and meaningful science learning experiences.
 To provide advanced science and technology learning
opportunities to all students regardless of learning styles or
abilities.
 To produce high quality science and technology curriculum in
which the integration of technology provides “added value” to
teaching and learning activities.
 To increase students’ knowledge of the Connecticut River’s
history and geology, and to gain and understanding its past,
present and possible future environmental issues.
…and move to questions
Has the project developed technology-enhanced
science learning experiences that have been
instrumental in improving student mastery of the
Skills of Inquiry, understanding of the
history/geology/ecology of the Connecticut
River, and of the 5-8 science curriculum in
general?
Has the project offered teacher professional
development that has resulted in improved
teacher understanding of universal design
principles and technology integration strategies?
…And Then to Indicators
What is it that you want to measure?
Whether the projects have enhanced learning
The relationship between the units and
The selected curriculum
The process by which they were developed
Increases in teacher technology skills (in relation to
particular standards)
Whether the professional development model met
with its design expectations
Collaborative and sustainable
Involves multiple subjects and administrators
Indicators should reflect your project’s unique
goals and aspirations
Rooted in proposed work
Indicators must be indicative of your unique
environment...what constitutes success for you might
not for someone else
Indicators need to be highly descriptive and can
include both qualitative and quantitative measures
You collect data on your indicators
Creating a
District-wide
Technology
Evaluation
G enerate
leadership
support
Determine scope
of the evaluation
effort
Appoint
Committee
O rient and Train
In-District
Evaluation
Committee
Formulate
Evaluation
Q uestions
Review
Q uestions
Stage 1
Committee orientation,
evaluation framing, and
training
Develop Indicator
Rubrics
Data Collection
Stage 2
Data collection and
analysis
Data Analysis
Scoring the
Rubrics
Findings
Initiating the Next
Review Cycle
Recommendations
Dissemination of
Report
Stage 3
Findings, recommendations,
and reporting
Evidence?
Classroom observation, interviews, and workproduct review
What are teachers doing on a day-to-day basis to
address student needs?
Focus groups and surveys
Measuring teacher satisfaction
Triangulation with data from administrators and
staff
Do other groups confirm that teachers are being
served?
Data Collection
Review Existing Data
Current technology plan
Curriculum
District/school improvement plans
Others?
Tools and Techniques
Surveys
Interviews
Observations
Artifact Analysis
Surveys
Online vs. Paper
Is there sufficient connectivity?
Doesn’t have to be at the classroom level
Often works best if people complete the instruments
all at the same time
Same goes for paper surveys
Online surveys provide immediate data
Spreadsheets which can be exported to a
variety of different programs for analysis
Surveys
Online
VIVED
Profiler
LoTi
Zoomerang
SurveyMonkey.com
Make Your Own!
www.sun-associates.com/neccsurv.html
Based on a CGI script on your webserver
Outputs to a text file, readable by Excel
Works with yes/no, choose from a list, and
free text input (no branching)
 www.sun-associates.com/surveyws/surveys.html
Survey Tips
Keep them short (under 10 minutes)
Avoid huge long checklists
Allow for text comments
Support anonymity
But allow for categorical identifications -school, job function, grade, etc.
Coordinate and support survey
administration
Avoid the “mailbox stuffer”
Work with building leaders
Provide clear deadlines
Three Big Points
Surveys alone mean nothing
TRIANGULATE!
100% response rate is virtually impossible
On the other hand, nearly 100% is very
possible if you follow our tips!
Share the data
No one wants to fill in forms for no purpose
Interviews
Serve to back up and triangulate survey
data
Less anonymous than surveys
Mixed blessing...
Allows for immediate follow-up of
interesting findings
Interviewing Tips
As homogenous as feasible
By grade, job function, etc.
Be attentive to power structures
Don’t mix principals with teachers; tech
coordinators with teachers; central office staff
with principals; etc.
Use outside interviewers
People will explain things to us (because they have
to!)
We avoid the power structure issues
We’ve done this before
Structure and focus the interviews
Use a well-thought-out and designed protocol
Only diverge after you’ve covered the basic question
Three Big Points
Create protocols after you’ve seen survey
data
Homogeneity and power
Use outsiders to conduct your interviews
Observations
The third leg of your data triangle
Surveys - Interviews - Observations
Familiar yet different
You’ve done this before...but not quite
Progressively less “objective” than
surveys and interviews
Observation Tips
Insure that teachers understand the point and
focus of the observations
You’re evaluating a project, not individuals!!
Sample
You can’t “see” everything
So think about your sample
You can learn as much from an empty
classroom as an active one
Look at the physical arrangement of the room
Student materials
How is this room being used?
Outside observers are necessary unless
you simply want to confirm what you
already know
Avoid turning observations into a
“technology showcase”
Showcases have their place -- mostly for
accumulating and reviewing “artifacts”
But the point of observations is to take a
snapshot of the typical school and teacher
Three Big Points
Observe the place as well as the people
Observations are not intended to record
the ideal...rather, the typical
Use outside observers
Artifact Analysis
Reviewing “stuff”
Lesson plans
Teacher materials
Student work
Create an artifact rubric
Not the same as your project evaluation
indicator rubric
10 Tips for Data Collection
Challenge your assumptions
But also don’t waste time by asking the
obvious
Cast a wide net
It’s all about stakeholders
Dig deep
Try to collect the data that can’t easily be
observed or counted
Use confirming sources
Triangulate! Surveys alone do nothing.
Have multiple writers
Stakeholders and different perspectives
Think before you collect
Choose questions carefully and with regard
to what you really expect to find
Set (reasonable) expectations for
participation
Time and effort
Forget about mailbox surveys
Usually waste more time than their value
Report back
Don’t be a data collection black hole!
It’s a process, not an event!
It does little good to collect data once and
then never again
Data collection is part of a long-term process
of review and reflection
Even if the immediate goal is only to get
“numbers” for the state forms
Dissemination
Compile the report
Determine how to share the report
School committee presentation
Press releases
Community meetings
Conclusion
Build evaluation into your technology
planning effort
Remember, not all evaluation is
quantitative
You cannot evaluate what you are not
looking for, so it’s important to —
Develop expectations of what constitutes
good technology integration
Data collection is not evaluation
Rather, it’s an important component
Data must be collected and analyzed
within the context of a goal-focused
project indicators
More Information
[email protected]
978-251-1600 ext. 204
www.sun-associates.com/evaluation
www.edtechevaluation.com