EDIT9990 - University of Georgia

Download Report

Transcript EDIT9990 - University of Georgia

Design-Based Research for
Advancing Educational Technology
EDIT 9990
Goals



Critique the stateof-the-art of
educational research.
Describe applications of
“design-based research.”
Encourage new thinking
about why and how we do
research.
Improving the
quality of
teaching and
learning
through
educational
research is
critical to our
survival.
Global Warming
Dutch Floating Homes
The lack of
scientific
literacy is
appalling in
even the
most
developed
countries.
Bad News
Most educational
research has
little impact on
practitioners and
yields few
discernable
benefits.
Oh...no!
The Failure of
Educational Research
– Vast resources going into
education research are
wasted.
– They [educational
researchers] employ weak
research methods, write
turgid prose, and issue
contradictory findings.
The Failure of
Educational Research
– Too much useless work is
done under the banner of
qualitative research.
– Qualitative research….
[yields] ….little that can be
generalized beyond the
classrooms in which it is
conducted.
College of Education
The University of Georgia
 Ranked
27th of 187
education colleges in
the USA
 240 faculty members
in 9 departments
 5,000 students in
33,000+ student
university
Research Productivity 1997-2001
Refereed Journal Articles (in-cites.com)
 U.
of Wisconsin - 202
 U. of Georgia - 201
 U. of Michigan - 164
 Indiana U. - 161
 U. of Maryland - 146
Georgia vs. Wisconsin
 Per
pupil
 $7,824
 $8,604
 Salary
 44,073
 42,232
 HS
 51%
 78%
 49th
 7th
Grad.
 Ranking
It is time we put the PUBLIC
back in publication!
Bush Administration Position
“There’s been no
improvement in
education over the
last 30 years, despite
a 90 percent increase
in real public spending
per pupil.”
 Promotes randomized
controlled trials as
used in medical
research.

Four Reform
Principles
Accountability: Guaranteeing Results
Flexibility: Local Control for Local
Challenges
Research-Based Reforms: Proven
Methods with Proven Results
Parental Options: Choices for
Parents, Hope for Kids
What Works
Clearinghouse
The What Works Clearinghouse
(WWC) has been established by the
U.S. Department of Education’s
Institute of Education Sciences to
provide educators, policymakers,
researchers, and the public with a
central and trusted source of
scientific evidence of what works in
education.
Slavin’s 5 Questions for Valid
Educational Research
Is there a control group?
 Are the control and
experimental groups
assigned randomly?
 If a matched study, are the
groups extremely similar?
 Is the sample size large
enough?
 Are the results statistically
significant?

Robert Slavin
“What Works” Position
“Once we have dozens or
hundreds of randomized
or carefully matched
experiments going on
each year on all aspects
of educational practice,
we will begin to make
steady, irreversible
progress.”
 NCLB funds “scientifically
based research.”

Robert Slavin
“It Won’t Work” Position
Double blind experiments
impossible in education
 Implementation variance
reduces treatment
differences
 Causal agents are underspecified in education
 Goals, beliefs, and
intentions of students and
teacher affect treatments

David R. Olson
Medical and health knowledge
is rarely applied sufficiently.
Another “It Won’t Work” Position



The What Works Clearinghouse
(WWC) standards “ignore the
critical realities about social,
organizational, and policy
environments in which
educational programs and
interventions reside.”
Advocates “decision-oriented”
evaluation research over
“conclusion-oriented” academic
research.
Recommends extended-term
mixed-method (ETMM) designs
as a viable alternative.
Madhabi Chatterji
American Evaluation Association
The priority given to randomized controlled
trials “manifests fundamental
misunderstandings about 1) the types of
studies capable of determining causality, 2)
the methods capable of achieving scientific
rigor, and 3) the types of studies that support
policy and program decisions. We would like
to help avoid the political, ethical, and
financial disaster that could well attend
implementation of the proposed priority.”
Randomized controlled
trials are the only way
we’ll ever be able to prove
“what works” in education!
Randomized controlled
trials promotes
pseudoscience and will
limit effective change!
Educational
researchers
have failed
to make a
clear appeal
to the public
for their
support.
People
learn
when
…..
Ellen Lageman argues
that educational
researchers, in a
misguided effort to be
“scientific,” have
turned away from the
pragmatic vision of
John Dewey.
 She attacks the
excessive emphasis
on quantitative
measurement.

Kieran Egan argues
that progressive ideas
from Herbert Spencer,
John Dewey, and Jean
Piaget are responsible
for the “general
ineffectiveness” of our
schools.
 He also assails the
notion that education
can be improved
through research as
traditionally conceived.

Thomas Kuhn
The Structure of Scientific Revolutions
"I'm not sure that there can now be such a
thing as really productive educational
research. It is not clear that one yet has
the conceptual research categories,
research tools, and properly selected
problems that will lead to increased
understanding of the educational process.
There is a general assumption that if
you've got a big problem, the way to solve
it is by the application of science. All you
have to do is call on the right people and
put enough money in and in a matter of a
few years, you will have it. But it doesn't
work that way, and it never will."
Complexity of Interactions
We cannot store up
generalizations and
constructs for
ultimate assembly
into a network.
 When we give
proper weight to
local conditions, any
generalization is a
working hypothesis,
not a conclusion.

Lee Cronbach
Learning Styles
“Research into learning
styles can, in the main, be
characterised as smallscale, non-cumulative,
uncritical and inwardlooking. It has been
carried out largely by
cognitive and educational
psychologists, and by
researchers in business
schools and has not
benefited from much
interdisciplinary research.”
Dichotomies
















convergers versus divergers
verbalisers versus imagers
holists versus serialists
deep versus surface learning
activists versus reflectors
pragmatists versus theorists
adaptors versus innovators
assimilators versus explorers
field dependent versus field
independent
globalists versus analysts
assimilators versus
accommodators
imaginative versus analytic
learners
non-committers versus plungers
common-sense versus dynamic
learners
concrete versus abstract learners
random versus sequential learners

















initiators versus reasoners
intuitionists versus analysts
extroverts versus introverts
sensing versus intuition
thinking versus feeling
judging versus perceiving
left brainers versus right brainers
meaning-directed versus undirected
theorists versus humanitarians
activists versus theorists
pragmatists versus reflectors
organisers versus innovators
lefts/analytics/inductives/successive
processors
versus rights/globals/deductives/
simultaneous processors
executive, hierarchic, conservative
versus legislative,
anarchic, liberal.
If Sisyphus
were a
scholar, his
field would
be
educational
research.
- David Laberee
Educational Technology Research
Pseudoscience Results
Insert cone of experience example
Pseudoscience Results
Insert cone of experience example
Educational technology researchers
are not doing much better than other
educational researchers.
NCLB Requirements
"every student is
technologically literate
by the time the student
finishes the eighth
grade," and
 "that technology will be
fully integrated into the
curricula and
instruction of the
schools by December
31, 2006."

Abundant technology
has not led to
extensive use of
computers for
“tradition-altering
classroom
instruction.”
 The small percentage
of computer-using
instructors only use it
to maintain existing
classroom practices.

Teachers have legitimate concerns.
Is it simple enough for
me to learn quickly?
 It it versatile?
 Will it motivate students?
 Is it aligned with skills I’m
expected to teach.
 Is it reliable?
 It it breaks, who will
help?
 Will it weaken my
classroom authority?

Ed. Tech Research Reality
Isolated researchers conduct
individual studies rarely linked to
a research agenda or concerned
with any relationship to practice.
 Studies are presented at
conferences attended by other
researchers and published in
journals few people read.
 Occasional literature reviews
and meta-analyses are
published.

Ed. Tech Research Reality
Many educational technology
studies claim to have predictive
goals (testing theories) and use
quasi-experimental designs with
quantitative measures.
 Research reviewers usually
must reject 75 percent or more
of the published studies to find
the few worthy of further review
or inclusion in meta-analyses.

Ed. Tech Research Reality
Dillon & Gabbard’s 1998 literature
review of “Hypermedia as an
Educational Technology” highlights
problems with IT research.
 Major conclusion: “Clearly, the
benefits gained from the use of
hypermedia technology in learning
scenarios appear to be very limited
and not in keeping with the generally
euphoric reaction to this technology
in the professional arena.”

Ed. Tech Research Reality
Fabos & Young 1999 literature review
of “Telecommunications in the
Classroom: Rhetoric Versus Reality” is
another bad sign.
 Major conclusion: “…many
of the expected benefits of
telecommunications
[enhancing writing,
multicultural awareness,
and economic possibilities]
are inconclusive, optimistic,
and even contradictory.”

Bernard et al. (2004) Meta-analysis:
“How Does Distance Education
Compare to Classroom Instruction?”
a
very small but positive mean effect
size for interactive distance education
over traditional classroom instruction on
student achievement
 small negative
effect for retention
rate
DE Research from 1985-2002
 1,010 potential “studies”
retrieved
 232 studies met all criteria
 599 independent effect sizes
 47,341 students (achievement)
Results: Overall Effects
 325 independent outcomes (total
achievement)
 Hedges’ g = +0.0122, p < .001
 Range of findings from –2.17 to +2.66
 177 outcomes with low methodology
removed
 Hedges’ g = +0.017, p > .05
 Significantly heterogeneous
Distribution of Effect Sizes
3
Magnitude of Effect Size
Hedges’g
2
1
0
1
21
41
61
81 101 121 141 161 181 201 221 241 261 281 301 321
-1
-2
-3
Effect
Sizes
Ordered
by Magnitude
Distribution
of Effect
Sizes for Achievement
Outcomes
325 independent outcomes (achievement)
Hedges’ g = +0.0122, p < .001
Sir John Daniel - UNESCO
… the futile tradition of comparing
test performances of students
using new learning technologies
with those who study in more
conventional ways…is a pointless
endeavor because any teaching
and learning system, old or new,
is a complex reality. Comparing
the impact of changes to small
parts of the system is unlikely to
reveal much effect and indeed,
“no significant difference” is the
usual result of such research.
Chewing Gum More Effective than
Interactive Multimedia CD-ROM
Dr. Ken Allen at NYU
wanted to compare CDROM with lectures
 Wrigley’s wanted to fund
chewing gum study
 Combined study
 Gum chewers = BAbstainers = C+
 CD-ROM no better

Media comparison studies are akin to
comparing copper bracelets with
voodoo dolls as medical cures.
Experimental approaches to
educational technology research
won’t work in the way that they do
in medical research!
Genetics
Research
Basic
Medical
Cures
Applied
“Pasteur’s Quadrant” approach to
research is needed (Stokes, 1997).
Considerations of use?
Research is
inspired by:
No
Bohr
Yes
Pasteur
Yes
Quest for
fundamental
understanding?
Edison
No
Good News
There are new strategies
for conducting “designbased research” that can
improve our research
so that it can
become
a socially
responsible
enterprise.
Thank
goodness!
Educational researchers often fail
to distinguish between research
goals and methods.
Six Ed. Tech research goals:
Theoretical
Predictive
Interpretivist
Postmodern
Design/Development
Action/Evaluation
Theoretical Goals
Focus on explaining
phenomena through
logical analysis and
synthesis of
principles and results
from other studies
 EXAMPLE: Gagne’s
theory of the
conditions of learning

Predictive Goals
Focus on determining
how education works
by testing hypotheses
related to theories of
learning, teaching,
performance, etc.
 EXAMPLE:
cooperative learning
and control studies by
Hooper, Temiyakarn,
and Williams

Simon Hooper
Interpretivist Goals
Focus on determining
how education works by
describing and
interpreting phenomena
related to learning,
teaching, performance,
etc.
 EXAMPLE: Delia
Neuman’s observations of
disabled children using
commercial software

Delia Neuman
Postmodern Goals


Focus on examining the
assumptions underlying
educational programs with
the goal of revealing
hidden agendas and
empowering
disenfranchised minorities
EXAMPLE: Ann
DeVaney’s analysis of IT in
relation to race, gender,
and power
Design/Development Goals


Focus on dual
objectives of
developing creative
approaches to solving
problems and
constructing reusable
design principles
EXAMPLE: Sasha
Barab’s “Quest Atlantis”
project and “Learning
Engagement Theory”
Action/Evaluation Goals
Focus on describing,
improving, or estimating
the effectiveness and
worth of a particular
program
 EXAMPLE: Hill and
Reeves four-year
evaluation of ubiquitous
computing initiative.

Methods should not be selected until
goals & research questions are clear:
 Quantitative
 Qualitative
 Critical
Theory
 Historical
 Literature
Review
 Mixed-methods
So what
does
designbased
research
look like in
the real
world?
Design-Based Research Collective
Goals of designing learning environments
and theories are intertwined
 Development and research occur in
continuous cycles
 Research on designs leads to sharable
theories relevant to practitioners
 Research must account for how designs
function in authentic settings
 Development of accounts relies on methods
that connect actions to outcomes

Design-Based Research Strategies
Define a pedagogical outcome and create
learning environments that address it.
 Emphasize content and
pedagogy rather than
technology.
 Give special attention to
supporting human
interactions.
 Modify learning
environments until
outcome is reached.

Chris Dede – Harvard University
River City Curriculum &
Situated Learning Theory
Yasmin Kafai - UCLA
van den Akker, Nieveen,
McKenney – University of Twente
Design-Based Research Example
“Authentic Learning in Interactive
Multimedia Environments.”
 Ph.D. dissertation by Jan Herrington at
Edith Cowan University in Australia.
 Supervised by Professor
Ron Oliver.
 Winner of AECT
Young
Researcher
of the Year
in 1999.

Outcome Practitioners Desired
New teachers will
use a wider
variety of
assessment
methods in their
student teaching
experience and
eventual practice.
Learning Environment Design
 Identified
the critical
characteristics of a
situated learning
model.
 Developed an
interactive multimedia
learning environment
based on those
characteristics.
Situated Learning Model
Herrington 1997





Provide an authentic context
reflecting the way the knowledge
will be used in real-life
Provide authentic activities
Provide access to expert
performances and the
modeling of processes
Provide multiple roles
and perspectives
Support collaborative
construction of knowledge
Situated Learning Model
Herrington 1997
Promote reflection to enable
abstractions to be formed
 Promote articulation to enable
tacit knowledge to be made
explicit
 Provide coaching and scaffolding
at critical times
 Provide for integrated
assessment of learning within
the tasks.

Mixed Methods Design
Videotaped
preservice teachers
using program
 Interviewed teachers
and their supervisors
in schools during
student teaching
practicum
 Logged usage data

Findings

Problem Solved:
– Novice teachers acquired
advanced knowledge while
engaging in higher order
thinking
– New knowledge and skills
applied in practicum

Design Principles:
– Situated learning model is
a successful design model
for interactive learning
Authentic Learning Team
Reeves, Herrington, Oliver
What
challenges do
we face in
adopting
design-based
research
approaches?
Design-Based Research Challenges
–Sampling bias
–Response bias
–Researcher bias
–Overwhelming data
–Confounded variables
–Dissemination
–Scaling up
Ann Brown
Design-Based Research Collective
We suggest that the
value of designbased research
should be
measured by its
ability to improve
educational
practice.
Because we are a
design profession
(not a discipline),
educational
technologists should
pursue designbased research that
integrates the desire
to solve problems
with the search for
knowledge.
“The status of research
deemed educational
would have to be
judged, first in terms of
its disciplined quality
and secondly in terms
of its impact. Poor
discipline is no
discipline. And
excellent research
without impact is not
educational.”
- Charles W. Desforges (2000)