Wieman Presentation - CCLI/TUES: Course, Curriculum, and

Download Report

Transcript Wieman Presentation - CCLI/TUES: Course, Curriculum, and

Measuring Impact in STEM Ed;
Are they thinking like experts?
Carl Wieman
Assoc. Director for Science
White House Office of Science and
Technology Policy
The White House perspective
“Maintaining our leadership in research and
technology is crucial to America’s success. But if
we want to win the future – if we want innovation
to produce jobs in America and not overseas –
then we also have to win the race to educate our
kids.” B. Obama
Major Policy questions
What is effective teaching, particularly in STEM?
Can it be developed? How?
How can we achieve better learning? (evidence!)
switching hats to science education researcher
What is the broad goal of your project?
→ how to measure
What is the learning that matters to you?
(30 s)
Bunch of facts & solution techniques?
May be useful, but use tiny fraction of what learn in
school, and in career need vastly more than learn
in school.
Want them to understand _____! [DNA, relativity, PH…]
What does “understand” mean?
How measure if achieved?
Think about and use ____ like a scientist/engineer.
“Think like a scientist/engineer.”
I. What does that mean?
Expert thinking (cog. pysch.)
II. Development of expert thinking
III. More details on expert thinking
IV. Measuring --developing tools
Major advances past 1-2 decades
Consistent picture  Achieving learning
Science
classroom
studies
brain
research
cognitive
psychology
→principles of learning help design experiments and
make sense of results. Understand both what and why.
Expert competence research*
historians, scientists, chess players, doctors,...
Expert competence =
•factual knowledge
• Mental organizational framework  retrieval and application
or ?
patterns, relationships,
scientific concepts
•Ability to monitor own thinking and learning
("Do I understand this? How can I check?")
New ways of thinking-- require MANY hours of intense
practice to develop
*Cambridge Handbook on Expertise and Expert Performance
Significantly changing the brain, not just adding
bits of knowledge.
Building proteins, growing neurons  enhance
neuron connections, ...
Brief digression on research on development
of expertise.
Essential element of developing expertise*
“Deliberate practice” (A. Ericcson)
•task of challenging but achievable level that requires
explicit expert-like thinking. Intensely engaged
•reflection and guidance on result
•repeat & repeat & repeat, ...
10,000 hours later-- very high level expertise
Different brain, develops with “exercise.”
cew interpretation--“formative assessment”,
“constructivism”, “self-regulated learning” all
contained in “deliberate practice” framework.
* accurate, readable summary in “Talent is over-rated”, by Colvin
“Think like a scientist/engineer.”
I. What does that mean?
Expert thinking (cog. pysch.)
II. Development of expert thinking
III. More details on expert thinking
IV. Measuring --developing tools
How experts solve a problem—
”Cognitive task analysis”
(and how different
from non-experts)
features in your discipline? (1 min)
•concepts and mental models (analogies)
•testing these and recognizing when apply or
not
•distinguishing relevant & irrelevant information
•established criteria for checking suitability of
solution method or final answer (“sensemaking and self-checking”)
“How Scientists Think in the Real World: Implications for
Science Education”, K. Dunbar, Journal of Applied Developmental
Psychology 21(1): 49–58 2000
Lots of complex pattern recognition
What features and relationships important?
Which are not?
(“surface features” vs. “underlying structure”)
Often hear-“Novice problem solvers just do pattern matching,
experts use more sophisticated concept based
strategies.”
cew unproven claim (not official WH position)—
It is all pattern matching– experts just
look for and recognize different patterns.
Non-cognitive elements of thinking like a scientist.
Perceptions/attitudes/beliefs
(important, but changed more quickly, essential
precursor to “deliberate practice”)
Perceptions about science (& how learned and used)
Novice
Expert
Content: isolated pieces of
information to be memorized.
Content: coherent structure
of concepts.
Handed down by an
authority. Unrelated to world.
Describes nature,
established by experiment.
Problem solving: simple
matching to memorized
recipes.
Prob. Solving: Systematic
concept-based strategies.
Widely applicable.
consistent views across scientists in a discipline
(physics, chem, bio)
*adapted from D. Hammer
Student Perceptions/Beliefs
Kathy Perkins, M. Gratny
Percent of Students
60%
All Students (N=2800)
50%
Intended Majors (N=180)
40%
Actual Majors (N=52)
30%
20%
10%
0%
10
0
Novice
20
30
40
50
60
70
80
90
100
Expert
CLASS Overall Score
(measured at start of 1st term of college physics)
Student Beliefs
Percent of Students
60%
50%
40%
Actual Majors who were
originally intended phys majors
Actual Majors who were NOT
originally intended phys majors
30%
20%
10%
0%
10
0
Novice
20
30
40
50
60
70
80
90
100
Expert
CLASS Overall Score
(measured at start of 1st term of college physics)
Course Grade in Phys I or Phys II
(beliefs more important factor than grades)
Percent of Students
All Students (2.7/4)
45%
40%
35%
30%
25%
20%
15%
10%
5%
0%
Intended Majors (2.7/4)
Actual Majors (3.0/4)
DFW
C
B
A
Grade in 1st term of college physics
Creating tests to measure expert thinking
as different from non-expert (technical details)
A. Cognitive
Must understand student thinking!
No substitute for interviews.
Cognitive– think aloud solution to task. Look for
consistent features that appear.
Code interviews and have independent coding to
make objective. (BEWARE CONFIRMATION BIAS!)
Things to look for
•What mental models?
•How make decisions?
•What resources called upon (or not)?
Creating tests to measure expert thinking
as different from non-expert
Example– testing use of expert mental model
“troubleshooting”
Your laser suddenly put out only half as much light
as it had been before. What change may have
produced this result?
“redesign”
What are all the ways you could double the power
coming out of your laser?
You would like to …(e.g. build a bridge across this river).
What information do you need to solve this problem?
Steps in test development
1. Interview faculty-2. Interview students-- understand student thinking
3. Open-ended survey questions to probe.
4. Create multiple choice test-- answer choices reflect
actual student thinking.
5. Validation interviews on test– experts and
sample population
6. Administer to classes-- run statistical tests on
results.
Often iterate and/or skip steps, refine.
“Reasonable” data much better than no data!
Measuring perceptions. Same basic approach.
Interview students, capture perceptions in
their own words.
Survey as to level of agreement.
~40 statements, strongly agree to strongly disagree-Understanding physics basically means being able to recall something
you've read or been shown.
I do not expect physics equations to help my understanding of the
ideas; they are just for doing calculations.
Conclusion
Important educational goal “Thinking like a scientist”
Requires careful analysis to make explicit,and
distinguish from thinking of nonexperts.
Straightforward process to create tests that measure.
More sensitive and meaningful than typical exams.
Development and validation of instruments to measure learning
of expert-like thinking, W. Adams and C. Wieman, Int. J. Sci Ed
(in press). Covers last part of talk and technical details.
Tips for developing assessment tools.
1. Interview largest possible range of people.
Patterns and expert-novice differences more obvious.
2. 100+ student classes in large university don’t
vary year-to-year. Good way to get test-retest
reliability, find out if can measure changes.
3. Best questions: a)measure important aspect of
student thinking and learning.
b) measure aspect that instructors care about &
shocked at poor result.
4. Hard and not so useful to measure expert-like
thinking on everything. Sample as proxy.
Key elements of Good Concept
Inventory:
created by physicists, key concepts where student
failure is shocking.
(not probed by standard exams)


easy to administer exam pre & post. Learning
from this course
set of hard-to-learn topics-- (not everything)
proxy for broader learning (mastery & application of
concepts)


Suitable for use with wide range of institutions and
students
How administer?
Attitude surveys-- online, 1st and last week of class
small bonus mark for completion. 80-98%
Concept inventories-- Pre--in class, 1st week. Paper,
scantron. Students not keep test.
Post-- In class last week (“guide to in-class review and study for
final exam”). No affect on course mark. Occasional question on
final. 90+ %
Summary:
Data to drive educational improvement
Requirements
• measure value added (pre -post)
• easy to use (more important than perfection)
• test “expert-thinking” of obvious value to instructor
• validated (measure what is claimed)
• need many such instruments to use across
curriculum (collaborate)
instruments & research papers
class.colorado.edu
CWSEI.ubc.ca
Measuring conceptual mastery
• Force Concept Inventory- basic concepts of force and motion
1st semester university physics. Simple real world applications.
Ask at start and end of semester-What % learned? (100’s of courses)
Average learned/course
16 traditional Lecture
courses
improved
methods
Fraction of unknown basic concepts learned
On average learn <30% of concepts did not already know.
Lecturer quality, class size, institution,...doesn't matter!
Similar data for conceptual learning in other courses.
R. Hake, ”…A six-thousand-student survey…” AJP 66, 64-74 (‘98).
Nearly all intro classes average shifts to be
5-10% less like scientist.
Explicit connection with real life → ~ 0% change
+Emphasize process (modeling) → +10% !!
What every teacher should know
Components of effective teaching/learning
apply to all levels, all settings
basic cognitive
1. Motivation (lots of research)
& emotional
psychology,
2. Connect with prior thinking
diversity
3. Apply what is known about memory
a. short term limitations (relevant to you)
b. achieving long term retention
retrieval and application-- repeated & spaced in
time
*4. Explicit authentic practice of expert thinking.
Extended & strenuous
Measuring student (dis)engagement. Erin Lane
Watch random sample group (10-15 students). Check
against list of disengagement behaviors each 2 min.
example of data from earth
science course
time (minutes)
Design principles for classroom instruction
1. Move simple information transfer out of class.
Save class time for active thinking and feedback.
2. “Cognitive task analysis”-- how does expert think
about problems?
3. Class time filled with problems and questions that
call for explicit expert thinking, address novice
DP
difficulties, challenging but doable, and are
motivating.
4. Frequent specific feedback to guide thinking.