Challenges and Trade-offs in Measuring the Outcomes of NSF

Download Report

Transcript Challenges and Trade-offs in Measuring the Outcomes of NSF

Challenges and Trade-offs in
Measuring the Outcomes of
NSF’s Mathematics and
Science Partnership Program:
Lessons from four years
on the learning curve
Five Key Features of
NSF MSP Projects
• Partnership-driven, with significant engagement of
faculty in mathematics, the sciences, and engineering
• Teacher quality, quantity, and diversity
• Challenging courses and curricula
• Evidence-based design and outcomes
• Institutional change and sustainability
Design-Implementation-Outcomes =
the DIO Cycle of Evidence
From Evidence: An Essential Tool - a Math and Science Partnership Program Publication
developed by MSP principal investigators and evaluators of Cohort 1 and 2 projects in order to
formulate a statement that would guide effective project-level evaluation
(http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0531).
Determining What to Measure
Basic professional development
logic model
Steps in decisionmaking
• Determine a general outcome domain,
• Identify one or more indicators of that domain,
• Select a measure.
Example of mathematical
knowledge for teaching item:
8. As Mr. Callahan was reviewing his students’ work from the day’s lesson on
multiplication, he noticed that Todd had invented an algorithm that was different from
the one taught in class. Todd’s work looked like this:
983
x 6
488
+5410
5898
What is Todd doing here? (Mark ONE answer.)
a) Todd is regrouping (“carrying”) tens and ones, but his work does not
record the regrouping.
b) Todd is using the traditional multiplication algorithm but working
from left to right.
c) Todd has developed a method for keeping track of place value in the
answer that is different from the conventional algorithm.
d) Todd is not doing anything systematic. He just got lucky—what he
has done here will not work in most cases.
Professional development logic
model with illustrative indicator
and measure
Inputs
Activities
Outcomes
Short term
Long term
Content for teaching
Mathematical knowledge for teaching
` Federal
funds
Other
inputs
Summer
institutes
Improved
content
knowledge
in
mathematics
or science
Ongoing
school-year
workshops
Inquiry-based teaching
Horizon classroom observation protocol
Higher
quality
content area
instruction
Improved
student
learning in
math and/or
science
Persistence in career
School records
Online
course
supplements
Other human and
material
resources
Improved
understanding
of pedagogy
as it relates to
the content
area
Other teacher
outcomes:
retention,
satisfaction,
leadership
ability
Enhanced
district/school
capacity
Grouping practices, questioning strategies, student-centered work
Survey
Performance on
Achievement Test
XYZ State Test
Group Discussion Time
Challenges and trade-offs
in context of NSF project experience
• Develop or adopt or adapt—are there existing
measures that are sufficiently aligned with your
program that they provide valid measures of what
you are trying to achieve? How closely aligned to
the treatment and its specific focus should your
measure be? Is it really measuring what you intend
to measure? What can you do, as far as instrument
development and administration is concerned, to
ensure the collection of more reliable data? Does
the instrument lend itself to the type of analysis
you’d like to do—i.e., have you thought through
how the data will be analyzed?
(continued)
• Knowledge of facts or application—to what
extent do you want to measure the
information/knowledge vs application of
information or knowledge? Understanding of
misconceptions?
• Stakeholder credibility—what types of
measures do you need to use to promote
confidence in your findings? Will your
stakeholders accept teacher self report? Do
you need some more objective measure of the
outcome area? When might multiple
measures of the same indicator be useful?
(continued)
• Practical concerns—what can you afford? For
example, can you budget support observations? Do
you have resources to follow up on missing or
incomplete data? What do you or your staff have the
skill and time to do? Do you have staff on board who
can, for example, conduct observations or score
portfolios. What is the relative burden on respondents
as far as data collection?
Examples of instruments used
in MSP projects
Examinations
Assessing Teacher Learning about Science Teaching (ATLAST) –
http://www.horizon-research.com/atlast/
Concept Inventories – Force
(http://modeling.asu.edu/R&E/Research.html);
Function/Precalculus Concept Assessment (currently being
validated at Arizona State University;
http://cresmet.asu.edu/prods/pca.shtml)
Mathematical Knowledge for Teaching –
http://sitemaker.umich.edu/lmt/home
MOSART: Misconception Oriented Standards-based Assessment
Resource for Teachers – http://hub.mspnet.org/index.cfm/11777
(continued)
Behavioral Observations
Horizon Classroom Observation and Analytic Protocol –
http://www.horizon-research.com/instruments/clas/cop.php
OMLI Classroom Observation Protocol –
http://hub.mspnet.org/index.cfm/11980
Reformed Teaching Observation Protocol (RTOP) –
http://cresmet.asu.edu/prods/rtop.shtml
(continued)
Surveys
Student Motivation – http://www.mspmap.org/index.html
Survey of Enacted Curriculum –
http://www.ccsso.org/projects/Surveys_of_Enacted_Curriculum/
Ongoing sources for tools/instruments
MSPNet–Toolbox — http://hub.mspnet.org/index.cfm/msp_tools
Online Evaluation Resource Library (OERL) — http://oerl.sri.com/