MSP Program Evaluation

Download Report

Transcript MSP Program Evaluation

MSP Program Evaluation

Carol L. Fletcher, Ph.D.

TRC Project Director Meetings 1/27/09 and 2/5/09

TRC Criteria

• How has your Regional Collaborative impacted • Teacher Content Knowledge • Classroom Practice • Student Achievement

Guskey Model

• Based on mediators of relationship between professional development and improvements in student learning • i.e. administrator knowledge and practice, school culture, teacher knowledge and practice, parent knowledge and practice

Five Critical Levels of PD

summative evaluations “why” evaluation complex success at following levels

Evaluation

• Inform planning, formative and • Helps to identify the “what” and the • Hierarchically arranged from simple to • Success at one level is necessary for

Level 1: Participants’ Reactions

What questions are addressed?

How will information be gathered?

• Did they like it?

• Will it be useful • Was the leader knowledgeable and helpful?

• Where the facilities comfortable?

• Questionnaires • Interviews • Participant journals

Level 2: Participants’ Learning

What questions are addressed?

How will information be gathered?

• Did participants acquire the targeted knowledge and skills?

• Questionnaires • Interviews • Participant journals/portfolios • Demonstrations

Level 3: Organization Support and Change What questions are addressed?

How will information be gathered?

• Did it affect organizational climate, procedures, or policies • Were sufficient resources made available?

• Were successes recognized & shared?

• Questionnaires • Structured Interviews • Participant journals/portfolios • Focus groups • District/school records

Level 4: Participants’ use of new knowledge and skills What questions are addressed?

• Did participants effectively apply the new knowledge and skills?

How will information be gathered?

• Questionnaires • Structured Interviews • Participant journals/portfolios/ reflections • Direct observation • Videotapes

Level 5: Student learning outcomes What questions are addressed?

• Did it affect student performance or achievement?

• Are students more engaged learners?

• Do students select more rigorous coursework?

• Did it affect student attitudes toward science or math courses/careers?

How will information be gathered?

• Questionnaires • School/student records • Structured Interviews • Participant journals/portfolios/ reflections • Direct observation • Videotapes

NSF Developed Tools

• Horizon Research • Assessing Teacher Learning About Science Teaching (ATLAST) • www.horizon-research.com/atlast/ • Instruments to measure student and teacher science content knowledge • Force and Motion • Flow of Matter and Energy in Living Systems • Plate Tectonics

ATLAST continued

• Student tests - paper copies only • Teacher tests - paper or online versions • The student and teachers assessments measure knowledge of the science content, while the teacher assessment also measures a teacher's ability to use science content knowledge to analyze student thinking, and to make instructional decisions. All of the items on the teacher assessment are set in instructional contexts.

ATLAST continued

• HRI provides the following analyses: • Percent of respondents choosing each answer choice; • Percent of respondents answering each item correctly; • Group mean score; and • Significance testing and computation of effect sizes for pre- and post-test group mean scores.

MOSART

www.cfa.harvard.edu/smgphp/mosart • Misconception Oriented Standards-based Assessment Resource for Teachers (MOSART) • Harvard; PI - Phillip Sadler • Content instrument for K-12 physical science and earth science, based on research about student misconceptions • Completion of online tutorial (~30 min) required to download instruments

MOSART continued

• K-4 & 5-8 Physical Science, Earth Science, Astronomy/Space Science • 9-12 Physics, Chemistry, Earth Science, Astronomy/Space Science • Multiple versions of the test are available for pre/post administration

MSPnet Toolbox

• http://hub.mspnet.org/index.cfm/msp_tools • Clearinghouse for tools used and shared by MSP projects • Tools may include assessment instruments, evaluation protocols, form letters, etc.

University of Louisville

http://louisville.edu/education/research/centers/crmstd/diag _sci_assess_middle_teachers.html

• Diagnostic Assessments for Middle School Teachers • Each assessment is composed of 25 items—20 multiple-choice and 5 open response.

• Six versions of each assessment are available in paper-and-pencil format

University of Louisville (cont.)

• Assessments available free of charge • Optional scoring service for $10/teacher • Detailed summary of teachers' performance that includes scores on individual items, on each science subdomain in the content area, and on four different knowledge types (memorized, conceptual understanding, higher-order thinking, pedagogical content knowledge)

University of Louisville (cont.)

Physical Science Matter Life Science Structure/Function Earth/Space Science Atmosphere/Hydrosphere Motion & Forces Energy Internal Regulation Heredity/Diversity Lithosphere Space Interdependence

MSP Knowledge Management and Dissemination

http://www.mspkmd.net/ • Knowledge reviews good for citing research base • Searchable Instrument database for science and math

Online Evaluation Resource Library (OERL)

http://oerl.sri.com/ • Teacher and student instruments • Content assessments, interviews, surveys,classroom observations

Thinking About Mathematics Instruction (TMI)

• http://www2.edc.org/tmi/tmi_survey.html

• Survey to investigate elementary and middle school principals leadership content knowledge (LCK) for mathematics

Distributed Leadership

• http://www.sesp.northwestern.edu/dls/instruments/ • Instructional Leadership Daily Practices Log • Principal Experience Sampling Method (ESM) Log • School Staff Network Survey