Bintz-BSCS AIM Overview

Download Report

Transcript Bintz-BSCS AIM Overview

Evaluating the Conceptual Coherence
and Rigor of Instructional Materials
Florida Association of Science Supervisors
5 May 2005
Jody Bintz, BSCS Center for Professional Development
• Think of a district science team
working together to select or
recommend instructional materials…
– What are they saying?
– What is influencing their decision?
• What are they looking at in the materials?
• What context issues are being considered?
– How will they make their decision?
Goal
• Provide insights into an evidence-based process
for evaluating instructional materials that:
– Builds the awareness that high-quality instructional
materials matter in the learning process for students.
– Develops common understandings among teachers
about the characteristics of high-quality instructional
materials.
– Requires teacher dialogue around evidence to build
consensus during the decision-making process.
– Serves as a professional development strategy to
support implementation of instructional materials.
We believe that we found powerful evidence that
textbooks exert a strong influence on what teachers
teach. This seems to be true in most countries
despite differences in the nature and use of
textbooks. Textbook coverage is important both for
what topics are taught and for the levels of
performances and accomplishments expected of
students. Whatever should be the case, textbooks
have an impact almost everywhere, and it is perilous
to ignore the ways in which they at least partially
shape what is taught.
Why Schools Matter, Schmidt, et. al., {date}
Premises…
• Student learning is directly linked to the quality of
the instructional materials selected.
• Teachers are more effective in the classroom
when they implement high quality instructional
materials.
• Teachers make better decisions about instructional
materials when the dialogue is based on evidence.
• The process of selecting instructional materials
presents an opportunity for professional
development.
Teacher Dialogue
AIM
A process and tools for
Analyzing Instructional Materials
Development of the AIM
Process and Tools
• Collaboration between the BSCS Center for
Professional Development and the K-12 Alliance
of WestEd
• Adapted and field-tested as part of the NSF
funded National Academy for Curriculum
Leadership (NACL)
• Used by thousands of teachers across the country
What is AIM?
• It is…
– A process and set of tools
to select instructional
materials
– A professional
development strategy
– A collaborative process that
uses consensus-building
– Based on National Science
Education Standards
– Linked to the research
findings described in How
People Learn
• It is NOT…
– A checklist; goes beyond
the thumb test
– Overly complicated and
prescriptive
Standards and Research
Today we will…
• Pique
• Challenge
• Engage
• Model
Research on How People Learn
Key
Findings
Key Findings for Students
Key Findings for
Teachers
First
Students come to the classroom with preconceptions about
how the world works.
Recognize preconceptions
and adjust instruction
Second
Students must have a deep foundation of usable knowledge
and understand facts in the context of a conceptual
framework.
Understand the content
and conceptual framework
for a discipline
Students must be taught explicitly to take control of their
own learning by monitoring their progress.
Teach metacognitive skills
Third
Based on How People Learn, National Research Council, 2000
Application of Key Findings
Key
Findings
Instructional Materials
First
Include structured strategies to elicit and challenge student
preconceptions
Incorporate background for the teacher about common
preconceptions
Second
Third
Be organized around a conceptual framework
Connect factual information to the framework
Provide relevant examples to illustrate key ideas
Make learning goals explicit
Integrate metacognitive skill development into content
Facts and Concepts
(Our Definitions)
• Facts or definitions are pieces of
information. Focus is on verifiable and
discrete details.
• Concepts are over-arching ideas that
clearly show the relationships between
facts. Frequently abstract and often define
the discipline.
Science Content: Gathering Evidence
• Scan the entire unit/chapter and note the overarching
concept.
• Read your assigned section carefully.
• Write the big ideas or concepts addressed in your section
on sticky notes. Write concepts in complete sentences.
Limit yourself to one concept per sticky note.
• Note how the concepts in your section build from
those in previous sections, how concepts in your
section are extended in subsequent sections, and how
the concepts in your section connect to the
overarching concept of the unit or chapter
• Note the context (e.g., real-world, engaging to
students) in which the concepts are learned.
Gathering Evidence
Science Content
Work Students Do
Assessment
NACL Team’s Work
Pittsburgh Team, Summer 2001
Conceptual Flow Diagram
Major Concepts in a Unit from BSCS Biology: A Human Approach
Looking at the Science Content Rubric
• Note the criteria addressed in the
rubrics.
• Are any of these similar to those we
brainstormed previously?
• For your assigned row, visualize
what would be present in “ideal”
instructional materials that would
justify a score of “5”. In other words,
what would the evidence look like?
SCIENCE CONTENT RUBRIC
STANDARDS ALIGNMENT
Science content standards:

May originate at the national, state, district, or school level,

May include the subject matter disciplines (physical, life, earth
and space sciences) as well as science as inquiry, science and
technology, science in personal and social perspectives, history
and nature of science, and/or unifying concepts and processes.
(5)
(3)
(1)
Most of the science
content standards
designated for the
specific course and/or
grade level are
addressed.
Some of the science
content standards
designated for the
specific course and/or
grade level are
addressed.
Few of the science
content standards
designated for the
specific course and/or
grade level are
addressed.
Content is accurate with
very few errors of fact
or interpretation.
Content is accurate with
some errors of fact or
interpretation.
Content has many
errors of fact or
interpretation.
Most key science
concepts are developed
for conceptual
understanding.
Some key science
concepts are developed
for conceptual
understanding.
Few key science
concepts are developed
for conceptual
understanding.
The materials have a
coherent sequence.
The materials have a
somewhat coherent
sequence.
The materials lack a
coherent sequence.
Most key science
concepts are addressed
in a context-rich
setting.
Some key science
concepts are addressed
in a context-rich
setting.
Few key science
concepts are addressed
in a context-rich
setting.
ACCURACY
Accurate science content:

Is grounded in current research and conforms to fact,

Includes explanations about science that translate information
into developmentally appropriate content without losing original
meaning or distorting fact.
CONCEPT DEVELOPMENT (HPL 2)
Content developed for conceptual understanding:

Includes a limited number of key concepts,

Develops concepts in-depth at a developmentally appropriate
level,

Requires students to apply and demonstrate their understanding
in multiple ways.
SEQUENCING (HPL 2)
Content with a coherent sequence:

Is organized in a deliberate fashion to promote student
understanding,

Links facts and concepts in ways that facilitate retrieval and
application,

Builds from and extends concepts previously developed,

Strongly connects concepts to an overarching conceptual
framework.
CONTEXT (HPL 2)
Content that is context-rich:

Is presented in an engaging context that is related to real world
experiences and situations,

Facilitates the assimilation of new knowledge or reorganization of
knowledge in a way that allows students to build on their prior
conceptions and/or experience with the world.
The SCI Center at BSCS: National Academy for Curriculum Leadership
Copyright © 2002 by BSCS and WestEd.
Applying the Science Content Rubric
• With your group, give this unit/chapter a
score for the bottom three rows in this
rubric.
• Note two or more pieces of evidence that
justify each score.
• On the AIM Summary Sheet, note strengths
and limitations for the bottom three rows of
the rubric.
AIM Process: Paper Screen Score Sheet
AIM Paperscreen Score Sheet
Criteria / Component
Score
Weight
Weighted
Total
Percent
Content
Standards Alignment
Accuracy
Concept Development (HPL 2)
Context (HPL 2)
Organization (HPL 2)
TOTAL Content Criterion
X
.40
=
X
.20
=
X
.20
=
X
.20
=
Work Students Do
Engaging Prior Knowledge (HPL1)
Metacognition (HPL 3)
Abilities to Do Scientific Inquiry
Understandings about Scientific Inquiry
Accessibility
TOTAL Work Students Do Criterion
Assessment
Quality
Multiple Measures
Use of Assessments
Accessibility
TOTAL Assessment Criterion
The Work Teachers Do
Instructional Model
Effective Teaching Strategies
Teaching Strategies for Inquiry
Support for the Work Teachers Do
TOTAL Work Teachers Do Criterion
Grand Total
T=
T / total possible x 100 =
AIM Summary Sheet
Criteria and
Components
Standards
Alignment
Content
Accuracy
Concept
Development
Sequencing
Context
Summary of Strengths
Summary of Limitations
AIM includes . . .
Pre-Screen
Paper Screen
Implementation
Identify Criteria
Identify Criteria
Gather Evidence
Gather Evidence
Analyze Evidence
& Apply Rubric
Analyze Evidence
& Apply Rubric
Score Components
Score Components
Summarize Results
Summarize Results
Select
Scaling-Up for Full Implementation
The AIM Process: Paper Screen & Implementation
Identify Criteria
Science
Content
Work
Students Do
Assessment
Work
Teachers Do
Student
Understanding
Teacher
Implementation
Gather
Evidence
Gather
Evidence
Gather
Evidence
Gather
Evidence
Gather
Evidence
Gather
Evidence
Analyze Evidence
and Apply Rubric
Analyze Evidence
and Apply Rubric
Analyze Evidence
and Apply Rubric
Analyze Evidence
and Apply Rubric
Analyze Evidence
and Apply Rubric
Analyze Evidence
and Apply Rubric
Score
Components
Score
Components
Score
Components
Score
Components
Score
Components
Score
Components
Summarize
Results
Summarize
Results
Merge Paper Screen and Implementation Results
Select Instructional Materials
The AIM Process:
Combining the Paper Screen and Implementation Scores
Score Sheet for AIM Process: Paper Screen
Criteria/Component
Score
Weight
Weighted
Total
Score Sheet for AIM Process: Implementation
Percent
Criteria/Component
CONTENT
STUDENT UNDERSTANDING
Standards Alignment
Pre-Post Assessment of Unit Concept(s)
Accuracy
Investigation
Readability
Activity
Concept Development (HPL 2)
Reading
Sequencing (HPL 2)
Assessment
Content (HPL 2)
TOTAL Content Criterion
Total Student Understanding
X 0.40
=
Score
Weight
Weighted
Total
X 0.60
=
X 0.40
=
Percent
TEACHER IMPLEMENTATION
WORK STUDENTS DO
Content Background
Engaging Prior Knowledge (HPL 1)
Teaching Strategies
Metacognition (HPL 3)
Teaching Strategies for Inquiry
Abilities Necessary To Do Scientific Inquiry
Assessment Strategies
Understandings About Scientific Inquiry
Accessibility
TOTAL Work Students Do Criterion
TOTAL Teacher Implementation
X 0.20
=
X 0.20
=
GRAND TOTAL (Pilot)
ASSESSMENT
T=
T/23 X
100 =
Quality
Multiple Measures
Use of Assessments
Accessibility
TOTAL Assessment Criterion
THE WORK TEACHERS DO
(0.6) (pilot score) + (0.4) (paper screen score) = Total Score
Instructional Model
Effective Teaching Strategies
Teaching Strategies for Inquiry
Support for the Work Teachers Do
TOTAL Work Teachers Do Criterion
GRAND TOTAL (Paper Screen)
X 0.20
=
T=
T/24 X
100 =
AIM Summary Sheet
Criteria and Components
Content
Content
Standards Alignment
Accuracy
Concept Development (HPL 2)
Sequencing (HPL 2)
Work
The WorkDo
Teachers
Teachers Do
Assessment
Assessment
Work
Work
Do
Students
Students Do
Context (HPL 2)
Engaging Prior Knowledge (HPL 1)
Metacognition (HPL 3)
Abilities Necessary To Do Scientific Inquiry
Understandings About Scientific Inquiry
Accessibility
Quality
Multiple Measures
Use of Assessments
Accessibility
Instructional Model
Effective Teaching Strategies
Teaching Strategies for Inquiry
Support for the Work Teachers Do
Summary of Strengths
Summary of Limitations
Customize AIM
• Establish your own criteria
• Adopt or adapt existing rubrics
• Vary the weighting system
• Adjust to fit your schedule and needs
Reflection
• What insights into instructional
materials could AIM provide?
• What questions have been
raised by this brief introduction
to AIM?
Why does AIM work?
• Informed, justifiable, evidence-based decisionmaking
• Customized and flexible
• Leads to rich conversations about learning and
teaching science
• Builds common understandings about highquality instructional materials
• Informs PD for scaling-up implementation
• Develops teacher-leaders
What people say about AIM...
•
•
•
•
•
•
•
•
“AIM really validates the money spent on curriculum.”
“Profitable to team building.”
“Teachers liked the process and it made sense to them.”
“Very useful for conceptual teaching and understanding
materials. It also plays a role in understanding inquiry.”
“Vital to our ability to implement back in our district.”
“Analyzing materials based on data, not on feeling.”
“For the first time (in 32 years), I feel I have some insight into
selecting instructional materials.”
“It makes great sense. Why would you choose curriculum any
other way?”
Options…
• FASS Leadership Team
– Come to BSCS and learn to facilitate the
process
• BSCS Staff
– Come to Florida to teach leadership
teams around the state how to facilitate
the process
– Work directly with district teams
Contact Us:
Center for Professional Development
BSCS
5415 Mark Dabling Blvd.
Colorado Springs, CO 80918
719.531.5550 X 119
[email protected]
www.bscs.org