Understanding by Design:

Download Report

Transcript Understanding by Design:

Self-assessment, Peer
Review & Feedback
Testing Design Work Against
Standards
1
Self-Assessment & SelfAdjustment
2
The research jibes with this
common sense:
To achieve any goal you must learn
how to self-assess your learning
and your performance
One of 3 chief findings in How People
Learn
 Success depends upon valid selfassessment and timely self-adjustment on
your own you with minimal outside
prompting and close supervision

3
From the research: one of 3
chief findings 

“The teaching of metacognitive skills should be
integrated into the curriculum. Because
metacognition often takes the form of an
internal dialogue, many learners may be
unaware of its importance unless the processes
are explicitly emphasized by teachers.
“Research has demonstrated that learners can
be taught these strategies, including the ability
to predict outcomes, explain to oneself, note
failures to understand, activate background
knowledge, plan ahead, and apportion time and
memory…

How People Learn, p. 14, 21
4
Research on metacognition,
part 2
The model for using the meta-cognitive
strategies is provided initially by the
teacher, and learners practice and discuss
the strategies as they learn to use them.
 Ultimately, learners are able to prompt
themselves and monitor their own
comprehension without teacher support.
[transfer]


How People Learn, pp. 18-19
5
So, what follows for staff?
Models, feedback, adjustments,
feedback
This cycle has to be woven into
professional development and team
meetings
 This is the essence of LessonStudy, the
highly successful Japanese approach
 Depends on model units, ubd standards, a
self-assessment protocol, and practice in
self-assessment after having it modeled

6
Unit Design Cycles
Designer’s
observation
Student
feedback
based on:
• Program goals
• Performance gaps
Trial
• Stage 1
• Stage 2
• Stage 3
Design
Feedback
Expert
review
Backward
Design:
Analysis of
student work
Working smarter via:
• design teams
Reviewed against
Design Standards by:
• self
• peers
7
8
UbD – Design Standards
See the design standards and review
materials in the UbD Workbook:
 design standards linked to the
three stages ‘backward design’
 each standard contains criteria, a
three-point performance scale and
indicators
 serve as guides for peer & external
reviews
9
Peer Review: Goals
 Provide
specific feedback and
guidance to designer based on design
standards.
 Improve the quality of unit designs.
 Increase understanding of the
qualities of effective curricular
design.
 Develop a more collaborative and
open culture
10
Peer Review Process
Step 1 - Overview of unit
 Designer
presents a brief overview of
unit and requests specific feedback from
the group.
 Designer leaves the group.
Suggestion: Designer should take another
unit to review while they are waiting.
11
Peer Review Process
Step 2 - Individual reviews of unit
designs without designer present

Establish group roles.
Reviewers individually (and silently)
review and evaluate the design based on
review criteria and record comments
about strengths and weaknesses on the
Individual Review Form.

12
Peer Review Process
Group roles:
 Timekeeper/Facilitator
 Recorder
 Spokesperson(s)
13
Peer Review Process
Step 2 - Individual reviews of unit
designs without designer present

Establish group roles.
Reviewers individually (and silently)
review and evaluate the design based on
review criteria and record comments
about strengths and weaknesses on the
Individual Review Form.

14
Peer Review Process
Step 3/4 - Review group discussion
without designer present
Review group discusses the design and
records key questions, feedback and
guidance on the Group Review Form.

Spokesperson(s) verbally summarizes
the group’s comments.

15
Peer Review Process
Step 5 - Group discussion of unit with
designer
Reviewers present feedback and
guidance to the designer.

Designer listens, takes notes, and
asks clarifying questions.

16
Feedback
Feedback is descriptive. It provides
commentary on the extent to which
the design meets the stated goals
(intent vs. effect).
e.g., The first performance task does
not seem to align with any of the unit
Understandings.
17
Feedback, defined
Useful information on
the actual against the optimal;
the intent vs. effect

Feedback reports back on what you did,
against a specific target—no personal
value or aesthetic judgment is made
Feedback is descriptive, not evaluative
 Feedback is not praise or blame
 Feedback is not guidance/advice

18
Examples of feedback about
ubd





“That question seems to me to be the most
‘essential’ question - it is thought-provoking and
can profitably be asked and re-asked throughout
the unit .”
“That second one is really more of a leading factual
question than an essential question.”
“I cannot find any assessment of your 2nd
understanding”
“Those activities align nicely with Understandings 1
and 2. But I don’t see any activities directly related
to #3.”
“I think a student could do well on that
performance assessment without really
understanding understanding #2 (which you say it
19
addresses).”
Feedback
The most useful feedback is:
 specific
 guided by criteria
 understandable to the
receiver
 timely
© 2000 Grant Wiggins & Jay McTighe
20
Feedback vs. Guidance
Guidance offers suggestions for
improving the design, based on the
designer’s stated goals (i.e., how to
narrow the gap between intent & effect).
e.g., Perhaps a different Understanding
would be more closely aligned with your
first assessment task.
© 2000 Grant Wiggins & Jay McTighe
21
Don’t Confuse Feedback
with Guidance (Advice)
Too many people jump to
guidance without first providing
feedback and making sure the
performer understands it and
agrees.


Feedback: what you did or did not do, given a
standard; a neutral description of your
performance or product
Guidance: what you might do to honor the
feedback - good advice
22
Check for Understanding
KEY:
F = Feedback
G = Guidance N= Neither
F G N
1. What a great idea!
2. The first Essential Question is
factual in nature (recall only).
3. Perhaps you could begin with
an experiential activity as a ‘hook’.
4. I never use Word Searches in
my classroom.
© 2000 Grant Wiggins & Jay McTighe




23
Check for Understanding
KEY:
F = Feedback
G = Guidance N= Neither
5. The mystery game is likely to
‘hook’ and engage the students.
6. We didn’t like the culminating
activity.
7. You might try using the ‘concept
attainment’ technique here.
8. We didn’t see any assessment of
the second understanding.
© 2000 Grant Wiggins & Jay McTighe
F G N




24
Tip for Effective Peer Review
 Remember that the
primary goal of peer review is
to provide useful feedback
and guidance to help improve
the design.
25
Tips for Effective Peer Review
Reviewer’s comments
should be specific and
“objective” - i.e. guided by
and made in reference to the
design standards, not
personal taste.
26
Tips for Effective Peer Review
 Try to understand the design and its
intent before offering feedback and
guidance. The designer/teacher should
feel that you are trying to improve their
design, not substitute your own goals or
methods for theirs.
27
Feedback and its
power
28
feedback and its use is key
to great gains
Black & Wiliam meta-analysis:
“There is a body of firm evidence that formative
assessment is essential... We know of no other
way of raising standards for which such a strong
prima facie case can be made.”
Black and Wiliam (1998) “Inside the Black Box: Raising Standards
through Classroom Assessment,” Phi Delta Kappan, volume 80, 2
(October), pp. 139 ff.
Cf. Working Inside the Black Box: Assessment for Learning in the
Classroom, by Paul Black, Christine Harrison, Clare Lee, Bethan Marshall,
and Dylan Wiliam Phi Delta Kappan, Volume 86, #1 (September, 2004)


29
Feedback: Harvard’s “most
effective” courses
from Making the Most of College:


"The big point—it comes up over and over again as
crucial—is the importance of quick and detailed
feedback. Students overwhelmingly report that
the single most important ingredient for making a
course effective is getting rapid response on
assignments and quizzes.
"Students suggest that it should be possible in
certain courses to get immediate feedback. They
suggest that the professor should hand out an
example of an excellent answer.
- Richard Light
30
Feedback as key (cont.):
"Secondly... an overwhelming majority
are convinced that their best learning
takes place when they have a chance to
submit an early version of their work, get
detailed feedback and criticism, and then
hand in a final revised version...
 Many students observe that their most
memorable learning experiences have
come from courses where such
opportunities are routine policy."
31

Feedback is vital for faculty,
too

“Faculty members at Harvard were asked
what single change most improved their
teaching. Two ideas swamped all others.
One is enhancing student awareness of the
big picture, ‘the big point of it all’. The
second is the importance of helpful and
regular feedback from students so a
professor can make midcourse corrections.”

- Harvard Assessment Seminar, 1993
32
Feedback Depends upon
Models
based on:
 Exemplars/Models
 Expert Commentary on
Models and non-models
 Ideal specifications or
performance standards
33
No gains = poor feedback
system
No imrpovement over time show the
problem is not “prof dvlmpt” or “staff”
but the feedback system
 Ponder: how would a school’s Cross
Country team do if we only kept the place
of finish and graded you on your relative
place of finish - and did not keep your
times?

34
Feedback/Self-assessment
System - Healthy






Performers seek feedback on their own and
know that it is in their interest - even if the
news is bad
Performance improves at all levels; there is
obvious “value added”
Improved performance occurs more rapidly than
is typical or expected
Few quarrels about the feedback (if there are, it
is about the meaning of the results)
Feedback use opportunities are central to the
job
Norms and standards rise over time: what was
once considered extraordinary performance
35
becomes more common
Feedback/Self-assessment
System - Unhealthy






Learners fear, resist, do not seek, or ignore feedback
Learner performance rarely improves much beyond
what is typical
Novices struggle to improve; they do not know “what
you want.” Their self-assessment is very inaccurate
Many quarrels about the credibility and meaning of
the results; anecdotes and effort trusted more than
anything
Training is too prone to coverage and activities, with
little opportunity to get feedback and use it
repeatedly
Norms stay the same, standards rise - and
expectations are thus lowered
36
Excellent feedback
Some criteria:









Timely
user-friendly - in approach and amount
Descriptive & specific re: performance
Consistent
Expert
Accurate
Honest, yet constructive
Derived from concrete standards
On-going
37
So, what follows for staff?
How can we make feedback more
common, useful, credible - and
timely?
 Sample
their work regularly
 Peer review
 Use online resources (e.g.
ubdexchange)
38