Transcript IPT Tool Prototype - Shannon Rist
Slide 1
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 2
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 3
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 4
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 5
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 6
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 7
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 8
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 9
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 10
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 11
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 12
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 13
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 14
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 15
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 16
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 17
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 18
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 19
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 20
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 21
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 22
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 23
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 24
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 25
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 26
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 27
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 28
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 29
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 30
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 31
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 32
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 33
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 34
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 35
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 36
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 37
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 38
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 39
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 40
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 41
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 42
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 43
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 44
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 45
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 2
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 3
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 4
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 5
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 6
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 7
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 8
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 9
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 10
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 11
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 12
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 13
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 14
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 15
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 16
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 17
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 18
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 19
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 20
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 21
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 22
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 23
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 24
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 25
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 26
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 27
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 28
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 29
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 30
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 31
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 32
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 33
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 34
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 35
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 36
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 37
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 38
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 39
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 40
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 41
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 42
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 43
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 44
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/
Slide 45
IPT 536 - IPT Tool
Perri Kennedy, Shannon Rist, Chester Stevenson
Boise State University
Spring 2010
Continue
Dick and Carey’s
Instructional Design Model
This training tool is intended to provide instructional designers
with an overview of Dick and Carey’s Instructional Design Model.
It is not meant to be an all inclusive list but rather a list of the
models we felt most beneficial for each phase of design. Click
continue.
Continue
The next four slides will explain how to use this Tool. Click the Blue Arrows
in the text box to move between slides.
Instructions:
On the Home screen, you will see ten icons
similar to this. Each icon represents a phase
in Dick & Carey’s Instructional Design
Model. Click each icon to learn more.
1 of 4
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningNavigation
event. Twobuttons
tasks happen
during the
GoaltopAnalysis phase:
are available
at the
right side of each page. The left arrow will
• Needs Analysis take
- Used
to to
gain
understanding
of:
you
theanprevious
page viewed.
The
• Optimal performance
or knowledge
Home button
will take you back to the
• Actual or current
or knowledge
home performance
menu. The right
arrow will take you
• Feelings of to
trainees
and
others
the next page.
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
2 of 4
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
Instructions:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learningClick
event.
tasksTips
happen
Goal Analysis phase:
theTwo
Helpful
iconduring
to getthe
inside
information regarding important Do’s and
• Needs Analysis Don'ts
- Used for
to gain
understanding of:
eachanphase.
•
•
•
•
•
Optimal performance or knowledge
Actual or current performance or knowledge
Feelings of 3trainees
of 4 and others
Causes of the problem from many perspectives
Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis - Determine the overall goals of the training course.
Instructions:
Click the hyperlinks to get expanded
information on important topics.
Open Tool
4 of 4
Revise
Instruction
Conduct
Instructional
Analysis
Identify
Instructional
Goals
Write
Performance
Objectives
Develop
Assessment
Instruments
Develop
Instructional
Strategy(ies)
Develop
& Select
Instructional
Materials
Design
& Conduct
Formative
Evaluations
Analyze
Learners
& Contexts
Design
& Conduct
Summative
Evaluations
Goal Analysis
What Happens:
During the Goal Analysis Phase, designers should determine what will be accomplished
during the learning event. Two tasks happen during the Goal Analysis phase:
• Needs Analysis - Used to gain an understanding of:
• Optimal performance or knowledge
• Actual or current performance or knowledge
• Feelings of trainees and others
• Causes of the problem from many perspectives
• Solutions to the problem from many perspectives (Rossett, 1987, p. 4)
• Goal Analysis – Used to determine the overall goals of the training course.
Goal Analysis
Try Using:
According to Dick and Carey (1996) a Training Needs Assessment should answer:
1.
2.
3.
Who are your learners? What are they like? What characteristics might affect your
design of the learning environment? Find out information about learners:
•
•
•
•
Cognitive abilities
Previous experiences
Motivational interests
Personal learning styles
What is the instructional need? According to the data collected in Step 1, what are
learners currently not able to do that you need them to do?
What leads you to believe that the need can be addressed by instruction?
Using the data collected, create an organized summary describing your learning need. (as
cited in “ID Final Project: Lesson 3,” 2003)
Goal Analysis
Try Using:
Robert Mager’s (1997) Goal Analysis Model states:
Step One: Write down the goal in outcome terms.
Step Two: Jot down, in words and phrases, the performances that, if observed, would cause
you to agree the goal was achieved.
Step Three: Sort out the jottings. Delete duplications and unwanted items. Repeat Steps One
and Two for any remaining abstractions (fuzzies) considered important.
Step Four: Write a complete statement for each performance describing the nature, quality, or
amount you will consider acceptable.
Step Five: Test the statements with the question, ‘If someone achieved or demonstrated each
of these performances, would I be willing to say he or she had achieved the goal?’
When you can answer ‘yes,’ the analysis is finished (p. 86).
Goal Analysis
Helpful Tips
• Use action verbs to describe the performance objective and indicate observable
behaviors.
• Describe the desired behavior that should result from the training.
• Avoid using verbs like "know" or "understand" to describe the performance behavior,
because these are not observable behaviors.
• Don't write objectives that describe what the instructor or student will do in class - the
objective describes the result, not the process.
Conduct Instructional Analysis
What Happens:
When conducting an Instructional Analysis, designers discover what skills are needed to
achieve the results identified from the goal analysis. The primary method is through a Task
Analysis, which is a list of steps and skills used for each procedure in the course being
designed.
Additional Resources:
Swanson, R.A. (1996). Analysis for Improving Performance: Tools for Diagnosing
Organizations and Documenting Workplace Expertise. San Francisco, Ca: Berrett – Koehler
Gupta, K. (2007). A Practical Guide to Needs Assessment (2nd Ed.). San Francisco, Ca: Pfeiffer
Conduct Instructional Analysis
Try Using:
Task Analysis information can be collected by:
• Observing employees on the job
• Interviewing employees and supervisors
• Reviewing documents, processes, policies, etc. for the job position
Step
Action
1
Analyze learners and determine prerequisites.
2
Identify job functions.
3
Identify tasks within each function.
4
Identify stages of process.
5
Is the task procedural?
• If yes, identify the steps and go to Step 7
• If no, go to Step 6.
6
Identify guidelines of the principle-based task.
7
Identify knowledge needed to complete the task.
(“Unit 2: Job Task Analysis,” n.d., p. 4)
Conduct Instructional Analysis
Helpful Tips
• Use the events as a guide for structuring the learning activities.
• When structuring learning activities, avoid including information that learners already
know or don't need to know.
Analyze Learners and Contexts
What Happens:
When conducting a Learner and Context Analysis, designers discover what knowledge, skills,
abilities, and personalities their learners will bring to the training event.
Analyze Learners and Contexts
Try Using:
Learner Analysis
Nicholson (2004) suggests using records, interviews, surveys, observations, job descriptions,
and personnel files determine:
• General characteristics: age, gender, language, culture
• Personal/social characteristics: maturity level, emotional level, expectations, aspirations,
talents/interests, experience, physical capabilities
• Academic characteristics: education level, training levels completed, special courses
completed, previous performance levels, test scores, GPA
• Specific entry characteristics: prerequisite skills, prior experience with topic, reading levels,
attention span, attitudes towards work or the subject
• Learning styles: visual or auditory, sensory or intuitive, inductive or deductive, actively or
reflectively, sequentially or globally
Analyze Learners and Contexts
Try Using:
Context Analysis – Two parts: Performance and Learning Context
The Learning Context describes what the learning environment will be like.
The Performance Context describes what the learners environment will be on the job.
Analyze Learners and Contexts
Try Using:
Performance Context
Nicholson (2004) explains four components of the performance context as:
• Managerial Support – How will supervisors and managers support learners on the job?
• Physical Aspects of the Site - What equipment, facilities, and tools will be available?
• Social Aspects of the Site – Will learners work alone or in teams? Will they work in the
office or in the field?
• Relevance of Skills to Workplace – “How relevant are the new skills to the actual
workplace? Are there physical, social, or motivational constraints to the use of the new
skills” (Nicholson, M. 2004)?
Analyze Learners and Contexts
Try Using:
Learning Context
Nicholson (2004) explains four components of the learning context as:
• Number and Nature of Sites – What facilities and equipment will be available for training?
• Compatibility of the Site With the Instructional Requirements – Are there any limitations to
using the available training site(s)?
• Compatibility of the Site With the Learner Needs – Does the site have necessary
conveniences, necessary equipment, and adequate space available?
• Feasibility for Simulating the Workplace – How well can the actual work environment be
simulated at the site? Can anything be done to make it more?
Analyze Learners and Contexts
Helpful Tips
• Determine what prior knowledge the learners have and how relevant information to be
learned is to them.
• Don't assume what learners do and do not know. This can lead to unexpected and
unwelcome surprises when you launch the project.
Write Performance Objectives
What Happens:
When writing Performance Objectives, designers translate data from the needs and goal
analysis into specific objectives. These objectives will be used later to measure the quality of
instruction and learning that takes place. They will also be used to:
• Determine whether the instruction being developed relates to its goals
• Guide the development of evaluation tools
• Give learners an idea of what content they should focus on
Objectives are different than goals. Goals are more of an overarching vision of what the
course will accomplish. Objectives are measurable descriptions of what you want the learners
to demonstrate on the job.
Two methods that can be used to create objectives are:
• Mager’s Behavioral Objectives
• Bloom’s Taxonomy
Write Performance Objectives
Try Using:
Robert Mager (1997) developed a method for developing Behavioral Objectives which
consisted of:
• Performance – What should the learner be able to do on the job?
• Condition – Under what conditions will the performance occur on the job?
• Criterion – How well should the learner be able to perform on the job?
Review the example below:
#
1
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able to create
effective behavioral objectives
Given the “Write Performance
Objectives” portion of the IIDT
That define how well learners should
perform on the job
Write Performance Objectives
Try Using:
The Cognitive Domain in Bloom’s Taxonomy can be used to align objectives with instructional
activities. Decide what level the Performance Objective falls under, then use an action verb
from the list below in Mager’s Performance column.
Bloom, Engelhart, Furst, Hill, & Krathwohl, (1956) provide six levels in their cognitive domain:
1.
2.
3.
4.
5.
6.
Knowledge - arrange, define, duplicate, label, list, memorize, name, order, recognize,
relate, recall, repeat, reproduce, state.
Comprehension - classify, describe, discuss, explain, express, identify, indicate, locate,
recognize, report, restate, review, select, translate.
Application - apply, choose, demonstrate, dramatize, employ, illustrate, interpret,
operate, practice, schedule, sketch, solve, use, write.
Analysis - analyze, appraise, calculate, categorize, compare, contrast, criticize,
differentiate, discriminate, distinguish, examine, experiment, question, test.
Synthesis - arrange, assemble, collect, compose, construct, create, design, develop,
formulate, manage, organize, plan, prepare, propose, set up, write.
Evaluation - appraise, argue, assess, attach, choose, compare, defend, estimate,
judge, predict, rate, core, select, support, value, evaluate.
See an example
Write Performance Objectives
Example of Mager’s Behavioral Objectives used with Bloom’s Taxonomy:
#
1
Blooms Taxonomy Level
( ) Knowledge
( ) Comprehension
( ) Application
( ) Analysis
(X) Synthesis
( ) Evaluation
Performance
Condition
Criterion
(What will they do?)
(What Conditions?)
(How Well?)
Learners should be able
to create effective
behavioral objectives
Given the “Write
Performance Objectives”
portion of the IIDT
That define how well
learners should perform on
the job
Under the performance column, notice the verb “create” corresponds to Bloom’s Synthesis
level. When developing activities for this objective, designers should ask learners to “create
effective behavioral objectives.” By making activities congruent with the correct level in
Blooms Cognitive Domain, designers ensure learners are developing the correct KSABs.
Write Performance Objectives
Helpful Tips
• Analyze the desired performance carefully to determine the levels to be covered in the
training.
• Higher on the taxonomy is not necessarily better. If Application is the appropriate highest
taxonomy level, then stay at that level.
Develop Assessment Instruments
What Happens:
Before designing training, develop Assessment Instruments to:
•
•
•
•
•
Determine if learners have necessary prerequisites to learn skills used in this course
Evaluate what knowledge, skills, and abilities learners gained during the course
Document learners’ progress
Aid in creating Formative and Summative evaluations
Determine performance measures before development of lesson plan and instructional
materials (“ID Final Project: Lesson 7,” 2003)
Develop Assessment Instruments
Try Using:
ID Final Project: Lesson 7 (2003) lists four types of assessment instruments to develop:
• Entry Behaviors Test – Prior to instruction, give to assess learners’ mastery of prerequisite
skills
• Pre-test – Prior to instruction, given to assess whether learners have already mastered
some of the course skills determined during the instructional analysis
• Practice Tests – During instruction, give learners a chance to rehearse the new skills they
are learning and allow for corrective feedback
• Post-tests – Following instruction, give to determine if learners have achieved the ability to
carry out the performance objectives
Develop Assessment Instruments
Helpful Tips
• Make sure to begin with the desired performance and work backward to determine what
will be needed to achieve objectives.
• Don't wait until you've developed the training before you look at how to assess it. Training
design should begin at Kirkpatrick's 4th level of outcomes. Until you know what you want
for Results, the other three levels are irrelevant (Kirkpatrick & Kirkpatrick, 2009).
Develop Instructional Strategy
What Happens:
When developing a Instructional Strategy, designers “identify and employ teaching strategies
and techniques that most effectively achieve the performance objectives” (Gagné, 1992).
Designing instruction is about more than choosing the mode of delivery. Much like a
screenplay sets the stage for a play or movie, the instructional strategy is the screenplay for
learning. One method available to guide designers in developing instruction is Gagné’s 9
Events of Instruction.
Develop Instructional Strategy
Try Using:
Robert Gagné’s (1992) 9 Events of Instruction
1. Gain attention – Ensure learners are ready to learn
2. Inform learners of objectives – Ensure learners know what they are going to learn
3. Stimulate recall of prior learning – Tie prior knowledge to what they are about to learn
4. Present the content – Introduce the new material
5. Provide "learning guidance” – Use examples, case studies, role play, etc. to help learners
better understand the material
6. Elicit performance (practice) – Allow the learner to practice
7. Provide feedback – Provide specific and immediate feedback to guide learners
8. Assess performance – Give a post/final test to assess learners mastery of the material
9. Enhance retention and transfer to the job – Create job aids, references, tools, etc. which
learners can utilize for their job. Find a way to ensure what learners’ gained from the
course will transfer to their jobs.
Develop Instructional Strategy
Helpful Tips
• Classify learning outcomes. Different types of learning require different types of training
(eg; skills vs. attitude).
• The 9 events do not have to be performed in sequential order, as separate segments, or at
all. If it makes more sense to move, combine, or omit certain events, do it. Gagné's 9
Events are a flexible guideline, not an absolute blueprint.
Develop and Select Instructional Materials
What Happens:
When developing and selecting Instructional Materials, designers select print and electronic
instructional materials to use in the course. Ideally, existing materials should be used,
although they may need improvement or revision.
Develop and Select Instructional Materials
Try Using:
The materials may be in various forms: print, computer, audio, audio-video, etc. There are
benefits and drawbacks to each media type depending on the budget and learning situation.
See a table of media types along with benefits and considerations.
The table on the following page was adapted from Strategies for developing Instructional
Materials for the Interpersonal Domain (2010)
Develop and Select Instructional Materials
Type
Benefits
Considerations
Simulations
•
•
•
•
Permits independence in learning process
Contextualizes content
Can provide multiple perspectives
Develops critical thinking skills
•
•
Can be expensive
Feedback important to success
Training Games
•
•
•
•
Highly motivational
Encourages teamwork
Uses problem solving skills
Develops communication skills
•
•
Difficult with large groups
Can require extensive guidance to be effective
Role Playing
•
•
•
•
Introduces real world situations
Promotes understanding of other positions
Emphasizes working together
Provides opportunities to give & receive feedback
•
•
Difficult with large groups
Can require extensive guidance to be effective
Interactive Games
•
•
•
Highly motivational
Engages learner
Develops strategical thinking skills
•
•
Best with individuals or small groups
May require support materials to ensure
learning
Video
•
•
•
•
Great for large groups
Provides for safe observation
Can include real life situations
Can develop critical thinking
•
•
•
Technology requirements
Difficult to adapt
Need discussion & practice opportunities
Job Aids
•
•
•
•
Provides for rapid instruction
Inexpensive
Can use with any size group
Provides opportunities for self-assessment
•
•
Good as a support tool
Need practice opportunities to ensure transfer
Develop and Select Instructional Materials
Helpful Tips
• Consider both the work and the learner when designing a job aid. What steps need to be
taken to complete the task? What is the experience level of learner?
• Avoid confusing the learner. Include only the steps necessary to complete the task. Use
words that the learner can easily understand. Don't use industry jargon or long, obscure
words.
• Include job aids that learners can keep for reference.
Design and Conduct Formative Evaluation
What Happens:
When designing and conducting Formative Evaluations, designers gather data to revise and
improve instruction as well as materials created for the course. The Formative Evaluation will
attempt to answer the following questions:
SIL International (1999) identifies seven questions to ask during a Formative Evaluation:
• Did you identify training needs correctly?
• Have you noticed other areas which need attention?
• Are there indications that the training objectives will be met?
• Do you need to revise the objectives?
• Are you fully covering training topics?
• Do you need to include additional training topics?
• Are the training methods appropriate or do you need to adjust them” (“How To Do
Formative Training Evaluation?,” 1999)
Design and Conduct Formative Evaluation
Try Using:
The Six Stages of Formative Evaluations:
Dick and Carey (1996) lists six stages of Formative Evaluations:
1. Design Review – Determine if the instructional design matches the analysis
2. Expert Review – Ensure the content is accurate
3. One-to-One – Determine course impact on ARCS factors
4. Small Group – Verify feedback from one-on-one evaluations. Look for additional issues
5. Field Trials – Verify feedback from small group evaluations. Look for context-related issues
6. Ongoing Evaluation – Continually evaluate training to ensure it continues to be relevant
Design and Conduct Formative Evaluation
Helpful Tips
• Make sure to conduct a formative evaluation with a test group before official roll-out of
training.
• Don't rely on a simple "smile sheet". Although you hope learners will enjoy the instruction,
your focus must be on whether or not they have learned the desired skills and can apply
them to their jobs. Happy learners are not the same as better performers.
Design and Conduct Summative Evaluation
What Happens:
When designing and conducting Summative Evaluations, designers study the effectiveness of
the instruction as a whole. It begins after Formative Evaluations are complete, and the
instruction has been implemented. Summative Evaluations provide information about how
much the instruction has improved performance, and how the instruction has affected
workplace performance.
Design and Conduct Summative Evaluation
Try Using:
Donald Kirkpatrick’s (2009) 4 Levels of Training Evaluation:
Level
Level
Level
Level
1:
2:
3:
4:
Learner Reaction – Were the learners satisfied with the training?
Performance Evaluation – Did they gain the intended KSABs?
Behavior Evaluation – Are they applying their newly acquired KSABs to their job?
Effect on the Organization – To what degree did the training achieve the desired
impact on the organization?
Design and Conduct Summative Evaluation
Helpful Tips
• Look at all four levels of Kirkpatrick.
• Levels 3 and 4 (Behavior and Results) are important indicators of the training's value to
the organization. Do not limit yourself to only the first two levels (Reaction and Learning).
Revise Instruction
What Happens:
When revising instruction, designers examine the results of formative evaluations and adjust
the instruction intervention accordingly. You may have to revise your goals, objectives, or
analyses as well as materials and methods.
Revisions might include the following:
• Modify the instructional objective to focus it more clearly on the organizational goal
• Adjust assumptions about learners' prior knowledge of similar subjects
• Increase or decrease the speed at which new information is delivered
• Replace or delete less effective learning activities
Revise Instruction
Helpful Tips
• Ask yourself the following 3 questions:
1. What is my instructional strategy?
2. What is my budget?
3. What resources do I already have available?
• After editing one phase, consider its effect on all other phases.
Sources
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy
of educational objectives: The classification of educational goals (Handbook I: Cognitive
domain). New York, NY: David McKay Company, Inc.
Dick, W. & Cary, L. (1996). The systematic design of instruction. New York, NY: HarperCollins
Publishers.
Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.).
Fort Worth, TX: Harcourt Brace Jovanovich College Publishers.
How to do formative training evaluation. (1999). Retrieved from
http://www.silinternational.org/lingualinks/literacy/ImplementALiteracyProgram/HowToD
oFormativeTrainingEvalua.htm
ID final project: Lesson 3 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson3.htm
ID final Project: Lesson 7 – instructional analysis pt. 1. (2003). Retrieved from
http://www.itma.vt.edu/modules/spring03/instrdes/lesson7.htm
Kirkpatrick, D. (2009). The Kirkpatrick philosophy. Retrieved from
http://www.kirkpatrickpartners.com/OurPhilosophy/tabid/66/Default.aspx
Sources
References
Kirkpatrick, J. & Kirkpatrick, W. K. (2009). The Kirkpatrick four levels: A fresh look after 50
years, 1959 - 2009. Retrieved from
http://www.kirkpatrickpartners.com/Resources/tabid/56/Default.aspx.
Mager, R.F. (1997). Goal analysis: How to clarify your goals so you can actually achieve them
(3rd ed.). Atlanta, GA: CEP.
Nicholson, M. (2004). Learner and context analysis ppt. Retrieved from
http://iit.bloomu.edu/Id/LearnersContext/LearnerAnalysis.htm
Rossett, A. (1987). Training needs assessment. Englewood Cliffs, NJ: Educational Technology
Publications.
Strategies for developing instructional materials for the interpersonal domain. (2010).
Retrieved from
http://en.wikiversity.org/wiki/Strategies_for_Developing_Instructional_Materials_for_the_
Interpersonal_Domain
Unit 2: Job task analysis. (n.d.). Retrieved from http://www.ewrite.biz/files/seminar_manual_sample.pdf
Williams, B. (n.d.). Designing and conducting formative evaluation. Retrieved from
https://www.courses.psu.edu/trdev/trdev518_bow100/D_C10present/