Transcript 360 Degree Academic Performance Assessment Model
Best Practices for Writing Objective Test Items
College of Nursing
January 2011
Writing Objective Test Items
Best Practices for Writing Objective Test Items January 2011 Presenter January 2010
Dr. James Coraggio ,
Director, Academic Effectiveness and Assessment
Contributor
Alisha Vitale,
Collegewide Testing Coordinator
January 7. 2011 Academic Effectiveness and Assessment 2
Writing Objective Test Items
Best Practices for Writing Objective Test Items January 2011 March 2010
Former Life…
Director of Test Development , SMT
Director of Measurement and Test Development, Pearson Taught EDF 4430 Measurement for Teachers, USF
January 7. 2011 Academic Effectiveness and Assessment 3
Purpose
Best Practices for Writing Objective Test Items January 2011 March 2010
This presentation will address the importance of
January 2010
establishing a test purpose and developing test specifications.
This presentation will explain how to create effective multiple choice test questions. The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers.
January 7. 2011 Academic Effectiveness and Assessment 4
Agenda
Best Practices for Writing Objective Test Items January 2011 March 2010
Purpose of a Test
January 2010
Prior to Item Writing Advantages of Objective Tests Types of Objective tests Writing Multiple Choice Items The Test-wise Student Test Instructions Test Validity January 7. 2011 Academic Effectiveness and Assessment 5
Purpose of a Test
Best Practices for Writing Objective Test Items January 2011 March 2010
To clearly
January 2010
delineate between those that know the content and those that do not.
To determine whether the student knows the content, not whether the student is a good test taker. Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know (and understand) the material.
January 7. 2011 Academic Effectiveness and Assessment 6
Prior to Writing Items
Best Practices for Writing Objective Test Items January 2011 March 2010
Establish the test purpose
January 2010
Conduct the role delineation study/job analysis Create the test specifications January 7. 2011 Academic Effectiveness and Assessment 7
Establish the Test Purpose
Best Practices for Writing Objective Test Items January 2011 March 2010 Initial Questions January 2010
How will the test scores be used? Will the test be designed for minimum competency or content mastery?
Will the test be low-stakes, moderate-stakes, or high-stakes (consequences for examinees)?
Will the test address multiple levels of thinking ( higher order, lower order, or both )?
Will there be time constraints?
January 7. 2011 Academic Effectiveness and Assessment 8
Establish the Test Purpose
Best Practices for Writing Objective Test Items January 2011 March 2010
Responses to those initial questions have
January 2010
implications such as the overall length of the test, the average difficulty of the items, the conditions under which the test will be administered, and the type of score information to be provided. Take the time to establish a singular purpose that is clear and focused so that goals and priorities will be effectively met. January 7. 2011 Academic Effectiveness and Assessment 9
Conduct the Job Analysis
Best Practices for Writing Objective Test Items January 2011 March 2010
The primary purpose of a role delineation study
January 2010
or job analysis is to provide a strong linkage between competencies necessary for successful performance on the job and the content on the test.
This work has already been conducted by the National Council Licensure Examination for Registered Nurses
NCSBN, 2009] [See Report of Findings from the 2008 RN Practice Analysis: Linking the NCLEX-RN® Examination to Practice,
January 7. 2011 Academic Effectiveness and Assessment 10
Create Test Specifications
Best Practices for Writing Objective Test Items January 2011 March 2010
Test specifications are essentially the ‘blue print’ used to create the test.
Test specifications operationalize the competencies that are being assessed.
NCLEX-RN® Examination has established test specifications. [See 2010 NCLEX RN ® Detailed Test Plan, April 2010, Item Writer/Item Reviewer/Nurse Educator Version] January 7. 2011 Academic Effectiveness and Assessment 11
Create Test Specifications
Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010
January 7. 2011 Academic Effectiveness and Assessment 12
Create Test Specifications
Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010
January 7. 2011 Academic Effectiveness and Assessment 13
Create Test Specifications
Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010
January 7. 2011 Academic Effectiveness and Assessment 14
Create Test Specifications
Best Practices for Writing Objective Test Items January 2011 March 2010
Test specifications: Support the validity of the examination Provide standardized content across administrations Allow for subscores that can provide diagnostic feedback to students and administrators Inform the student (and the item writers) of the required content January 7. 2011 Academic Effectiveness and Assessment 15
Item Development
Best Practices for Writing Objective Test Items January 2011 March 2010
After developing the test specifications, item development can begin.
The focus on the remaining presentation will be on creating ‘appropriate’ objective items.
January 7. 2011 Academic Effectiveness and Assessment 16
Objective Tests
Best Practices for Writing Objective Test Items January 2011 March 2010
Measure several types of learning (also levels)
January 2010
Wide content, short period of time Variations for flexibility Easy to administer, score, and analyze Scored more reliability and quickly
What type of learning cannot be measured?
January 7. 2011 Academic Effectiveness and Assessment 17
Types of Objective Tests
Best Practices for Writing Objective Test Items January 2011 March 2010
Written-response
January 2010
Completion (fill-in-the-blank) Short answer Selected-response Alternative response (two options) Matching Keyed (like matching) Multiple choice January 7. 2011 Academic Effectiveness and Assessment 18
Written-response
Best Practices for Writing Objective Test Items January 2011
Single questions/statements or clusters (stimuli) Points out student misconceptions Disadvantages Measure several types of learning Minimizes guessing Time to score Misspelling and writing clarity Incomplete answers More than one possible correct response (novel answers) Subjectivity in grading January 7. 2011 Academic Effectiveness and Assessment 19
Completion
Best Practices for Writing Objective Test Items January 2011
A word that describes a person, place or thing is a ________.
1.
2.
3.
4.
5.
6.
Remove only ‘key’ words Blanks at end of statement Avoid multiple correct answers Eliminate clues Paraphrase statements Use answer sheets to simplify scoring January 7. 2011 Academic Effectiveness and Assessment 20
Short Answer
Best Practices for Writing Objective Test Items January 2011 March 2010
Briefly describe the term proper noun.
January 2010
____________________________ Terminology – Stimulus and Response 1.
Provide an appropriate blank (word (s) or sentence).
2.
3.
Specify the units (inches, dollars) Ensure directions for clusters of items and appropriate for all items January 7. 2011 Academic Effectiveness and Assessment 21
Selected-response
Best Practices for Writing Objective Test Items January 2011 March 2010 Select from provided responses
Advantages Measure several types of learning Measures ability to make fine distinctions Administered quickly Multiple scoring options (hand, computer, scanner) Disadvantages Allows guessing Cover wide range of material Reliably scored Distractors can be difficult to create Student misconceptions not revealed January 7. 2011 Academic Effectiveness and Assessment 22
Alternative Response
Best Practices for Writing Objective Test Items January 2011 March 2010
T F 1. A noun is a person place or thing.
January 2010
T F 2. An adverb describes a noun.
1.
2.
3.
4.
5.
6.
Explain judgments to be made Ensure answers choices match Explain how to answer Only one idea to be judged Positive wording Avoid trickiness, clues, qualifiers January 7. 2011 Academic Effectiveness and Assessment 23
Matching Item
Best Practices for Writing Objective Test Items January 2011 March 2010
Column A __Person, place, or thing. Column B a. Adjective __Describes a person, place, or thing. b. Noun Terminology – premises and responses 1.
Clear instructions 2.
3.
Homogenous premises Homogenous responses (brief and ordered) 4.
Avoid one-to-one January 7. 2011 Academic Effectiveness and Assessment 24
Keyed Response
Best Practices for Writing Objective Test Items January 2011 March 2010
Responses
January 2010
a. A noun b. A pronoun c. An adjective d. An adverb ___Person, place, or thing.
___Describes a person, place, or thing.
Like matching items, more response options January 7. 2011 Academic Effectiveness and Assessment 25
MC Item Format
Best Practices for Writing Objective Test Items January 2011 March 2010
What is the part of speech that is used to name a person, place, or thing?
A) A noun* B) A pronoun C) An adjective D) An adverb 26 January 7. 2011 Academic Effectiveness and Assessment
MC Item Terminology
Best Practices for Writing Objective Test Items January 2011 March 2010
Stem: Sets the stage for the item; question or
January 2010
incomplete thought; should contain all the needed information to select the correct response.
Options: Possible responses consisting of one and only one correct answer Key: correct response Distractor: wrong response, plausible but not correct, attractive to an under-prepared student January 7. 2011 Academic Effectiveness and Assessment 27
Competency
Best Practices for Writing Objective Test Items January 2011 March 2010
Items should test for the appropriate or
January 2010
adequate level of knowledge, skill, or ability (KSA) for the students.
Assessing lower division students on graduate level material is an ‘unfair’ expectation.
The competent student should do well on an assessment, items should not be written for only the top students in the class. January 7. 2011 Academic Effectiveness and Assessment 28
Clarity
Best Practices for Writing Objective Test Items January 2011 March 2010
Clear, precise item and instructions Correct grammar, punctuation, spelling Address one single issue Avoid extraneous material (teaching) One correct or clearly best answer Legible copies of exam January 7. 2011 Academic Effectiveness and Assessment 29
Bias
Best Practices for Writing Objective Test Items January 2011 March 2010
Tests should be free from bias… No stereotyping No gender bias No racial bias No cultural bias No religious bias No political bias January 7. 2011 Academic Effectiveness and Assessment 30
Level of Difficulty
Best Practices for Writing Objective Test Items January 2011 March 2010
Ideally, test difficulty should be aimed at a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e., workforce area).
January 7. 2011 Academic Effectiveness and Assessment 31
Level of Difficulty
Best Practices for Writing Objective Test Items January 2011 March 2010
To make a M/C item more difficult , make the stem more specific or narrow and the options more similar .
To make a M/C item less difficult , make the stem more general and the options more varied .
January 7. 2011 Academic Effectiveness and Assessment 32
Trivial and Trick Questions
Best Practices for Writing Objective Test Items January 2011 March 2010
Avoid trivia and tricks.
Avoid humorous or ludicrous responses.
Items should be straight forward. They should cleanly delineate those that know the material from those that do not.
Make sure every item has value and that it is contributing to the final score.
January 7. 2011 Academic Effectiveness and Assessment 33
http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf
Test Taking Guidelines
Best Practices for Writing Objective Test Items January 2011 When you don’t know the answer
As with all exams, attempt the questions that are easiest for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you don’t know the answer. Here are some tips to help you guess ‘intelligently’.
Use a process of elimination
Try to narrow your choice as much as possible: which of the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable?
January 7. 2011 Academic Effectiveness and Assessment 34
http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf
Test Taking Guidelines
Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010
and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.:
The apparent distance hypothesis explains… b) The distance between the two parallel lines appears…
January 7. 2011 Academic Effectiveness and Assessment 35
http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf
Test Taking Guidelines
Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010
‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often.
Favor look-alike options
If two of the alternatives are similar, give them your consideration. e.g.: A. tourism consultants B. tourists C. tourism promoters D. fairy penguins January 7. 2011 Academic Effectiveness and Assessment 36
http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf
Test Taking Guidelines
Best Practices for Writing Objective Test Items January 2011 March 2010 Favor more inclusive options
If in doubt, select the option that encompasses others. e.g.: A. an adaptive system B. a closed system C. an open system D. a controlled and responsive system E. an open and adaptive system.
Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank.
January 7. 2011 Academic Effectiveness and Assessment 37
Test-wise Students
Best Practices for Writing Objective Test Items January 2011 March 2010
Are familiar with item formats Use informed and educated guessing Avoid common mistakes Have testing experience Use time effectively Apply various strategies to solve different problem types January 7. 2011 Academic Effectiveness and Assessment 38
Test-wise Students
Best Practices for Writing Objective Test Items January 2011 March 2010
Vary your keys: ‘Always pick option ‘C’. ’ Avoid ‘all of the above’ and ‘none of the above.’ Avoid extraneous information: It may assist in answering another item.
Avoid item ‘bad pairs’ or ‘enemies.’ Avoid clueing with the same word in the stem and the key.
January 7. 2011 Academic Effectiveness and Assessment 39
Test-wise Students
Best Practices for Writing Objective Test Items January 2011 March 2010
Make options similar in terms of length, grammar, and sentence structure. Different options stand out. Avoid ‘clues.’ January 7. 2011 Academic Effectiveness and Assessment 40
Item Format Considerations
Best Practices for Writing Objective Test Items January 2011
Information in the stem
January 2010
Avoid negatively stated stem, qualifiers Highlight qualifiers if used Avoid irrelevant symbols (“&”) and jargon Standard set number of options (Prefer only four)
Ideally, you should tie an item to reference (and rationale)
January 7. 2011 Academic Effectiveness and Assessment 41
Test Directions
Best Practices for Writing Objective Test Items January 2011 March 2010
Highlight Directions
January 2010
1.
2.
3.
4.
5.
State the skill measured.
Describe any resource materials required.
Describe how students are to respond.
Describe any special conditions.
State time limits, if any.
January 7. 2011 Academic Effectiveness and Assessment 42
Ensure Test Validity
Best Practices for Writing Objective Test Items January 2011 March 2010 Congruence between items and course January 2010 objectives
Congruence between item and student characteristics
Clarity of items Accuracy of the measures Item formatting criteria Feasibility-time, resources January 7. 2011 Academic Effectiveness and Assessment 43
Questions
Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010
January 7. 2011 Academic Effectiveness and Assessment 44