Writing Good Multiple Choice Test Questions: by Cynthia J. Brame, CFT Assistant Director
Writing Good Multiple Choice Test Questions: by Cynthia J. Brame, CFT Assistant Director
Writing Good Multiple Choice Test Questions: by Cynthia J. Brame, CFT Assistant Director
Cite this guide: Brame, C., (2013) Writing good multiple choice test questions. Retrieved [todaysdate]
from https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/.
Multiple choice test questions, also known as items, can be an effective and efficient
way to assess learning outcomes. Multiple choice test items have several potential
advantages:
Versatility: Multiple choice test items can be written to assess various levels
A multiple choice item consists of a problem, known as the stem, and a list of suggested
solutions, known as alternatives. The alternatives consist of one correct or best
alternative, which is the answer, and incorrect or inferior alternatives, known as
distractors.
2. The stem should not contain irrelevant material, which can decrease the reliability
and the validity of the test scores (Haldyna and Downing 1989).
3. The stem should be negatively stated only when significant learning outcomes
require it. Students often have difficulty understanding items with negative phrasing
(Rodriguez 1997). If a significant learning outcome requires negative phrasing, such as
identification of dangerous laboratory or clinical practices, the negative element should
be emphasized with italics or capitalization.
4. The stem should be a question or a partial sentence. A question stem is
preferable because it allows the student to focus on answering the question rather than
holding the partial sentence in working memory and sequentially completing it with each
alternative (Statman 1988). The cognitive load is increased when the stem is
constructed with an initial or interior blank, so this construction should be avoided.
6. The alternatives “all of the above” and “none of the above” should not be
used. When “all of the above” is used as an answer, test-takers who can identify more
than one alternative as correct can select the correct answer even if unsure about other
alternative(s). When “none of the above” is used as an alternative, test-takers who can
eliminate a single option can thereby eliminate a second option. In either case, students
can use partial knowledge to arrive at a correct answer.
8. The number of alternatives can vary among items as long as all alternatives are
plausible. Plausible alternatives serve as functional distractors, which are those chosen
by students that have not achieved the objective but ignored by students that have
achieved the objective. There is little difference in difficulty, discrimination, and test
score reliability among items containing two, three, and four distractors.
Additional Guidelines
1. Avoid complex multiple choice items, in which some or all of the alternatives
consist of different combinations of options. As with “all of the above” answers, a
sophisticated test-taker can use partial knowledge to achieve a correct answer.
Considerations for Writing Multiple Choice Items that Test
Higher-order Thinking
When writing multiple choice items to test higher-order thinking, design questions that
focus on higher levels of cognition as defined by Bloom’s taxonomy. A stem that
presents a problem that requires application of course principles, analysis of a problem,
or evaluation of alternatives is focused on higher-order thinking and thus tests students’
ability to do such thinking. In constructing multiple choice items to test higher order
thinking, it can also be helpful to design problems that require multilogical thinking,
where multilogical thinking is defined as “thinking that requires knowledge of more than
one fact to logically and systematically apply concepts to a …problem” (Morrison and
Free, 2001, page 20). Finally, designing alternatives that require a high level of
discrimination can also contribute to multiple choice items that test higher-order
thinking.
Additional Resources
Burton, Steven J., Sudweeks, Richard R., Merrill, Paul F., and Wood, Bud. How
to Prepare Better Multiple Choice Test Items: Guidelines for University Faculty,
1991.
Cheung, Derek and Bucat, Robert. How can we construct good multiple-choice
items? Presented at the Science and Technology Education Conference, Hong
Kong, June 20-21, 2002.
Haladyna, Thomas M. Developing and validating multiple-choice test items,
2nd edition. Lawrence Erlbaum Associates, 1999.
Haladyna, Thomas M. and Downing, S. M.. Validity of a taxonomy of multiple-
choice item-writing rules. Applied Measurement in Education, 2(1), 51-78, 1989.
Morrison, Susan and Free, Kathleen. Writing multiple-choice test items that
promote and measure critical thinking. Journal of Nursing Education 40: 17-24,
2001.
Quick Links
Teaching Guides
The CFT has prepared guides to a variety of teaching topics with summaries of best
practices, links to other online resources, and information about local Vanderbilt
resources.
CFT Photostream
View more photos
flickr
RSS Feed
Teaching Guides
Quick Links
Contact Us
Phone: 615-322-7290
cft@vanderbilt.edu
Hours:
8:00 a.m. to 4:30 p.m.
Monday - Friday
Content on this site is licensed under a Creative Commons Attribution-NonCommercial
4.0 International License.
© 2020 Vanderbilt University · All rights reserved. Site Development: Digital Strategies
(Division of Communications)
Vanderbilt University is committed to principles of equal opportunity and affirmative
action. Accessibility information.
Vanderbilt®, Vanderbilt University®, V Oak Leaf Design®, Star V Design® and Anchor
Down® are trademarks of The Vanderbilt University