Skip links
Mystery box graphic

Building Good Multiple Choice Questions (MCQs)

This blog post aims to consider how multiple choice questions can be used to effectively assess learning outcomes and to provide useful and actionable feedback to students who sit the assessment.

To consider what a good MCQ looks like we need to think about each element of the question. Each MCQ has 3 elements, the information containing the question itself (stem), the correct answer (key) and incorrect answers (distractors).

The Stem

The stem is (not surprisingly) fundamental to writing a good MCQ. There are some general principles that should be followed in order to avoid confusion and make the question as clear as possible.

  1. Put as much information in the stem as possible rather than including it in the key and the distractors.
  2. Make the stem as clear and definite as possible with no room for ambiguity in it’s interpretation.
  3. Avoid unnecessary or irrelevant material.
  4. Try to avoid the use of negatives and if used embolden or capitalise.

The Distractors

The distractors should be considered carefully and well planned MCQ questions will often include common errors as distractors or statements that are factually correct but that do not answer the question that has been posed.

Other considerations when setting distractors will be:

  1. Make sure that the alternatives provided are plausable and attractive (the use of common errors and correct statements will help here)
  2. Ensure that the distractor does not include information that may give a clue to the correct answer.

The Key

The main consideration with the key is to ensure that there is only one correct response and that none of the distractors provided could be considered correct. In this sense the question that is set should be objective in nature so that there is only one clear and unambiguous answer. It is also not advisable to use ‘All of the above’ or ‘None of the above’ as answers to MCQs.

Using MCQs for Summative Assessment

MCQs are relatively easy to set in a way that assesses the lower order of cognitive ability (remembering / understanding), they are much more difficult to set in a way that tests higher level cognitive ability (application / analysing / evaluating / creating). If the learning outcomes on your module use primarily higher level descriptors (as is likely in the higher education environment), then it is more difficult to devise MCQs that test at this level. Any summative assessment set for students will be required to assess those learning outcomes and thus a great deal of effort will need to be spent devising MCQs that adequately test at the level of the learning outcome.

For example, rather than having one MCQ that tests whether a learner ‘knows’ a principle, you could create a short scenario with a series of MCQs testing the learner’s application of the principle by asking whether or not the information in the scenario is consistent with the principle, would it be consistent if some information were changed, and how might it be changed to be consistent with another principle. In this way, the learner will need to have the ability to apply the principle in one context and then consider it from several points of view.

Morrison and Free (2001) suggested that manipulation of verbs that, at first glance, could not be used in MCQs such as ‘Apply’ or ‘Compare’ could be undertaken by replacing it with it’s noun derivative. This would create such stems as ‘which of the options provides the best application of….‘ which allows for testing of higher level verbs but is restricted by the fact that the learner did not have to think through and suggest the best application themselves.

The use of a sequence of questions that require the answer from the previous was suggested by Burns (2010) as a ‘multiple neuron’ questions that requires the combination of several areas of knowledge to build up and answer all aspects of the question. This technique relies on the learner understanding the ‘Interconnections between knowledge

The suggestions above will hopefully help to consider whether MCQs are appropriate for assessing at the level required for the learning outcomes on your module and provide guidance as to how best to create good MCQs if that is the case.

References

Burns, E.R. (2010). “Anatomizing” reversed: Use of examination questions that foster use of higher order learning skills by students. Anatomical Science Education, 3 (6) 330-334

Morrison, Susan, and Kathleen Walsh Free, (2001) Writing multiple-choice test items that promote and measure critical thinking. Journal of Nursing Education, January 2001, Vol. 40, No. 1, pp. 17-24.