Skip to main content
Have a personal or library account? Click to login
Reducing the Number of Distractors in Multiple-Choice Questions: A Randomized Study in Undergraduate Medical Education Cover

Reducing the Number of Distractors in Multiple-Choice Questions: A Randomized Study in Undergraduate Medical Education

Open Access
|Apr 2026

Abstract

Background: Multiple-choice questions (MCQs) are widely used in medical education due to their efficiency and scalability. However, the quality of distractors plays a critical role in item validity. Evidence suggests that reducing distractors may enhance psychometric performance without compromising assessment quality.

Methods: We conducted a randomized study involving 198 MCQs used in second-year medical exams at the University of Navarra. Each exam included 50 items with two distractors and 50 with three distractors. Items were analyzed for difficulty index, point-biserial coefficient, low-functioning distractors, item-writing flaws, and overall quality (difficulty index 0.45–0.75 and point-biserial > 0.20). In addition, items were classified as either factual recall or clinical case-based to assess the effect of question type. Univariate and multivariate analyses were performed.

Results: No significant differences were found between two- and three-distractor items in terms of difficulty, discrimination, or overall quality. However, items with two distractors presented fewer item-writing flaws (60.2% versus 46%; p = 0.045) and more frequently all their distractors were functional (38.8% versus 18%; p = 0.01). Factual questions were easier and more likely to contain poor distractors, while clinical-case items were more complex and discriminative. Multivariate analyses confirmed that distractor quality, not quantity, was the main factor associated with item performance.

Conclusions: Multiple-choice questions with two well-constructed distractors are a valid and efficient alternative to traditional formats. Reducing the number of distractors improves distractor quality and reduces writing flaws, without compromising psychometric standards. These findings support a shift toward more streamlined item formats in medical education assessment.

DOI: https://doi.org/10.5334/pme.2583 | Journal eISSN: 2212-277X
Language: English
Submitted on: Mar 19, 2026
Accepted on: Mar 26, 2026
Published on: Apr 22, 2026
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2026 Patricia Sunsundegui, Marcos Llorente-Ortega, Felipe Lucena, Nerea Fernández-Ros, Maite Solas, Ana Belén Alcaide, Manuel F. Landecho, Jorge Quiroga, Mercedes Iñarrairaegui, José Ignacio Herrero, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.