International Journal of Anatomy and Research

Welcome to International Journal of Anatomy and Research




Type of Article: Original Research

Year: 2015 | Volume 3 | Issue 4 | Page No. 1607-1610

Date of Publication: 30-11-2015



Pradip Chauhan *1, Girish Rameshbhai Chauhan 2, Bhoomika Rameshbhai Chauhan 3, Jayesh Vrajlal Vaza 4, Suresh Punjabhai Rathod 5.

*1 Department of anatomy, Pandit Dindayal Upadhyay Government Medical College, Gujarat, India.

2 Assistant Professor, Department of Oral Pathology, Government Dental College, Gujarat, India.

3 Assistant Professor, Department of Gynaecology, GMERS Medical College, Sola, Gujarat, India.

4 Assistant professor, Department of Orthopaedics, L.G. Medical college, Gujarat, India.

5 Professor and Head, Department of Anatomy, P.D.U. Medical College, Gujarat, India.

Address: Dr. Pradip Rameshbhai Chauhan, Department of anatomy, Pandit Dindayal Upadhyay Government Medical College, Rajkot, Gujarat, India. Mob. No.: +918866199560


Background: Single best-answer multiple-choice questions (MCQs) consist of a question (the stem) two or more choices from which examinees must choose the correct option (the distracters) and one correct or best response (the key).  Item analysis is the process of collecting, summarizing and using information from students’ responses to assess the quality of test items. Classical test theory for item analysis is most followed method to determine the reliability by calculating Difficulty Index (P score) and Discriminating Index (D score) and Distracter effectiveness
Aim: This Study was aimed to calculate P scoreand distracter effectiveness; to find out relationship between P score and distracter effectiveness.
Material and methods: In this Cross Sectional study 65 items responded by 120 Students of first year M.B.B.S were studied for Item Analysis. Difficulty Index, and Distracter Effectiveness were calculated for each item. Distracters were identified and classified as Functioning and Non- functioning distracter. Interrelationship between P Score, and Distracter Effectiveness was calculated and analyzed by Epinifo 7 software
Result: We found Items with two functioning distracters were more difficult than that of others followed by items with three functioning distracters.
Conclusion: Distractors affect the item difficulty index and by the means also affects quality of the assessment.
KEY WORDS: Item Analysis, Difficulty Index, Distracter Effectiveness, Multiple Choice Question.


  1. Hubbard JP, Clemans WV. Multiple-choice Examinations in Medicine. A Guide for Examiner and Examinee. Philadelphia: London. Lea & Fabiger; 1961:180.
  2. Skakun EN, Nanson EM, Kling S, Taylor WC. A preliminary investigation of three types of multiple choice questions. Med Edu 1979;13: 91-96.
  3. Kehoe J.  Basic item analysis for multiple choice tests. Practical Assessment, research, Evaluation 1995;4: 10. URL: [accessed on 17 March 2014]
  4. De Champlain AF, Melnick D, Scoles P, Subhiyah R, Holtzman K, Swanson D. Assessing medical students’ clinical sciences knowledge in France: collaboration between the NBME and a consortium of French medical schools. Acad Med 2003;78:509-17.
  5. Cizek GJ, O'Day DM. Further investigations of nonfunctioning options in multiple-choice test items. Educ Psychol Meas 1994; 54(4):861-872. URL: [accessed on 17th March,2014]
  6. Zubairi  AM, Kassim NLA.Classical and Rasch analysis of dichotomously scored reading comprehension test items. Malaysian Journal of ELT Research 2006;2: 1-20. URL: [accessed on 17 March 2014]
  7.  Considine J, Botti M, Thomas S. Design, format, validity and reliability of multiple choice questions for use in nursing research and education. Collegian 2005:12: 19-24. URL: [accessed on 17th march,2014]
  8. Miller MD, Linn RL, Gronlund NE. Measurement and assessment in teaching. 10th edition. Upper Saddle River, NJ: Prentice Hall; 2009:576.
  9. Fowell SL, Southgate LJ, Bligh JG. Evaluating assessment: the missing link? Med Educ 1999;33: 276-81.
  10. Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distracters in multiple-choice questions: a descriptive analysis. BMC Med Educ 2009;9: 40. URL: [accessed on 17th March.2014]
  11. Si-Mui Sim, Rasiah RI. Relationship between item difficulty and discrimination indices in true/false type multiple choice questions of a para-clinical multidisciplinary paper. Ann Acad Med Singapore 2006;35: 67-71. URL: [accessed on 17th March.2014]
  12. Mitra NK, Nagaraja HS, Ponnudurai G, Judson JP. The Levels of Difficulty and Discrimination Indices In Type A Multiple Choice Questions Of Pre-clinical Semester 1 Multidisciplinary Summative Tests. IEeJSME 2009;3 (1): 2-7. URL: [accessed on 17th March.2014]
  13. Shah CJ, Baxi SN, Parmar RD, Parmar D, Tripathi CB. Item Analysis of MCQ from Presently Available MCQ Books. Indian Journal For The Practising Doctor 2011;6(1);26-30
  14. Trevisan MS, Sax G, Michael WB. The effects of the number of options per item and student ability on test validity and reliability. Educ Psychol Meas  1991;51(4):829-837. URL: [accessed on 17th March.2014]


Pradip Chauhan, Girish Rameshbhai Chauhan, Bhoomika Rameshbhai Chauhan, Jayesh Vrajlal Vaza, Suresh Punjabhai Rathod. RELATIONSHIP BETWEEN DIFFICULTY INDEX AND DISTRACTER EFFECTIVENESS IN SINGLE BEST-ANSWER STEM TYPE MULTIPLE CHOICE QUESTIONS. Int J Anat Res 2015;3(4):1607-1610. DOI: 10.16965/ijar.2015.299




Volume 1 (2013)

Volume 2 (2014)

Volume 3 (2015)

Submit Manuscript