Type of Article:  Original Research

Volume 6; Issue 2.1 (April 2018)

Page No.: 5156-5162

DOI: https://dx.doi.org/10.16965/ijar.2018.143


Padamjeet Panchal 1,  Bheem Prasad *2, Sarita Kumari 3.

1,*2 Assistant Professor, Department of Anatomy, All India Institute of Medical Sciences, Phulwari Sharif, Patna, Bihar, India.

3 Senior Resident, Department of Anatomy, All India Institute of Medical Sciences, Phulwari Sharif, Patna, Bihar, India.

Corresponding Author: Dr. Bheem Prasad, Assistant Professor, Department of Anatomy, All India Institute of Medical Sciences, Phulwari Sharif, Patna, Bihar-801507, India. E-Mail: drbheemp@aiimspatna.org


Background: The accurate, reliable and timely assessment of students is an essential domain of teaching during Medical professional courses. The Multiple Choice Questions (MCQ) are time tested method of ready assessment of undergraduate students. Although it evaluates student’s cognitive knowledge but does not evaluate professional skills.  However it is said that MCQs emphasize recall of factual information rather than conceptual understanding and interpretation of concepts.

Objectives: The main objective of the study is to analyse the items with the help of item analysis and select the items which are good for incorporation into future question bank with reliability.

Materials and Methods: This study was done in Department of Anatomy, AIIMS, Patna. A 396 first year MBBS students of different batches took the MCQ test comprising 60 questions in two sessions. During the evaluation process of MCQ’s each correct response was awarded one mark and no marks was awarded for any incorrect response. Each item was analysed for difficulty index, discrimination index and distractor effectiveness.

Results:  The overall mean of Facilitative value, Discrimination Index, Distractor Effectiveness and Correlation Coefficient was 66.09 (±21.55), 0.26 (±0.16), 18.84 (±10.45) and 0.55±0.22 respectively.

Conclusion: The framing of MCQ should be according to Bloom’s classification to assess cognitive, affective as well as psychomotor domain of the students. The MCQ having poor and negative discrimination should be reframed and again should be analysed.

Key words: Item Analysis, Multiple Choice Question, Difficulty Index, Discrimination Index, Distractor Effectiveness, Correlation Coefficients.


  1. Srivastava A, Dhar A, Agarwal CS. Why MCQ? Indian Journal of Surgery2004;66:246-48.
  2. Carneson J, Delpierre G, Masters K. Designing and managing MCQs. Appendix C: MCQs and Blooms taxonomy. Available at: http://web. uct.ac.za/projects/cbe/mcqman/mcqappc.html. Accessed on 11 December 2017.
  3. Case SM, Swanson DB. Extended matching items: a practical alternative to free response questions. Tech Learn med 1993;5:107-15.
  4. Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distracter efficiency. J Pak Med Assoc. 2012;62:142-6.
  5. Haladyna TM, Downing SM, Rodriguez MC. A review of multiple-choice item- writing guidelines for classroom assessment. Appl Measurement Edu. 2002:15(3):309-34.
  6. Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian J Comm Med. 2014;39(1):17-20.
  7. Sharif MR, Rahimi SM, Rajabi M, Sayyah M. Computer software application in item analysis of exams in a college of medicine. J Sci Tech. 2014;4(10):565-9.
  8. Vyas R, Supe A. Multiple choice questions: a literature review on the optimal number of options. The National J India. 2008;21(3):130-3.
  9. Kehoe J. Writing multiple-choice test items. Pract Assess, Res Evaluation. 2008;4(9). Available online: http://PAREonline.net/getvn.asp?v=4&n=9.
  10. Singh T, Gupta P, Singh D. Test and item analysis. Principles of Medical Education. 3rd Ed. New Delhi, Jaypee Brothers Medical Publishers (P) Ltd; 2009: 70-77.
  11. Chauhan P, Chauhan GR, Chauhan RB, Vaza JV and Rathod SP. Relationship between difficulty index and distracter effectiveness in single best-answer stem type multiple choice question. Int J Anat Res. 2015, 3(4): 1607-10.
  12. Karkal YR and Kundapur GS. Item analysis of multiple choice questions of undergraduate pharmacology examinations in an international medical school in India. 2016. 5(3). 183-186.
  13. Mehta G and Mokhasi. Item Analysis of Multiple Choice Questions- An Assessment of the Assessment Tool. 2014. 4(7). 197-202.
  14. Kumar T, Kumar Y, Jha K and Singh R. Adding Value to Multiple Choice Questions Banking: An AIIMS Patna Experience. Indian J Physiol Pharmacol. 2017. 61(1): 70-75.
  15. Namdeo SK, Sahoo B. Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. Int J Res Med Sci. 2016. 4(5):1716-1719.
  16. Patil R, Bhaskar SP, Vell K, Boratne AV. Evaluation of multiple choice questions by item analysis in a medical college at Pondicherry, India. Int J Community Med Public Health. 2016; 3(6):1612-16.
  17. Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education. 2009;9:1-8.
  18. Schuwirth LWT, Vleuten CPM. Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education. 2004;38(9):974.
  19. Downing SM: Assessment of knowledge with written test forms. In International handbook of research in medical education Volume II. Edited by:Norman GR, Van der Vleuten C, Newble DI. Dorcrecht: Kluwer Academic Publishers. 2002; 647-72.
  20. McCoubrie P. Improving the fairness of multiple choice questions: a literature review. Med Teach. 2004; 26(8):709-12.

Cite this article: Padamjeet Panchal, Bheem Prasad, Sarita Kumari. MULTIPLE CHOICE QUESTIONS – ROLE IN ASSESSMENT OF COMPETENCY OF KNOWLEDGE IN ANATOMY. Int J Anat Res 2018;6(2.1):5156-5162. DOI: 10.16965/ijar.2018.143