IJAR.2018.171

Type of Article:  Original Research

Volume 6; Issue 2.2 (May 2018)

Page No.: 5250-5255

DOI: https://dx.doi.org/10.16965/ijar.2018.171

A STUDY OF TEST CONSTRUCT AND GENDER DIFFERENCES IN MEDICAL STUDENT ANATOMY EXAMINATION PERFORMANCE

Cheryl Melovitz-Vasan 1, Matthew Gentile 2, Susan Huff 3, and Nagaswami Vasan *4.

1 Assistant Professor, Department of Biomedical Sciences, Cooper Medical School of Rowan University, Camden, New Jersey, USA.

2 Director of Assessment, Office of Medical Education, Cooper Medical School of Rowan University, Camden, NJ, USA.

3 Assessment Coordinator, Office of Medical Education, Cooper Medical School of Rowan University, Camden, NJ, USA.

*4 Professor of Anatomy, Department of Biomedical Sciences, Cooper Medical School of Rowan University, Camden, NJ, USA.

Address of correspondence: Prof. Dr. Nagaswami Vasan, Professor, Department of Biomedical Sciences, Cooper Medical School of Rowan University, 401 South Broadway, Camden, NJ  08103 USA. Phone: +1 (856) 361-2890 E-Mail: vasan@rowan.edu

ABSTRACT:

Background: A number of studies compare cognitive abilities of male and female students from elementary school through high school employing various test constructs and reported presence of gender-related differences having to do with the mode of assessment. The purpose of the present study was to examine whether there is a difference in the performances of male and female medical students. We employed two types of test constructs viz., constructed response (CR) and selected response (SR) examinations that use the same question stem.

Materials and Methods: Two types of test questions, CR and SR, were studied. Each CR and SR question used the same question stem to assure all questions were matched.  Study participants were incoming first year medical students enrolled in a six-week summer enrichment anatomy course prior to the start of the school year.  Group 1 included 16 students (8 male and 8 female) and Group 2 19 students (7 male and 12 female).  The course focused on study of the thorax and abdomen and the student performances were analyzed.

Results and Discussions: Mean scores and statistical analysis showed comparable performance between male and female students.  The independent sample t-test showed that, statistically, there were no significant differences in performances of male and female students in the CR or SR examinations, except in the Group 1 male and female abdomen SR examination [t (14) = 1.934, p< 05 1 tailed].  Collectively, these results show that male and female students in both the groups performed better in the SR test than the CR test.

Conclusions: Gender poses no limitation to medical student performance, irrespective of the type of examination format.  It is possible to adopt CR examination as a formative evaluation tool for students to identify their deficits and strategize effective learning.

Key words:  Selected Response Question Examination, Constructed-Response Examination, Gender Performance In Examination Format, Medical Education.

REFERENCES

  1. Downing K, Chan SW, Downing WK, Kwong T, Lam TF. “Measuring gender differences in cognitive functioning”, Multicultural Education & Technology Journal 2008; 2: 4-18,
  2. Maccoby, EE, Jacklin CN. The Psychology of Sex Differences. Stanford, Calif.: Stanford University Press. 1974.
  3. Benbow, CP. “Sex Differences in Mathematical Reasoning Ability in Intellectually Talented Preadolescents: Their Nature, Effects, and Possible Causes.” Behavioral and Brain Sciences 1988; 11: 169-232.
  4. Mazzeo J, Schmitt AP, Bleistein CA. Sex-Related Performance Differences on Constructed Response and Multiple- Choice Sections of Advanced Placement Examinations Bulletin College Board Report No. 92-7, ETS RR No.  93-5, College Entrance Examination Board, New York. 1993.
  5. Traub RE, MacRury K. Multiple-Choice vs Free-Response in the Testing of Scholastic Achievement. 6th Toronto, Ontario, Canada: Ontario Institute for Studies in Education. 1990.
  6. Bolger, N., and T. Kellaghan. “Method of Measurement and Gender Differences in Scholastic Achievement.” Journal of Educational Measurement. 1990; 27:165-74.
  7. Murphy, RJL. “Sex Differences in GCE Examination Entry Statistics and Success Rates.” Education Studies. 1980; 6: 169-78.
  8. Murphy, R.J.L. “Sex Differences in Objective Test Performance.” British Journal of Educational Psychology. 1982; 52:213-19.
  9. Becker, W., & Johnston, C. The relationship between multiple choice and essay response questions in assessing economics understanding. Economic Record. 1999; 75, 348-357.
  10. Dufresne RJ, Leonard WJ, Gerace W. J. Making sense of students’ answers to multiple-choice questions.  Phys Teach. 2002; 40:174-180.
  11. Rodriguez, MC. Construct equivalence of multiple-choice and constructed response items: a random effects synthesis of correlations. Journal of Educational Measurement. 2003; 40:163-184.
  12. Hedges LV, Nowell A. Sex differences in mental test scores, variability, and numbers of high-scoring individuals.  1995; 269: 41-45.
  13. Weaver AJ, Raptis H. Gender differences in introductory atmospheric and oceanic science exams: multiple choice versus constructed response questions.  J Sci Educ Technol. 2001; 10:115–126.
  14. Stanger-Hall KF. Multiple-choice exams: An obstacle for higher-level thinking in introductory science classes. CBE Life Sci Educ. 2012; 11:294-306.
  15. Federer MR, Nehm RH and Pearl DK. Examining Gender Differences in Written Assessment Tasks in Biology: A Case Study of Evolutionary Explanations. CBE Life Sci Educ. 2016 Spring; 15(1):
  16. Case SM, Becker DF, Swanson DB. Performances of men and women on NBME Part I and Part II: the more things change…. Acad Med. 1993; 68 (10 suppl): S25-27.
  17. Cuddy MM, Swanson DB, Clauser BE. A multilevel analysis of examinee gender and USMLE step 1 performance. Acad Med. 2008; 83(10 Suppl): S58-62.
  18. Simkin MG, Kuechler WL. Multiple-choice tests and student understanding: What is the connection? Decis Sci Journal Innovat Educ. 2005; 3:73-97.
  19. Martinez ME. Cognition and the question of test item format. Educ Psychol. 1999; 34:207-218.
  20. Lukhele R, Thissen D, Wainer H. On the relative value of multiple‐choice, constructed response, and examinee‐selected items on two achievement tests. J Educ Meas. 1994; 31:234-250.
  21. Veloski JJ, Binowitz, HK, Robeson MR. A solution to the cueing effects of multiple-choice questions: The un-Q format. Med Educ. 1993; 27:371-375.
  22. DeMars DE. Gender differences in mathematics and science on a high school proficiency exam: the role of response format.  Appl Meas Educ. 1998; 11:279-299.
  23. Zimmerman DW, Williams RH. A new look at the influence of guessing on the reliability of multiple-choice tests. Appl Psychol Meas. 2003; 27:357-371.
  24. Ben-Shakhar G, Sinai Y. Gender differences in multiple-choice tests: the role of differential guessing tendencies. J Educ Meas. 2005; 28: 23-35.
  25. Sam AH, Hameed S, Harris J, Meeran K. Validity of very short answer versus single best answer questions for undergraduate assessment. BMC Med Educ. 2016; 16:266–269.

Cite this article: Cheryl Melovitz-Vasan, Matthew Gentile, Susan Huff, Nagaswami Vasan. A STUDY OF TEST CONSTRUCT AND GENDER DIFFERENCES IN MEDICAL STUDENT ANATOMY EXAMINATION PERFORMANCE . Int J Anat Res 2018;6(2.2):5250-5255. DOI: 10.16965/ijar.2018.171