Mediterranean Journal of Pharmacy and Pharmaceutical Sciences
https://app.periodikos.com.br/journal/medjpps/article/doi/10.5281/zenodo.7869133

Mediterranean Journal of Pharmacy and Pharmaceutical Sciences

Short communication

Evaluation of multiple-choice and short essay questions in pharmacology education

Ali I. Banigesh, Abdulsalam O. Elfowiris, Abdulsalam Sughir

Downloads: 0
Views: 133

Abstract

Multiple choice questions (MCQs) and short essay questions (SEQs) are common methods of the assessment of medical students in courses of pharmacology. Poorly constructed test items (questions) are a widespread problem resulting in failure to assess learning objectives. It has been reported that there are 36.0% to 65.0% flawed test items in medical education assessment tools. Thus, the objective of this study was to evaluate MCQs by determining the item writing flaws (IWFs) and to evaluate the SEQs by determining the cognitive level of each item. Four pharmacology tests were administered to third-year pharmacy students at Department of Pharmacology, Faculty of Pharmacy, Omar Al-Mukhtar University, Bayda, Libya. These were evaluated by determining the IWFs and the level of the cognitive domains. Based on Buckwalter’s modification of Bloom’s taxonomy cognitive level, for the SEQs, 30.0% of the questions were attempted to check recall of information, 26.0% were attempted to evaluate understanding and interpretation of data and 43.0% of the questions were attempted to check the application of knowledge for solving a particular problem. For the MCQs, 94.6% of the questions were attempted to evaluate the understanding and interpretation of data. For the IWFs, there were more than 40.0% of flawed questions. The most common writing flaws were the negative stem (47.4%), unfocused item (16.0%), non-homogenous in grammar and contents (10.0%), all the above (10.0%) and clang association (05.0%). In a short essay, the SEQs were of excellent quality because they were equally distributed among the three levels of cognitive (level I, II and III). On the other hand, the most common mistakes IWFs of the MCQs were the negative stem (47.0%) and the idea was not clearly and concisely stated in the stem (16.0%). This study concludes that questions in SEQs are valid to measure the learning objective but MCQs were not in pharmacology courses in Libya.

Keywords

Cognitive level, IWFs, Libya, MCQs, questions, SEQs

References

  1. Downing SM (2003) Validity: On the meaningful interpretation of assessment data. Medical Education. 37 (9): 830-837. doi.org/10.1046/j.1365-2923.2003.01594.x.
  2. Buckwalter JA, Schumacher R, Albright JP, Cooper RR (1981) Use of an educational taxonomy for evaluation of cognitive performance. Journal of Medical Education. 56 (2): 115-121. doi: 10.1097/00001888-198102000-00006.
  3. Kozikoğlu İ (2018) The examination of alignment between national assessment and English curriculum objectives using revised Bloom’s taxonomy. Educational Research Quarterly. 41 (4): 50-77.
  4. Haladyna TM, Downing SM (1989) A taxonomy of multiple-choice item-writing rules. Applied Measurement in Education. 2 (1): 37-50. doi.org/10.1207/s15324818ame0201_3.
  5. Downing SM (2005) The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Advances in Health Sciences Education. 10 (2): 133-143. doi.org/10.1007/s10459-004-4019-5.
  6. Baig M, Ali SK, Ali S, Huda N (1969) Quality evaluation of assessment tools: OSPE, SEQ & MCQ. Pakistan Journal of Medical Sciences. 30 (1): 3-6. doi.org/10.12669/pjms.301.4458.
  7. Betts SC (2008) Teaching and assessing basic concepts to advanced applications: Using Bloom's taxonomy to inform graduate course design. Academy of Educational Leadership Journal. 12 (3): 99-107.
  8. Boland RJ, Lester NA, Williams E (2010) Writing multiple-choice questions. Academic Psychiatry. 34 (4): 310-316. doi.org/10.1176/appi.ap.34.4.310.
  9. Michael C. Rodriguez MC (2005) Three options are optimal for multiple-choice items: a meta-analysis of 80 years of research. Educational Measurement: Issues and Practice. 24 (2): 3-13. doi.org/10.1111/j.1745-3992.2005.00006.x.
  10. Nedeau-Cayo R, Laughlin D, Rus L, Hall J (2013) Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development. 29 (2): 52-57. doi.org/10.1097/NND.0b013 e318286c2f1.
  11. Palmer EJ, Duggan P, Devitt PG, Russell R (2010) The modified essay question: Its exit from the exit examination. Medical Teacher. 32 (7): doi.org/10.3109/0142159X.2010.488705.

Submitted date:
03/13/2023

Reviewed date:
04/14/2023

Accepted date:
04/22/2023

Publication date:
07/20/2023

64b99b32a953956bca29cb36 medjpps Articles
Links & Downloads

Mediterr J Pharm Pharm Sci

Share this page
Page Sections