Evaluation of student assessment practices in a medical college

Authors

  • Omer A. Elfaki Department of Medical Education, College of Medicine, King Khalid University, KSA
  • Abdulaziz A. Alamri Department of Medical Education, College of Medicine, King Khalid University, KSA

DOI:

https://doi.org/10.18203/2320-6012.ijrms20174968

Keywords:

Assessment, Course, Practices

Abstract

Background: The importance of students’ assessment and its role in driving students learning are well recognized. Guidelines for good assessment practice have been developed. The GMC issued important recommendations related to assessment of students’ performance to be followed by medical schools in UK. The Liaison Committee on Medical Education (LCME) developed standards emphasizing the importance of documenting students’ performance. The utility concept of an assessment tool had been proposed by Van der Vleuten stating a number of weighted criteria. Assessment of clinical competence was proposed to be well covered by the model of Miller. No single method of assessment can be recommended to be appropriate for all assessment purposes and all domains of competence. Therefore, multiple methods of assessment are required.

Methods: There are 35 courses included in the MBBS program in the college of medicine, KKU. these are taught over five years in addition to a preparatory year and the internship year. the curriculum can still be described as discipline based. a survey was planned to study the current assessment situation. this is a cross-sectional descriptive study. the data collection methods used were survey and study of the documents of the courses. an online questionnaire was developed. the responses were analyzed using descriptive statistics to determine frequencies, averages and percentages. the study was conducted during the period January-May 2014.

Results: Twenty course coordinators responded to the survey (57%). Eleven of the courses covered were basic and nine were clinical. Multiple tests as well as multiple methods of continuous assessment were used in the courses studied. Some of the methods used for summative assessment are no longer recommended in current assessment practices in medical education. Real OSCE was used only in one clinical course. Standard setting methods were not used and a fixed pass mark was used instead.

Conclusions: Important shortcomings in student assessment system in many of the courses studied were identified. Less educationally desirable assessment methods and practices are still used in some courses such as unattended single long case examination. More attention should be given to technical aspects of assessment.

References

World Federation for Medical Education Executive Council. International standards in medical education: assessment and accreditation of medical schools educational programmes. A WFME position paper. Med Educ 1998; 32:549–58.

Van Der Vleuten CP. The assessment of professional competence: developments, research and practical implications. Advanc in Heal Sci Edu. 1996;1(1):41-67.

Schuwirth LW, van der Vleuten CP. How to design a useful test: the principles of assessment. Understanding medical education: Evidence, theory and practice. 2010:241-54.

Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith R, Hays R, Kent A, Perrott V, Roberts T. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Medical teacher. 2011;33(3):206-14.

General Medical Council. Recommendations on Undergraduate Medical Education. GMC, London UK: 2002.

LCME (2003). Liaison Committee on Medical Education website: http://www.lcme.org.

Van Der Vleuten CP. The assessment of professional competence: developments, research and practical implications. Advanc Heal Sci Edu. 1996;1(1):41-67.

Miller GE. The assessment of clinical skills/competence/performance. Acad Medic. 1990;65(9):S63-7.

Cizek G, Bunch M. Standards setting, In, S. Downing and T. Haladyna, (Eds.) Handbook of. 2006.

Downing S, Tekian A, Yudkowsky R. Procedures for establishing defensible absolute passing scores on performance examinations in health professional education. Teach Learn Medic. 2006; 18(1): 50-7.

Norcini JJ. Setting standards on educational tests. Medic Edu. 2003; 37(5): 464-9.

Shannon S, Norman G, editors. Evaluation methods: a resource handbook. McMaster University, 1995:47-54.

Bauer D, Holzer M, Kopp V, Fischer MR. Pick-N multiple choice-exams: a comparison of scoring algorithms. Adv Heal Sci Edu. 2011;16(2):211-21.

Bridge PD, Musial J, Frank R, Roe T, Sawilowsky S. Measurement practices: methods for developing content-valid student examinations. Med Teach 2003; 25(4): 414–21.

Ponnamperuma GG, Karunathilake IM, McAleer S, Davis MH. The long case and its modifications: a literature review. Med Educ. 2009;43(10): 936–41.

Wilkinson TJ, Campbell PJ, Judd SJ. Reliability of the long case. Med Educ. 2008;42(9): 887–93.

Hamdy H, Prasad K, Williams R, Salih FA. Reliability and validity of the direct observation clinical encounter examination (DOCEE) Med Educ. 2003;37(3): 205–12.

Downloads

Published

2017-10-27

How to Cite

Elfaki, O. A., & Alamri, A. A. (2017). Evaluation of student assessment practices in a medical college. International Journal of Research in Medical Sciences, 5(11), 5048–5051. https://doi.org/10.18203/2320-6012.ijrms20174968

Issue

Section

Original Research Articles