|Year : 2020 | Volume
| Issue : 2 | Page : 74-78
Practical skills evaluation of undergraduate medical students by an objective structured practical examination (OSPE): An effective tool for formative assessment in microbiology
Gurjeet Singh, Raksha Singh
Department of Microbiology, N.C. Medical College and Hospital, Panipat, Haryana, India
|Date of Submission||23-May-2020|
|Date of Acceptance||23-May-2020|
|Date of Web Publication||19-Jun-2020|
Dr. Gurjeet Singh
Department of Microbiology, N.C. Medical College and Hospital, Israna, Panipat 132107, Haryana.
Source of Support: None, Conflict of Interest: None
Background: Objective structured practical examination (OSPE) has ended up being a decent choice to beat traps of customary technique for evaluation. OSPE is not typically used in medical colleges in India in view of requirements of assets. Consequently, the two teachers and students are very little mindful of the OSPE model. Consequently, we led this examination to make mindfulness on the OSPE strategy and break down Student’s assessment with respect to the technique. Aims and Objectives: This study was conducted to know how the subjectivity of conventional practical methods of evaluation in medical education is overcome in OSPE. Materials and Methods: A total of 130 second-year professional MBBS students were included in the study. The skills of the medical students were assessed on the basis of their performance. Four stations were formed, of which two were procedure stations and other response stations. Students were evaluated using the OSPE model. OSPE additionally broke down to see the lacuna in any progression or any inquiry with respect to students. Traditional techniques and procedural and response stations in OSPE were investigated to learn the knowledge, skill, and competency of students in this subject. Student’s opinion about the new method was collected through a questionnaire and analyzed. Results: A total of 130 students from MBBS’ second professional year were included in the study. Of which, 91.54% of students felt that they were given adequate time at each station adequate, 71.54% responded that they were prior sensitized about OSPE, 93.08% responded that the exam was not stressful, 94.62% said that the OSPE was very much organized, applicable, and uniform, 80.77% said that the exam covers the appropriate knowledge than the traditional examination, and 76.15% opined that such type of exams should be included in the future as a method of formative assessment in practical examination. Conclusion: It was proved in this study that OSPE had overcome the subjectivity in conventional methods. The students showed lacunae in some steps of the exercise, which were informed to the teachers for more concentration on those steps. The students had good knowledge and skills but less competency, our study highlights the implementation of OSPE for formative assessment in the curriculum of undergraduate medical education, so that students can achieve more skills and knowledge to improve their competency.
Keywords: Examination, medical education, microbiology, objective, skills
|How to cite this article:|
Singh G, Singh R. Practical skills evaluation of undergraduate medical students by an objective structured practical examination (OSPE): An effective tool for formative assessment in microbiology. MGM J Med Sci 2020;7:74-8
|How to cite this URL:|
Singh G, Singh R. Practical skills evaluation of undergraduate medical students by an objective structured practical examination (OSPE): An effective tool for formative assessment in microbiology. MGM J Med Sci [serial online] 2020 [cited 2022 Jan 25];7:74-8. Available from: http://www.mgmjms.com/text.asp?2020/7/2/74/287172
| Introduction|| |
Traditional methods of assessment of practical skills of students are riddled with many problems. There is a lot of scope for bias during evaluation and many subjective factors, which will lead to incorrect assessment of skills. Objective structured practical examination (OSPE) has proved to be a good alternative to overcome such deficiencies of traditional method.
The OSPE is a reliable objective and a structured method of skill-based assessment by direct observation of the student’s performance in an adaptable examination setup that is based on a circuit of laboratory “stations.”,,,, Although the traditional practical examination (TPE) primarily lays emphasis on the “knows” and “knows how” aspects (the first and second levels of Miller’s Framework of development of competency), the OSPE assesses the third level (the “shows how” level) that stresses on the evaluation of the specific performance of psychomotor skills. For a reliable skill-based assessment, student performance has to be evaluated across a range of situations.
Assessment is an integral component of competency-based medical education. Over a period, the methods of student assessment in medical education have changed. We have moved from a standard of pen-and-paper tests of knowledge toward a more complex system of evaluation. The use of OSPEs in the quantitative assessment of competence has become widespread in the field of undergraduate and postgraduate medical education, mainly due to the improved reliability and unbiased method of this assessment format. It offers a fairer test of the candidate’s practical abilities, as all the candidates are presented to the same task. This is an assessment format in which the candidates rotate around a circuit of stations, at each of which specific tasks have to be performed, usually involving a practical skill. The marking scheme for each station is structured and determined in advance in the form of a checklist. So far, not many published documents are available in an Indian context about the use of OSPE in medical education assessments.
TPEs are chiefly subjective, and they largely examine the cognitive (knowledge) component rather than the psychomotor (competence) component. Compared to TPEs, the OSPE appraises a range of competencies,, measures practical psychomotor skills better eliminate subjectivity, has a wider discrimination index, and helps students to grasp various components of competencies and also to obtain feedback. The barriers to the use of OSPE include its labor-intensive nature, snags in maintaining uniform difficulty levels, and observer fatigue. Currently, OSPE has been introduced in select Indian universities,, but there is no national-level guidelines on conducting OSPE. The purpose for the selection of the OSPE in this study is because this is categorized in the “must-know” component of the university syllabus for the subject of Microbiology for the second-year MBBS course and carries five marks in the university practical examination. Furthermore, performance evaluates the psychomotor skills of the students. However, it requires holistic preparation for conducting the OSPE model evaluation. Hence, OSPE is not usually used in many medical colleges because of constraints with respect to staff and resources. Hence, both teachers and students are not very aware of the OSPE model, mode of conducting, scoring, and assessment.
Aims andobjectives: The aims and objectives of this study were as follows:
- To know the subjectivity of traditional practical methods of evaluation in medical education is overcome in OSPE
- Evaluation of student feedback for the OSPE model
- What OSPE goes about as a device for knowing the competency of students also whether students were stress free during the exams?
| Materials and methods|| |
The prospective study was conducted at the Department of Microbiology, N.C. Medical College and Hospital, Israna, Panipat, Haryana, India, over a period of 6 months from January to June 2018. A total of 130 students from MBBS’ second professional year were included in the study. The skills were assessed on the basis of the student’s performance.
All fourth-semester students, of either sex, who gave written informed consent to participate in the study were included in this study.
Those who did not give written informed consent were excluded in this study.
This study was part of a project that was reviewed and examined by the ethical review committee of N.C. Medical College and Hospital, Israna, Panipat, Haryana, India. Consent was obtained from the students before the start of the study.
OSPE station questionnaires/tasks and checklist were well structured and pre-validated by the faculty of microbiology after discussion with necessary modification. The venue was arranged for OSPE in the Practical Hall of the Microbiology Department. All candidates were given clear instructions on what task they should perform at each of the stations. The examiners were given instructions of giving marks at the given station according to the checklist provided to them for the individual students so that the students and examiners understand their respective roles clearly and precisely during the conduct of OSPE. At the beginning of the examination, the attendance and signature of all the students and examiners were taken. The required instruments and materials were provided at the respective stations. Of the total four stations used to test their skills. Each station was allotted 3 min to answer, and 30s were given in-between stations. Each station was structured into three subsets of observations/tasks/questions, carrying one mark each with an additional two marks for global assessment (how students technically handle instruments, use microscopes, mount, and focus slides) [Table 1].
The questionnaire for the experiment and answers standardized were prepared and students were briefed (exposed) to the OSPE model in previous practical classes. Students were evaluated using the OSPE model. Student opinion about the new method was collected through a questionnaire and analyzed.
The faculty gave mixed responses regarding OSPE and TPE. Most agreed that OSPE eliminated examiner bias better, only if multiple faculties were involved in assessing, and prior preparation of a well-drafted answer key was done. All faculties unanimously said that OSPE should include further in formative assessments, and if possible in summative assessments also. Another important fact faculty had expressed was that students’ attitude and communication skills cannot be assessed only by OSPE, and hence a combination of assessment methods has to be used, not sticking to a single method.
| Results|| |
A total of 130 second professional year MBBS students were enrolled in this study for OSPE station feedback analysis.
In the feedback questionnaire for students, 91.54% felt that the time given at each station was adequate, 71.54% responded that they were prior sensitized about OSPE, 93.08% responded that the exam was not stressful, 94.62% said that the OSPE was very much organized, applicable, and uniform, 80.77% said that the examination covered the appropriate knowledge than traditional examination, and 76.15% opined that such type of exams should be included in the future as a method of assessment in practical formative assessments [Table 2], [Graph 1].,
| Discussion|| |
Evaluation is a process that systematically and objectively determines the relevance, effectiveness, and impact of activities in light of their objectives. Thus, evaluation should include cognitive, psychomotor, and affective domains. In conventional methods of evaluation, there are many shortcomings. Apart from the performance of the student, various other factors such as experiment factors, instrument condition, and examiner factors also play a significant role in scoring. Besides, individual skills are not evaluated and only the final result is taken into consideration in awarding the marks, leading to incomplete and unjust evaluation. Most students are assessed only for the cognitive domain, and not for psychomotor and affective domains., In order to overcome these disadvantages of conventional methods of evaluation, new models have been proposed while attempting to remove these biases. The most promising method among them is the OSPE described in 1975 by Harden et al. In OSPE, several stations are created and at each station, the student performs the skill.
In our study, 91.54% of students felt that the time given at each station was adequate, 71.54% responded that they were prior sensitized about OSPE, 93.08% responded that the exam was not stressful, 94.62% said that the OSPE was very much organized, applicable, and uniform, 80.77% said that the exam covered the appropriate knowledge than the traditional examination, and 76.15% opined that such type of exams should be included in the future as a method of assessment in practical formative assessments [Table 2], [Graph 1]. Similar findings were reported by different authors., However, teachers differed on this opinion and felt that OSPE did not have the option to ask more questions and tested very basic knowledge. Sixty-two percent of students felt that OSPE is easier to pass than traditional exams. Few other studies also showed that there was a significant improvement in scores compared to traditional methods., Students also felt that OSPE was more relevant and useful than traditional examinations.
Assessment of students in medicine has always remained debatable. It is seen as the single strongest determinant of what students learn (as opposed to what they are taught) and is considered to be uniquely powerful as a tool for manipulating the whole education process. There are continuous attempts to make assessment more objective and reliable rather than subjective. Traditional, age-old methods such as essay-type questions, which suffer from lack of objectivity, are giving way to newer objective methods of assessment in the form of multiple-choice questions, short-answer questions, and such other tools, for the assessment of the cognitive domain. As far as skill assessment is concerned, conventional methods are not only subjective in nature but also lack scope for direct observation of the performance of skills by the assessor. Moreover, the coverage of the contents may be limited. Hence, attempts have been made to introduce methods that can overcome the aforementioned limitations. For assessment in preclinical and para-clinical subjects, a modified version of the objective structured clinical examination (OSCE), the OSPE has been introduced. In India, the use of OSPE for assessment of skills has been reported by some institutes.
In this study, analysis of feedback showed that OSPE tested objectivity, measured practical skills better, and eliminated examiner bias to a greater extent, similarly, Kundu et al. concluded OSPE as a method of assessment of practical skills and learning and to determine student satisfaction regarding the OSPE. Abraham et al. studied feedback analysis of students and found that they favored OSPE compared with the TPE. Rehman et al. stated that OSPE is an easy, uniform, fair, unstressful, and unbiased method of examination for practical examination. Rahman et al., Yaqinuddin et al., Dandannavar and Schwartz, and Bashir et al. [Table 3] quoted that OSPE is a better choice as an assessment.
|Table 3: Some important differences between OSPE and TPE as pointed out by Bashir et al.|
Click here to view
| Conclusion|| |
Opinion on OSPE concluded that it is a fair, better, relevant method and has several advantages; though it is difficult to conduct OSPE, efforts should be made to include it in internal assessments so that both teachers and students get accustomed to the new method and get benefited. Most of the students opined that OSPE was well structured, and it covered the appropriate knowledge area, it was less stressful, and it should be continued in future formative assessments. Most faculties agreed that OSPE eliminated examiner bias better, only if multiple faculties were involved in assessing and prior preparation of a well-drafted. The answer key is done. All faculties unanimously said that OSPE should definitely be included further in formative assessments, and if possible in summative assessments also.
Financial support and sponsorship
Conflicts of interest
There are no conflicts of interest.
| References|| |
Ananthakrishnan N. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med 1993;39:82-4.
] [Full text]
Smee S. Skill based assessment. BMJ 2003;326:703‐6.
Harden RM. What is an OSCE? Med Teach 1988;10:19-22.
Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979;13:41-54.
Harden RM. Assess clinical competence—An overview. Med Teach 1979;1:289-96.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447‐51.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65(9 Suppl):S63-67.
Abraham RR, Raghavendra R, Surekha K, Asha K. A trial of the objective structured practical examination in physiology at Melaka Manipal Medical College, India. Adv Physiol Educ 2009;33:21-3.
Charles J, Janagond A, Thilagavathy, Rajendran, Ramesh, Vidhya. Evaluation by OSPE (objective structured practical examination)—A good tool for assessment of medical undergraduates—A study report from Velammal Medical College, Madurai, Tamil Nadu, India. IOSR-JRME 2006;6:1-6.
Reznick RK, Smee S, Baumber JS, Cohen R, Rothman A, Blackmore D, et al
. Guidelines for estimating the real cost of an objective structured clinical examination. Acad Med 1993;68:513-7.
Gupta P, Dewan P, Singh T. Objective structured clinical examination (OSCE) revisited. Indian Pediatr 2010;47:911-20.
Gitanjali B. The other side of OSPE. Indian J Pharmacol 2004;36:388-9. [Full text]
Ashok NS, Prathab AG, Indumathi VA. Objective structured practical examination as a tool for evaluating competency in gram staining. J Educ Res Med Teach 2013;1:48-50.
Wadde SK, Deshpande RH, Madole MB, Pathan FJ. Research article assessment of III MBBS students using OSPE/OSCE in community medicine: Teachers’ and students’ perceptions. Sch J App Med Sci 2013;1:348-53.
Malik SL, Manchanda SK, Deepak KK, Sunderam KR. The attitudes of medical students to the objective structured practical examination. Med Educ 1988;22:40-6.
Rahman N, Ferdousi S, Hoq N, Amin R, Kabir J. Evaluation of objective structured practical examination and traditional practical examination. Mymensingh Med J 2007;16:7-11.
Malhotra SD, Shah KN, Patel VJ. Objective structured practical examination as a tool for the formative assessment of practical skills of undergraduate students in pharmacology. J Educ Health Promot 2013;2:53.
Kundu D, Mandal T, Osta M, Sen G, Das H, Gautam D. Objective structured practical examination in biochemistry: An experience in Medical College, Kolkata. J Nat Sci Biol Med 2013;4:103-7.
Rehman R, Syed S, Iqbal A, Rehan R. Perception and performance of medical students in objective structured practical examination and viva voce. Pak J Physiol 2012;8:33-6.
Yaqinuddin A, Zafar M, Ikram MF, Ganguly P. What is an objective structured practical examination in anatomy? Anat Sci Educ 2013;6:125-33.
Dandannavar VS, Schwartz A. A comparative study to evaluate practical skills in physiology among 1st phase medical under graduates at JNMC Belgaum: Traditional practical examinations versus objective structure practical examinations (TPE v/s OSPE). Int J Educat Res Technol 2014;5:126-34.
Bashir A, Tahir S, Khan JS. Objectively structured performance evaluation—A learning tool. Biomedical 2014;2:139-47.
[Table 1], [Table 2], [Table 3]