• Users Online: 184
  • Print this page
  • Email this page


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2020  |  Volume : 7  |  Issue : 2  |  Page : 74-78

Practical skills evaluation of undergraduate medical students by an objective structured practical examination (OSPE): An effective tool for formative assessment in microbiology


Department of Microbiology, N.C. Medical College and Hospital, Panipat, Haryana, India

Date of Submission23-May-2020
Date of Acceptance23-May-2020
Date of Web Publication19-Jun-2020

Correspondence Address:
Dr. Gurjeet Singh
Department of Microbiology, N.C. Medical College and Hospital, Israna, Panipat 132107, Haryana.
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/mgmj.MGMJ_45_20

Rights and Permissions
  Abstract 

Background: Objective structured practical examination (OSPE) has ended up being a decent choice to beat traps of customary technique for evaluation. OSPE is not typically used in medical colleges in India in view of requirements of assets. Consequently, the two teachers and students are very little mindful of the OSPE model. Consequently, we led this examination to make mindfulness on the OSPE strategy and break down Student’s assessment with respect to the technique. Aims and Objectives: This study was conducted to know how the subjectivity of conventional practical methods of evaluation in medical education is overcome in OSPE. Materials and Methods: A total of 130 second-year professional MBBS students were included in the study. The skills of the medical students were assessed on the basis of their performance. Four stations were formed, of which two were procedure stations and other response stations. Students were evaluated using the OSPE model. OSPE additionally broke down to see the lacuna in any progression or any inquiry with respect to students. Traditional techniques and procedural and response stations in OSPE were investigated to learn the knowledge, skill, and competency of students in this subject. Student’s opinion about the new method was collected through a questionnaire and analyzed. Results: A total of 130 students from MBBS’ second professional year were included in the study. Of which, 91.54% of students felt that they were given adequate time at each station adequate, 71.54% responded that they were prior sensitized about OSPE, 93.08% responded that the exam was not stressful, 94.62% said that the OSPE was very much organized, applicable, and uniform, 80.77% said that the exam covers the appropriate knowledge than the traditional examination, and 76.15% opined that such type of exams should be included in the future as a method of formative assessment in practical examination. Conclusion: It was proved in this study that OSPE had overcome the subjectivity in conventional methods. The students showed lacunae in some steps of the exercise, which were informed to the teachers for more concentration on those steps. The students had good knowledge and skills but less competency, our study highlights the implementation of OSPE for formative assessment in the curriculum of undergraduate medical education, so that students can achieve more skills and knowledge to improve their competency.

Keywords: Examination, medical education, microbiology, objective, skills


How to cite this article:
Singh G, Singh R. Practical skills evaluation of undergraduate medical students by an objective structured practical examination (OSPE): An effective tool for formative assessment in microbiology. MGM J Med Sci 2020;7:74-8

How to cite this URL:
Singh G, Singh R. Practical skills evaluation of undergraduate medical students by an objective structured practical examination (OSPE): An effective tool for formative assessment in microbiology. MGM J Med Sci [serial online] 2020 [cited 2023 Mar 28];7:74-8. Available from: http://www.mgmjms.com/text.asp?2020/7/2/74/287172




  Introduction Top


Traditional methods of assessment of practical skills of students are riddled with many problems. There is a lot of scope for bias during evaluation and many subjective factors, which will lead to incorrect assessment of skills. Objective structured practical examination (OSPE) has proved to be a good alternative to overcome such deficiencies of traditional method.[1]

The OSPE is a reliable objective and a structured method of skill-based assessment by direct observation of the student’s performance in an adaptable examination setup that is based on a circuit of laboratory “stations.”[2],[3],[4],[5],[6] Although the traditional practical examination (TPE) primarily lays emphasis on the “knows” and “knows how” aspects (the first and second levels of Miller’s Framework of development of competency), the OSPE assesses the third level (the “shows how” level) that stresses on the evaluation of the specific performance of psychomotor skills.[7] For a reliable skill-based assessment, student performance has to be evaluated across a range of situations.[2]

Assessment is an integral component of competency-based medical education. Over a period, the methods of student assessment in medical education have changed. We have moved from a standard of pen-and-paper tests of knowledge toward a more complex system of evaluation. The use of OSPEs in the quantitative assessment of competence has become widespread in the field of undergraduate and postgraduate medical education, mainly due to the improved reliability and unbiased method of this assessment format. It offers a fairer test of the candidate’s practical abilities, as all the candidates are presented to the same task. This is an assessment format in which the candidates rotate around a circuit of stations, at each of which specific tasks have to be performed, usually involving a practical skill. The marking scheme for each station is structured and determined in advance in the form of a checklist. So far, not many published documents are available in an Indian context about the use of OSPE in medical education assessments.[8]

TPEs are chiefly subjective, and they largely examine the cognitive (knowledge) component rather than the psychomotor (competence) component. Compared to TPEs, the OSPE appraises a range of competencies,[9],[10] measures practical psychomotor skills better eliminate subjectivity,[9] has a wider discrimination index, and helps students to grasp various components of competencies and also to obtain feedback.[11] The barriers to the use of OSPE include its labor-intensive nature, snags in maintaining uniform difficulty levels, and observer fatigue.[12] Currently, OSPE has been introduced in select Indian universities,[8],[9] but there is no national-level guidelines on conducting OSPE. The purpose for the selection of the OSPE in this study is because this is categorized in the “must-know” component of the university syllabus for the subject of Microbiology for the second-year MBBS course and carries five marks in the university practical examination. Furthermore, performance evaluates the psychomotor skills of the students. However, it requires holistic preparation for conducting the OSPE model evaluation. Hence, OSPE is not usually used in many medical colleges because of constraints with respect to staff and resources. Hence, both teachers and students are not very aware of the OSPE model, mode of conducting, scoring, and assessment.

Aims andobjectives: The aims and objectives of this study were as follows:

  1. To know the subjectivity of traditional practical methods of evaluation in medical education is overcome in OSPE


  2. Evaluation of student feedback for the OSPE model


  3. What OSPE goes about as a device for knowing the competency of students also whether students were stress free during the exams?



  Materials and methods Top


The prospective study was conducted at the Department of Microbiology, N.C. Medical College and Hospital, Israna, Panipat, Haryana, India, over a period of 6 months from January to June 2018. A total of 130 students from MBBS’ second professional year were included in the study. The skills were assessed on the basis of the student’s performance.

Inclusion criteria

All fourth-semester students, of either sex, who gave written informed consent to participate in the study were included in this study.

Exclusion criteria

Those who did not give written informed consent were excluded in this study.

Ethical clearance

This study was part of a project that was reviewed and examined by the ethical review committee of N.C. Medical College and Hospital, Israna, Panipat, Haryana, India. Consent was obtained from the students before the start of the study.

OSPE station questionnaires/tasks and checklist were well structured and pre-validated by the faculty of microbiology after discussion with necessary modification. The venue was arranged for OSPE in the Practical Hall of the Microbiology Department. All candidates were given clear instructions on what task they should perform at each of the stations. The examiners were given instructions of giving marks at the given station according to the checklist provided to them for the individual students so that the students and examiners understand their respective roles clearly and precisely during the conduct of OSPE. At the beginning of the examination, the attendance and signature of all the students and examiners were taken. The required instruments and materials were provided at the respective stations. Of the total four stations used to test their skills. Each station was allotted 3 min to answer, and 30s were given in-between stations. Each station was structured into three subsets of observations/tasks/questions, carrying one mark each with an additional two marks for global assessment (how students technically handle instruments, use microscopes, mount, and focus slides) [Table 1].
Table 1: OSPE stations with checklist

Click here to view


The questionnaire for the experiment and answers standardized were prepared and students were briefed (exposed) to the OSPE model in previous practical classes. Students were evaluated using the OSPE model. Student opinion about the new method was collected through a questionnaire and analyzed.

The faculty gave mixed responses regarding OSPE and TPE. Most agreed that OSPE eliminated examiner bias better, only if multiple faculties were involved in assessing, and prior preparation of a well-drafted answer key was done. All faculties unanimously said that OSPE should include further in formative assessments, and if possible in summative assessments also. Another important fact faculty had expressed was that students’ attitude and communication skills cannot be assessed only by OSPE, and hence a combination of assessment methods has to be used, not sticking to a single method.


  Results Top


A total of 130 second professional year MBBS students were enrolled in this study for OSPE station feedback analysis.

In the feedback questionnaire for students, 91.54% felt that the time given at each station was adequate, 71.54% responded that they were prior sensitized about OSPE, 93.08% responded that the exam was not stressful, 94.62% said that the OSPE was very much organized, applicable, and uniform, 80.77% said that the examination covered the appropriate knowledge than traditional examination, and 76.15% opined that such type of exams should be included in the future as a method of assessment in practical formative assessments [Table 2], [Graph 1].
Table 2: Student feedback form for OSPE

Click here to view
,
Graph 1: Student feedback form for OSPE

Click here to view



  Discussion Top


Evaluation is a process that systematically and objectively determines the relevance, effectiveness, and impact of activities in light of their objectives. Thus, evaluation should include cognitive, psychomotor, and affective domains.[1] In conventional methods of evaluation, there are many shortcomings. Apart from the performance of the student, various other factors such as experiment factors, instrument condition, and examiner factors also play a significant role in scoring. Besides, individual skills are not evaluated and only the final result is taken into consideration in awarding the marks, leading to incomplete and unjust evaluation. Most students are assessed only for the cognitive domain, and not for psychomotor and affective domains.[1],[13] In order to overcome these disadvantages of conventional methods of evaluation, new models have been proposed while attempting to remove these biases. The most promising method among them is the OSPE described in 1975 by Harden et al.[6] In OSPE, several stations are created and at each station, the student performs the skill.[1]

In our study, 91.54% of students felt that the time given at each station was adequate, 71.54% responded that they were prior sensitized about OSPE, 93.08% responded that the exam was not stressful, 94.62% said that the OSPE was very much organized, applicable, and uniform, 80.77% said that the exam covered the appropriate knowledge than the traditional examination, and 76.15% opined that such type of exams should be included in the future as a method of assessment in practical formative assessments [Table 2], [Graph 1]. Similar findings were reported by different authors.[14],[15] However, teachers differed on this opinion and felt that OSPE did not have the option to ask more questions and tested very basic knowledge. Sixty-two percent of students felt that OSPE is easier to pass than traditional exams. Few other studies also showed that there was a significant improvement in scores compared to traditional methods.[13],[15] Students also felt that OSPE was more relevant and useful than traditional examinations.[16]

Assessment of students in medicine has always remained debatable. It is seen as the single strongest determinant of what students learn (as opposed to what they are taught) and is considered to be uniquely powerful as a tool for manipulating the whole education process. There are continuous attempts to make assessment more objective and reliable rather than subjective. Traditional, age-old methods such as essay-type questions, which suffer from lack of objectivity, are giving way to newer objective methods of assessment in the form of multiple-choice questions, short-answer questions, and such other tools, for the assessment of the cognitive domain. As far as skill assessment is concerned, conventional methods are not only subjective in nature but also lack scope for direct observation of the performance of skills by the assessor. Moreover, the coverage of the contents may be limited. Hence, attempts have been made to introduce methods that can overcome the aforementioned limitations. For assessment in preclinical and para-clinical subjects, a modified version of the objective structured clinical examination (OSCE), the OSPE has been introduced. In India, the use of OSPE for assessment of skills has been reported by some institutes.[17]

In this study, analysis of feedback showed that OSPE tested objectivity, measured practical skills better, and eliminated examiner bias to a greater extent, similarly, Kundu et al.[18] concluded OSPE as a method of assessment of practical skills and learning and to determine student satisfaction regarding the OSPE. Abraham et al.[8] studied feedback analysis of students and found that they favored OSPE compared with the TPE. Rehman et al.[19] stated that OSPE is an easy, uniform, fair, unstressful, and unbiased method of examination for practical examination. Rahman et al.,[16] Yaqinuddin et al.,[20] Dandannavar and Schwartz,[21] and Bashir et al.[22] [Table 3] quoted that OSPE is a better choice as an assessment.
Table 3: Some important differences between OSPE and TPE as pointed out by Bashir et al.[22]

Click here to view



  Conclusion Top


Opinion on OSPE concluded that it is a fair, better, relevant method and has several advantages; though it is difficult to conduct OSPE, efforts should be made to include it in internal assessments so that both teachers and students get accustomed to the new method and get benefited. Most of the students opined that OSPE was well structured, and it covered the appropriate knowledge area, it was less stressful, and it should be continued in future formative assessments. Most faculties agreed that OSPE eliminated examiner bias better, only if multiple faculties were involved in assessing and prior preparation of a well-drafted. The answer key is done. All faculties unanimously said that OSPE should definitely be included further in formative assessments, and if possible in summative assessments also.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Ananthakrishnan N. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med 1993;39:82-4.  Back to cited text no. 1
[PUBMED]  [Full text]  
2.
Smee S. Skill based assessment. BMJ 2003;326:703‐6.  Back to cited text no. 2
    
3.
Harden RM. What is an OSCE? Med Teach 1988;10:19-22.  Back to cited text no. 3
    
4.
Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979;13:41-54.  Back to cited text no. 4
    
5.
Harden RM. Assess clinical competence—An overview. Med Teach 1979;1:289-96.  Back to cited text no. 5
    
6.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447‐51.  Back to cited text no. 6
    
7.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65(9 Suppl):S63-67.  Back to cited text no. 7
    
8.
Abraham RR, Raghavendra R, Surekha K, Asha K. A trial of the objective structured practical examination in physiology at Melaka Manipal Medical College, India. Adv Physiol Educ 2009;33:21-3.  Back to cited text no. 8
    
9.
Charles J, Janagond A, Thilagavathy, Rajendran, Ramesh, Vidhya. Evaluation by OSPE (objective structured practical examination)—A good tool for assessment of medical undergraduates—A study report from Velammal Medical College, Madurai, Tamil Nadu, India. IOSR-JRME 2006;6:1-6.  Back to cited text no. 9
    
10.
Reznick RK, Smee S, Baumber JS, Cohen R, Rothman A, Blackmore D, et al. Guidelines for estimating the real cost of an objective structured clinical examination. Acad Med 1993;68:513-7.  Back to cited text no. 10
    
11.
Gupta P, Dewan P, Singh T. Objective structured clinical examination (OSCE) revisited. Indian Pediatr 2010;47:911-20.  Back to cited text no. 11
    
12.
Gitanjali B. The other side of OSPE. Indian J Pharmacol 2004;36:388-9.  Back to cited text no. 12
  [Full text]  
13.
Ashok NS, Prathab AG, Indumathi VA. Objective structured practical examination as a tool for evaluating competency in gram staining. J Educ Res Med Teach 2013;1:48-50.  Back to cited text no. 13
    
14.
Wadde SK, Deshpande RH, Madole MB, Pathan FJ. Research article assessment of III MBBS students using OSPE/OSCE in community medicine: Teachers’ and students’ perceptions. Sch J App Med Sci 2013;1:348-53.  Back to cited text no. 14
    
15.
Malik SL, Manchanda SK, Deepak KK, Sunderam KR. The attitudes of medical students to the objective structured practical examination. Med Educ 1988;22:40-6.  Back to cited text no. 15
    
16.
Rahman N, Ferdousi S, Hoq N, Amin R, Kabir J. Evaluation of objective structured practical examination and traditional practical examination. Mymensingh Med J 2007;16:7-11.  Back to cited text no. 16
    
17.
Malhotra SD, Shah KN, Patel VJ. Objective structured practical examination as a tool for the formative assessment of practical skills of undergraduate students in pharmacology. J Educ Health Promot 2013;2:53.  Back to cited text no. 17
    
18.
Kundu D, Mandal T, Osta M, Sen G, Das H, Gautam D. Objective structured practical examination in biochemistry: An experience in Medical College, Kolkata. J Nat Sci Biol Med 2013;4:103-7.  Back to cited text no. 18
    
19.
Rehman R, Syed S, Iqbal A, Rehan R. Perception and performance of medical students in objective structured practical examination and viva voce. Pak J Physiol 2012;8:33-6.  Back to cited text no. 19
    
20.
Yaqinuddin A, Zafar M, Ikram MF, Ganguly P. What is an objective structured practical examination in anatomy? Anat Sci Educ 2013;6:125-33.  Back to cited text no. 20
    
21.
Dandannavar VS, Schwartz A. A comparative study to evaluate practical skills in physiology among 1st phase medical under graduates at JNMC Belgaum: Traditional practical examinations versus objective structure practical examinations (TPE v/s OSPE). Int J Educat Res Technol 2014;5:126-34.  Back to cited text no. 21
    
22.
Bashir A, Tahir S, Khan JS. Objectively structured performance evaluation—A learning tool. Biomedical 2014;2:139-47.  Back to cited text no. 22
    


    Figures

  [Graph 1]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]


This article has been cited by
1 Pharmaceutical microbiology in pharmacy education: Operational process of an objective structured practical examination (OSPE) in Universiti Teknologi MARA
Fazleen Haslinda Mohd Hatta, Azwandi Ahmad, Ahmad Azani Othman
Electronic Journal of Medical and Educational Technologies. 2023; 16(1): em2301
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Materials and me...
Results
Discussion
Conclusion
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed3745    
    Printed333    
    Emailed0    
    PDF Downloaded284    
    Comments [Add]    
    Cited by others 1    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]