|Year : 2022 | Volume
| Issue : 3 | Page : 589-594
A study to assess the reliability of structured viva examination over traditional viva examination among 2nd-year pharmacology students
Marya Ahsan1, Ayaz Khurram Mallick2
1 Department of Pharmacology, College of Medicine, Imam Mohammad Ibn Saud Islamic University, Riyadh, Saudi Arabia
2 Department of Clinical Biochemistry, College of Medicine, King Khalid University, Abha, Saudi Arabia
|Date of Submission||05-Dec-2021|
|Date of Decision||20-Aug-2020|
|Date of Acceptance||23-Aug-2022|
|Date of Web Publication||2-Nov-2022|
Dr. Marya Ahsan
Department of Pharmacology, College of Medicine, Imam Mohammad Ibn Saud Islamic University, Riyadh 13317
Source of Support: None, Conflict of Interest: None
Introduction: Although viva voce provides the examiner an opportunity to probe and assess the reasoning and higher-order thinking abilities of the student, it is marred by inconsistency, subjectivity, and biases. These issues can be addressed by increasing the objectivity and standardization of the viva examination. Hence, this study was done to explore the perception and opinion regarding structured viva examination and to compare and correlate the marks of structured viva with theory examinations among 2nd-year medical students. Materials and Methods: An educational study was carried out on 150 students after obtaining institutional ethical committee and informed consent, out of which 42 students had dropped. The students were subjected to a theory examination of 30 marks following which they were divided into two groups. One group was assessed by traditional viva examination, whereas the other group was assessed by structured viva. The perception of the students and faculty was recorded using a five-point Likert scale. The marks obtained in the viva examination were correlated with the theory examination using Pearson coefficient. The significance of the mean was compared using Student's t-test. P < 0.05 was considered statistically significant. Results: A moderate positive correlation (r = 0.442; P = 0.001) was seen between marks scored in structured viva and theory examination. The overall response of the students and examiners was favorable toward a structured examination. Conclusion: Structured viva examination may be considered more reliable and fairer in comparison to a traditional viva examination. Hence, it may be preferred over the traditional viva examination.
Keywords: Fairness, objectivity in viva, reliability
|How to cite this article:|
Ahsan M, Mallick AK. A study to assess the reliability of structured viva examination over traditional viva examination among 2nd-year pharmacology students. J Datta Meghe Inst Med Sci Univ 2022;17:589-94
|How to cite this URL:|
Ahsan M, Mallick AK. A study to assess the reliability of structured viva examination over traditional viva examination among 2nd-year pharmacology students. J Datta Meghe Inst Med Sci Univ [serial online] 2022 [cited 2023 Feb 4];17:589-94. Available from: http://www.journaldmims.com/text.asp?2022/17/3/589/360224
| Introduction|| |
Along with knowledge, skills, and attitude, communication is also an essential domain of learning. These domains are assessed by different forms of assessment, each having its advantages and limitations. These assessment tools form an integral and vital component of any learning process and drive learning. One such tool is the traditional viva voce or oral examination, which refers to a professional communication between the assessor and the student with an intent to assess the core competencies, especially of higher-order learning such as evaluation and synthesis. It provides the examiner with an opportunity to probe deeper understanding, which may not be assessed in a written theory examination. In addition, timely feedback during the viva assessment and after the examination can help students enhance their knowledge and work on their communication skills.,
However, most of the marks in medical colleges are dedicated to the theory and practical examinations, thus emphasizing the knowledge and skills domain. Furthermore, the students, especially in the preclinical and paraclinical years, consider viva voce as a formality as it is usually conducted along with the practical examinations, which carries more weightage. Furthermore, the viva voce is marred by its highly subjective nature and lack of uniformity. However, if conducted properly, viva voce can be as reliable, valid, and fair as a theory examination. One way to do this would be to make the viva voce structured by following set guidelines agreed upon by the examiners before the viva, thus increasing its objectivity.
Given the current pandemic, as the colleges are forced to assess online, structured viva voce can prove an essential and fair assessment tool. Although studies have been done to explore the perception and opinion regarding structured viva examination, there are minimal studies that directly compare the structured viva examination with the theory examination. Hence, this study was carried out to compare traditional and structured viva examinations and correlate them with theory examinations among 2nd-year medical students.
| Materials and Methods|| |
This educational study was carried out on 2nd-year medical students in a medical college in India. Institutional ethical committee clearance was obtained before commencing the study. After obtaining their informed consent, we planned to include all 2nd-year MBBS students (n = 150) attending pharmacology classes. All the students were assured that their scores in the assessment would be kept confidential and in no way affect their internal or summative assessment. Students who did not consent and did not appear for both theory and viva examinations were excluded from the study.
Two units, which had recently been covered: “cardiovascular drugs” and “drugs affecting blood and blood formation,” were selected for assessment. The topics were announced 2 weeks in advance to provide ample time for preparation. All the participating students were first subjected to a closed book theory assessment of 30 marks. Multiple-choice and short answer-type questions were included in the assessment. The questions of varying levels of difficulty were selected from a prevalidated question bank. A checklist was used for evaluation. The answers of each student were assessed by blind marking by two examiners independently using the checklist. The mean score given by both the examiners was considered the final theory score and subsequently recorded. Following the theory assessment, all the students were subjected to a viva examination over a span of 4 days (day 1–day 4). Students were randomly grouped into Group A and Group B, comprising 75 students each using the table of random numbers. Students in Group A were assessed using traditional viva voce, whereas students in Group B were subjected to structured viva voce. Group A students were further subdivided into A1 (n = 37) and A2 (n = 38). Group A1 was called on day 1, whereas Group A2 was called on day 2. On both days, the students were divided among three examiners equally (Examiner A, B, and C) who assessed the students individually by traditional viva voce. The marks given by the examiners in traditional viva voce were recorded.
Similarly, Group B was divided into B1 (n = 37) and B2 (n = 38). Group B1 was called on day 3, whereas Group B2 was called on day 4 for structured viva. Before the structured viva, all the students and examiners were sensitized. Any queries and doubts regarding the process and conducting of the structured viva were cleared. A predesigned and peer-reviewed set of 10 items was used for the structured viva. Students were distributed among the three examiners on both day 3 and day 4. The examiner was allowed a maximum of 20 min per student. Students were not given more than 2 min for each item. The marks scored by the student were recorded. Feedback was taken from the students and the examiners after the structured viva examination using a prevalidated questionnaire.
For the purpose of structured viva examination, two sets of questions were prepared from the units included in the study. The difficulty level of the questions ranged from recall to application to evaluation and synthesis. All the questions prepared were aligned with the difficulty level of the theory examination. In addition, two senior faculty members reviewed all the questions and the answer checklist for ambiguity. The reviewers were not part of the study. Each set of question papers consisted of 10 items. Each item consisted of two related questions. Thus, there were a total of 20 questions, each carrying one mark. The questions were arranged in increasing order based on their level of difficulty.
Following the viva examination, a prevalidated questionnaire was given to the examiner and students for collecting feedback. The students were given a questionnaire consisting of ten items on a five-point Likert scale and one open-ended question regarding their perception and experience of structured viva examination. The examiners recorded their feedback on a questionnaire consisting of seven items on a five-point Likert scale and one open-ended question for additional remarks.
All the data were analyzed using SPSS (Statistical Package for the Social Sciences) version 26.0 (IBM, Chicago, Illinois, United States). The marks obtained in the examination were expressed as mean ± standard deviation. The significance of the mean was compared using the Student's t-test. The correlation between the theory and viva examination was analyzed using the Pearson coefficient. P < 0.05 was considered statistically significant. The responses of the students and examiners to the questionnaires were expressed as median with interquartile range and percentage.
| Results|| |
Of the 150 2nd-year MBBS students who enrolled for the study, 132 students appeared for the theory examination. During the course of the study, 24 students further dropped out. Hence, 108 students remained in the study resulting in a response rate of 72%. There were 52 students in Group A and 56 students in Group B. The mean marks obtained by the students in their theory and viva examination are summarized in [Table 1]. The mean theory marks of both groups' students were comparable. However, there was a significant difference (P = 0.003) in the marks obtained in the viva examinations. Group B students scored a mean of 12.40 ± 2.80, ranging from 6.75 to 19.0, compared to 13.12 ± 2.40 marks, ranging from 10.0 to 18.0, scored by Group A students.
[Table 2] compares the marks given by the three examiners in the theory and viva examinations. The marks given by Examiner 2 and 3 in both the viva were comparable. However, there was a significant difference in the marks given by Examiner 1 (P = 0.017). Examiner A's mean marks in traditional viva were 14.55, ranging from 10.0 to 18.0 compared to 11.1 in structured ranging from 6.75 to16.0.
|Table 2: The mean marks given by the three examiners in traditional and structured viva examinations|
Click here to view
To study the correlation between the marks scored in the two types of viva examination and theory examination, the Pearson correlation coefficient was determined. It was observed that there was no correlation between the marks obtained in traditional viva with that of theory examination. However, there was a moderately positive correlation (r = 0.442) between the marks obtained in structured viva and theory examinations, which was statistically significant (P = 0.001) [Figure 1] and [Figure 2].
|Figure 1: A plot showing the correlation between marks obtained by the students in their structured viva and theory examinations. The Pearson correlation coefficient (r) was 0.442 with a significant P = 0.001 (P < 0.05 is considered significant)|
Click here to view
|Figure 2: A plot showing the correlation between marks obtained by the students in their traditional viva and theory examinations. The Pearson correlation coefficient (r) was 0.202 with no significance (P = 0.151) (P < 0.05 is considered significant)|
Click here to view
The examiners' and students' feedback regarding their experience and perception of structured viva were collected through a prevalidated questionnaire. The questionnaire consisted of five-point Likert scale-based questions and open-ended questions. Cronbach's alpha was used to check the validity of the questionnaire, which had a score of 0.88, indicating that the questionnaire was valid. The responses of the students are summarized in [Table 3].
|Table 3: The response of the Group B students (n=56) to the five-point Likert scale questionnaire, the median, and the interquartile range|
Click here to view
Of the 56 students who appeared for the structured viva, almost all of them, i.e., 54 students, found the structured viva atmosphere conducive for assessment, and about 60% of the students were relaxed during their viva examination. Over 80% of the students opined that the range of questions covered the entire syllabus. They also found the questions easy to understand.
Regarding the time limit for each question, there was a mixed response. About 60% of the students were satisfied with the 2 min allotted to each question. However, the rest felt more time should have been allotted per question. More than 86% of the students responded by saying that they would prefer giving structured viva, as about 77% of them felt that it would remove bias, and over 80% opined that structured viva would encourage them to study better, thus helping them score better.
As seen in [Table 4], all the examiners unanimously opined that structured viva was fairer than traditional viva as it removes bias. All the faculty said that the time allotted for each question was adequate and that they would prefer taking structured viva in the future. According to them, the structured viva was a time-consuming process. However, this could be overcome by increasing the number of examiners.
|Table 4: The response of the Examiners (n=3) to the five-point Likert scale questionnaire, the median, and the interquartile range|
Click here to view
| Discussion|| |
Assessment is an integral part of the curriculum, which plays a vital role in the learning process. Assessment directly impacts learning as it indicates whether a learner has achieved the required competency or not. Over the years, assessment has been a dynamic process evolving continuously with an intention to make the process robust, reliable, valid, fair, feasible, and backed by evidence-based approaches. Assessment of a medical graduate is crucial as they would be responsible for patient care and safety. Therefore, a medical graduate must attain adequate knowledge, develop the desired skills, and develop effective communication skills and attitudes toward patients. Hence, the assessment of a medical graduate requires multiple modes and tools to ensure that all the learning domains are covered. The written examination is the most common mode of assessment in paraclinical disciplines and carries the most weightage in terms of marks. It has certain advantages, such as assessing a large number of students and assessing all the learning levels from recall to synthesis and evaluation.
On the other hand, oral or viva examination, which once played a central role in assessment, gradually lost importance., Reasons for it falling out of favor to theory examination were attributed to its unreliability and subjectivity. Furthermore, the marring effect of the anxiety factor among students on a high-stake examination was another primary reason for its losing popularity.,
Traditionally, the viva examination in Indian medical colleges is a one-to-one interaction between the assessor and the learner where the assessor is in charge. Often, the viva examination is driven by the position, personality, strictness, topic preference, and favoritism of the assessor rather than the learner's knowledge. Therefore, in this study, we tried to eliminate these factors by standardizing the viva examination. Structuring or standardizing the viva examination involved assessing all the students with a set of questions beginning with easy recall-based questions and gradually moving up Bloom's taxonomy to evaluation and synthesis. Every question was allotted a time. The examiner gave no prompting or comments to the students' responses. In case the student was not able to answer, they could move on to the next question. Various studies have described the role of the assessment environment as a factor influencing the viva examination. An unfavorable atmosphere can aggravate the students' anxiety and nervousness, resulting in poor performance. In our study, almost all the students felt comfortable during the structured viva. This conducive atmosphere can have a positive impact on their performance as it would reduce their anxiety. Our study showed that about 60% of the students were not anxious during the structured viva.
Beginning the viva with a simple recall-based question would also instill confidence in the student and help settle the initial nerves. Moving gradually to higher-order questions may help the student to grow in confidence during the structured viva. Apart from the atmosphere, another critical factor that affects the viva is high subjectivity which reduces its reliability and fairness. Conducting a viva examination is an exhaustive process involving multiple examiners. In traditional viva, the examiner conducts viva as per their preference of questions and topics. This inter-examiner variation and inability to cover the syllabus affects the quality of the viva examination. Apart from topic bias, personal bias, the performance of the student in the previous assessment, or the performance of the previous student in the same assessment (carryover effect) are other factors compromising the fairness of the assessment.
As seen in [Table 3], the majority of the student preferred giving structured viva in the future as it would be fairer and encourage studying better. Therefore, structured viva can increase fairness by ensuring complete syllabus coverage and eliminating biases by grading the students with the help of a prevalidated checklist. This would also reduce inter-examiner variation. [Table 2] compares the marks given by the same examiner in traditional and structured viva. As seen in the table, the least marks allotted by each examiner in traditional viva was 50% which was the passing mark. Moreover, in the case of Examiner A, there was a significant difference (P = 0.017) in the mean marks given in both the forms of the viva, and there was a wider range of marks allotted as well (6.75–16.0). This trend of awarding a minimum pass mark to the student in the viva threatens its validity. Moreover, a narrow range of marks given to the student would fail to differentiate good learners from poor learners. To further stress this point, we compared the viva marks of the students with their theory marks. As seen in Figures 1 and 2, there was no correlation in the marks obtained by the students in the viva examinations and theory examinations. However, a significant positive correlation (r = 0.404, P = 0.001) between their theory marks and structured viva marks was observed. We believe that this correlation coefficient can further increase with subsequently conducting structured viva, better planning, and more robust linking of the questions to the learning objectives. Another implication of this study would be the emphasis on the importance of structured viva in the present COVID-19 pandemic. High infectivity with and multiple waves of pandemics have forced upon a worldwide lockdown, affecting medical institutions. Although many countries have adopted the online mode of teaching, the assessment remains a challenge. One big concern regarding online assessment is fairness, as it is challenging to invigilate written examinations conducted remotely. This is where structured viva can provide a solution, as conducting online face-to-face viva examinations would be fairer.
Despite various advantages, structured viva can limit the role of an examiner. Many consider the viva examination as an opportunity to assess and analyze the critical thinking ability of the student. This would require freedom for the examiner to question as per their preference and create scenarios to open discussion. Another challenge regarding structured viva would be at the level of planning, time management, and manpower. One essential step is to ensure there is no interaction between the students once their viva is over.
This study has a few limitations. First, it was conducted only in one department. Similar studies need to be conducted in various other disciplines to get consolidated data. Second, we would have preferred to conduct the structured viva on a single day as it would minimize the contamination in the study. The viva was conducted over 2 days as there were only three faculty available for the viva. However, we minimized the contamination by having different sets of questions on both days.
| Conclusion|| |
Validity, reliability, feasibility, and fairness are the essential attributes of any assessment. The viva voce or oral examination is an essential tool of assessment for medical graduates. However, high subjectivity in the viva process questions its reliability and fairness. The significant correlation between marks obtained in structured viva and theory examination indicates that conducting the viva in a favorable environment and assessing the students with a prevalidated question and checklist would increase the objectivity, remove bias, and ensure good differentiation of student.
Financial support and sponsorship
Conflicts of interest
There are no conflicts of interest.
| References|| |
Ahmed I, Ishtiaq S. Assessment methods in medical education: A review. Isr Med J 2020;6:95-102.
Ganji KK. Evaluation of reliability in structured viva voce as a formative assessment of dental students. J Dent Educ 2017;81:590-6.
Mudey G, Damke S, Tankhiwale N, Mudey A. Assessment of perception for objectively structured viva voce amongst undergraduate medical students and teaching faculties in a medical college of central India. Int J Res Med Sci 2016;4:2951-4.
Dhasmana DC, Bala S, Sharma R, Sharma T, Kohli S, Aggarwal N, et al.
Introducing structured viva voce examination in medical undergraduate pharmacology: A pilot study. Indian J Pharmacol 2016;48:S52-6.
Mallick AK, Mallick AK, Patel A. Comparison of structured viva examination and traditional viva examination as a tool of assessment in biochemistry for medical students. EJMCM 2020;7:1785-93.
Nehm R, Mead L. Evolution assessment: Introduction to the special issue. Evo Educ Outreach 2019;12:7.
Krupat E. Critical thoughts about the core entrustable professional activities in undergraduate medical education. Acad Med 2018;93:371-6.
Al-Wardy NM. Assessment methods in undergraduate medical education. Sultan Qaboos Univ Med J 2010;10:203-9.
Scott M, Unsworth J. Matching final assessment to employability: Developing a digital viva as an end of programme assessment. High Educ Pedagogy 2018;3:373-84.
Huxham M, Campbell F, Westwood J. Oral versus written assessments: A test of student performance and attitudes. Assess Eval High Educ 2012;37:125-36.
de Silva V, Hanwella R, Ponnamperuma G. The validity of oral assessment (viva) that assesses specific and unique competencies in a post-graduate psychiatry examination. Sri Lanka J Psychiatry 2012;3:16-9.
Knight RA, Dipper L, Cruice M. The use of video in addressing anxiety prior to viva voce exams. Br J Educ Technol 2013;44:E217-9.
Framp A, Downer T, Layh J. Using video assessments as an alternative to the Objective Structured Clinical Examination (OSCE). Aust Nurs Midwifery J 2015;23:42.
Naseem S, Javed M, Baneen B. Developing and implementing structured viva voce examination as a valid and reliable assessment tool in biochemistry for first year BDS students. J Clin Diagn Res 2019;13:5-8.
Georgiou G. Are oral examinations objective? Evidence from the hiring process for judges in Greece. Eur J Law Econ 2017;44:217-39.
Knight RA, Dipper L, Cruice M. Viva survivors – The effect of peer-mentoring on pre-viva anxiety in early-years students. Stud High 2018;43:190-9.
Moleyar V. How to conduct medical viva. Med J DY Patil Vidyapeeth 2018;11:374-88. [Full text]
[Figure 1], [Figure 2]
[Table 1], [Table 2], [Table 3], [Table 4]