As students often ask for a viva assessment in philosophy, we set out to answer three questions:
I. Would there be a sizable uptake of viva assessment if it were offered?
II. Would students perform roughly as well as they did in other modes of assessment?
III. Would students find the experience educationally satisfying?
It turns out there was not a sizable uptake and there was no significant statistical difference in performance between modes of assessment. Finally, students did find the experience educationally satisfying. In this paper we reflect on these results, reflect on the limitations of our study, and conclude with some suggestions for future work. This paper would be particularly useful for anyone within the Arts and Humanities who are considering introducing assessed oral presentations into their course.
The use of vivas in 2nd year philosophy: thoughts and insights.
Oral assessment has been around in one form or another for hundreds of years (Stray, 2001) and as (Joughin, 2010, p. 1) comments there is: “every reason to believe that oral forms of assessment are as important now as they ever were”. Some disciplines use it more than others (Huxham et al., 2012); and of course it is the standard way of examining the PhD thesis. Whereas other disciplines such as philosophy use it less frequently.
In this short paper we will be concerned with one very specific type of oral assessment, the ‘viva’. The definition of ‘viva’ is discussed in the literature, e.g. Goffman (1974) and Dobson (2008) and it also links to broader concerns about dialogue, e.g. Bohm (2003). However, in this case, the viva involves an undergraduate student being questioned for 30 minutes by two members of staff. The topic is selected by the student from the syllabus and the viva is used to ascertain the depth of the student’s knowledge on that elected topic.
There are many reasons that might be given to undertake vivas, and, in fact, Joughin (2010, p. 5) cites seven: the learning outcomes demand it; it allows probing of the student’s knowledge; it reflects the world of work; it improves learning (Carless, 2002); it suits some students (Waterfield and West, 2005); the meaning of questions can be clarified; it helps to ensure academic integrity.
We can’t discus all these but those of particular importance to us are: student interest, programme specification and the world of work.
At our annual teaching day we consider the ways in which our current curriculum might be refined and augmented. In 2010 we received results of an anonymous survey conducted by students amongst philosophy students (133 students from a possible 390).
More than 10% of students had indicated that given the choice of exam, essay or viva, they would prefer a viva (note that each of these assessment methods were explained in the questionnaire). One student remarked: ‘I think the best assessment would be just a discussion with the lecturer or something similar’. Another: ‘‘giving a ‘presentation’, if it involved being asked questions, would demonstrate how well students have a grip on what they’ve been studying.’.
However, 57% of the students who responded to the survey indicated that they were opposed to viva examinations. A number of students expressed an extreme level of concern at the possibility of being made to take any kind of assessed oral presentation.: ‘I do not enjoy presentations’; ‘I don’t enjoy presenting to a group of people’; ‘presentations would be awful!’; ‘I resent graded group presentations…because you can be penalised for other students’ failings.’ In general: “nerves”, “freezing”, “shyness”, “embarrassing” and “incompetency” are all words students used in relation to assessed oral presentations.
Another pertinent point was the kinds of skills and abilities the students expected to acquire from a philosophy degree. For instance, one student remarked: ‘Whilst some may not be comfortable with presentations, this adds another skill other than essay writing.’ Another noted: ‘I think a philosophy degree should teach you to think, write AND debate. I have experienced too little of the latter, even in seminars.’ Finally, our student course representative argued that:
Oratory skill is underdeveloped in philosophy students because of shyness etc. I believe the department should really consider developing this skill. Perhaps presentation is the wrong method of doing this, but at the very least, oral philosophical input during seminars should be graded and included in the overall module grade. Most students either go on to pursue a postgraduate philosophy course, or try to find a job. This skill will make you more successful as a teacher, PhD student, lecturer etc. it will also prepare you for other careers that philosophy students are known for pursuing, e.g. consulting, marketing, management, local governance and sales. If you really think it through, you cannot deny that the ability to communicate and articulate yourself verbally to individuals is not only an underdeveloped skill in our course, but a skill that will always be required throughout life.
- 43% of students were in favour of some form of oral presentation.
- A little over 10% indicated that some form of oral presentation would be their preferred mode of assessment.
- There were some extremely negative feelings towards group presentations.
- There was some recognition of the value of developing good oral communication skills
The programme learning outcomes for our Undergraduate Philosophy Programme includes the aim to:
Develop students’ ability to think logically and critically, to acquire problem-solving skills and communicate effectively both orally and in writing (italics added for emphasis)
The assumption made was that students would meet this outcome through the process of attending and contributing in seminars. However, the student comments don’t support this claim as they did not believe they were acquiring these skills. Also, research suggests that students best acquire skills when they are assessed for them (Gibbs and Simpson, 2004, p. 4). This presented us with a choice. Either, we could better assess the aims of the programme by including some form of viva, or we could modify the programme specification. Upon reflection and discussion we agreed that the programme aims were indeed correct and that consequently we would include some form of oral assessment within the programme.
Delivering a philosophical education is our primary goal. However, we are aware that the chief qualification that our graduates will carry with them into the world of work is their degree certificate. When considering the assessments that we offer, and the skills that we think it best for our students to have, we think that it would be irresponsible not at least to consider the potential gains that might be made by including some form of oral assessment. (The link between assessment and employability is well documented: see for example, Knight and Yorke, 2004, especially chapter 6).
When successful in getting a job (which of course, often involves an interview), students may well frequently find themselves in positions where they are required to explain their views on a range of matters.
Most fields of practice are dominated by talking rather than writing – listening and responding as a client discusses his or her needs; explaining a course of treatment to a patient; teaching a class of students. (Joughin, 2010, p. 6)
It is therefore in the students’ own interests to have some form of assessment where they are able to try-out those skills, and refine them in light of feedback (see also Crosling, 2000; Davis, Misra and van Auken, 2002) on the importance of communication skills).
This is also relevant for academic progression as we expect graduates who are working towards research degrees to communicate their views clearly and effectively. This is why one of the requirements for most post-graduate degree programmes is an oral thesis or dissertation defence. So, adequately preparing our students for such professional interaction would be another motivation for assessing the students’ ability to communicate orally as well as in writing.
Consequently, we needed some form of oral assessment that would permit our undergraduates the opportunity to test their oral skills and to be given some feedback. However, we had to be careful about how we defined ‘viva’ and in particular how it related to group presentations. Recall there was resistance to group presentations.
The nature of the assessment we chose took the form of ‘vivas’. In the following section we describe the structure and rationale of these.
Although there is quite a lot of detail in this section, our hope is that other academics can use this as a model for starting to think about using vivas in their own courses. We encourage academics to use Joughin’s framework when considering their design, in particular, the six key questions he uses: (1) What is being assessed? (2) What is the interaction going to look like? (3) How authentic or ‘realistic’ is the viva (this is particularly relevant to clinical practice)? (4) What is the structure going to be like? (5) Who is going to run the assessment; (6) Will it be a mix of different modes of assessment e.g. oral and written (Joughin, 2010, p. 10-13).
We chose to run the vivas in the second year because students in their first year do not always engage robustly with their course and because the final year of study is weighted so as to count for 60% of the final mark. Including the assessment in the second year, with only 40% of their final mark at stake, meant that students would take the task seriously, but not be scared off from trying out a new mode of assessment. Furthermore, we decided to make the viva assessment optional for students.
We chose to use Metaphysics which is an optional second year module with a typical enrolment of around 80 students. The normal method of assessment mirrors that of the wider Department: a 2 hour written exam (that contributes 60% of the final mark) and a 2200 word piece of coursework (that contributes 40% of the final mark). The only slight deviation from the departmental norm is that the exam for Metaphysics is a seen paper; questions were released around 3-4 weeks ahead of the exam.
For the years included in the study (2011-12 and 2012-13) students were offered a choice of how they were to be assessed:
- Option 1: A 2 hour exam in which two questions had to be answered; a 2200 word essay (the Departmental norm)
- Option 2: A 1 hour exam in which you must answer one question; a 30 minute viva; a 2200 word essay
In option 1, the written exam was divided into two sections (A) and (B), and students were required to answer one question from each section.
In option 2, the written exam consisted solely of section (A) and the options in Section B were covered by the viva questions this meant that options 1 and 2 covered the same material.
There needed to be some structure to the viva in order to strike a balance between presentation and unstructured discussion:
Students need a more-or-less predictable structure to allow them to plan for the assessment and to reduce unnecessary anxiety about unknowns; while a high degree of structure can also increase the reliability of the assessment. However, if the assessment is overly structured the capacity to ask probing follow-up questions can be lost… (Joughin, 2010, p. 11)
The option that we fixed upon was as follows. Students were given a choice of three main questions (each one approximating one of the questions on the longer written exam) and each of these three questions had three sub sections. The reason for this was because we believed that just answering one main question for 30 minutes was too open-ended a task, especially with students who were new to this assessment method. All of this material (main questions and sub questions) was released to the students at the same point as the written exam questions were released to students not taking the viva. The following structure is representative:
Main Question: Critically assess view P
Sub Q 1: What is view P?
Sub Q 2: What is the best argument against view P?
Sub Q3: What is the best argument in favour of view P?
However, in addition to this, we prepared follow-up questions designed to give more structure and allow the examiners to probe the students’ knowledge. These questions were not seen in advance, though students were told what to expect. These questions were prepared in advance to minimalize unscripted questions which would mean greater consistency and fairness.
Students were told that the exam would last 30 minutes. During that time they would be expected to answer each of the sub-questions and at least one of the follow-up questions. Students were told roughly what kind of follow-up questions to expect and that time-keeping was down to them.
The exams were all videoed for three reasons. First, implicit bias suggests that decisions that are reached in haste, and that cannot be reviewed, are more likely to be prone to implicit biases (see, for example, http://www.biasproject.org/). By recording the exam, student performance could be reviewed. (For more on bias and related problem with viva’s see Khem, 2001; Birley, 2001). Second, to allow for moderation. Third, to allow review by an external examiner.
To further reduce bias, each viva was carried out by two examiners. One was the module convenor, the other a philosophy lecturer who did not specialize in metaphysics. In order to divide the viva and non-viva students, the optional mode of assessment was announced in the first and second lectures of the module; it was described in the module guide; explanatory notes about it were placed in the virtual learning environment and two emails were sent to all students on the course.
We required students to make their choice by the end of week two of the semester. Ideally they would have had longer; however, practicalities concerning the University’s exam timetable made this impossible. The vivas were then scheduled with students individually, via email in the week immediately prior to the written exam. Hence, preparation time for the viva was reduced by only a matter of a few days.
The uptake was disappointing and unexpected. This also meant that we could have little confidence in the quantitative data (to have confidence we would hope for at least 25 participants). We discuss this worry below but for this reason we only touch on the qualitative data. Consequently although we felt confident to answer our first question regarding uptake we were less confident regarding the second: would students perform roughly as well as they did in other modes of assessment?
In the academic year 2011/12, 7 students (out of the 82 enrolled on the module) opted to take the viva. In the academic year 2012/13, 11 students (out of the 80 enrolled on the module) opted to take the viva. In terms of gender balance, more male (4.34%) than female students (1.85%) chose to sit the viva.
Given the feedback from student, we had expected uptake to be higher. As noted, a little over 10% of students declared that out of exam, essay and oral presentation, an oral presentation would be their preferred mode of assessment, with 43% of the population being generally in favour of such a mode of assessment. In light of this, we found the approximately 11% uptake to be low. We conducted further qualitative research which pursues this theme; in particular, we think about risk and how this relates to moving away from the perceived departmental norm.
One other notable feature of the uptake was the extremely low proportion of female students that opted to take the vivas. There is research that would predict this result. This research typically point to the greater likelihood for men to take risks in HE (see for example, Byrnes, Miller and Schafer, 1999)
In order to pursue an answer to question three – namely would the students find the experience educationally satisfying, the viva examiners ran a focus group with students who had opted for the viva. The focus group had no agenda or list of set questions. The discussion was recorded and the students gave permission for their contributions to be used on the basis that their input would be anonymous. There were a number of general themes that arose. We will give a description of these and then discuss them more fully below.
The first concerned the student’s motivations for opting for the viva. One student summed up a common feeling:
It meant there was less pressure in the exam – it spreads the pressure out.
In fact, interestingly, when talking about the motivation for opting for the viva there was very little talk about the positive aspect of the viva itself. Rather the students talked about the benefits as they related to the other assessment on the course. Consider what is actually referred to in this quotation below. When asked about the motivation for doing vivas one student responded:
[Yes] I think maybe that if a few more exams would have that option [viva + shorter exam] that would be good…. I do like chopping the exams up into little bits – it is less stressful
We further pressed this issue by presenting the group with a scenario where they were taken back in time to before they had chosen the viva. We then presented them with the option of two shorter exams or a viva and a short exam. This choice made them less convinced they would have picked the viva.
I wouldn’t have a preference…
That would be cool if all exams were like that!
Notice again that the shorter exams were the focus. Given their reservation about the positive benefit of the viva qua viva the authors wondered if the students considered themselves risk-takers. When asked explicitly about this, something interesting happened.
None of the students were prepared to talk about their opting for the viva as a ‘risk’. Conversely none were prepared to say that the reason their peers didn’t opt for the vivas was because of it as a perceived risk.
It is unclear why they didn’t like the narrative or ‘risk’. The interesting thing was that despite the reticence to use the language of ‘risk’ it was clear that they did think it was a risk. Interestingly there is evidence to suggest that the effect of student anxiety and risk is over emphasized (Pearce and Geoffrey, 2009).
I don’t think it was a risk – when I was worried about it I just thought I will just work really really hard. And come really prepared. That was my like calculated bet.
It was a calculated bet.
Calculated but not a risk
I think I would be slightly more tempted to do the exams and play it safe… (emphasis Added throughout)
When talking further about why their peers might not have taken the viva, another idea emerged:
One of my friends said that she didn’t like the idea because it was obviously her that was putting the ideas forward. She said that because [exams] were anonymous and you are hidden behind that paper then if you write something that sounds maybe not quite right then it doesn’t have a direct impact on you – immediately I mean. There is no immediate feedback.
This speaks to a worry about the immediacy and transparency of the viva as a form of assessment; there is nowhere to hide!
In an exam you are just a number but the presentation is personalized and you’re in direct contact with the people who assess you. Joughin (1999, p. 152)
However, there is another related issue, that of perceived bias. That is, if the student felt that he had been identified by the lecturer as weak, a trouble-maker or whatever, then he would be reticent to choose the viva for this reason.
Another general observation was that there was very little mention of transferable skills. Recall that one reason that the viva was introduced – and a reason which echoed what students were saying – was to improve communication and help further equip the student for the ‘world of work’ and further study. However, very little of this was talked about in the focus group. In fact only one comment in the 50min focus group mentioned the perceived benefit in this area:
I was recently preparing for job application and interviews and you can cite [the viva] as evidence that you have good communication skills. So I was applying for a research based position and you can say that you’d done all this research and communicated it to other people. So it is useful.
We then asked whether they thought the viva was a good way of assessing what they had learned. Their response was again unexpected. For example, we might have expected an answer along these lines: ‘yes, it was a good test of what I leant. It allowed me to develop more fully what I had read about universals, truth… etc’. Whereas this is what they said (emphasis ours):
I can’t remember the feedback that I got – on that basis I guess I say it was. I felt that it was.
I think it was a good test because you did structure the questions like how you would answer an essay question anyway.
I can’t remember exactly but I think it was mirrored on the structured of what you did in the lectures, so that it was a good test of what we’d learnt. But then it also had to be a test of wider reading. Yes, and actually I was able to put my individual input into as well.
I am the same – I normally sort of like when I write an exam or a piece of coursework I believe what I write (normally). That is what makes you feel like a philosopher I think it is probably about the same for me.
When asked how we might improve the vivas so as to improve the development of their communication skills the students did not seem to have the same understanding of the question as we had.
Doesn’t having an hour exam improve it anyway?
I think it does! Maybe having a mock would help – then you could get feedback and then improve on your communication. The reason that the feedback you get after the exam is kind of looked over sometimes is because you don’t really need to know – because you won’t have any other vivas. But it doesn’t really directly fit into any other exam we have; maybe if we had a mock and then had some feedback that would help communication skills.
In particular notice how ‘communication skills’ is tied directly to assessment. When pushed on whether there is anything else that they thought could improve communication skills– including changing the assessment method completely. All agreed with this reply:
‘no, can’t think of anything else…[at least] nothing that would work with philosophy’.
To remind the reader, our questions were:
- Would there be a sizable uptake of viva assessment if we offered them?
- Would students perform roughly as well as they did in other modes of assessment?
- Would students find the experience educationally satisfying?
Drawing on comments made in the focus group we make a few remarks which may start to explain further our findings relating to (i) and (ii). We end with a couple suggestions for further research and development.
Uptake: given that a significant proportion of students wanted their philosophy degree to help them with their communication skills – with some even mentioning presentations – we would have expected a sizable up take for the vivas. There wasn’t.
One plausible reason is that students are strategic in the type of assessment they pick. In fact, the students in our focus group backed up this suggestion; they talked in terms of ‘weighing up’ and making a ‘calculated bet’. As the vivas were assigned some marks, this meant that in the students’ eyes the risk was too big, and so they stayed with the departmental norm where they felt ‘safer’.
There are two ways which might increase uptake and consequently increase the richness and statistical significance of the data. The first is to make vivas compulsory even though there would be resentment about this as it would mean that vivas were a compelled perceived risk. We feel this would be an excellent response to this issue as the principle of not doing something because students might resent it is not a sound pedagogical one and would probably result in a very uneventful and quiet classroom.
Second, we could make the vivas non-assessed, however the risk here is that in our experience students are reticent to complete work which isn’t assessed. However, if the pedagogy and associated benefits are carefully explained to the students we would hope that they would use the viva to help their oral skills even if it did not count towards a grade.
There is an expectation that students will not perform well in vivas. This is because they have very little experience in formally using their oral skills with a formalised assessment as an outcome. Exams and essays are second nature to them and thus they prefer this mode of examination.
There is a belief that the knowledge required to take a viva is different to that of taking an exam or writing an essay. Arguably (though we don’t defend this here) the knowledge that is required to complete a viva is ‘deeper’ (Sayce, 2007 and Floyd and Gordon, 1998) or is at least of a different form from essays and exams (Wass et al., 2001). Consider that in exams and essays the critical element is very delimited and ‘safe’. The students choose what to include and they themselves are their own prompt and critic, there will be no challenge or idea which will suddenly throw them off their focus.
This contrasts with the viva. Taking a viva forces the student to reflect on why they hold certain views – and importantly they aren’t choosing the why questions (it is perhaps no surprise then that Socrates chose to develop his ideas through dialogue). This in turn means that they have to think about the material from ‘every possible angle’. This is harder.
Satisfaction: taking findings from the focus group, we think the vivas were a qualified success as the students did seem to feel that the experience was a positive one, and no one felt it was an unpleasant experience. Furthermore, there are comments that indicate that the students found the experience to be one that tested them appropriately.
As the viva examiners ran the focus groups, the students may have been to reluctant to give a true account of what they felt about the vivas. This was a serious flaw in how the project was undertaken (see Morgan, 1997 on how bias and power structures undermine focus groups). So even though it was made clear that anything the students said would be ‘off the record’ they may not have fully believed this and hence not said anything which might have reflected negatively on them. Consequently an obvious area for improvement would be for someone other than the viva examiners to run the focus group .
Another improvement would be to give a longer time for the students to consider their options which hopefully would lead to a greater ‘buy in’.
We have already mentioned some possible avenues for further work. One obvious way is to aim to increase the data set to allow us to make some justifiable statistical claims. This in turn will mean we can meaningfully reflect on the different outcomes of written and oral work with a mind to what Joughin (2010, p. 13) calls ‘concurrent validity’.
Another might be to increase the amount of focus groups – run by neutral members of staff – by including students who didn’t opt for the viva. This would allow some of the current themes and ideas – such as ‘risk-avoidance’ – to be explored further.
There is a student desire for the philosophy course to ‘improve communication skills’ and students didn’t opt for the viva, or acknowledge that the vivas were helping develop this skill set. Consequently, the Department of Philosophy ought to think about the ways in which we can meet this student request especially as ‘communication skills’ are central to many degree program specifications,
We suggest that this might start with further work with students on how they understand the term ‘communication’. In particular how they hear the word in conjunction with their degree programme. The outcome of this discussion may include things such as ‘rebranding’ what we already do, or perhaps something more radical.
For example, one current suggestion is to run a module entitled ‘Communicating Philosophy’. Not only does this have ‘communication’ in the title, but the module will involve students being trained by various professionals outside academia with regards to how to communicate philosophy to different audiences; for instance, how might a newspaper communicate certain content? Or how might a radio programme? Or a primary school teacher?
This is a radical move for any philosophy department to make. The traditional distinction between rhetoric and philosophy – where the former is inferior to the latter – still holds sway. And opponents might put this module firmly in the former. However, if the philosophical content is still being taught and – most importantly – quality assured, then this explicit focus on communication may meet the students’ expectations regarding communication, with the added benefit of given them a direct link to employability.
Birley, H. (2001). The Society of Apothecaries Diploma examination in Genitourinary Medicine: death of the viva voce? Sexually Transmitted Infections 77 (3), pp. 223-228.
Bohm, D. (2003). On Dialogue. London: Routledge.
Byrnes, J. P., Miller, D.C. and Schafter, W.D. (1999). Gender differences in risk taking: A meta-analysis. Psychological bulletin 125(3), pp. 367-383.
Carless, D. (2002). The ‘mini-viva’ as a tool to enhance assessment for learning. Assessment and Evaluation in Higher Education 27 (4), pp. 353-363.
Crosling, G. (2000). Transition to university: The role of oral communication in the undergraduate curriculum. Journal of Institutional Research 9 (1), pp. 69-77.
Davis, R et al. (2002). A gap analysis approach to marketing curriculum assessment: a study of skills and knowledge. Journal of Marketing Education 24 (3), pp. 218-224.
Dobson, S. (2008). Theorising the academic viva in higher education: the argument for a qualitative approach. Assessment and Evaluation in Higher Education 33 (3), pp. 277-288.
Floyd, C. and Gordon, M. (1998). What skills are most important? A comparison of employer, student, and staff perceptions. Journal of Marketing Education 20 (2), pp. 103-109.
Gibbs, G. and Simpson, C. (2004). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education (1), pp. 3-32.
Goffman, E. (1974). Frame Analysis: An Essay on the Organization of Experience. Harvard: Harvard University Press.
Huxham, M and Campbell, F. (2012). Oral versus written assessments: a test of student performance and attitudes. Assessment and Evaluation in Higher Education 37(1), pp. 125-136.
Kehm, B. (2001). Oral examinations at German universities. Assessment in Education: Principles, Policy & Practice 8 (1), pp. 25-31.
Joughin, G. (2010). A Short Guide to Oral Assessment. Leeds: Leeds Met Press.
Joughin, G. (1999). Dimensions of oral assessment and student approaches to learning. Assessment and Evaluation in Higher Education 23 (1), pp. 146-156.
Knight, P and Yorke, M. (2004). Assessment, Learning and Employability. Oxford: Open University Press.
Miller, C and Parlett, M. (1974). Up to the Mark: a study of the examination game. Guildford: Society for Research into Higher Education.
Morgan, D. (1997) Focus groups as qualitative research. Qualitative research methods series 16 (1). pp. 80-90.
Pearce, G and Lee, G. (2009). Viva voce (oral examination) as an assessment method insights from marketing students. Journal of Marketing Education 31 (2), pp. 120-130.
Sayce, S. (2007). Managing the fear factor (or how a mini-viva assessment can improve the process of learning for international students). Proceedings of the 6th European Conference on Research Methodology for Business and Management Studies. Academic Conferences Limited, pp. 275-285.
Snyder, B. (1971). The Hidden Curriculum. New York: Knopf.
Stray, C. (2001) The shift from oral to written examination: Cambridge and Oxford 1700–1900. Assessment in Education: Principles, Policy & Practice 8 (1), pp. 33-50.
Wass, V et al. (2001). Assessment of clinical competence. The Lancet 357 (9260), pp. 945-949.
Waterfield, J and West, B (eds). (2005) Inclusive Assessment in Higher Education: A Resource for Change. Plymouth: University of Plymouth
Wistedt, I. (1998). Assessing student learning in gender inclusive tertiary mathematics and physics education. Evaluation and Program Planning 21 (1), pp. 143-153.