The paper examines the author’s experience of developing e-learning materials to support a traditionally delivered first-year undergraduate module, setting this in the context of institutional and national moves towards the greater utilisation of e-learning in enhancing student learning. The paper focuses on the impact of the e-learning materials on student performance over a three-year period and assesses the possible reasons behind the dramatic improvement in marks achieved by those students who engaged with the e-learning materials.
or How to Improve Retention and Progression by Engaging Students
Paul Catley is a senior lecturer in the law department and is currently researching the effectiveness of e-learning in enhancing student retention and progression as part of his University teaching fellowship.
E-learning at Brookes
‘E-learning offers significant benefits and some unique attributes compared with traditional course delivery, such as (i) time and location shifting, (ii) flexible sequencing, (iii) widen access and increasing diversity, (iv) access to extensive resources, and (v) improved communications and acceleration of feedback . The above aims and the overall vision and strategic goals of the University requires excellence and innovation in teaching and learning and e-learning is central to realising these.’ (Oxford Brookes University)
The University’s commitment to e-learning did not begin in 2002, as the Brookes Virtual Project had already been launched in 2000 and before that there were individual pioneers ploughing their own individual furrows. Since 2000 there have been three biennial Brookes Virtual conferences and this paper is developed from a paper given at the most recent conference in July 2004.
Throughout higher education recent years have seen increased pressure on resources. These pressures are detailed in the Report of the National Committee of Inquiry into Higher Education (‘the Dearing Report’). The report, which was published in 1997, expressed concern that further planned cuts on top of a unit cost reduction of 40 per cent over the last 20 years…may damage the intrinsic quality of the learning experience which underpins the standing of UK awards (Dearing – Summary Report, paragraph 13). Subsequent experience suggests that these fears of further financial stringency were not misplaced. As universities’ finances have become stretched they have looked for alternative sources of revenue and this has imposed additional demands on staff who in addition to their teaching face growing pressures to bring in money through research, consultancy, short courses, or any other money-making project which can be viewed as falling somewhere within their universities’ often incredibly broadly written mission statements. In such a world it is perhaps unsurprising that there are concerns about standards. Logic suggests that if student numbers keep increasing and unit funding of students keeps falling at some point the maintenance of quality must come into question. Teaching quality assessments and similar quality assurance mechanisms have become an embedded feature of university life. As universities are expected to teach ever-increasing numbers of students at reduced sums per student, e-learning is seen as a potential answer:
‘In the present context of financial stringency and greatly increased numbers of students entering HE, the maintenance of quality in the face of reduced units of resource is something to which technology assisted teaching might contribute.’ (HEFCE, 1996)
The term ‘might’ in the above quotation is significant. Will ‘technology assisted teaching’ or e-learning make a contribution to the ‘maintenance of quality’? Is it a cheap fix to the problem? Or will it fail to deliver?
The report commissioned by HEFCE noted that massive increases of the use of IT for teaching and learning has led to growing awareness of the potential use of computer-assisted methods in universities’ (Consortium of Telematics, 1997). This report concluded
‘We recommend to the funding bodies that all staff entering HE be required to achieve a defined threshold level of appropriate ITATL [Information Technology Assisted Teaching and Learning] skills and awareness as suited to their role of support, teaching or management. (Consortium of Telematics, p.6)
In the same document they also called for ITATL provision to become part of the quality assurance mechanism:
‘We recommend the new Quality Assurance Agency take specific account of the quality and extent of ITATL in the review of teaching quality.’ (Consortium of Telematics)
Similarly the Dearing Report discussed the need to develop good computer-based learning and teaching materials for higher education (Dearing, 1997 see for example recommendation 15). This belief in ICT as a means of enhancing higher education is also very apparent in the Government’s response to the Dearing Report in which it is stated that:
‘Communications and information technology offers opportunities to increase the effectiveness of learning and to provide improved access to higher education. All those concerned with the delivery of higher education have a responsibility to seek to ensure that the benefits of communications and information technology are exploited as fully as possible.’ (Department of Education and Employment,1998)
This attitude is also seen in the Dearing Report when it is stated that:
‘We believe that the innovative exploitation of Communications and Information Technology (C&IT) holds out much promise for improving the quality, flexibility and effectiveness of higher education. The potential benefits will extend to, and affect the practice of, learning and teaching and research.’ (Dearing, paragraph 13.1)
C&IT will have a central role in maintaining the quality of higher education in an era when there are likely to be continuing pressures on costs and a need to respond to an increasing demand for places in institutions.’ (Dearing, paragraph 13.2)>
I did not start as a zealot convinced of the value of e-learning and longing to be allowed to unleash it on my unsuspecting students. Nor did I consider that I had to become involved in e-learning because of Government or HEFCE diktat. My own route to e-learning began somewhat circuitously. I had enrolled on the Postgraduate Diploma in Learning and Teaching organised by the Oxford Centre for Staff and Learning Development and was looking for acceptable modules which would not involve a regular weekly commitment as I was anxious that my studying should not interfere with my teaching commitments. A number of School of Education modules were acceptable for the Diploma and I found one module on Information and Communication Technology that was taught over a weekend and then subsequently via distance learning. The weekend’s training centred on web-design and the use of Dreamweaver. The assessed work was to create web-based materials for a course. Prior to taking the course I had absolutely no skills in web-design and had never used Dreamweaver or any other similar product. Furthermore, I guess I would have classed myself as sceptical about the use of such technologies and doubtful that they could replace or even effectively supplement university teaching.
Stage 1: introducing web-based materials
Committed to incorporating e-learning into one of the modules on which I taught, I decided to focus on a first-year, first-term module for which I was module leader. The module, Legal Method, was a compulsory module taken by all law students. Approximately three-quarters of law students have not studied law prior to university, so the idea of the module is to enable these students to acquire the basic skills in reading cases and interpreting statutes that they will need in their future legal studies. Many students find the transition to university difficult, they are suddenly in a large cohort of students and away from what for many was a much smaller, more supportive school regime. Where they are taking a subject they have not previously studied, they have further problems in that they are not exactly sure what is expected of them or what will be required. The old Brookes’ system of termly assessment meant that they had to hit the ground running. In Legal Method, the first assessed coursework was submitted in the fifth week of term. Then after eight weeks of teaching, there was a revision week and then exams. The new semesterised system, even though it allows a little more time for students to find their feet, still involves it being all over by Christmas.
Initial thoughts about e-learning
I was aware from student feedback that Legal Method students did not feel sufficiently prepared for the coursework or the exam. Although they practise the skills in seminars and were provided with a Skills Booklet which included exam and coursework guidance, a significant number of feedback responses indicated that they did not know what was expected of them. I therefore decided to focus my initial e-learning materials on web-based exam and coursework guidance. Using questions from previous years I produced a three-step scheme whereby the student would be first faced with the question on screen and would be asked to write down their answer (you can see how low-tech the approach I adopted was!). Having written down the answer the student could click on to the next screen which would present them with details of what the markers were looking for in answer to that question and then a final click would take the student to the marking guide for that question and would invite the student to assess their own work. Student feedback was very positive about the exercise; the feedback was taken in the final lecture – after they had completed and received back their coursework but before they had taken the exam. The feedback indicated that 42% had made use of the coursework guidance and of these 86% said they had found it valuable. Every student responded to the feedback questionnaire by stating that they intended to look at the online exam guidance before the exam.
The Proof of the Pudding
Prior to the introduction of web-based materials, results in Legal Method had been disappointing. Typically approximately 25% of students failed the module at first attempt and a similar number got marks between 40 and 49%. Approaching 80% of students who re-sat the module passed the re-sit which suggested that they had the ability to get through but for whatever reason did not do so at first attempt. If one wanted to be upbeat about the results one could have pointed to a pass rate including the re-sit of around 90%, but the truer picture I felt was one where about half the students were getting what they were likely to view as a disappointing result and half of these students’ study in the following term was being disrupted by the necessity to work for a re-sit exam whilst embarking on their term 2 studies.
Results of the first attempt prior to the introduction of e-learning materials
As can be seen, not only was the number of marks below 50% disappointingly high, the number of marks of over 70% was low. The picture after introducing the e-learning materials was transformed.
Results of first attempt with online materials
The number of fails fell from around 25% to less than 15% and the number of marks below 50% dropped from around half the cohort to just over a third. The number of Firsts and 2:1s was now over a third, whereas before it had been closer to a fifth.
Whilst writing the computer-based materials had been time-consuming, the benefits were not just in better student performance but were also seen in time saved through having fewer re-sit students to see prior to their re-sit and fewer re-sit scripts to mark. The other benefit was that once written the materials would not have to be changed (or so I thought) whereas if the improvement in performance continued the benefits would continue to accrue year on year.
From a position of scepticism I was now well on the way to becoming a convert. The students in 2001-02 had done much better than students in previous years, yet aside from the provision of web-based materials they seemed a comparable group. The entry requirements for the course remained unchanged, the lectures and seminars similarly had not altered. The exam and coursework format continued as before and I continued to deliver all the lectures. However, before being convinced I wanted to see if the improvement would be sustained in future years.
Stage 2: Introducing WebCT-based quizzes
Well on the way to convert status I happily signed up to MediaWorkshop courses on using WebCT and was impressed by what could be done with the package and particularly, having struggled with DreamWeaver, with how easy it was to use. I decided to concentrate on quizzes. My aim was to enable students to assess whether they had understood the material covered in the lectures, as I have found, perhaps because it is a subject that most of them have never previously studied, that many students are not entirely clear what they should be learning from the lecture. I found the mechanics of designing WebCT quizzes very straightforward. However, I found writing the questions more difficult. WebCT has a range of styles of question that can be adopted. All but one of the styles is based around a multiple choice format. Being inexperienced in drafting multiple-choice questions I found the process of coming up with plausible, yet definitely wrong, answers difficult and time consuming. The opportunity to provide additional feedback that would be prompted by particular answers was a nice feature and felt far more supportive than a stark ‘correct’ or ‘incorrect’ message. My original plan to release a quiz on each week’s lecture immediately after the lecture was thwarted by the fact that writing the quizzes took longer than I had anticipated and instead students had quizzes on the first four lectures released in week 4 and then a further five quizzes released to coincide with the final lecture.
In a system where increasing student numbers translate into larger seminar group sizes, formative feedback is often one of the first casualties. Staff struggle to find the time to mark and return work quickly and work which does not form part of the module’s assessment may well be sacrificed to allow time for more pressing demands. Peer assessment is one option, but online quizzes provide a useful alternative which has many advantages. Feedback is immediate. The student completes the quiz, submits their answers, and instantaneously, they are marked and returned. Students can do the quiz whenever they want and they can do it as often as they want. Taking in coursework, marking it, and returning it the next week cannot begin to compare with the immediacy of the online quiz as a means of providing what Graham Gibbs has termed ‘quick and dirty’ feedback.
As course designer of a WebCT-course you have access to an immense amount of information about the students doing your module. You know who does the quizzes, how often they do the quizzes, how long it takes them to do the quizzes, what marks they get, and even the time at which they did the quiz. You also get information about the cohort so you can identify those questions that students are finding easy and those that are proving difficult. You could then look at those questions which are creating difficulties and decide whether it is a problem in how the question is phrased or whether the fact that a number of students are getting it wrong perhaps suggests that you should go over the point again in a subsequent lecture. Given the amount of data one can glean from WebCT, I feel it is only fair to alert students to this before they decide to have a go at any of the questions.
Percentage of Fails
As can be seen from the chart the fall in the number of fails was sustained following the introduction of e-learning materials. From a position where over 25% of students failed the module at first attempt the situation became one where only just over half that number failed. The transformation of results did not only see struggling students perform better, it also saw more students performing very well.
Percentage of Firsts
Over the period the number of Firsts has more than doubled. As other factors have remained the same it seems likely that the explanation for the improved performance is linked to the e-learning materials.
Looking at the ‘before and after’ picture, it is very apparent that the introduction of e-learning materials has coincided with a very marked improvement in results. As stated, the quizzes were largely in the form of multiple-choice questions. Neither the exam nor the coursework involved any multiple-choice questions. However, Section A of the exam did contain ten short compulsory questions (worth 30% of the module’s marks) which were based on material covered in the lectures, for which the knowledge tested in the quizzes would be useful.
Before and after e-learning
Who took the quizzes?
WebCT enables one to look at the results of those who have used the WebCT quizzes and compare them with those who have not done the quizzes. This produces some interesting findings.
In 2003-04, 54.2% of students did one or more of the quizzes. The figure for 2002-03 was 42% – exactly the same percentage as had said they had made use of the web based coursework guidance in 2001-02. Interestingly more students did all the quizzes in 2002-03 as against 2003-04 – perhaps suggesting that the distribution in two batches led to more students going through all the quizzes, whereas the weekly release of quizzes in 2003-04 perhaps linked with feedback on the success of quiz-takers in 2002-03 led to a higher percentage of students attempting at least one quiz.
Female students were marginally more likely to take the quizzes than male students (56.8% of female students did one or more quiz as against 50.6% of male students). This perhaps fits the stereotype of female students working harder slightly better than the stereotype of male students feeling more comfortable with computer technology, but perhaps more noteworthy is the fact that there was little gender difference in terms of quiz taking.
With regard to entry qualifications there was no real distinction – the average A-level points of those who did the quizzes was 22.1 points, the average for those who did not take any of the quizzes was almost identical at 22.6 points. Therefore in so far as A-level grades are an indicator of likely university success there was virtually no difference between the two groups.
The final comparison that I elicited from the statistics was whether the students had previously studied A-level law. This showed that those who had previously studied law were slightly more likely to take the quizzes than those who had not (59.1% cf. 52.9%).
Comparing quiz-takers’ performance with that of non quiz-takers
So far what has been shown is that the introduction of e-learning materials coincided with an improvement in results and as there were apparently no other significant changes it perhaps suggested a causal link. What the tracking qualities of WebCT enabled me to investigate further was the relative performance of quiz-takers against non quiz-takers. As explored above it has already been shown that the two groups were substantially similar, female students were slightly more likely to take the quizzes, as were those who had previously studied law (a small minority of the overall cohort), but the entry qualification of quiz-takers and non quiz-takers was basically the same – the quiz-takers having very slightly worse A-level grades than the non quiz-takers.
Comparing the results of quiz-takers and non quiz-takers 2002-03 and 2003-04
The chart shows a dramatic difference in performance between the two groups. The picture of results for non quiz-takers is very similar to the results of students prior to the introduction of e-learning materials (Chart 1 above) – about 25% are failing, a similar number are getting marks in the 40s and very few of them are getting Firsts or 2:1s. The picture for those who took the quizzes (and remember the threshold for being classed as a quiz-taker was completing one quiz) is quite different. Less than 5% of the quiz-takers failed and less than 15% got a mark in the 40s. The most common result was a mark in the 60s and the percentage of Firsts was approaching 25%. Looking at the comparison in terms of average mark it possibly looks less dramatic but is still very significant – quiz-takers averaged 58.8%, whereas non quiz-takers averaged 49.8%. Therefore quiz taking was linked to an average performance almost a whole class higher than the average performance of non quiz-takers.
Were quiz-takers more engaged with the course?
In 2003-04 attendance at Legal Method seminars was monitored. Seminars remained optional, but students were aware that records of attendance were being taken. Once the module was completed and the exams marked the seminar leaders pooled the attendance records and these were then compared them with the quiz-taking records.
Links between quiz taking and seminar attendance
|Average number of seminars attended (maximum 8)||Percentage of seminars attended|
It is very apparent from these figures that quiz-takers had a much better attendance record than non quiz-takers. This raises a chicken and egg question of whether being a quiz-taker leads to someone going to seminars or whether being a keen seminar attender makes someone more likely to have a go at the quizzes. It might seem that this evidence casts into doubt the value of the quizzes. Were the quiz-takers just the harder working students who then got the best results? This conclusion, however, does not explain the transformation of results following the introduction of e-learning. There is nothing to suggest that there were not an equivalent proportion of potentially hardworking students prior to the introduction of e-learning students as there were after the introduction. Yet results prior to the introduction of the e-learning materials were significantly worse. It therefore seems more plausible to argue that it is the e-learning materials that is helping to engage students. This hypothesis is supported by the results of Legal Method quiz-takers in other law modules.
|Quiz-takers||Non quiz-takers||Percentage by which Legal Method quiz-takers outperformed non quiz-takers|
None of these three courses involved any online quizzes or other WebCT elements.
Legal Process, like Legal Method, is a first-term, first-year, single module. During the period in question it was compulsory for those studying for an LLB (hons) degree and was an option for those studying for a combined honours degree in Law. It is assessed entirely by exam. Those taking quizzes in Legal Method on average performed 6.8% better than those who did not take the quizzes.
Contract is a double module which during the period in question ran in the first and second terms. It is required for students wanting to enter the legal profession so, whilst not compulsory, is taken by nearly all law students. It is assessed by means of examination (75%) and coursework (25%). Remarkably the difference in performance when one compares quiz-takers and non quiz-takers was even greater than in Legal Method (10.8% cf. 9%). One possible explanation is that Contract is a module where knowledge builds on material covered in previous weeks so that students cannot pick and choose the parts on which they want to concentrate, but must attend regularly and work hard throughout. A possible reason why a number of students struggle in the module is that they concentrate in term one on those modules that are assessed in term one and hope, often mistakenly, that they can catch up with their Contract Law studies in the second term. If the Legal Method quizzes help to engage students with the course as a whole and not just with the Legal Method module it may be that the quiz-takers’ better seminar attendance also applied to Contract and that their greater commitment to the course led to markedly better performance than those who were less engaged and possibly fell behind in their studies.
Constitutional Law, like Contract, is required by students wanting to enter the legal professions and therefore is taken by nearly all law students. Throughout the period in question it ran in term 2, so it took place the term after Legal Method. Though less marked, there is still a clear difference between the performance of quiz-takers and non quiz-takers which suggests that the engagement achieved by the quizzes does have some lasting benefit – at least into the following term.
After three years of using e-learning materials in Legal Method, I am persuaded as to the value of quizzes and other online materials. At a time when we are seeing increasing student numbers and pressures to improve our performance in widening access I believe that e-learning can provide an important supplement to traditional teaching approaches. The time taken in producing e-learning materials should not be underestimated, but, with a little training, systems such as WebCT are relatively straightforward to use. The time saved in having fewer re-sits is an important benefit and whilst quizzes may, depending on the subject area, need a little updating the work is largely done for future years whereas the benefits will hopefully continue year after year.
It is not however a panacea. Some students love online quizzes. Others do not. Despite being able to tell students about the relative success of those who took the quizzes in the past, over 45% of the students taking the module last year did not attempt a single quiz. Perhaps their failure to engage with the quizzes was symptomatic of an unwillingness to engage with the course – a worrying conclusion when considering first-year, first-term students. Alternatively, it may be explained by technophobia or possibly a lack of technical ability. However, what is clear is that we should not assume that all students will want to embrace the technology. This does not necessarily mean that the old ways are necessarily the best. My linked research on seminar attendance found that 9% of students didn’t go to a single seminar and given that it was a first-year, first-term module they couldn’t really claim to have been put off by their experiences of seminars.
Notwithstanding these reservations, from my experience in this module and in other modules where I have incorporated e-learning elements, I would endorse the statement in the Dearing Report (1997) that ‘Communications and information technology offers opportunities to increase the effectiveness of learning.’ I would also support the aim of the Brookes Virtual Project of developing ‘an integrated learning environment through extensive use of C&IT, and thereby to enhance the learning experience gained by students studying a Brookes course.’ One of the downsides of WebCT is that it creates its own demand. If students use and enjoy WebCT on one module it develops a hope and expectation that it will feature in other modules. One of the issues for the future may be how to satisfy that demand, whilst recognising that not all students want an e-learning dimension to their studies.
Paul Catley is a senior lecturer in the law department and is currently researching the effectiveness of e-learning in enhancing student retention and progression as part of his University teaching fellowship.
Consortium of Telematics for Education (University of Exeter), The NatWest Financial Literacy Centre (University of Warwick) and The Institute of Learning and Research Technology, Bristol University, (1997) Information Technology Assisted Teaching and Learning in Higher Education, commissioned by the Higher Education Funding Council for England, HEFCE Research Series M11/97, (Retrieved on 21st October 2004 from the World Wide Web: http://www.hefce.ac.uk/pubs/hefce/1997/m11%5F97.htm).
Department for Education and Employment, (1998), Higher Education for the 21st century: Response to the Dearing Report. (Retrieved on 21st October 2004 from the World Wide Web: http://www.lifelonglearning.co.uk/dearing/index.htm).
Higher Education Funding Council for England and Wales (1996), commissioning document quoted in The Consortium of Telematics for Education (University of Exeter), The NatWest Financial Literacy Centre (University of Warwick) and The Institute of Learning and Research Technology, Bristol University, (1997), Information Technology Assisted Teaching and Learning in Higher Education, commissioned by the Higher Education Funding Council for England, HEFCE Research Series M11/97, (Retrieved on 21st October 2004 from the World Wide Web: http://www.hefce.ac.uk/pubs/hefce/1997/m11%5F97.htm).
National Committee of Inquiry into Higher Education, (1997), Report of the National Committee of Inquiry into Higher Education, (Dearing Report), (Retrieved on 21st October 2004 from the World Wide Web: http://www.ncl.ac.uk/ncihe/index.htm).
Oxford Brookes University, E-learning Strategic Plan 2002 to 2005, (Retrieved on 21st October 2004 from the World Wide Web: http://www.brookes.ac.uk/virtual/strategy/files/brookes_e-learning_strategy.pdf).