Tag to track? Analytics to measure the impact of educational policies

Abstract

Analytics, or the utilisation of user data to enhance education, derives from business intelligence and has received considerable attention over the last few years (Cooper, 2012; Goldstein and Katz 2005). In the context of institutional research, it is argued that data can aid the decision making, implementation and analysis of policy and change (e.g. Saupe, 1990), and that new forms of online data collection make the incorporation of educational data more accessible and analysable for this purpose (e.g. Campbell and Oblinger, 2007).

An academic analytics approach has been used to evaluate the impact of two recently introduced educational policies designed to enhance the student experience at a London based university. These are a revised academic framework, which resulted in the redesign of most courses; and an online submission, marking and feedback policy. Each has had significant implications for the use and uptake of technologies to support learning, teaching and assessment.

The virtual learning environment of the institution has been used to collect longitudinal user data, including through customized page tagging, to enable the impact of the policies to be visualised and assessed. This paper discusses the findings.

Introduction

Deriving from business intelligence services, the possibilities of analytics for the educational context was explored by Goldstein and Katz (2005) who introduced the term ‘academic analytics’. Various forms of analytics have been proposed since, related to aspects of education, and there is no consensus on the precise definition, orientation and aims of analytics in an educational context (Cooper, 2012; van Barneveld et al., 2012, p. 5). Siemens et al. (2011) differentiate between learning analytics, where the primary aim is the use of student data in combination with analysis and modelling techniques to improve student retention, attainment and success, and academic analytics, which focus, through the collection and analyses of institutional data to improve the organisational effectiveness related to student learning. Academic analytics, which is the focus of this article, aims to use “institutional data to produce actionable intelligence” (Campbell et al., 2007, p. 42), which can aid institutions to address student success, accountability, and policy making, while better fulfilling their academic missions (Campbell and Oblinger, 2007).

Academic analytics introduces data sources not formerly routinely available to institutional researchers. Widespread information and communication technologies have now made it relatively straightforward to mine data deriving from web based learning technologies such as Virtual Learning Environments (VLEs) and library systems (e.g. Cooper, 2012; MacNeill and Mutton, 2013). New techniques of data collection, such as web analytics, used in the commercial sector to track and analyse website traffic (e.g. Cutroni, 2010; Ledford et al., 2010), make it possible to gather and analyse user behaviour from web based learning technologies with relative ease (e.g. Kraan and Sherlock, 2013; MacNeill and Mutton; 2013).

The importance and relevance of using data to support the evaluation of institutional policies, to inform new iterations and aid future decision making is expounded by Saupe (1990) and Yorke (2004). Case studies describing how academic analytics can be used to support the evaluation of institutional policy are sparse, and most studies describe the opportunities and challenges, highlighting the importance of theoretical frameworks, approaches, policy development and investment to stimulate the institutional update and to safeguard the success of institutional analytics projects (c.f. Daniel, 2014; Dziuban et al., 2012; Ferreira and Andrade, 2014).

In this study longitudinal data was collected to evaluate the impact of two new institutional policies at a London based HEI. The first was a complete review of the academic framework (RAF), as a result of which all modules were redesigned and revalidated with increased attention paid to Technology Enhanced Learning (TEL). The second policy prescribed the use of e-submission, marking and feedback (e-SMF) for all formative and summative assignments and assessments within the university. Both policies are expected to have an impact on the use of the VLE. User data collected over two academic years is used in this analysis.

This paper presents a method for measuring the impact of institutional policies through analysis of VLE user data. It describes the policies, data collection, methods and results followed by discussion of the findings.

Revised Academic Framework (RAF)

This was developed to enhance student satisfaction, achievement and graduate outcomes by:

  • Providing greater course coherence and cohort identity
  • Developing academic and employability skills,
  • Enabling more individual support and monitoring,
  • Reducing the number of summative and formative assessment and instead focusing on “progressive learning”,
  • Reducing the amount of module and quality assurance administration,
  • Adopting a simplified course structure (30 credits instead of 15 credit modules).

To achieve this all modules were redesigned to include the following characteristics:

  • ‘Assessment for learning’ and feedback designed at course level,
  • Technology enhanced learning and communication at course level,
  • Employability skills (including placement opportunities),
  • Academic skills and wider skills and knowledge (including online resources),
  • Research and practice-led teaching and Capstone projects (final year project),
  • A university-wide Personal Tutor Scheme.

Learning technologies were seen as an important medium to deliver the RAF objectives and academic staff were encouraged to utilise online communication tools such as a discussion boards to enhance the course identity. The use of multimedia including blogs, wikis, YouTube and visual images was encouraged as an alternative to traditional textual content to enhance interaction and collaboration between students.

In addition to changes to the curriculum and delivery, the academic timetable was revised to reflect concerns about attendance and to align teaching blocks more closely. The academic year remained divided into two teaching blocks, with an optional third for courses that run over the summer. It was recommended that modules be developed that run over two rather that a single semester.

e-Submission, marking and feedback (e-SMF) policy

This policy was designed to increase submission flexibility and enhance the quality of feedback to students. It does not prescribe a particular technology other than, for coursework, Turnitin, a third party service for online assignment submissions was recommended, as it checks submissions for similarity to identify potential plagiarism. Blackboard Assignment tools were also suggested to allow submission of a wider variety of formats and enable use of the Blackboard Test Manager for objective tests. Although these instruments were used by staff before the introduction of the policy it was anticipated that during academic year 2013-14 usage would increase as a result of the RAF implementation, despite an expected decrease in the number of submissions.

TEL and the VLE

This study focuses on investigation of the use of the VLE (Blackboard) which supports the majority of the e-learning and communication activities, directly or by integration with third party tools such as Turnitin, as is largely the case across the sector (e.g. Browne et al., 2010; Matsušů et al., 2012; Walker et al., 2014). However, other technologies, for example the use of WordPress blogs, are available to all members of staff and students.

To investigate the impact of the policies, the functionalities of a VLE are grouped as follows:

  • Content distribution: dissemination of mainly course related content, including files and texts items, and multimedia, such as audio, images and video,
  • Content creation: creation of content by students, e.g. blogs, journals, wiki’s and video conferencing (Blackboard collaborate),
  • Communication and dissemination of information: including e-mail, announcement and discussion board activity,
  • Assessment and assignment instruments: Turnitin for objective testing and dissemination, e-submission and online marking of text based assignments.

A recent European survey of educational TEL by Matsušů et al. (2012) and the UK surveys of TEL by Browne et al. (2010) and Walker et al. (2014) show that the functionalities in Content distribution, Assessment and Assignment and Communication tend to be most utilised. However, the use of TEL tools was found to be in general “less than 25% across an institution’s range of courses”, with the exceptions of assignment submission, plagiarism detection, use of external web based recourse and asynchronous collaborative tools (Browne et al., 2010, p. 25). The increase in assignment submission and plagiarism tools reflects a wider trend in the sector and the report concludes that “the results reinforce […] that much TEL usage is still supplementary to traditional forms of delivery” (Browne et al., 2010, p. 25).

This study starts with the premise that VLE user behaviour changes as a result of both the RAF and e-SMF policies. The impact of the RAF is measured by analysis of the user content creation tools to stimulate-learning, and multimedia to supplement the traditional curriculum (Wilks and van der Sluis, 2014); and the e-SMF policy by analysis of the use of online assessment tools to support online submission, marking and feedback. However, the e-SMF policy is likely to be influenced by the RAF which has resulted in fewer modules and prescribes less summative assessment.

Methodology

In this analysis two types of data collection techniques are applied:

  • Client-side data collection, whereby user interaction with websites is tracked with the help of a tracking code on each web page which is sent to a third party who collect and analyses the data.
  • Server-side data collection, whereby the web interactions are stored in log files which are analysed.

Although both methods can be used in conjunction, a comparison might for several reasons be difficult as both methods result in slightly different metrics (Ledford et al., 2010). Client side data collection through Google Analytics is the main data source used in this paper.

Rather than track the number of page hits, which is not regarded as a useful metric (e.g. Cutroni, 2010; Ledford et al., 2010), goals were set on functionalities to track what was used or visited within the VLE. Google analytics can be customized to count the number of goal completions for each day, indicating how many visitors have used a functionality at least once. This metric will be used throughout this paper (for more details about the methodology and customization see van der Sluis, 2012; Clifton, 2010; Cutroni, 2010).

Server-sided data collection was used, for example to make a distinction between staff and students. This data has been supplemented with data from the Human Resources department to enable the creation of a breakdown of staff by a range of categories. Data from third party providers such as Turnitin has been used to enrich the university data.

The data was collected and analysed using descriptive statistics and where relevant grouped by semester and academic year to make comparisons possible. The period for each semester is given in Table 1 below. Comparisons of semesters 1 and 2 for the academic year 2012-13 and 2013-14 were investigated for statistical significance where relevant. In most cases normality could not be assumed and to provide consistency all group comparison are made using the Mann-Whitney U (MWU) test as a nonparametric alternative for the independent t-test (Cohen et al., 2007). This was done using R software for statistical computing.

Academic Year Semester 1 Semester 2
2012-2013 24/09/2012 – 27/01/2013 28/01/2013 – 05/06/2013
2013-2014 23/09/2013 – 05/01/2014 06/01/2014 – 25/05/2014

Table 1 Semester dates

User Base

To give perspective to the populations and user numbers of particular functionalities a potential user base is needed.

Student numbers went down by around 6% from 23105 in academic year 2012-13 to 21614 in year 2013-14. The number of staff went up from 3155 to 3209 over the same period. However, not all members of staff and students will have a need to access the VLE, either because their role does not include support for teaching and learning or because students might be on courses which do not use the VLE. Nevertheless the numbers are indicative. See Table 2.

Year Students Staff Grand Total
2012-13 23105 3155 26260
2013-14 21614 3209 24823

Table 2 Total number of potential users

Results

This section will explore the impact of the two policies, by analysing changes in VLE usage in the academic year after the introduction of the policies (2012-13 to 2013-14) through investigation of login, session duration and functionality data.

Login

The login data does was not significantly altered by the changes to the two policies. The average login per day (server-sided) per semester fluctuated a little, and dropped in 2013-14, which might reflect the size of the potential user base, however, the percentage of the user base remained fairly constant at around 30% throughout both semesters and academic years; see Table 3 below.

The comparison of the logins per day of 1st semester and 2nd semester of academic year 2012-13 and 2013-14 shows some variation due to the changes in the length of the semesters and start of assessment periods, but otherwise the patterns during the teaching weeks remain similar, as shown in the figures 1 and 2 below. Most logins occurred on Mondays and tailed off towards the end of the week, with low numbers of logins during the weekends. The graph of the 1st semester (figure 1) clearly shows a drop in logins during the reading/enrichments and the festivity holidays. The 2nd semester shows a drop of logins towards the end of the teaching weeks (end of March and April) followed by higher logins during the assessment period after the preparations weeks and tailing off towards the end. Both academic years have different semester periods and the 2nd semester of 2013-14 starts and ends earlier as a result of the RAF, nevertheless the pattern of logins is very similar across both academic as shown in figures 1 and 2.

(login/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14
average 7923 7185 7083 7289
median 8167 7586 7471 7742
max 11963 10916 11225 11090
min 2254 2369 1692 2095
range 9709 8547 9533 8995
average %

from user base

30% 27% 29% 29%

Table 3 Distinct logins/day per semester (server-sided)

fig1

Figure 1 Login 1st Semester 12-3 & 13-4

 

figure2

Figure 2 Login 2nd Semester 12-3 & 13-14

Visit duration and pages per visit

Another measure of engagement with the VLE is the average session duration and the number of viewed pages per session. In general the average session duration took between 30 and 35 minutes with a little variation over the different semesters as shown in Table 4.

More significant is the average number of pages viewed per session, which increased from 14.21 and 13.03 in the 1st and 2nd semester 2012-13 to 20.34 and 19.20 respectively in 2013-14. This data indicates that although the session duration remained constant, the number of pages visited per session increased following the introduction of the RAF and e-SMF policies.

Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14
Average session duration (min) 32.05 30.23 31.10 34.46
Average pages/session 14.21 13.03 20.34 19.20

Table 4 Duration and pages per session (client-sided)

Content distribution

In general the dissemination of traditional content or documents, such as presentation slides and written documents (files), written content posted as an item, and links (URL), are preferred to multimedia content, such as images, audio and video, or mashup links to Flickr photos, YouTube videos or SlideShare presentations, see tables 5-7 below.

Table 5, and Figure 3 below, shows that in general more files are added to the VLE (staff only) in the 1st semester than in the 2nd. A significant increase in 2013-14 was found in semester 1 (p <.05). This increase is likely to be a result of the RAF and the need to repopulate courses with new material. A slight increase of number of files viewed for academic year 2013-14 was not statistically significant and the percentage of the average logins/day (client-sided) remained unchanged at around 11%.

Written content, both posted as an item and adding external links (URL), remained relatively stable over both academic years, and no significant difference was found by a semester wise comparison, see Table 5. The general trend is that more items are written in the 1st semesters than in the 2nd, which follows the dissemination of files.

Multimedia and mashups to enrich course content did not seem to be affected by the implementation of the RAF, see Tables 6 & 7.

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Add File 24 18 32 21 p <.05 p >.05
View File 964 988 1087 1061 p >.05 p >.05
Add/Edit Item 118 93 129 91 p >.05 p >.05
Add/edit URL 8 7 9 6 p >.05 p >.05
% View Files/Avarage login 10% 11% 13% 11%

Table 5 Documents (files, items, url) (client-sided)

documents

Figure 3 Documents (client-sided) (note: left and right axis differ)

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Add/edit Image File 0.7 0.3 0.5 0.6 p >.05 p <.05
Add/edit Audio File 0.4 0.3 0.5 0.3 p >.05 p >.05
Add/edit Video File 1.1 0.5 1.2 0.8 p <.05 p >.05

Table 6 Multimedia files (image, audio, video) (client-sided)

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Add Flickr Image 0.1 0.1 0.0 0.1 p <.05 p >.05
Add Slideshare presentation 0.2 0.0 0.1 0.1 p <.05 p <.05
Add YouTube Video 1.3 0.7 1.6 1.5 p <.05 p <.05

Table 7 Mashups links (Flickr, Slideshare, YouTube) (client-sided)

Content creation

The VLE has various interactive functionalities that enable student led creation of content. An increase in the use of these functionalities in academic year 2013-14 would indicate that the RAF and its strong emphasis on TEL influenced the course design and delivery.

Discussion board

The discussion board is a versatile functionality and Figure 4 and Table 8 show that, in general, usage (views and creation of threads) varies between the 1st and 2nd semester, with less usage during the latter. A semester by semester comparison reveals that the discussion board is intensively used during semester 1, but there is a significant decrease in semester 2 (sem1 p <.05, sem2 p <.05) in viewing discussion board posts during academic year 2013-14 in comparison the year before, see Table 8. The average number of threads created per day remained similar, during the 1st semester, but dropped significantly (p <.05) from 20 to 15 on average/day during the 2nd semester. The percentage of views as a percentage of the average login/day (client-sided), confirm the general trend with less views during the second semester.

Various reasons could be proposed to explain this. More intensive use during the first semesters indicates that discussion boards at the start of the academic year might suggest usage for communication purposes. It will require a more in depth investigations with other research methods, but the lower usage of the discussion board during 2013-14 might currently be explained due to the overall reduction of modules, which limits the need for clarification through for example FAQ’s.

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Create Discussion Board 3 2 3 2 p >.05 p >.05
Create Discussion Board Thread 25 20 26 15 p >.05 p <.05
View Discussion Board 561 423 500 283 p <.05 p <.05
% View DB/average login/day 5.8% 4.8% 5.8% 3.0%

Table 8 Discussion board (client-sided)

discussion board

Figure 4 Discussion board (note: left and right axis differ)

Blogs and wikis

The usage of the blogs and wikis in comparison to the discussion board is relative low, see Table 9 below. The average number of blog posts added per day in combination with the range (0 min, 106 max) indicates that the blogs were used in only a few modules. A small, but significant change is apparent in the number of blog posts that were added or edited each year for both semesters.

The relatively similar average/day and the range (0 min, 44 max) of the wiki page modifications indicates the use by one or maybe two groups or cohorts of students, this went up significantly comparing semester 2 (p <.05), but not for semester 1. The average wiki views/day went up significantly comparing semesters 1 (p <.05) and 2 (p <.05). However, the significance of this increase is limited as the number of users is still very low and it is too early to judge whether this increase will be sustained in the future. Nevertheless, the increase in the blog post and wiki pages modified may be an indication of increased awareness of these interactive tools as a result of the RAF.

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Create Blog 0.2 0.2 0.3 0.1 p >.05 p >.05
Add/edit Blog Entry 8 6 13 16 p <.05 p <.05
Create Wiki 0.2 0.3 0.5 0.2 p <.05 p >.05
View Wiki 15 12 23 40 p <.05 p <.05
Modify Wiki Page 3.6 3.3 5.2 10.9 p >.05 p <.05

Table 9 Blogs & Wikis (client-sided)

Video conferencing

The average number of staff a day that create a video conferencing session with Blackboard collaborate is low, see Table 10. The number of staff and students that launch Blackboard collaborate to meet is also low. Although, the number of sessions created did not go up, the number of sessions launched went up significantly comparing semesters 1 (p <.05) and semesters 2 (p <.05), which might indicate an increased awareness of this collaboration tool as a result of the RAF. However, it is too early to tell if this increase will be sustained in the future.

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Create Collaborate session 0.4 0.3 0.4 0.3 p >.05 p >.05
Launch Collaborate session 0.4 0.4 2.1 3.2 p <.05 p <.05

Table 10 Video conferencing (client-sided)

Communication

The data shows a slight non-significant drop in the average number of visitors (staff only) that sent e-mails during academic year 2013-14 in comparison to the year before, see Table 11. Slightly more e-mails/day were sent during the 1st than the 2nd semester. The small reduction from 2012-13 to 2013-14 might be explained by the reduced number of modules due to the RAF, which reduced the need to communicate through different modules.

The metric add/edit announcement needs to be handled with care as the URL snippet for this destination goal changed due to an upgrade of the VLE over the summer between academic year 2012-13 and 2013-14. Semester 1 can be compared with semester 2, but academic years cannot be compared. In general more announcements are added or edited (staff only) in the 1st semester than in the 2nd. However, viewing announcement is an important source of information for students, which as a percentage of average logins/day (client-sided) remained relatively constant with around 80% over both academic years.

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Sent Email 38 31 34 28 p >.05 p >.05
Add/edit Announcement* 61 59 130 92
View Announcement 7615 7176 7136 7531 p >.05 p >.05
% View Announcement

/average login

78% 81% 83% 81%

 

Table 11 E-mail & Announcement (client-sided)

Assignment and assessment

The VLE has an important role in supporting the distribution, collection and marking of assignments and/or assessments, either formative or summative, and this was an area, as discussed above, where both policies were expected to have an impact.

The number of assignments created (staff only) a day went up significantly comparing academic year 2012-13 and 2013-14 semester wise (sem1, p <.05, sem2, p <.05). Various other metrics related to the Blackboard assignment (e.g. submission and viewing) could not be explored further as result of the upgrade in July 2014. And the new goals measure slightly different activities, making a comparison inappropriate, see Table 12.

The number of Turnitin assignments added (staff only) went up slightly, but significantly, comparing academic year 2012-13 and 2013-14 semester wise (sem1, p <.05, sem2, p <.05). The percentage of visitors who view/submit a Turnitin assignment each day remained fairly constant with around 7% for the first three semesters, but grew to 15% of all visitors in the 2nd semester 2013-14. Semester wise comparison shows that the average number of views/submissions fell slightly but, when the 2nd semesters are compared, it can be seen that the average/day Turnitin view/submissions grew significantly (p <.05), from 674 in 2012-13 to 1403 in 2013-14. This could be explained by the RAF which stimulated the use of thin long modules, with assignments at the end of the academic year.

The pattern of viewing/submitting Turnitin during academic year 2013-14 in comparison to 2012-13 is reflected in the third party statistics of Turnitin. Table 13 and figure 5 below summarise the number of papers submitted and the number of graded papers (sem1 Sep-Jan, sem2 Feb-June). Figure 5 shows a noticeable growth of graded papers in Turnitin between the first and second semesters in 2013-14; over this period online marking went up from 2% to a little over 75% of all e-submissions. The graph also indicates that using Turnitin to submit papers was relatively well established before the introduction of the e-SMF policy, but had a considerable impact on e-marking by staff.

The number of started tests went down from 2012-13 to 2013-14, see table 12 below. Slightly more tests were started during the 1st semester, 278 and 184 on average per day for 2012-13 and 2013-14 respectively compared with 163 and 148 during 2nd semester. The drop was significant for the 1st semesters (p <.05), but not significant for the 2nd semesters (p >.05). This, as discussed above, is likely to be as a result of the RAF, which stipulated less assessment, both summative and formative, and instead stimulated progressive learning.

(average/day) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14 MWU test (Sem 1) MWU test (Sem 2)
Add Bb Assignment 2.3 2.1 4 3.3 p <.05 p <.05
Review Bb Assignment 55 41
re-View/upload Bb Assignment 146 150
Add Turnitin 8 9 11 13 p <.05 p <.05
View/Submit Turnitin 637 674 565 1403 p >.05 p <.05
% View Submit Turntin/average login 7% 8% 7% 15%
Add Bb Test 3 1 4 2 p <.05 p <.05
Start Bb Test 278 163 184 148 p <.05 p >.05

Table 12 E-assignments and e-test (client-sided)

(total papers) Sem1 12-13 Sem2 12-13 Sem1 13-14 Sem2 13-14
Turnitin Submissions 39398 42369 36363 48831
Turnitin Grade paper 754 1343 11085 37279

Table 13 Submission to Turnitin and graded papers (third-party stats)

turnitin

Figure 5 Turnitin submission and grading (Third-party stats)

Discussion

This study indicates that both RAF and e-SMF policies had an impact on student and staff engagement, although to different extents, and that data from educational systems such as the VLE can be used for traditional IR objectives including evaluation of aspects of institutional policies. A richer picture would require an evaluation with a more holistic approach, including use of a variety of quantitative data sources such as attendance and attainment as well as qualitative data to capture the experience of staff and students. This case study reports at a university level, which was seen appropriate for this paper; a multi-level analyses by for instance faculty or school could have provided deeper insights into the difference in uptake and impact of the policies, and provided educational managers with more detailed information upon which to act. Nevertheless, as the results here indicate, academic analytics is capable of providing a rich insight in changes over time, which aids the evaluation at the HEI.

The extent to which each policy could be measured using analytics data differed. The RAF as a framework has prescriptive elements such as a reduced number of modules and changed timetables. The redesigned and repopulated modules have to incorporate its objectives, and TEL was the recommended format for realising these. The RAF did not prescribe the use of TEL so that measuring the impact of the RAF using academic analytics was, in this respect, a relatively indirect measurement; in contrast the e-SMF policy had a direct impact on the use of the VLE. The degree to which policy drives the use of TEL needs to be taken into consideration when utilising analytics as a data collection method to measure impact.

Within this analysis the metrics have been identified and defined carefully to align with the impact that is measured. But issues arose regarding interaction with discussion boards because they can be used for multiple purposes, such as facilitating communication or disseminate content, so the extent to which this metric changed as result of policy is open to question. Further, the use of educational tools is subject to many different factors, and a simple change in design will result in different user behaviour and may generate unexpected applications and utilisation. Cross referencing findings with user accounts should be considered to develop a complete picture.

Some of the metrics described above, for example submission of Blackboard assignments, did not remain stable over the data collection period. The use of web analytics in this respect needs to be considered with care since, data from learning systems are subject to permanent change and upgrades. The design and use of the Blackboard assignment manager changed considerably, enhancing its features and its ease of use, so the extent to which the metrics were measuring the same thing pre and post upgrade may be an issue. Clearly, academic analytics metrics and the findings deriving from them have a limited useful life.

There was considerable interplay between the two policies, as we have seen, especially in the assessment and assignment category. Ideally they would have been introduced in succession so the independent impact of each could be established more fully. Nevertheless this paper shows that data collection methods and analysing techniques can give important insights into the impact of institutional policies.

Conclusions

The overall usage of the VLE seemed to reflect that within the HE sector. Functionalities to support the dissemination, collection and distribution of content, assessment and communication are most utilised as discussed above (c.f. Browne et al., 2010; Matsušů et al., 2012; Walker et al., 2014). The impact of both policies on the general usage of the VLE appears minimal; there was little change in the number of logins. The session duration did not change significantly; however, the number of pages visited per session increased which suggest an increase in activity, but could be due to various reasons.

A comparison of the usage of the VLE functionalities by academic year showed varying outcomes that might be the result of the two policies. Little change was found in the use of communication tools.

The RAF policy appears to have the greatest impact on student interaction with assessments and assignments. The number of Turnitin assignments viewed and submitted went up in the 2nd semester 2013-14 as result of the RAF’s long and thin modules. The use of objective testing is likely to have been reduced because the RAF prescribes less formative and summative assessments for a smaller number of modules. It was expected that the RAF would have an impact on the category content distribution and content creation; however, it seems to reinforce traditional distribution of content which went up in 2013-14. This probably needs to be seen in the light of the redesigned and repopulated modules which result could wear off in the future.

While a semester wise comparison indicated an increase in the use of some multimedia metrics, and the use of content creation tools, such as blog, wiki and video conferencing; the overall numbers are too small to signify a sustained change in behaviour. As such, the RAF cannot be said to have resulted in a significant change in the delivery of content and approach to teaching and learning by enhanced use of TEL. The low uptake of the VLE for e-learning is also apparent in the reduced use of the discussion board in the 2nd semester of 2013-14; which could be explained by the reduced number of modules and utilisation of an alternative content dissemination tool.

The impact of the e-SMF policy is mainly seen in the sharp increase in submitted papers and the viewing/submitting of the Turnitin inboxes, especially during the 2nd semester of 2013-14, this is confirmed by the rapid growth in marked Turnitin papers; even though, as a submission inbox, Turnitin was already well established. While the e-SMF policy did not prescribe which tools needed to be used, it was to be expected that the use of Turnitin, as the current and recommended instument, would be intensified. The impact on the alternative option, the Blackboard assignments tool, could not be fully established due to an upgrade halfway through the process and change in data collection methods, but the limited data available also suggests an increase in usage.

References

Browne, T., Hewitt, R., Jenkins, M., Voce, J., Walker, J. and Yip H. (2010) Survey of Technology Enhanced Learning for higher education in the UK. UCISTA TEL Survey, Oxford: UCISTA.

Campbell, J. P., DeBlois, P. B., and Oblinger, D. G. (2007) Academic analytics: A new tool for a new era. Educause Review, 42(4), pp. 40-57.

Campbell, J. P., and Oblinger, D. (2007) Academic analytics. Educause Center for Applied Research. Wahington (DC): Educause.

Clifton, B. (2010) Advanced web metrics with Google Analytics. 2nd ed. Indianapolis: Wiley Publishing.

Cohen, L. Manion. L. and Morrison, K. (2007) Research Methods in Education. 5th ed. London: Routledge Falmer.

Cooper, A. (2012) What is analytics? Definition and essential characteristics. CETIS Analytics Series, 1(5), pp. 1-10.

Cutroni, J. (2010). Google Analytics. Sebastopol: O’Reilly Media, Inc.

Daniel, B. (2014) Big Data and analytics in higher education: Opportunities and challenges. British Journal of Educational Technology, 2014, pp. 1-17

Dziuban, C., Moskal, P., Cavanagh, T., and Watts, A. (2012) Analytics that Inform the University: Using Data You Already Have. Journal of Asynchronous Learning Networks, 16(3), pp. 21-38.

Ferreira, S. A., and Andrade, A. (2014) Academic analytics: Anatomy of an exploratory essay. Education and Information Technologies, pp. 1-15.

Goldstein, P. J., and Katz, R. N. (2005) Academic analytics: The uses of management information and technology in higher education (Vol. 8), Educause Center for Applied Research. Boulder: Educause.

Kraan, W., and Sherlock, D. (2013) Analytics Tools and Infrastructure. Cetis Analytics Series, 1(11), pp. 1-24.

Ledford, J. L., Teixeira, J., and Tyler, M. E. (2010) Google analytics. 3rd ed. Indianapolis: John Wiley and Sons.

MacNeill, S., and Mutton, J. (2013) Case study: Engaging with analytics. Cetis Analytics Series, 2(1), pp. 1-7.

Matušů, R., Vojtěšek, J., and Dulík, T. (2012) Technology-enhanced learning tools: A survey of use in European higher education. WSEAS Transactions on Information Science and Applications, 10(9), pp. 316-326

Saupe, J.L. (1990) The Functions of Institutional Research. 2nd ed. Tallahassee: The Association for Institutional Research.

Siemens, G., Gasevic, D., Haythornwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., Duval, E., Verbert, K., and Baker, R.S.J.D. (2011) Open Learning Analytics: An Integrated and Modularized Platform. Society for Learning Analytics Research (SoLAR), available at: http://solaresearch.org/OpenLearningAnalytics.pdf, [12/04/14].

van Barneveld, A., Arnold, K. E., and Campbell, J. P. (2012) Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative, ELI Paper 1.

van der Sluis (2012). Applying Google Analytics on Blackboard for Academic Analytics. Available at: http://hvdsluis-technicalities.blogspot.co.uk/2012/10/applying-google-analytics-on-blackboard.html, retrieved 01 October 2014.

Walker, R., Voce, J., Nicholls, J., Swift, E., Ahmed, J., Horrigan, S. and Vincent, P. (2014). 2014 Survey of Technology Enhanced Learning for higher education in the UK. UCISTA TEL Survey, Oxford: UCISTA.

Wilks, C.F. and van der Sluis (2014), personal conversation 07/05/2014 with Clarissa F. Wilks, Dean Learning and Teaching at Kingston University.

Hendrik van der Sluis

Hendrik van der Sluis is a lecturer in Learning and Teaching in Higher Education at the Centre for Higher Education Research and Practice at Kingston University. His various roles in Further and Higher Education in the United Kingdom and the Netherlands include those of lecturer, course director, staff developer, and blended learning leader. His research interests have followed his professional roles and include widening participation, student attainment, technology enhanced learning, academic and learning analytics, and educational professional development. Hendrik van der Sluis Lecturer Learning and Teaching in Higher Education, Centre for Higher Education Research and Practice (CHERP), Kingston University Tel: 020 8417 5400 Email: H.vanderSluis@Kingston.ac.uk

Steve May

Steve May is a Senior Researcher within the Centre for Higher Education Research and Practice at Kingston University where he oversees and contributes to a range of projects and chairs its research ethics committee. He was academic lead of the Kingston Education Research Network up to 2015 and led the university Student Retention Project to completion in 2003. His research interests are around social mobility and the experience and attainment of students from diverse backgrounds. Before entering Higher Education, Steve was a Further Education lecturer and a researcher in the steel industry. Steve May Senior Researcher, Centre for Higher Education Research and Practice (CHERP), Kingston University Tel: 020 8417 5646 Email: S.May@Kingston.ac.uk

Creative Commons License
This paper is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Double Blind Review
This paper has been subject to a double blind peer review by at least two reviewers. For more information about our double blind review process please visit: http://bejlt.brookes.ac.uk/about/double-blind-review/

How to cite this paper.
Tagged with: , , ,
Posted in Research Paper

Leave a Reply

Your email address will not be published. Required fields are marked *

*

Subscribe to BeJLT

Get email alerts when there is a new issue.
* = required field

Send this page to Kindle

your kindle user name:
(you@kindle.com, without @kindle.com)
Approved E-mail:
(Approved E-mail that kindle will accept)
Kindle base email kindle.com | free.kindle.com
(Use kindle.com to download on wispernet or wifi, use free.kindle.com for wifi only.)
using kindle.com may incur charges)