Student learning technology use: preferences for study and contact

Greg Benfield, Ruslan Ramanau  and Rhona Sharpe


This paper reports some of the results from a mixed-method research project to evaluate learners’ experiences of e-learning at Oxford Brookes University. This report concentrates primarily on the findings directly related to students’ preferences for location of study, form of contact and technology used, and discussing the implications for strategic planning and development of the Brookes student learning experience.

The Pathfinder evaluation was funded by the UK Higher Education Academy through its e-learning Pathfinder programme (see The data collection work was carried out entirely in the academic year 2007–8. The aims of the study were to

  • investigate undergraduate learner experiences with social use of software in different learning activities
  • explore the strategies and techniques learners employ in using software tools for interpersonal communication
  • investigate formal and informal aspects of learning technology use

The study involved a questionnaire distributed to undergraduates in both online and paper form and seven case studies of course-contextual applications of technologies supporting interpersonal communication and collaboration. Patterns of technology use found in the survey have implications for design of learning spaces and communication infrastructure and for the development of Brookes Virtual, the Oxford Brookes Virtual Learning Environment (VLE).


The last systematic evaluation of e-learning at Oxford Brookes University was completed in 2001 (Breen et al., 2001). Between then and now we have lived through three e-learning strategies and a substantial period of institutional change involving e-learning (see, for example, Sharpe et al., 2006). E-learning is firmly embedded in the student experience at Brookes. The VLE is a standard aspect of most students’ learning experience and its administration is tightly integrated with the electronic Course and Student Information System (eCSIS). There are a growing number of innovative learning and teaching projects focused on the use of social software such as Wikis, blogs and e-portfolios and of technology-mediated applications that support collaboration, networks and community building. There has been a pressing need for some time to understand the impact of these changes on the student experience of learning at Brookes.

In 2006, Brookes became one of the pilot institutions for the HEA’s e-Learning Pathfinding programme. We seized the opportunity Pathfinder provided to design and implement an institutional evaluation of e-learning at Brookes. The Pathfinder evaluation proposal was discussed and its aims and approach agreed upon in a variety of forums, most importantly the Centre for E-Learning, E-Learning Forum, and by meetings of the school E-Learning Coordinators. The Pathfinder project itself was approved by the University senior management team and was supported with matched funding. It provided a much-needed opportunity to evaluate the impact of the rapid advance of e-learning in the institution. The Oxford Centre for Staff and Learning Development (OCSLD) employed a full-time research fellow to support the evaluation project. The aim was to investigate the relationships between students’ use of technology for formal and informal learning and social networking and the increasingly sophisticated pedagogical use of social software by Brookes staff.

Learning technologies play an increasingly important part in higher education worldwide. In a previous review of the literature we noted that the majority of research in this area has been carried out from the practitioner’s perspective and rarely had the learner’s own expressions of their experience as their focus (Sharpe et al., 2005). Two projects funded by the UK Joint Information Systems Committee (JISC) in response to this review discovered an underworld of learning technology use which is integral to student lives (Conole et al., 2006; Creanor et al., 2006). They showed that some learners are making sophisticated use of an increasing range of educational media and are capable of customising and personalising the learning technologies provided to them by their institutions to suit their needs. It was found that learners use the Internet as their primary source of information and the widespread use of Google and Wikipedia by students has been confirmed in survey research (White, 2007).

As well as using technology to access information, it was noted that effective e-learners are adept in using learning technology for interpersonal interaction and social networking. The data on Internet use in the UK supports these findings, showing that students are more likely than other occupational groups to undertake communication activities online (Dutton and Helsper, 2007). Evidence from research in the United States shows similar trends (Salaway et al., 2007).

Although these projects have started to reveal the types of online activities today’s learners are likely to be engaged in, there is still little empirical evidence regarding learner-directed activities and patterns of learning technology use. In a large-scale qualitative study observing and interviewing young people, Green and Hannon (2007) identified four patterns of use: digital pioneers, creative producers, everyday communicators and information gatherers. These terms were offered in an attempt to characterise and make visible the ways in which young people are using technology, but it raises the question of whether groups of learners are using technologies in distinct ways. According to both recent studies on e-learning experiences in the UK (JISC, 2007) and research on ‘the Net generation’ (Oblinger and Oblinger, 2005), today’s learners actively construct their learning process, regulate and adapt both their behaviour and the study context to the demands of their studies.

The current study addresses the dearth of research in the area and explores patterns of learning technology use and their interrelationships with perceptions of learning. The qualitative stage of the study was intended to evaluate student experiences of the use of social software, e-portfolios, personal learning environments and virtual environments for reflective learning and personal development planning. The quantitative part of the project aimed to investigate patterns in online technology use in their relationship to self-regulation, learner perceptions of learning community and choice in learning. The key research question was: How do students experience social uses of technology in different learning contexts?


The project involved two distinct but overlapping and interrelated data collection strands. The first strand involved collecting seven case studies of student experiences of using technology to support cooperative/collaborative learning or the formation of social networks. The principal data collection method in this strand was semi-structured interviews with learners, both group and individual. The aim was to gather rich descriptions of student experiences of using technology during their university life.

Design of survey

The second strand of data collection and the focus of this paper was the survey. In November 2007 full-time undergraduate students from across the University were asked to complete printed or online questionnaires. This survey explored student uses and perceptions of technologies to support their learning. Its design was informed in part by preliminary case study findings.

  • A 47-item self-report questionnaire comprising Likert-like statements was designed to address the key goals of the study. The questionnaire consisted of four sections:
  • Section A: demographic information, including age, gender, school affiliation, level of study, preferred study location and information on disabilities
  • Section B: preferences of devices for gaining online access and methods of contact with peers, friends and tutors
  • Section C: learner use of online media. This section included 27 items concerned with learner use of media. Some of the items were used in the JISC LXP student experience study (Conole et al., 2006) and others were developed by the project team from an initial item pool that was partly based on the results of qualitative evaluation. The participants were asked to indicate how often they performed various online activities using a five-point scale from ‘virtually never’ to ‘very often’.
  • Section D: learner self-regulatory activities, help-seeking and peer learning. Section D used scales of the Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich et al., 1991) and Course Experience Questionnaire (CEQ, Ramsden, 1991). As the analysis of data from these scales is not reported here, we will not discuss the design of this section of the survey further.

We felt that school affiliation might be an important contextual factor in the way students use technologies to support their studies. Accordingly, the sampling strategy we adopted was proportional sampling by school affiliation.  We aimed for a 10% sample of the full-time undergraduate student population of each of the eight Brookes academic schools. In the event, we achieved this aim, with almost 1,200 survey respondents, representing around 10% of the undergraduate population of each school.

The questionnaire was pilot tested on a sample of respondents with characteristics similar to those of the target population. Both internal consistency coefficients and item-total correlations were at appropriate levels and so, after minor modifications, the instrument was deemed appropriate for use. The key data analysis techniques included the analysis of descriptive statistics, factor and correlation analysis. This report details only the descriptive statistics, concentrating on sections A, B and C of the survey. Our first attempts at factor and correlation analysis have been presented elsewhere (Ramanau et al., 2008).


Paper and online versions of the instrument were administered to a sample of full-time undergraduate students studying in Oxford Brookes University in November 2007. The participants either completed the questionnaire online from the institutional VLE or filled in the paper version of the survey before or after lectures. In total there were 1,180 valid survey returns – 400 online and 780 paper. The broad demographic characteristics of the sample corresponded fairly closely to those of the undergraduate student population as a whole in the following areas:

  • age range was 17- 64 years, with a mean of 21.7 years
  • gender distribution was 37% male and 63% female
  • 88% of the sample were UK residents and 12% were international students
  • 8.8 % declared a disability

That said, Table 1 shows that almost half of the sample is first-year students. In some schools the distribution by year level is even more extreme. For example, 84% of Built Environment and 78% of Health and Social Care students were first-year students, while only 6% of Education students came from the first year.  There were also differences in the mode of completion of the survey across the schools. There were no online returns from Education, while in Health and Social Care, and Social Sciences and Law a substantial majority of returns were online.

Table 1: Number of survey responses by year and academic school

School No of online returns No of print returns 1st year % 2nd year % 3rd year %
Arts & Humanties (n=150) 25 125 36 22 41
Built Environment (n=109) 15 94 84 7 9
Life Sciences (n=92) 33 59 33 28 39
Business School (n=216) 82 134 36 39 25
Health & Social Care (n=197) 105 92 78 9 13
Social Science & Law (n=122) 76 46 30 31 39
Technology (n=132) 64 68 72 12 16
Education (n=162 0 162 6 47 47
Totals 400 780 47 25 28

A series of tests were performed in order to assess the possible impact of these differences in year of study and mode of survey completion within the sample. Online survey respondents were older (sample mean of 23.6 compared to 21.6 years of age for paper survey submitters, F(1,1144) = 30.33, p < .001), and a large proportion of online submitters were in the first year of their studies, i.e., 52.5% compared to 43.5% for students who chose to fill in a paper questionnaire (X2=11.42, d.f. = 2, p = .002).

A tendency for online submitters to be heavier users of the Internet than the paper submitters could be expected. Students who completed an online survey used the Internet as a learning resource more often (F(5, 1113) = 23.64, p < .001), but paper survey submitters were more likely to contribute to Wikis and websites (F(5,1113) = 3.87, p = .002), use online multimedia ((F(5,1113) = 31.59, p < .001), play online games (F(5,1113) = 26.06, p < .001) and use Web 2.0 (F(5,1113) = 7.09, p < .001).  It seems that since the link to the survey was posted on the student VLE homepage, the choice to submit the survey online was influenced by the nature and frequency of use of the VLE on particular courses.


The first section of the survey asked questions about employment and preferences for place of study. Almost a quarter of our sample (22%) performed paid work of 10–30 hours a week, with an additional 3% working full time.

In response to the question ‘Where do you usually study? (tick one)’, 79% chose ‘home’, 11% chose ‘on campus’, 6% chose ‘Library’ and 2% chose ‘Pooled computer room’ (see Figure 1).


Figure 1: preferred locations for studying

When asked about their preferred device for getting online, Figure 2 shows a reliance on students’ own computers at home.  At least 57% of students owned their own laptop computer, but only 7% used it on campus. Once again, it is notable that just 7% of respondents used on-campus pooled computer rooms as their primary method for getting online.


Figure 2: preferences for devices to get online

Three questions asked students about how they contacted fellow students, tutors and friends. Students used a fairly broad combination of methods to contact other students at Brookes or friends, but a much narrower range of methods to contact their tutors. Figure 3 shows responses to the question ‘How do you usually contact other Brookes students? (tick all that apply)’, offering ten alternatives (the tenth being ‘other’) including landline phone, social networking sites and online discussion forums. The three most popular were face-to-face (75%), mobile phone (73%) and then e-mail (71%). Social networking lagged behind at 53%.


Figure 3: preferences for modes of contacting students, friends and tutors

The top four responses for ‘How do you usually contact your friends?’ were similar to the results for contacting students, although higher proportions preferred face-to-face (88%) and mobile phone (88%) contact for friends than fellow students. When asked ‘How do you usually contact your tutors?’ the range was much narrower. E-mail was the most popular form (82%), face-to-face was second (52%) and Brookes Virtual VLE was a distant third (10%).

Section C of the survey contained 27 items asking students to rate from 5 (very often) to 1 (virtually never) how frequently over the last three months they had engaged in various online activities. Although the effects of other variables (e.g., student age, year of study or gender) were not taken into account in the analysis, the ANOVA results showed that students in the eight academic schools reported statistically significant differences at least at the .05 level in their responses on all 27 items in section C. Table 2 shows the items where differences were particularly noticeable in the frequency of activity between academic schools.

Table 2. Means (Standard Deviations) for Selected Survey Items (5-Point Likert Scale).

C4 Searched for library resources C18 Watched online videos C 20 Played multiplayer games C21 Shared files online C26 Used instant messenger
Arts and Humanities 4.01 (1.14) 3.70 (1.37) 1.94 (1.34) 2.59 (1.47) 3.54 (1.55)
Built Environment 2.68 (1.21) 3.94 (1.19) 1.83 (1.12) 2.63 (1.36) 3.36 (1.41)
Life Sciences 3.78 (1.09) 3.30 (1.39) 1.44 (0.86) 2.46 (1.39) 3.61 (1.47)
Business School 4.06 (0.96) 3.90 (1.21) 2.21 (1.44) 2.88 (1.40) 3.80 (1.28)
Health and Social Care 3.99 (1.00) 2.74 (1.41) 1.36 (0.89) 2.04 (1.22) 2.98 (1.68)
Social Sciences and Law 4.27 (0.86) 3.81 (1.25) 1.57 (1.07) 2.50 (1.29) 3.59 (1.40)
Technology 2.75 (1.40) 3.96 (1.06) 2.29 (1.54) 2.80 (1.35) 3.79 (1.40)
Westminster Institute of Education 3.91 (1.06) 3.23 (1.34) 1.49 (0.94) 2.52 (1.33) 3.38 (1.48)
Overall 3.75 (1.21) 3.54 (1.37) 1.78 (1.24) 2.55 (1.37) 3.50 (1.49)


The survey aimed to discern patterns of technology use among full-time undergraduate students and make recommendations for institutional provision.

It is of interest to this study and some surprise to us that more than 40% of our sample of undergraduates are in paid employment in addition to studying full time. It is likely that this group of students requires a high degree of flexibility in how, when and where they access learning resources and activities. This may in part explain why the majority of students prefer to study at home on their own computer rather than on campus. Appropriate and well-designed use of learning technologies will be of critical importance to these students.

Students have a clear preference for studying at home. Among those who prefer to study on campus, we found a preference for studying in designated learning spaces or the library rather than in a pooled computer room. This raises questions about the suitability of pooled computer rooms for teaching and learning. The results do, however, provide evidence of an untapped demand for social spaces and give added impetus to the ‘Space to Think’ project currently taking place at Brookes and the recent provision of social learning spaces in the main library and ASKe and Reinvention CETLs.

We found that although ownership of laptops is high, students preferred to leave their laptops at home. It is possible that more students will bring their laptops to university with the planned availability of ubiquitous wireless network on all of our campuses and we suggest this should be monitored in the future.

In terms of what activities students engage in online, we found that the most common activities were accessing online resources including multimedia. Most students used the web extensively to find resources to support their studies. Fewer students used some of the most popular Web 2.0 services, such as social bookmarking and contributing to wikis and blogs. This shows the high course-related use of the Internet by our students. Our undergraduate students spend a lot of time reading learning materials, using online library sources, and searching for and using a variety of learning resources, including external sources.

The other high-usage activities involve multimedia use, particularly listening to audio and watching video. It is clear that many of our students use their computers virtually throughout their waking hours, listening to music, reading the news, watching either live or on-demand TV or video, as well as reading, writing, and researching for their courses. By comparison online communication is a relatively less frequent activity, although it may be no less important.

One of the interesting aspects of analysis was the degree to which the initial hypotheses regarding the differences in online media use across schools was supported by the data on the frequency of web use. Apparently, students in some schools were more likely to be engaged in certain types of online media usage than in others. For example, students in the School of Social Sciences and Law reported searching for library resources online more frequently than students in other schools, while Technology students watched online videos and played multiplayer online games more frequently. Although the nature of these differences should not be viewed in isolation from the differences in sample characteristics reported in Table 1, these findings might provide some insights into the types of e-learning activities that students of different subjects engage in both as part of their curriculum and in their spare time.  Further analysis to explore these differences is ongoing.

Although Creanor et al. (2006) report that the boundary between using technology for leisure and studies seems to be often blurred for today’s generation of learners, we see here that the precise nature of technology use is influenced by the context of use. The high levels of activities related to using the web as a learning resource and the differences between schools suggests that these behaviours are influenced by the context in which the learners find themselves (the course, the institution) rather than their attributes. This implies a high level of institutional relevance and responsibility for shaping learner behaviours in this area. The dominance of the frequency of study activities related to searching for, accessing and reading online resources suggests that institutions should actively seek to shape learner experiences and skills in searching for and evaluating online information.

It is worth noting that the evaluation took significant institutional effort, which was only possible with external funding. Even so, questions remain that are worthy of further investigation. Some of the questions about how students choose which technology to use in which location to support social learning will be informed by the analysis of the interviews from the case studies, which is ongoing. This illustrates the necessity of undertaking mixed-mode evaluations such as this in order to fully understand the results. Also, considering the apparent demand for flexible working, including working from home, we would like to undertake further research beyond the current sample of full-time undergraduates to see how part-time and postgraduate students make use of technology.


Greg Benfield is an e-learning specialist educational developer in the Oxford Centre for Staff and Learning Development at Oxford Brookes University. He tutors on the PGCTHE and runs workshops and online courses for higher-education staff across the UK. His recent work has focused on learner experiences of e-learning, especially technology-mediated communication and collaboration, and on promoting transformative curriculum design through expanded, team-based practices.

Ruslan Ramanau is a research assistant on the nationwide project funded by the ESRC (Economic and Social Research Council) that looks at experiences of e-learning among first-year undergraduate students at five UK universities. He is based at the Institute of Educational Technology of the Open University where he also teaches on the MAODE (Master of Online and Distance Education) programme. His research interests include student self-regulation strategies and cross-cultural differences in e-learning.

Rhona Sharpe is an educational developer in the Oxford Centre for Staff and Learning Development at Oxford Brookes University. She runs workshops, online courses, and offers consultancy on e-learning topics for higher education staff across the UK. In recent years her work has focused on improving our understanding of learners’ experiences of e-learning.


Breen, R., Lindsay, R., Jenkins, A. and Smith, P. (2001) ‘The role of information and communication technologies in a university learning environment’, Studies in Higher Education, vol. 26, no. 1, pp. 95–115.

Conole, G., De Laat, M., Dillon, T. and Darby, J. (2006), JISC LXP: Student Experiences of Technologies Final Report. December 2006. Retrieved 19 January 2007 from

Creanor, L., Trinder, K., Gowan, D. and Howells, C. (2006), LEX: The Learner Experience of e-Learning Final Project Report. August 2006.  Retrieved 2 November 2006 from

Dutton. W. H. and Helsper, E. J. (2007), The Internet in Britain 2007. Oxford Internet Institute, the University of Oxford.

Green, H. and Hannon, C. (2007), Their Space: Education for a Digital Generation. Retrieved 17 November 2008 from

JISC (2007), Understanding my learning outcomes.  Retrieved 26 February 2007 from

Oblinger, D. G. and Oblinger, J. L. (2005), Educating the Net Generation. Boulder, Colorado: Educause.

Pintrich, P. R., Smith, D., Garcia, T. and McKeachie, W. (1991), A Manual for the Use of the Motivated Strategies for Learning Questionnaire (MSLQ). Ann Arbor, Michigan: The University of Michigan.

Ramanau, R., Sharpe, R. and Benfield, G. (2008), ‘Exploring patterns of student learning technology use in their relationship to self-regulation and perceptions of learning community’, Sixth International Networked Learning Conference, Halkidiki, Greece. Retrived 16 December from

Ramsden, P. (1991), ‘A performance indicator of teaching quality in higher education: The Course Experience Questionnaire’, Studies in Higher Education, vol. 16, pp. 129–50.

Salaway, G., Caruso, J. B. and Nelson, M. R. (2007), The ECAR Study of Undergraduate Students and Information Technology, 2007.  Retrieved 9 October 2007 from

Sharpe, R., Benfield, G. and Francis, R. (2006), ‘Implementing a university e-learning strategy: levers for change within academic schools’, ALT-J, vol. 14, no. 2, pp. 135–51.

Sharpe, R., Benfield, G., Lessner, E. and De Cicco, E. (2005), Final Report: Scoping Study for the pedagogy strand of the JISC e-Learning Programme.  Retrieved 16 December 2008 from

White, D. (2007), Results and analysis of the Web 2.0 services survey undertaken by the SPIRE Project. Retrieved 17 November 2008, from

Tagged with:
Posted in Research Notes

Leave a Reply

Your email address will not be published. Required fields are marked *


Subscribe to BeJLT

Get email alerts when there is a new issue.
* = required field

Send this page to Kindle

your kindle user name:
(, without
Approved E-mail:
(Approved E-mail that kindle will accept)
Kindle base email |
(Use to download on wispernet or wifi, use for wifi only.)
using may incur charges)