Written assignment types in assessment: a varied and healthy diet?

Abstract

This research project investigated university written assignment types with the aim of identifying the extent of the existing range and thereby aiding lecturers in setting, and students in completing assignments, thus facilitating academic literacy skills development. In the literature there has been, for several reasons, a call to increase the range of assignment types from the previously more restricted diet of exams and essays. The question addressed in this research is, if the diet of assignment types is indeed more varied, to what extent is this of benefit to students? Assignment briefs were collected and analysed to determine the assignment type required. Results confirmed, in line with findings from similar studies, that the range of assignment types students have to produce is now increasingly varied. Though this may be currently recommended practice, areas of pedagogic concern are raised. Ways to address these concerns are offered, primarily through proposed development in several aspects of assessment literacy knowledge and skill among staff and students alike.

Fiona Gilbert is responsible for the academic English provision for undergraduate students with English as a second language in the International Centre at Oxford Brookes. In the field of teaching English language, Fiona is a fully qualified language trainer, holds an MA in applied linguistics and is an accredited Cambridge ESOL teacher trainer. She became an Oxford Brookes teaching fellow in 2008. Her pedagogic research interests include academic literacy, discipline specific academic English provision, teacher education, elearning and assessment. Her main current focus being written assignment tasks and written assignment briefs.

Introduction

Assessment in higher education, both for students and for those involved in designing and delivering assignment tasks, is of central importance. Given that assessment is ‘one of the single most powerful influences on the approach students take to learning’ (Rust, 2008, n.p.) and given that students cannot escape the effects of poor assessment practice (Boud, 1995, in Rust, 2007), why the practice of assessment is allocated so much attention is clear. This importance is increasingly reinforced as assessment practices receive negative evaluation in student surveys (ASKe, n.d.) and as the call to make the process more equitable (Williams, 2005) and more transparent becomes ever louder. With employability also becoming a consideration, in order for students to develop the key graduate attributes of academic and professional literacy, the ability to become assessment literate becomes an ever more urgent need.  Being assessment literate may have been more straightforward to achieve previously, when students were a more homogenous group; however, the student population is now diversifying with an increase in English as an additional language users and in students from non-traditional study backgrounds. Study routes are also now more diverse with an increase in the study modes available, differing entry points as well as an increase in cross-disciplinary study. Therefore, ensuring that all students can engage with assessment, that assessment practice is equitable, and that all students have the opportunity to become sufficiently academically and professionally literate is not only more challenging, but also somewhat more pressing than previously.

At the core of assessment is the written assignment as it is the most frequent mode and as the written assignment type is our main concern here, a definition is first necessary. There are several terms used to refer to assignment type in the literature including assessment techniques, assignment tasks and assessment methods (Dunn et al, 2004). This differs from text types (sometimes also referred to as rhetorical functions in the literature), another term often used when referring to written assignment types, which ‘represent groupings of texts which are similar in terms of co-occurrence of linguistic patterns’ (Partridge, 1996, p. 237), irrespective of genre. Examples of these are, descriptive writing, problem / solution, reflective writing.

A written assignment type is defined, for the purposes of this paper, as a group of texts sharing similarities in structure, format or rhetorical organisation and which may or may not differ in linguistic features. One instance of this is a case study report. This definition is in fact closest in usage to a widely employed conceptualisation of genre, which is ‘a class of communicative events, the members of which share some set of communicative purposes which are recognised by the expert members of the parent discourse community’ (Swales, 1990, p.58).

The written assignment type menu at universities was previously more restricted and constituted primarily essays or unseen exams. For a number of reasons, explored more fully below, since the 1980’s there has been a call for innovative, authentic and diverse assignments tasks and this has resulted in an increase in written assignment types students are required to produce, this increase being the focus of this paper. A distinct assignment type implies distinct features of the written text itself, including structure (sometimes referred to in the literature as format or rhetorical organisation), audience, text purpose, text type and style. The more the assignment type and their features are made explicit to students the more likely they are to fulfil these assessment requirements.

The research reported herein surveyed a range of assignments and analysed the written assignment briefs. The aim was to ascertain whether a range of assignment types indeed exists at Oxford Brookes University and if so, to gain insight into the nature of this range of assignment types, whilst also offering suggestions why this more varied range might or might not be beneficial to students. The aim was not to question the value of the assignment type variety, but to gather knowledge about the range of assignment types that exist, and as a consequence, to better equip staff to support students.

This paper begins by reviewing the literature concerning academic literacy and written assignment tasks. This is followed by an exploration of the extent to which written assignment task diversity is beneficial. It then moves on to look at the central concern of this research, which is the written assignment type. The methodology used for this research is outlined and a brief overview of results provided. This is followed by a discussion of the implications of there being an increasing range of written assignment types and thus an increasing number of expected variations between them. Several tentative ways forward are suggested.

The primacy of written tasks in academic literacy

Academic literacy is one of the five core graduate attributes in the Oxford Brookes Strategy for Enhancing the Student Experience and, at undergraduate level, is defined as follows:

Disciplinary and professional knowledge and skills, understanding the epistemology and ‘landscape’ of the discipline, and what it means to think and behave as a member of that disciplinary and/or professional community of practice. (Strategy for Enhancing the Student Experience, 2010-2015) 

Writing is one mode of demonstrating academic literacy and Candlin and Plum (1999, p.197 cited in Magyar, McAvoy and Forstner, 2011) refer to ‘the ineluctable integration of writing with the display of disciplinary knowledge’.  Students need to attain ‘expertise in the particular text types or genres and the acquisition of shared and specialised knowledge and terminology’ (Hughes 2009, p. 560). One of the main purposes of university study ‘is to induct students into particular disciplinary communities through supporting their development from novice to expert users of the text types which characterise these communities’ (ibid. p. 561) thus becoming academically literate.  However, diverse understandings of academic literacy both within and between subject discipline communities exist because tutors develop their own implicit view of what good writing is and students have to contend with this (Bloxham and Boyd 2007, p. 27).

Alongside the disciplinary discourse students need to master to become academically literate, Williams (2005) highlights assessment discourse, that is, understanding assessment as a situated social practice which students have to gain access to, with a discourse which operates as a distinct element within academic discourse. While Bowstead (2009, p.5), in discussing non- traditional students, speaks of the ‘language of academic discourse and the power relations at play’ and how ‘assignment briefs, marking criteria, feedback, lectures, tutorials, even learner support, are all couched in terms that reinforce the barriers between the members of the discourse community and those on the outside’. It is our role to help students cross these barriers and not only master the academic discourse of their discipline, but also the assessment discourse within this in order to be able to engage as effectively as possible with the assessment process:

the social constructivist process model of assessment argues that students should actively engage with every stage of the assessment process in order that they truly understand the requirements of the process, and the criteria and standards being applied, and should subsequently produce better work (Price, O’Donovan & Rust, 2007, p.4).

To become members of the discourse community, to handle assessment, to engage with assessment, to gain mastery in it and to fulfil the aim of assessment which is ‘to obtain evidence of how well students can do what is required’ (Sadler, 2010:1), the primary form of discourse is the written word. Students therefore have to produce a written text. The type of written text that is required is central to the process as the type determines text features such as: structure, the audience, text purpose, text type and the language style. Research indicates that coherent structure is one shared feature in student writing that tutors across subject groups value, even though differences exist in writing concerning structure, purpose and audience, both within and outside a field (Nesi and Gardner, 2005). It is therefore essential for students to be fully aware of the assignment type in order to fulfil these requirements of text features thereby not only optimising their performance, but also providing an opportunity to be introduced to and gain practice in the disciplinary discourse.

The call for diversity in written assignments

Previously traditional assessment consisted of a limited number of assessment methods such as unseen exams, essays, reports or multiple – choice questions (see for example Falchikov, 2005, p. 32) and these are described by Brown (2011, n.p.) as ‘uninspiring, tame and excessively traditional’ while Dunn claims learners become  ‘essay producing machines’ (Dunn, 2002, p.2). However, the range of written text types that students now have to produce within a single discipline, across disciplines and within their academic career, can be extensive, as the current research shows. Even within a single discipline such as business, each sub discipline will have its own specific academic literacy requirements. For instance, although both are reports, compare an events proposal required by the hospitality field to a performance evaluation report in business management, indeed, multiple types of report can be required within a single module (Reid 2010, p.6). The academic literacies approach warns that academic writing is not just subject specific but module specific and dependent on the course orientation and course designers, hence ‘literacies’ not literacy (Creme and Lea (2003) cited in Bloxham and Boyd (2007)).

A call for an increase in the range of assignment tasks between and within different subjects has emerged in the literature and has been requested for the following reasons. Firstly, for reasons of employability students need to develop multiliteracies (New London Group, 1996, cited in Hughes, 2009, p. 560), in order to ‘lay the foundations for lifetime learning’ and ‘preparation for work’ (Falchikov, 2005, p.32). A further reason is the impact on learning assignment tasks can have with more innovative and authentic assignment tasks resulting in more engaged and deeper learning (Macquarie University, 2008). Additionally these encourage the use of interpretation and criticism as opposed to memorisation (Boud, 1990 cited in Falchikov, 2005, p.39) whilst also testing a wide range of learning outcomes (Higher Education Academy, 2006). Furthermore employing relevant and authentic assessment is said to increase levels of satisfaction (ASKe, n.d.). It is also claimed that diversity of assessment practices develops autonomous learning habits thus enabling students to continue ‘education throughout life’ (Williams, 1992, p. 46 cited in Falchikov, 2005, p. 86) and therefore empower learners (Falchikov, 2005, p. 82). The next reason given in support of a range of assessment tasks is so that assessment can be made more culturally diverse (MacKinnon and Manathunga, 2003, p. 142), developing intercultural communication and empathy writing skills more readily (Lea and Street, 2008), thus contributing to internationalising the curriculum. Reducing plagiarism is a further argument, since designing authentic or innovative tasks reduces the likelihood of plagiarism (Carroll, 2002). Finally, employing a range of tasks, which go beyond the traditional essay or report, appeals to a diverse range of students with a ‘range of learning styles and preferences’ and is appropriate for ‘students with disabilities’ (Devlin, 2002, in James, McInnis and Devlin, 2002, p. 3). As can be seen from the above, there are a number of convincing arguments offered for increasing the range of assignment tasks.

Alongside this call for an increase in the range of assignment tasks has come a request to provide students with a choice of assignment task (Rust, 2008), primarily since personalisation has been shown to increase motivation to undertake the task and foster deep learning. Furthermore, providing an element of choice in assessment means the assignment is more inclusive, as all students have greater opportunity to be successful (Macquarie University, 2008b). Having a culturally diverse choice of topics and texts not only helps international students’ to ‘demonstrate their understandings of concepts but also offers the opportunity for local students to broaden their horizons’ (Devlin, 2002, in James, McInnis and Devlin 2002, p. 3), thereby going some way towards internationalising the curriculum. If choice is provided in assignment type, the range of assignment types a student undertakes would be further extended.

This increase in the range of assignment tasks corresponds with a change in the range of audience the student is expected to write for, which has implications for the style of writing. Students are no longer writing solely for an academic community but also potentially for a professional audience, a combined academic / professional audience and even non-specialist audiences as they are required to do in empathy writing type assignments for instance.  According to Lea and Street (1998), even traditional disciplines such as physics, expect students to write for audiences outwith the academic community. Corresponding with the variety of assessment tasks and range of audiences, the purpose of the writing also varies. A reflective essay in health care, for example, would be written for both a different audience and purpose to an events proposal in the hospitality field. An increased range of assignment types also impacts on the text types that students are expected to produce; it has been found on average each assignment type requires 2.5 different text types (Moore and Moreton, 2005). An increase in the range of assignment tasks not only increases the assignment types that students have to produce, but also the corresponding range of audience, purpose and text type.

There is evidence in the literature that this call has been responded to and a wider variety of assignment types exist. For example, ASKe’s (Assessment standards knowledge exchange, n.d.) research in Oxford Brookes claims that since the early 1980’s ‘a huge variety of novel forms of assessment’ has been approved, while Ganobcsik-Williams (2001) identified 64 different types of writing, excluding the essay, in education, English and engineering fields; findings which correspond to research into the range of assignment tasks at other institutions (see for example Falchikov, 2005; Moore and Moreton, 2005; Gardner, 2006; Nesi et al, 2008; Gillett and Hammond, 2009; Nesi and Reid, 2010).

Several arguments exist as to why an increase in the range of assignment types students are required to produce is not necessarily to be welcomed. To gain access to a disciplinary community and therefore to become academically literate requires students to be skilled at writing in the discourse of that discipline. With a more limited range of assignment task types, both students and assignment brief designers alike, could quickly develop expertise not only in the content and processes inherent in these tasks, but also in the characteristic elements of the texts themselves. This was achievable previously by practice in a limited number of assignment types and students coming to understand the criteria and learning outcomes through repeated practice (ASKe. n.d.). However, as discussed above, a range of assignment tasks now clearly exists, so students and lecturers are now less likely to become experts in assignment types but more likely to become novices in multiple types with students having ‘little opportunity to get better at anything’ (Gibbs, 2011, n.p.). One study, for example, showed that a postgraduate student in education had to produce twelve different genres (Stierer, 2000 cited in Nesi and Gardner, 2005). Neither lecturers nor students will have been able to build up the bank of experience they would have acquired previously. If students no longer get the opportunity to become skilled writers in their discipline due to the diverse range of assignment types they are required to produce, they may be denied access to the discourse community. Therefore disciplinary acculturation may be a long process, ‘especially as undergraduate students are often required to write in a much wider range of knowledge areas than experts do’ (Nesi and Gardner 2005, p. 2) and, as current research indicates, in a much wider range of genres.

A number of further arguments against a range of innovative assignment types have been raised. These include not being welcomed by students due to unfamiliarity or increase in time to undertake them (Gibbs (2006b) cited in Bloxham and Boyd); student confusion at the ‘sheer variety of types of assignment they are asked to tackle’ (ASKe. N.d. p15); students questioning the relevance of skills involved (Hounsell et al, 2006 cited in Bloxham and Boyd, 2007) and as Gibbs (2006b) suggests, students challenging assessments where they have not been adequately prepared, which, one could argue, may be the case with unfamiliar assessments. Finally, although one of the arguments given in favour of a range of assignment types is that due to their authenticity they foster a deeper approach to learning, Gibbs (2011) has argued that if faced with a variety of assignments students become confused by the inherent ambiguity, thus experiencing anxiety which then tends to encourage a surface approach to learning.

The call to increase the range of innovative and authentic assignment types from the literature appears to have been responded to. However, there are convincing arguments as to why this might not be as beneficial to students as suggested, and, as pointed out by Evans and Abbott (1998 cited in Nesi and Gardner 2005), innovation has been promoted without being based on empirical research. The paper will now look at research analysing whether Oxford Brookes has responded to the call.

Methodology

Eighty written assignment task briefs were collected and analysed for the type of assignment the brief required, along with other key features. The majority of the sample briefs were from one discipline from first year undergraduate modules, due to ease of access to the briefs, although assignments from across disciplines and all stages of study were also included. The sample was restricted to written coursework assignment briefs and did not include exam briefs. Rather than being a comprehensive picture of one discipline from one stage of study, the written assignment brief bank consisted of a range of a number of written assignment tasks that Brookes’ students are expected to produce. It is important to emphasise the sample size is limited and does not claim to represent the range of written assignment types across the whole university. This is partly due to the fact that gaining access to assignment briefs is not straightforward as, unlike exam questions,  there is as of yet, no central repository in Oxford Brookes.

The assignment types were, with the primary aim of capturing a student perspective, identified from the briefs themselves, rather than inferred from exemplar texts, assessment criteria or other sources. The assignment types categories were derived inductively, that is, they emerged from the briefs themselves and not from predetermined categories. The assignment type product, rather than the process, i.e. the skills students use to produce that assignment (as used by Gillett and Hammond 2009), was categorised. The assignment briefs were further analysed to see whether characteristics of assignment types, namely, structure (sometimes referred to in the literature as format or rhetorical organisation), audience, text purpose, text type and style (register) were identified and clarified to students. The findings from this, although of note, are not the focus of this paper

Findings

Exploratory research on assignment briefs undertaken here at Brookes during 2009-2010 confirms the existence of a diverse range of written assessment tasks.

assigned type analysis percentage

Figure 1 Assignment Type Analysis (%)

From the analysis, the two traditional assignment types, essays and reports, constituted 27% and 18% of the total respectively (see figure 1 above). Case studies, research proposal, research project and review made up a small minority (15%). A quarter (25%) of the sample consisted of various types, including assignments such as an event proposal, a new magazine proposal, a media diary, a reflective statement, a formal letter, a questionnaire, a lesson plan, a written blog and a CV (see appendix 1 for the full range of assignment types identified). Interestingly, the remaining 15% of the assignment briefs analysed did not make the written text type explicit, i.e. the written text type was unspecified (see Gilbert and Maguire, (forthcoming) for a discussion of the implications of unspecified written assignment text type on the student experience and assignment performance).

Discussion

According to the categorisation used in Nesi et al’s (2008) research into assignment genres, the 25% of various assignment types could be sub-divided into ‘genre families’. In a ‘literature survey genre family’, for example, an article summary and an annotated bibliography could both be included, i.e. one genre according to Nesi et al’s (2008) classification, but they would not share text features and would therefore be distinct assignment types, requiring different structures, academic conventions, audience, purpose, text type, style, and so forth. Therefore, the ‘genre family’ still remains clearly distinct in structure and linguistic features. So, for the purposes of this research, with its base firmly rooted in analysing assignment briefs from students’ perspectives, it is not a categorisation with much surrender value.  Furthermore, the assignment types identified were those specified in the briefs themselves and these are ‘known to be unreliable indicators of genre across disciplines’ (Currie 1993, p. 102 cited in Nesi and Gardner, 2006, p. 104), so a report assignment will differ not only across disciplines but also within disciplines. Taking the assignment type report writing as an example, findings from Reid’s (2010, p.5) study confirmed that students are required to write a range of ‘different types of reports…. often within a single module’, although it is interesting that Reid notes that these were not clearly defined as a report in the brief.  Therefore, for the purposes of this research, any attempt at further subdivision would not be of benefit.

It is clear from the assignment briefs surveyed and analysed herein, that a greater percentage (55%) of the written assignments that students at Oxford Brookes are required to produce are not of the traditional essay and report assignment type (45%). There is some evidence a wider range of assessed written text types than would have been previously available are now being set. The implications of this increased range are explored below.

Implications

Although often lauded in the literature as recommended practice, the existence of such a range of assignment types has several implications for practice worthy of consideration.

The first implication for students is that as the range has increased, the assignment requirements have become significantly more varied with students having to become more skilled than they might previously have needed to be in deciphering what is required of them, to ‘unpack what kind of writing any particular assignment might require’ (Lea and Street 1998, p. 8). This skill of deciphering would certainly apply to the 15% of unspecified assignment types identified in this research. Sloan and Porter’s (2010) study reports student concerns regarding ‘understanding the different styles of assignment briefs’ and the ‘language used in briefs’ and ‘unpacking the task’. Norton and Pitt (2009) quoted a 2nd year student talking about assessment

It’s hard to now exactly what they want. Some tutors write what they want in the lectures, others don’t, so it’s a bit of a guessing game really. I often go and see them if I’m unsure. The module handbook gives you an idea but it’s not always that great if you don’t understand.  (Norton and Pitt, 2009, p. 27)

Evidence shows that if students do not understand the requirements of the assignment task, they will underperform (Nicol, 2007; Rust et al. 2003, cited in Macquarie University 2008a).

Given that a diverse variety of assignment types exist, it can be assumed that students not only have to produce different genres in different text types, but they also have to switch between varying writing requirements as significant differences exist in both academic styles and conventions across disciplines (Swales, 1990). Lea and Street’s (1998, p, 4) research found that a ‘dominant feature of academic literacy practices is the requirement to switch practices between one setting and another, to deploy a repertoire of linguistic practices appropriate to each setting’. Such switching can range from academic disciplines, to fields of study, interdisciplinary courses and to specific modules within programmes (Lea and Street, 1998, p. 6). This switching was at a more complex level than genre, such as the ‘essay’ or ‘report’, but lay more deeply at the level of writing particular knowledge in a specific academic setting’ (1998, p. 8) and may even extend to the different demands of individual subject tutors and their personal interpretation of writing requirements’ (Lea and Street, 1998, p. 6). According to one student in the study, ‘everybody seems to want something different’ (Lea and Street 1998, p. 6). ‘University students must develop proficiency in multiple literacies in multiple Discourses’ (Gee, 1999a; Paxton, 1998, cited in Williams 2005, p. 158), a skill not previously necessary nor currently in traditional universities.

A further implication of the widening range of assignment types is that, depending on a student’s past educational experience and /or cultural background, the type of HE assignment tasks set may be unfamiliar to students, increasingly so given the current call for innovation and authenticity in assignment tasks. Differences in academic conventions and styles exist not only across disciplines, and even to some extent within disciplines (Nesi and Gardner (2005), as discussed previously, but also inevitably across cultures / languages (Duszak 1996; Golebiowski and Liddicoat 2002, cited in Magyar, McAvoy and Forstner, 2011). Some cultures for instance, still rely primarily on exams as the assessment tool (see Carroll, 2008), hence students from such cultures may not have come across an assignment task as coursework before or may only have limited experience of a limited range of assignment types, which themselves may even no longer be relevant as the range of assignment tasks has been extended. These students may have never undertaken any independent writing, particularly in types of discourse that may be peculiar to a new academic culture, such as for example, reflective writing in British HE. This unfamiliarity not only concerns the disciplinary discourse required for the set task in the particular discipline but also the assessment discourse, i.e. the language used for assessment, for example, in the assignment criteria and briefs:

Where students come from a background that is culturally similar to that of the university, the ‘default’ may not be as distant and therefore there may be apparent initial correlation with the lecturers’ usage of assessment discourse ……access to that Discourse is more easily acquired. Where the Discourse ‘gap’ is wider the fallback is more noticeable. (Williams, 2005, p. 170) 

A further implication of employing a range of assignment types is that along with the variety of assignments inevitably comes a variety of learning outcomes and assessment criteria. Even though these are explicit at Oxford Brookes, ‘it is common for outcomes to be unique to a module and to be associated with a unique form of assignment with unique criteria’ (ASKe, n.d. p.16). This can result in students having a ‘single experience of each combination of outcomes, assignments, and criteria’ leading to confusion and never being able to build up expertise from repeated exposure (ASKe, n.d. p.16). Therefore a range of assignment tasks can potentially mitigate against ‘constructive alignment’ – the interdependence of teaching, learning outcomes and assessment that HE strives towards (ASKe, n.d.).

In conclusion then, although having a range of assignment tasks on the assessment menu is often perceived to be beneficial for students for the reasons discussed earlier, it is important to note and take into account a number of implications that ensue. These are: the number of different assignment types required at university, unfamiliarity with certain assignment types, potential difficulty in understanding what is required, the requirement to develop multiple literacies and the potential lack of constructive alignment. In accordance with Magyar, McAvoy and Forstner we need to be aware that ‘students can struggle with the variety of assignments they are asked to write’ (2011, p. 14). Becoming assessment literate can therefore present a challenge to both staff and students, particularly so given this increased range of assignment type. Given it is our responsibility to minimise any confusion and anxiety students may feel, several potential ways to achieve this are now suggested.

Ways forward

Developing assignment brief design

Assignment brief design has perhaps been a neglected stage in the assignment process but given the call for innovative, authentic and diverse assignments tasks, coupled with the diverse student population and diverse study modes, it should be assigned more attention than it currently attracts. Not making the implicit explicit is a key contributing factor in student difficulty in interpreting assignment briefs. Students in a study at Kent university said they ‘would find it useful to have more explicit detail of what is expected from the tutor, this is especially the case for less familiar forms of assessment (Frith, 2005). Gibbs (1992, p.165) has highlighted how requirements have to be made explicit when employing new methods of assessment and Bailey (2010, p.11) has called for  ‘more explicitness about the (frequently tacit) expectations of academic teaching staff as students produce writing and assessments for different teachers across various fields of study’. Dunn et al (2004, p. 243) also speak of not being explicit when communicating the task and our intentions. Developing shared understanding of assessment discourse at module or field level, as discussed above, would help make the implicit explicit and, as a consequence, may have a positive effect on assignment brief design and subsequent student performance.

Lecturers, both experienced and novice, are expected to design and write assignment briefs with no training to do so. This is challenging, yet, with increasingly large cohort size and higher workload, there is less time to allocate to such tasks. Dunn et al (2004, p 244) lists reasons why the task may be less effectively communicated to students, namely, ‘hurried preparation, attitudes about ‘spoon-feeding’, and our own lack of clarity’. With no training to design and write assignment briefs combined with the aforementioned factors mitigating against effective assignment brief design, it is hardly surprising that university educators do not always communicate the written assignment task in the brief as effectively as it might be possible to do.  Combine the above factors with designing and writing an unfamiliar assignment type brief, and it is easy to see how assignment brief design can be problematic, particularly if the assignment is also innovative and authentic.

There are two ways that the issue of assignment brief design could be addressed. One is by establishing a central consultancy (see for example Gilbert and Maguire, 2011), the aim being to promulgate more effective assignment brief design. Another is by offering training in assignment brief design. In addition to the above two possible courses of action, it is also suggested that a central digital repository be set up where lecturers submit their assignment task briefs (as happens currently at Oxford Brookes with exam briefs), either to an institutional or departmental repository. This repository would not only prove invaluable in research such as the exploratory research described herein, but also function as a resource bank of exemplar assignment tasks for lecturers, as well as serving for subject, field or institutional course development, audits and accreditations.

Developing assessment literacy of staff

Higher Education institutions are aware of the need to increase ‘transparency in assessment processes’ (Rust, Price, O’Donovan, 2003). In Oxford Brookes, for example, this has manifested in the ASKe project with one of its outcomes being an assessment agreement: the Brookes Assessment Compact 2009. This aims to develop assessment literacy in staff and students alike. The call to higher education staff to understand the assessment process in greater depth is evident throughout the literature. Carroll, for example, suggests that for teachers the most effective way of helping students is to become ‘more knowledgeable about their own academic culture’ (Caroll, 2005, in Carroll and Ryan, 2005, p. 26-27). Similarly Gillett and Hammond (2009, p.133) state that:

…. lecturers need to understand their assessment practices fully in order to make them explicit for learners, to identify and disseminate best practice amongst colleagues and ultimately help their students to succeed (Gillett and Hammond 2009, p.133).

One lecturer interviewed in Bailey’s (2010) research summarises this as follows

They (students) need to understand why we ask them to write (for assessment) in certain ways and we need to show more understanding of the difficulties they face. We need to do this to retain students. (Bailey, 2010, p.11)

Several authors refer to our duty as teachers in higher education and how it is our responsibility to help students ‘develop strategies for handling examinations and coursework’ and ‘the skills of competence and literacy needed in their research and study skills’ (Gillett and Hammond 2009, p. 120).  In the same vein, Sadler (2010, p.1) says it is an ‘ethical imperative’ to help students understand the assignment task and to know when their work fulfils the task requirements.

One solution which may go some way towards developing assessment literacy would be to promote shared understandings in the field of assessment amongst teaching teams. There are numerous references in the literature regarding the need for shared understanding (see for example Dunn et al, 2004, p. 247) and one area of assessment that often gains attention is the key assessment terms used in assignment briefs, for example, ‘critically analyse’, ‘discuss’. These terms are open to multiple interpretations and, if staff as experts cannot agree on the definition and uses of such terms, how can ‘novice’ students be expected to ‘mirror our interpretation?’ (O’Donovan, Price and Rust 2004, p. 6). Similarly, Lea and Street’s (1998, p.7) research data found that ‘while academic staff can describe what constitutes successful writing, difficulties arose when they attempted to make explicit what a well developed argument looks like in a written assignment’. While it would not be feasible to share the same interpretation of key assessment terms across a whole institution, at field or programme level staff could agree on a shared interpretation and make this explicit to students.

Examples of two further areas of practice which could be standardised within a field or programme are: a shared understanding of the assignment types and the features of those assignment types common to that discipline and secondly, academic referencing conventions. Bloxham and Boyd (2007) refer to the varying emphases of different tutors and modules and the considerable differences between academic practices. In discussing these areas of practice and if necessary, making the implicit explicit, may provide a partial solution to ‘the conflicting and contrasting requirements for writing on different courses and from different tutors’ documented in Lea and Street’s study (1998, n.p.) and the assessment discourse would move towards being used more consistently within a field or programme.

Developing academic literacy in students through scaffolding  

It has been shown that ‘assessment has most effect when students are inducted into the assessment practices and cultures of higher education’ (Boud et al, 2010, n.p) and Nesi and Gardner (2005, p.114) found significant evidence that tutors felt it was their ‘responsibility to introduce students to norms specific to their area’. One way to induct students to the range of tasks facing them in their discipline, is to provide scaffolding for students (particularly novice students) undertaking novel assessment tasks.  Scaffolding occurs where an expert provides carefully selected and staged activities of the appropriate degree of challenge, provided and then removed at the appropriate time, preferably staged developmentally towards student autonomy.  Scaffolding supports the learners’ development in gaining competence in the task as learners ‘generalize the skill, to learn when the skill is or is not applicable, and to transfer the skill independently when faced with novel situations’ (Collins, Brown and Holum, 1991, p. 3). Key to this is that after this support has been removed the learner becomes fully autonomous. In terms of assignment tasks, scaffolding deconstructs the task to aid students in performing it (Collins, Brown, & Holum, 1991) and allows the sharing of tacit knowledge (Bloxham and Boyd 2007, p. 80). Scaffolding assignment tasks would therefore enable students to ‘have a clear understanding of expectations and, therefore, a reasonable chance of success’ (Macquarie University 2008a). By scaffolding students are introduced to the ‘expectations of writing in their subject area’ and students learn ‘writing practices’ and ‘how written products are organised’ in their discipline, practices which Nesi and Gardner (2005, p.112) found evidence of in their research. Where it is assumed that students are ‘familiar with a text type, there may be less instruction’ (Nesi and Gardner 2005, p. 112), i.e. scaffolding becomes less of a requirement as student expertise increases.

As a result of providing scaffolding, not only is less student support needed for an assignment, but, as students progress through their academic career, they will be better prepared to process assignment briefs, developing and activating the schemata of assignment types more readily, thus enabling them to produce what the lecturer has in mind for future assignment tasks:

Over time, scaffolding can be progressively withdrawn as students become more familiar and competent in understanding and meeting the requirements of particular assessment tasks (Macquarie University, 2008a). 

Use of scaffolding in the early stages of a student’s career acts as assignment brief training, and will contribute to helping students ‘master’ the assessment discourse (Williams, 2005).

It may be worth highlighting however, that given the increase in range of tasks discussed above, students may arguably come across ‘novel assessment tasks’ throughout their academic career, and, as therefore the ‘text types which characterise disciplinary communities’ (Hughes, 2009, p. 561) may no longer be characteristic of that academic community, scaffolding may well be necessary throughout a student’s studies. Alternatively students could be trained directly to become more autonomous in processing, deconstructing and dealing with unfamiliar written assignment tasks, a discussion of which follows.

Developing academic literacy in students through training

Students from a traditional study background will be accustomed to being assessed on their performance in traditional assessment tasks (see student quote below), such as essay writing. However, as seen in this research and elsewhere, the range of assignment types is growing therefore students will come across increasingly unfamiliar assignment tasks, particularly so in modularised or cross-disciplinary courses. Where essays may no longer be on the assignment type menu students should perhaps be trained to be able to adapt to the requirements of diverse unfamiliar written assignment tasks. Gibbs (1981) states that for students to be effective learners in higher education they need to be able to develop skills in recognising the requirements of a task and respond flexibly to different demands of assignment tasks, a flexibility which is echoed by North (2005). In terms of report writing for example, ‘the report genre should be taught as a tool for invention, not merely as an organisational pattern or formula’ (Sheehan and Flood 1999 cited in Reid 2010, p. 14), so for students to negotiate the literacies required to write reports they need to be supported in identifying the audience and purpose of their reports (Reid 2010, p.15).  This flexibility is particularly pertinent as the growing range of tasks is diverse, innovative and not possible to predict (Nesi et al, n.d.). Developing skills of flexibility in response to tasks also reflects the workplace where there is generally no set way given to approach a task. Therefore training students to interpret and approach the assignment brief appropriately develops skills of autonomy and literacy, both of which are key graduate attributes.

Developing academic literacy through embedding writing in the disciplines

A further way to help students respond to the range of tasks and associated discourses they have to engage with at university is to address writing skills within the discipline rather than as a separate academic writing module or as study skills support. Research into academic literacies (see Lea and Street, 1998) has shown that academic discourse cannot be separated from context or cultural practices within that discipline, therefore there has been ‘a move away from generic provision and teaching of ‘academic writing skills’ to embedding writing in the disciplines (Bailey, 2010; Ganobcsik-Williams, 2003, cited in Magyar, McAvoy and Forstner 2011, p. 2). Students need to be prepared for assessment in context, context does not differ only by discipline, but by module and tutor and this is ‘particularly important if students are facing an unfamiliar method of assessment’ (Bloxham and Boyd 2007, p. 75). Furthermore, in terms of learning resources, although there is a ‘wealth of material available to students wishing to prepare for their assessment tasks’ there is, according to Gillett and Hammond (2009, p. 122), a ‘mismatch’ between the guidance which focuses on traditional assignment types and the wide range of assignment types that students have to produce in their discipline. By embedding academic writing in the disciplines this restricted view of assessment in higher education that study skills support and resources target may be at least partially resolved.

Lea and Street (1998, p.8) found that students’ writing difficulties lay not in using the correct terminology or just learning to do ‘academic writing’, as the term in the academic socialisation model would suggest, and more about adapting previous knowledge of writing practices, academic and other, to varied university settings (Lea and Street, 1998, p.8).

They continue by quoting a student:

The thing I’m finding the most difficult in my first term here is moving from subject to subject and knowing how you’re meant to write in each one. I’m really aware of writing for a particular tutor as well as for a particular subject. Everybody seems to want something different. It’s very different to A levels where we used dictated notes for essay writing.  (Lea and Street, 1998, p.8)

Nesi and Gardner (2006, p. 114) report similar findings that students need to be ‘alert to differences not only across subject areas, but also across assignments’. Inducting students and embedding writing in the discipline would go some way towards helping students face the range of assignment tasks they may face in that discipline.

Conclusion

As ‘assessment drives learning’ (Falchikov, 2005, p. 32), written assignment tasks are clearly a driving force in the assessment process. The call to increase the range of assignment types in higher education continues in the interests of: employability, in the fostering of a more engaged deeper learning, in cultural diversity and internationalising the curriculum. A choice of assessment tasks has also been called for in order to impact on the approach to learning that students might adopt. There is evidence from both the literature and the research reported herein, that this call has been responded to, so with the profession no longer ignoring the ‘full range of assessment techniques’ at their disposal (Rust 2002 cited in Gillett and Hammond 2009, p. 122), there now exists a growing range of innovative and authentic written assignment tasks used for assessment. The range is ‘varied and dynamic’ (Gillett and Hammond, 2009, p. 134), and although there are several arguments opposing this extended range, it is widely accepted as reflecting good practice.  However, it is not surprising that, hand in hand with this increase in the range of tasks, written perhaps for audiences outside the academic community, coupled with an element of choice for students, the challenge for assignment task designers to ensure students are clear on what is expected of them is also growing.

Therefore, whilst not denying the inherent benefits of this move away from the more restricted diet of traditional written assignments, there are implications resulting from this, several of which are discussed above, and these need to be addressed in order to enable students to perform at their optimal level when undertaking such a diverse range of tasks. The tentative solutions proposed here to address these concerns, primarily in development in several aspects of assessment literacy knowledge and skill among staff and students alike, may contribute towards meeting the challenges the current assignment task diet presents, whilst trying to ensure it remains as healthy as it is varied.

List of references

  • ASKe. (no date). Assessment – an ASKe position paper. Retrieved on 18 August 2011.
  • Bailey, R. (2010). The role and efficacy of generic learning and study support. Journal of learning Development in Higher education. Issue 2. February 2010.
  • Bloxham, S. and Boyd, P. (2007). Developing Effective Assessment in Higher Education: a practical guide. McGraw-Hill: Open University Press.
  • Boud, D. and Associates. (2010). Assessment 2020: Seven propositions for assessment reform in higher education.  Sydney: Australian Learning and Teaching Council
  • Bowstead, H. (2009). Opinion Piece: Teaching English as a foreign language – a personal exploration of language, alienation and academic literacy. Journal of Learning Development in Higher Education. Issue 1: February 2009.
  • Brookes Assessment Compact. (no date). Retrieved on 16 August 2011.
  • Brown, S. (2011). Authentic assessment at Masters level. Keynote speech. Assessment in Higher Education conference. July 2011. University of Cumbria.
  • Brown, E. and Glover, C. (2006). Evaluating written feedback. Innovative Assessment in Higher Education. Bryan, C and Clegg, K (eds) London: Routledge.
  • Carroll, J. (2002). Dealing with plagiarism. Learning and Teaching Briefing Papers Series. Oxford Centre for Staff and Learning Development OCSLD. Oxford Brookes University.
  • Carroll, J. (2008). Assessment Issues for International Students and for Teachers of International Students. The Enhancing Series Case Studies: Internationalisation. HEA: Hospitality, Leisure, Sport and Tourism Network. Assessment Standards Knowledge Exchange (ASKe).
  • Carroll, J & Ryan, J. (2005). Teaching International Students. London: Routledge.
  • Collins, A., Brown, J.S., Holum, A. (1991). Cognitive apprenticeship: making thinking visible. American Educator. (Winter) 6-11, 38-46.
  • Devlin, M. (2002). Assessing students unfamiliar with assessment practices in Australian higher education. In James, R., McInnis, C. and Devlin, M. (eds) (2002) Assessing learning in Australian universities. Australian Universities Teaching Committee.
  • Dunn, L. (2002). Selecting methods of assessment. Learning and Teaching Briefing Papers Series. Oxford Centre for Staff and Learning Development.
  • Dunn, L., Morgan, C., O’Reilly, M., and Parry, S. (2004). The Student Assessment handbook. Abingdon: RoutledgeFalmer
  • Falchikov, N. (2005). Improving Assessment Through Student Involvement. Abingdon. Routledge.
  • Frith, L. (2005). University of Kent undergraduate students’ views of assessment 2005. UELT University of Kent.
  • Ganobcsik-Williams, L. (2001). Teaching writing at university: a survey of staff perspectives. Paper presented a Teaching writing in higher education: an international symposium. (Warwick Writing Programme, the University of Warwick. March 2001). Cited in Nesi, et al (n.d.) Towards the compilation of a corpus of assessed student writing, an account of work in progress.Gibbs, G. (1981). Teaching learners to learn. Buckingham. OUP.
  • Gibbs, G. (1992). Improving the quality of student learning. Oxford Centre for Staff Development. Bristol: Technical and Educational Services Ltd.
  • Gibbs, G. (2009). Developing students as learners- varied phenomena, varied contexts and a developmental trajectory for the whole endeavour. Journal of Learning Development in Higher Education. Issue 1: February 2009.
  • Gibbs, G. (2011). How to change assessment of degree programmes so as to improve student learning. 2011 Oxford Brookes University. ASKe seminar Online powerpoint.
  • Gilbert, F. and Maguire, G. (2011). Optimising Student Performance Through Assignment Brief Design. Workshop. Oxford Brookes Teaching and Learning conference. June 2011.
  • Gillett, A. and Hammond, A.  (2009). Mapping the Maze of Assessment.  Active Learning in Higher Education Vol.10 (2): 120-137.
  • Higher Education Academy. (2006). QAA Code of Practice for the assurance of academic quality and standards in higher education. Section 6 Assessment of students. September 2006.
  • Hughes, C. (2009). Assessment as text production: drawing on systemic functional linguistics to frame the design and analysis of assessment tasks. Assessment and Evaluation in Higher Education Vol. 34, No 5, October 2009, pp 553-563.
  • James, R., McInnis, C. and Devlin, M. (2002). Assessing Learning in Australian Universities. Australian universities Teaching Committee.
  • Lea, M. and Street, B. (1998). Student writing in higher education: An academic literacies approach. Studies in Higher Education. June 1998 Vol. 23 Issue 2. pp. 157 – 173.
  • MacKinnon, D. and Manathunga, C. (2003). Going global with assessment: what to do when the dominant culture’s literacy drives assessment. Higher Education Research & Development, Vol. 22, No. 2, 2003.
  • Macquarie University, (2008a). Assessing first year students. Assessment toolkit resources. Learning and Teaching Forum: Assessing Learning, Communicating Standards. Macquarie University. 2008.
  • Macquarie University, (2008b). Designing Assessment for Learning. Assessment toolkit resource. Learning and Teaching Forum: Assessing Learning, Communicating Standards. Macquarie University 2008.
  • Magyar, A., McAvoy, D., and Forstner, K. (2011). ‘If only we knew what they wanted’: bridging the gap between student uncertainty and lecturers’ expectations. Journal of Learning Development in Higher Education. Issue 3: March 2011.
  • Moore, T., and Morton, J. (2005). Dimensions of difference: a comparison of university writing and IELTS writing. Journal of English for Academic Purposes 4 (2005). pp. 43-66
  • Nesi, et al. (n.d.). Towards the compilation of a corpus of assessed student writing, an account of work in progress. Retrieved on 20 August 2011.
  • Nesi, H. and Gardner. S. (2005). “Variation in disciplinary culture: University tutors’ views on assessed writing tasks.” In: Kiely, R., Clibbon, G., Rea-Dickins, P. & Woodfield, H. (eds) Language, Culture and Identity in Applied Linguistics. London: Equinox Publishing.
  • Nesi, H. et al. (2008). An Investigation of Genres of Assessed Writing in British Higher Education: Full Research Report ESRC End of Award Report, RES-000-23-0800. Swindon: ESRC.
  • Nicol, D. (2007). Principles of good assessment and feedback: Theory and practice. Assessment design for learner responsibility. pp. 29-31. Retrieved on 8 July.
  • North, S. (2005). Different values, different skills? A comparison of essay writing by students from arts and science backgrounds. Studies in Higher Education. Vol. 30, No. 5, October 2005. pp. 517-533.
  • Norton, L. and Pitt, E. (2009). Writing Essays at University. A guide for students by students. Write Now Centre for Excellence in Teaching and Learning. Assessment Plus. Retrieved on 12 July 2010 from
  • O’Donovan, B, Price, M and Rust, C (2004). Know what I mean? Enhancing student understanding of assessment standards and criteria. Teaching in Higher Education, 9 (3). pp. 325-335.
  • Oxford Brookes University. (no date). Strategy for Enhancing the Student Experience 2010-2015. Retrieved on 16 August 2011.
  • Paltridge, P. (1996). Genre, text type, and the language learning classroom. ELT journal. Vol. 50/3. Oxford University Press.
  • Price, M., O’Donovan, B. and Rust, C. (2007). Putting a social-constructivist assessment process model into practice: building the feedback loop into the assessment process through peer review. Innovations in Education and Teaching International, 44 (2). pp. 143-152.
  • Reid, M. (2010). More than just having the right headings: supporting students’ report writing. Journal of Learning Development in Higher Education. Issue 2: February 2010. pp.1 -17
  • Rust, C., Price, M. and O’Donovan, B. (2003). Improving students’ learning by developing their understanding of assessment criteria and processes. Assessment and Evaluation in Higher Education, Vol. 28, No. 2, 147-164.
  • Rust, C., O’Donovan, B. and Price, M. (2005). A social constructivist assessment process model: how the research literature shows us this could be best practice, Assessment and Evaluation in Higher Education, 30 (3), 231-240.
  • Rust, C. (2007). Towards a scholarship of assessment. Assessment and Evaluation in Higher Education. Vol. 32. No. 2. 229-237.
  • Rust, C. (2008). Plenary on Assessment and Feedback. Promoting Enhanced Student Learning Conference. 2009 University of Nottingham. Online Recording.
  • Sadler, D. R. (2010). Assessment task design principles. ASKe Workshop on assessment. Oxford Brookes University. October 2010.
  • Sloan, D. and Porter, E. (2010). It’s all in the words: the impact of language on the design and development of assessment briefs for international students.  HEA Conference Shaping the Future: Future Learning. Northumbria University. June 2010.
  • Swales, J.M. (1990). Genre Analysis: English in academic and research settings. Cambridge: Cambridge University Press.
  • Williams, K. (2005). Lecturer and first year student (mis)understandings of assessment task verbs: ‘Mind the gap’. Teaching in Higher Education, 10:2, pp157-173.

Appendix 1

Assignment types identified from 80 assignment briefs in Oxford Brookes 2009.

Creative Commons License
This paper is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. How to cite this paper.
Posted in Research Paper
0 comments on “Written assignment types in assessment: a varied and healthy diet?
1 Pings/Trackbacks for "Written assignment types in assessment: a varied and healthy diet?"

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>