25
HERDSA Review of Higher Education, Vol. 6, 2019 www.herdsa.org.au/herdsa-review-higher-education-vol-6/65-89 Making the connections between academic work and learning analytics: A framework for driving learning analytics take up Deborah West* Flinders University Abstract Developments in the field of learning analytics have heralded the promise of new insights into the learning and teaching for some years now (Mazza and Milani, 2004; Romero, Ventura, & Garcıa, 2008). Responding to a range of drivers in the field, the sector took up the challenge of applying emerging data science techniques. Early work focussed on the issue of student retention but has moved to a broader focus on student success opening up a myriad of opportunities to improve the learning and teaching experience. Yet, despite the field making substantial progress in relation to analytics techniques, the translation of this into practice has been somewhat limited. A key challenge to translation into practice has been the take up of learning analytics across institutions by those most connected to the teaching–academics. While many people in institutions have a role to play in supporting students and moving learning analytics forward, academics are pivotal as they are central to the curriculum, the teaching and in relationship to students. This review provides a background to developments in the field focussing on the connections between learning analytics and academics highlighting the key challenge that we face as we work to increase the take up and application of learning analytics. In order to move forward connections need to be made between academic work and learning analytics. This is presented in a framework for driving learning analytics take up including the connections to the purpose of various reports and learning and teaching lifecycles. Keywords: learning analytics; academic engagement; applications. * Email: [email protected]

Making the connections between academic work and learning

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Making the connections between academic work and learning

HERDSA Review of Higher Education, Vol. 6, 2019 www.herdsa.org.au/herdsa-review-higher-education-vol-6/65-89

Making the connections between academic work and learning analytics: A framework

for driving learning analytics take up

Deborah West* Flinders University

Abstract

Developments in the field of learning analytics have heralded the promise of new insights into the learning and teaching for some years now (Mazza and Milani, 2004; Romero, Ventura, & Garcıa, 2008). Responding to a range of drivers in the field, the sector took up the challenge of applying emerging data science techniques. Early work focussed on the issue of student retention but has moved to a broader focus on student success opening up a myriad of opportunities to improve the learning and teaching experience. Yet, despite the field making substantial progress in relation to analytics techniques, the translation of this into practice has been somewhat limited. A key challenge to translation into practice has been the take up of learning analytics across institutions by those most connected to the teaching–academics. While many people in institutions have a role to play in supporting students and moving learning analytics forward, academics are pivotal as they are central to the curriculum, the teaching and in relationship to students. This review provides a background to developments in the field focussing on the connections between learning analytics and academics highlighting the key challenge that we face as we work to increase the take up and application of learning analytics. In order to move forward connections need to be made between academic work and learning analytics. This is presented in a framework for driving learning analytics take up including the connections to the purpose of various reports and learning and teaching lifecycles. Keywords: learning analytics; academic engagement; applications.

* Email: [email protected]

Page 2: Making the connections between academic work and learning

Deborah West

66

1. Introduction

While the use of data in higher education is not new, the field of learning analytics is a relatively recent development which holds much promise. However, despite considerable work taking place (see Joksimović, Kovanović, & Dawson, 2019 for an overview), the impact—that is, the translation of research into practice—still appears to be somewhat limited. This is largely related to the take-up of the tools by key stakeholders, particularly academics.

The review explores the broader field of learning analytics focussing on engagement with academics. It begins with a brief background to the key drivers and work in this field drawing attention to the connection between both education technology and political drivers and developments in the field. Drawing on a range of literature, the key challenge of engaging academics in the use of learning analytics is explored. The final section presents a framework for connecting key elements and academic requirements within the educational context to drive the use of learning analytics.

2. The development and drivers of learning analytics

While higher education has used data to inform its practices and understand various student cohorts for many years, learning analytics remains at an early stage of development. The application of analytics, seen as the overarching concept of data driven decision making (Watson, 2010), took hold in business and other fields to drive recommendation engines, identify patterns of behaviours and develop targeted campaigns. (Brown, Costello & Giolla-Mhichil, 2016; Manyika et al., 2011; Wilson, Watshon, Thompson, Drew & Doyle, 2017;). Underpinning analytics development is the connection to the rise of the digital environment generating ‘big data’ and our ability to capture, store and analyse such data (Manyika et al., 2011, p.1) across society. The education sector also began the digitization of data such as library records, student information records and learning transactions allowing for techniques and approaches to be applied to a range of purposes. As a result, universities and academics began to consider how the affordances offered by this field might be applied to education.

In part, this led to an early focus on definitional debates which tried to articulate differences between business analytics, academic analytics, educational data mining, predictive analytics, action analytics (amongst

Page 3: Making the connections between academic work and learning

HERDSA Review of Higher Education

67

others) and learning analytics (for example see Chatti, Dyckhoff, Schroeder, & Thüs, 2012; Ferguson, 2012; van Barneveld, Arnold & Campbell, 2012). Fuelling this debate was the differences which arose from some definitions focusing on particular applications and others being related to data analysis techniques. Considerable effort was therefore expended on nuancing definitions and positioning despite the fact that there is considerable overlap depending on the perspective taken. Buckingham Shum & Ferguson (2012, p. 3) observed that the core proposition is that as unprecedented amounts of digital data about learners’ activities and interests become available there is significant potential to make better use of this data to improve learning outcomes. Generally, however, the definition put forward by Long and Siemens (2011, p. 34) has been widely accepted with learning analytics being defined as “…the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs”.

Several key drivers related to the use of technology in education and the broader political context led to the increasing focus on learning analytics, the techniques used and the focus of the application. The widespread introduction of virtual learning environments (Mazza & Milani, 2004; Romero, Ventura, & Garcıa, 2008) and their ability to capture transactional data provided a potential window into the application of analytics to learning and teaching. A key question in the first instance became, how can we extract value from the big data sets of learning-related data?

Yet, as the use of technology provided opportunities for different approaches to education and a re-thinking of distance education, it also presented a new set of challenges in online learning (Dringus and Ellis, 2005). Early work in this area found that online courses have higher attrition rates than more traditional face to face classrooms (Diaz, 2000). Investigations into the differences between ‘click’ and ‘brick’ establishments suggest several causal factors. Conflicting results among the studies led to further debate as to which factors were the more important (Diaz, 2000; Dekker, Pechenizkiy, & Vleeshouwers, 2009; Rivera & Rice. 2002). For example, one possibility includes differences in the student demographic as there tends to be more students from lower socio-economic backgrounds with less formal educational qualifications enrolled in online courses. There can also be difficulties with effectively using the technologies needed for online study (Mazza and Dimitrova, 2004), increased time constraints and less academic ‘preparedness’ for study and students feeling less supported due to a lack of face-to-face contact (Frankola, 2002). The second driver was an educational challenge—namely, how can we optimise and improve opportunities for online learning?

Page 4: Making the connections between academic work and learning

Deborah West

68

The growing use of online platforms was also connected to widening participation agendas in higher education. For example, the Review of Australian Higher Education, better known as the ‘Bradley Review’ set some key targets: “the target proposed for higher education is that 40 per cent of 25- to 34-year-olds will have attained at least a bachelor-level qualification by 2020” (Bradley, Noonan, Nugent, & Scales, 2008, p. xvi). Such targets were seen to be connected to the inclusion of more students from lower socioeconomic groups, non-English speaking background, students with a disability, women in non-traditional areas, regional and remote students, and Indigenous students. Similar, participation agendas were also being promoted internationally (O’Shea, Onsman & McKay, 2011). This raised considerable concerns related to retention rates which were traditionally lower amongst equity groups (DIICCSRTE, 2012). As such, the third driver in relation to the adoption of learning analytics was related to both political and economic concerns; how could widening participation and student success be enhanced in this changing environment?

With these drivers in place—and an institutional imperative to do so—the use of learning analytics techniques to address student retention was a clear focus. One of the earliest examples was the Purdue Signals Project (Arnold and Pistilli, 2012; Campbell, DeBlois & Oblinger, 2007) which generated much interest in the sector and prompted many to begin exploring how learning analytics could be used for this purpose while also focussing on institutional capacity building (for examples see: Lawson, Beer, Rossi, Moore, & Fleming, 2016; Liu, Rogers, & Pardo, 2015; Marbouti, Diefes-Dux, & Madhavan, 2016; Norris & Baer, 2013; Sclater & Mullan, 2017). Initiatives such as commissioning two strategic projects on the use of learning analytics for retention and capacity building by the now defunct Office of Learning and Teaching (OLT) in Australia demonstrate government interest in this field (Colvin et al., 2016; West, Huijser, Heath, Lizzio, et al., 2016). In the UK, JISC also began several long-term projects to investigate how learning analytics may be positioned, developed and utilised to improve outcomes in education (JISC, 2018a; 2018b; 2018c). However, both within and beyond the learning analytics field, the concept of retention was being broadened to take a greater focus on student success.

Student success

In 2009 Vincent Tinto suggested universities need to take a holistic approach to student retention in recognition of the critical role played by the broader educational environment. Tinto, (2009) identified four key conditions of

Page 5: Making the connections between academic work and learning

HERDSA Review of Higher Education

69

student success including expectations; support; feedback and involvement (or engagement). Building on Tinto and with a focus on the first-year experience and progression through a course, Kift, Nelson and Clarke (2012) (see also Kift, 2009; Kift, 2015; Nelson, Kift & Clarke, 2012) developed a ‘transition pedagogy’ putting forward a framework for such a holistic approach. Further work by Nelson, Clarke, Stoodley, & Creagh, (2014) developed a Student Engagement Success and Retention Model (SESR-MM) which included the categories of learning; supporting; belonging, integrating and resourcing and allowed institutions to determine their capacity and readiness in relation to student support (Nelson & Clarke, 2014).

This work on student success and retention highlights several key elements which are connected to development in learning analytics as they:

> situate retention as part of a broader student success model

> highlight that student success is an institutional issue and responsibility and that all areas of the university have a role to play

> draw attention to various elements which are potentially measurable with learning analytics

> call attention to the process of progression through a lifecycle of experience and study

Together with increasing calls to ensure that learning analytics retains its focus on learning, a broader focus emerged.

Colvin et al.’s study (2016) found universities either focussed primarily on using learning analytics to support student retention or as a way of gaining greater insight into teaching and learning practices. Numerous other papers discuss links between student engagement and success, suggesting engagement tracking through the use of learning analytics can aid in supporting student success (Fritz & Whitmer, 2017; Li & Tsai, 2017; O'Riordan, Millard, & Schulz, 2016). While a number of others discuss identifying students who are at-risk of either failing or leaving the institution (Lawson, Beer, Rossi, Moore & Fleming, 2016; Liu, Rogers, & Pardo, 2015; Marbouti, Diefes-Dux, & Madhavan, 2016; Sclater & Mullan, 2017).

In a parallel Office of Learning and Teaching (OLT) project, West, Huijser, Heath, Lizzio, et al. (2016) explored institutional practices linking learning analytics with student retention and success. They identified the importance of using and exploring data appropriately within institutional contexts as well as ensuring relevant parties are included in developing and implementing policy and practice. The focus on appropriate use of data

Page 6: Making the connections between academic work and learning

Deborah West

70

encompassed the investigation of concerns related to ethical use of data (Colvin et al., 2016; Ifenthaler & Tracey, 2016; Lawson, Beer, Rossi, Moore, & Fleming, 2016; Pardo & Siemens, 2014; West, Huijser, & Heath 2016; West, Huijser, Heath, Lizzio, et al., 2016) which was particularly prominent in early discussions of learning analytics. While much of the work related to ethics was focussed on issues of data privacy and protection, connections were also made to the ethical imperative to apply our increasingly sophisticated techniques for the benefit of students in a way which improved learning and supported student success (Prinsloo & Slade, 2016). As such retention continued to be a key focus but was subsumed under the broader heading of student success.

With this broadened view, the field began to explore the application of increasingly sophisticated techniques and thinking to learning and student support. Work to date internationally (Sclater, Peasgood & Mullan, 2016) provides evidence to support the validity of predictive models via learning analytics and anticipate that they can be used in several ways:

> as a tool for quality assurance and quality improvement

> as a tool for improving retention

> as a tool for identifying and responding to varying outcomes for different and varied student cohorts

Additionally, the 2016 Horizon strategic briefing (Hall, Giesinger, Adams Becker, Davis, & Shedd, 2016) and the 2017 Horizon report (Adams Becker, Cummins, Davis, Freeman, Hall et al., 2017) closely link learning analytics to adaptive learning. Adaptive learning refers to the capacity to modify course content in response to student learning and development of skills, claiming,

enabled by machine learning, these technologies can adapt to a student in real time, providing both instructors and students with actionable data (Adams Becker, Cummins, Davis et al., 2017, p. 38).

However, the impact of such developments will only occur when such techniques are applied and actioned at scale.

While the potential of learning analytics in the sector has been well documented over the last few years, its development and application in teaching (classroom) contexts remains at an early stage (Colvin, et al., 2016; Sclater, Peasgood & Mullan, 2016; West, Huijser, Heath, Lizzio, et al., 2016).

Page 7: Making the connections between academic work and learning

HERDSA Review of Higher Education

71

3. The challenge: increasing the take up and application of learning analytics

Despite having the same potential to transform higher education, the field of analytics has yet to yield equivalent operational benefits to those seen in other fields such as banking, health and retail. Clearly, in order to reap any benefits such techniques need to be applied, utilised and actioned systematically (Sclater & Mullan, 2017; West, Huijser, Heath, Lizzio, et al., 2016). Herein lies our challenge—how can we move forward on the take up and application of analytics?

An argument is often made that the education sector is different than other fields. Where it is similar to other fields is that each sector has had to work on how such methodologies are best applied to their particular context in order to apply techniques appropriately and gain the relevant benefit. Work in the field of learning analytics has moved us forward in the contextualisation as outlined above. However, application and take-up remain limited. A recent content analysis of learning analytics literature published from 2012 to early 2018 (West, Luzeckyj, Toohey, & Searle, 2017) found that most of the reported work remained locally/discipline based and in pilot stages with limited testing of efficacy. The fact that there are major initiatives such as the SHEILA Project (2018) and the work of JISC continuing to focus on developing institution-wide capacity in learning analytics suggests that there is some way to go before we can consider the application to be widespread. There are several key separate but connected issues related to take up and application including, engagement of staff and students; differences in pedagogy, program design and delivery; and the cognitive nature of learning (including how to make this more transparent).

As noted in both the student success and the learning analytics literature, staff from across the institution have a role to play in relation to both. It therefore follows that gaining traction with learning analytics and applying this to student success would require staff in a diverse range of roles and areas of expertise coming together, engaging in the process and utilising the techniques and tools available. However, several related but separate challenges present themselves in relation to this. The first is the development of appropriate reports, algorithms and approaches and the second is the actual application and use of reports.

The issue of skilled staff was evidenced as early as 2013 when Siemens, Dawson, & Lynch identified issues emphasising a “capabilities and skills shortage and access to knowledgeable people who are able to link pedagogy and analytics” (p. 22). The call for connection to pedagogy has continued to

Page 8: Making the connections between academic work and learning

Deborah West

72

be made by a range of authors (Lodge & Corrin, 2017; West, Huijser, Heath, Lizzio, et al., 2016; West, Luzeckyj, Searle, Toohey, & Price, 2018) for both developmental work and also to ensure take up. Recognising context, including pedagogy, as a key element and driver for development and use of learning analytics is critical, as it is teachers who can most appropriately identify where “student patterns of behaviour match with the pedagogical reasons for why that activity should lead to student learning” (Lodge & Corrin, 2017, p. 2). While increasing, this aspect remains somewhat elusive as demonstrated by the unpublished content analysis undertaken by West et al. (2017) where only 58% of learning analytics publications between 2012 and February 2018 made at least some connection to educational theory.

The data science aspect of learning analytics is largely confined to pockets of work in key institutions in Australia, and through informal research groups and formal networks both within Australia and internationally (e.g. SOLAR; JISC; LAK). While much of this work is sophisticated and cutting edge, it is generally quite intense and requires considerable disciplinary academic input, which can be difficult to gain. As such it is often being undertaken by those connected to data science as their disciplinary area (for example see Kim, Park, Yoon, & Jo, 2016; Li & Tsai, 2017). Additionally, the work is predominately translating into small projects using small—rather than big—data that are not scalable nor sustainable at an institutional level. Again, this was borne out in content analysis of literature which showed a limited number (35%) of reported studies beyond one class of students.

Vigentini et al. (2017) and Froissard, Liu, Richards, & Atif (2017) provide insight into small data, scalable projects supporting cross-disciplinary and institution-wide collaboration. They argue—similar to other authors (Colvin, et. al., 2016; Siemens, Dawson & Lynch, 2013; West, Luzeckyj, Searle, Toohey, & Price, 2016a)—that teaching staff, academic development staff and information technology and/or data scientists need to work together to develop or adapt applications to suit their own, specific teaching situations. Lodge & Corrin (2017) also suggest both psychology and neuroscience academics may also provide value to the teams who are developing learning analytics approaches as these disciplines bring specific insight into how learning occurs.

Yet, even if such approaches are taken, the actual use of learning analytics reports by academics is limited. One concerning evaluation discussed an institution-wide learning analytics program which attracted very little take up of learning analytics by academic staff who could not see the

Page 9: Making the connections between academic work and learning

HERDSA Review of Higher Education

73

value of investing their limited time into initiating or using the reports provided by the systems the institution had developed (Corrin et al., 2016). Other studies also highlight low levels of take up (Beer, Tickner, & Jones, 2014; Corrin, et al., 2016; Vigentini, et al., 2017; West, Tasir, et al., 2018), which is confirmed by common experiences in the sector. Given the financial and human (time) costs invested in developing learning analytics approaches it is imperative that there is a return on investment for all parties involved in researching, developing and using resources.

It needs to be recognised that learning analytics can operate at a number of levels and each level has been assigned different ‘value’ within the sector. For example, Morgan & Duncan (2016) provide an overview of the four primary categories of analytics, the functions they perform and the associated value they can yield (Table 1). Underpinning this, is the idea that descriptive and diagnostic analytics are of lower value than predictive and prescriptive analytics—a view which has pervaded the field.

Table 1. Four primary categories of learning analytics Category Function Value Descriptive analytics What happened hindsight Diagnostic analytics Why did it happen insight Predictive analytics What will happen Foresight Prescriptive analytics How can we make it

happen Foresight

While it may be true that predictive analytics are more valued by learning analytics researchers, this disregards a number of other factors important in the take up of analytics, including the needs of academics; their engagement; the application of learning analytics; and ethical issues which arise in the education sector.

Adams Becker, et al., (2017) suggest in the 2017 Horizon Report that staff and student digital literacy impact on the capacity to adopt and appropriately use learning analytics and adaptive learning. In their paper indicating the missing link for learning analytics, Gunn, McDonald, Donald, Blumenstein, & Milne (2016) suggest that training in the adoption and use of analytics is rarely discussed and even more sparingly provided to academics. This finding supports the need to improve digital literacy of staff and students, and their understanding of learning analytics. However, the literature would suggest that digital literacy is only one barrier to take up (Lodge & Corrin, 2017; West, Huijser, Heath, Lizzio et al., 2016; West, Tasir et al., 2018).

Page 10: Making the connections between academic work and learning

Deborah West

74

Several studies have investigated the needs of academics in relation to learning analytics. When exploring what academics were interested in West, Huijser, Heath, Lizzio, et al., (2016) found that academics highlighted several key applications (from now on described as use cases) including improving curriculum/course design and teaching, and improving retention and student performance, while others identified types of data including student behaviours, academic actions and behaviours, and student demographics. The final, identified category, ‘student success’, was a broader combination of both a use case and data source. Subsequent workshops across a range of institutions and at various conferences has reinforced the same requirements (West, 2018)

A subsequent international project undertaken by the Innovative Research Universities and Malaysian Research Universities Network (West, Tasir, et al., 2018) also explored the learning analytics requirements of teaching academics. Findings indicated that they were interested in several key data sources being applied to both teaching practice and student success which included:

> Access metrics (student access in relation to educational materials)

> Time measurements (time spent on various learning tasks and teaching resources)

> Engagement metrics (relationships between engagement and student success)

> Academic and grade performance (to assess efficacy of assessment)

They also emphasised questions concerning student preparedness for higher education study, (e.g. background, how prepared are they for this subject, what kind of experience do they have in relation to this discipline) and particularly the level of English and Maths proficiency.

It should be noted that the nature of questions teaching staff asked has a relationship to the understanding and context of learning analytics at each institution and a dependency on pedagogy and teaching mode (e.g. blended or fully online). This, conversely, explains the degree of complexity in the questions teaching staff want answered. Higher order thinking in terms of the complexity of questions is a development that can be expected as understanding of learning analytics matures. Irrespective of the reasons why, academics placed value on all levels of learning analytics from descriptive through to predictive. If we are to engage academics in the use of learning

Page 11: Making the connections between academic work and learning

HERDSA Review of Higher Education

75

analytics we need to work with them by addressing all of their needs and beginning at their level of understanding.

Additionally, many of the questions/use cases that were identified in West, Tasir et al., (2018) and West, Huijser, Heath, Lizzio et al, (2016) related to the academics’ role (e.g. teacher, subject/unit coordinator, course/program coordinator) and aligned with different key tasks— addressing issues of student engagement in the teaching semester, finding ways to support students during semester; reviewing and improving curriculum largely as a reflective task. In addition, such tasks can be seen to sit in several different lifecycles; a course/program lifecycle from development through to continual improvement and review and a subject/unit teaching lifecycle from preparation, to teaching and review (see Figures 1-3 below).

However, returning to the idea that student success involves stakeholders across the institution it is important to consider the needs of students and professional staff members who are often overlooked in current work. There is a growing body of researchers who are interested in developing resources and student dashboards which allow students to monitor and take control of their own learning (Daniel, 2017; Kitto, Lupton, Davis, & Waters, 2016; Schumacher & Ifenthaler, 2018; Tan, Koh, Jonathan, & Yang, 2017; West, Luzeckyj, Toohey, Vanderlelie, & Searle, in press). Illustrative of the idea of working with stakeholders, it was proposed by Dollinger and Lodge (2018, p. 97) at LAK 2018 that the current commonly used definition of learning analytics be amended to be “…the measurement, collection, analysis and reporting of data with learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occur” rather than just about learners. While the current review does not focus on students, it is an important point to note, because it again draws attention to the broader range of stakeholders that goes beyond academics.

Professional staff members are also overlooked in the current learning analytics dialogue, but they are often involved in supporting academic staff and students. For example, they play key roles related to student retention, student success and education quality where many of the learning analytics tools, dashboards and reports can be useful. Additionally, they often support academics in the provision of data for improvement of curriculum and teaching. In many ways however, professional staff are easier to engage in the use of the tools and reports as their roles are generally more defined and focussed. Tools that assist them in their role are generally welcomed and can be embedded in procedures and processes. For example, student

Page 12: Making the connections between academic work and learning

Deborah West

76

support or success officers are focussed specifically on that and reports that identify where to focus their efforts are seen as very useful. Yet, this is often not the case for academics as given the complexity of academic roles and time pressures it can unfortunately be challenging for them to identify how reports can help them focus their efforts.

It is clear from the literature that despite the work that is being undertaken and the potential benefit of learning analytics, our key challenge is related to engaging more academics in the use of the tools and learning analytic reports. Without such engagement, the field is likely to have limited impact (and remain the focus for academics whose research areas align with it) particularly as academics are the people who are closest to the students, have responsibility for the curriculum and have the most influence on the student experience. While some will engage keenly due to personal interest and enthusiasm, this will not be the majority. It is well known in relation to technology that there are basically three groups: the early adopters, the majority who will come to it but need to be supported and encouraged, and those that will not engage for various reasons (Corrin, Kennedy, de Barba et al., 2016; Newland, Martin & Ringan, 2015; Sclater, Peasgood & Mullan, 2015). The early adopters are those that are doing fantastic work in the sector, are publishing their work, and—as noted above—are generally in the data science/information technology fields. Yet, to have any real impact we need to have the broader base of academics engaged. The key question therefore becomes—how can we engage the majority of academics in the use of learning analytics to support student success?

4. Making the connections between academic work and learning analytics

In tackling the challenge of engaging academics in using learning analytics, the work outlined above provides us with some key insights which need to be drawn together and applied within a framework. These can be summarised as follows:

Academics are busy

Academics will be more attracted to tools that assist in undertaking the core elements of their role rather than adding to their workload. They need to be convinced of the value in investing time in learning about analytics and the value of actually using it to improve student learning.

Page 13: Making the connections between academic work and learning

HERDSA Review of Higher Education

77

Learning analytics reports need to be easy to use and non-technical

Many academics are not technical people and need tools that are easy to use in the context of their work. Combined with time limitations, academics are unlikely to go to another information technology system to find such information. Additionally, manipulating data is interesting for some academics, and even when it is of interest most academics are unlikely to invest significant amounts of time in doing so in relation to their teaching. Most with this inclination to manipulate data are more likely to do so in the context of their research. As such, the learning analytics reports need to be easy to read and relevant to their day-to-day teaching. It should not require training in learning analytics to use the learning analytics reports.

Academics have varying roles

Some academics are unit/subject coordinators or teach into a unit/subject. Some academics are course/program coordinators. Different information is required for these different roles, but all information is important to students in a transition pedagogy. Additionally, learning analytics reports have different purposes. For example, some are linked with curriculum improvement while others are focussed on student retention. Academics will not necessarily see the connection without some guidance that explicitly links learning analytics reports to the use case or purpose.

Context and pedagogy are important

While some reports will be useful to all academics teaching a unit/subject; others are more appropriate to particular pedagogies. However, this is often only able to be identified through the lens of the discipline and through the application to the teaching.

Usefulness is not related to sophistication of techniques

The sophistication of the data science techniques is not of concern to those who might use the reports. Academics are looking for learning analytics reports that help them to answer some key questions and this might sit at any level of sophistication from a data science perspective. In the earlier stages of development, this is often at the descriptive or insight level.

Page 14: Making the connections between academic work and learning

Deborah West

78

Learning analytics reports and approaches have different purposes

It is useful to make more explicit the actual purpose of the learning analytics report and how it aligns with use cases. Student success relies on students transitioning through the various years and stages of curriculum and this is connected to a student lifecycle. However, it is also connected to lifecycles related to teaching and curriculum improvement. As such learning analytics reports and use cases can be attached to 3 key lifecycles: course/program lifecycle (figure 1); unit/subject lifecycle (figure 2) and student success lifecycle (figure 3). This will provide a reference point to identify at which point in time learning analytics reports will be most useful based on the task/issue at hand. With these lifecycles as a guide, automated notifications can be sent to highlight what might be useful to an academic at a key point in time.

Perception that institutions do not value teaching

Although teaching may not be directly and explicitly seen to be valued in some institutions, the outcome of good teaching (i.e. greater student retention and success) is perceived to be very important by all institutions. Supporting academic staff to understand and use learning analytics reports as they develop and improve curriculum content will aid in improving teaching quality and make achieving good teaching outcomes easier for academics.

5. A framework for driving learning analytics take up

To connect the key insights in academic work and learning analytics together suggests a need for a framework which embeds learning analytics reports into the various lifecycles in a coherent, relevant and easy to interpret way. In implementing such a framework, it would be anticipated that there would be a variety of learning analytics reports or use cases attached to different points in the course/program lifecycle (Figure 1). The framework can then assist in embedding relevant reports within the context of teaching practice as well as policy, procedure and processes which support teaching and student success.

Page 15: Making the connections between academic work and learning

HERDSA Review of Higher Education

79

Figure 1. Course/program life cycle

The identification of roles and responsibilities is critical in terms of implementation. Data in and of itself will not impact on student retention, student success, curriculum or teaching improvement. Procedures and processes with clearly articulated responsibilities in terms of who is doing what and for what purpose are required (whether that be the unit/subject coordinator, course coordinator or professional support staff). For example, in Figure 2 a retention policy may identify that monitoring of student progress in a unit/subject needs to take place. The institution may assign this to the unit/subject coordinator or a student retention officer or at the course level to a course/program coordinator or student retention officer.

Page 16: Making the connections between academic work and learning

Deborah West

80

Figure 2. Subject/unit life cycle

The next step in implementing a framework for learning analytics take up would be to embed the learning analytics reports into the relevant systems. For example, some reports would be connected and accessible via the learning management system while others might be in the relevant curriculum management system (Figure 3). As such the learning analytics report lives in whichever system the academic or professional staff members spends their time undertaking their primary role. For a teaching academic, the link to key learning analytics reports would be most beneficial within the learning management system where they are most likely to utilise it.

Page 17: Making the connections between academic work and learning

HERDSA Review of Higher Education

81

Figure 3 Student success life cycle

The figures of lifecycles above provide an example of how these might be mapped in different points in time academic or professional staff members use to complete their everyday work. It should be noted however, that the same learning analytics report may serve several purposes and be identified in various lifecycles.

Table 2 provides an example of how these various elements might connect. This table is not intended to be comprehensive but to show how we might link the elements to key questions about the student experience. High level templates may also be useful in this context where different templates may be aligned with different pedagogical approaches and relevant learning analytics reports.

Engaging academics in the use of learning analytics therefore needs to be related to the primary purpose of their role as an academic. As such, the focus must be on the use cases (identified by the academics): supporting student success, improving retention, improving teaching practice and curriculum design. Any training that takes place should be within this context. For example, this should be included in any academic orientation to the university or within the context of professional development related to curriculum improvement or development. It should also be included as a part of a broader policy, procedural and process framework with learning

Page 18: Making the connections between academic work and learning

Deborah West

82 T

able

2

Focu

s K

ey Q

uest

ion

Po

ssib

le m

etri

cs/d

ata

poin

ts/in

dica

tors

Pe

dago

gica

l A

ppro

ach

Whi

ch li

fecy

cle

Po

int

on

lifec

ycle

R

ole

&

Res

pons

ibili

ty

Ret

entio

n

Whi

ch s

tude

nts

are

mos

t at

ri

sk?

Who

has

/has

not

sub

mitt

ed w

ork?

A

ny

Tea

chin

g lif

ecyc

le

Wee

ks 1

-3

Uni

t/su

rjec

t co

ordi

nato

r

W

hat

are

the

acce

ss p

atte

rns

and

tren

ds?

Any

Wee

ks 1

-3

Uni

t/su

bjec

t co

ordi

nato

r

W

hat

is th

e da

te o

f fir

st a

cces

s an

d la

st

acce

ss t

o th

e LM

S?

Any

Wee

ks 1

-3

Uni

t/su

bjec

t co

ordi

nato

r

Ea

rly

asse

ssm

ent

scor

e A

ny

W

eeks

1-3

U

nit/

subj

ect

coor

dina

tor

Stud

ent

Succ

ess

W

hich

, and

ho

w m

any,

st

uden

ts a

re

likel

y to

st

rugg

le w

ith

the

cont

ent

of

this

unit?

Back

grou

nd p

repa

ratio

n in

ter

ms

of:

Pre-

entr

y fo

r ea

rlie

r ye

ar s

tude

nts

Succ

ess

in r

elat

ed u

nit/

subj

ects

for

subs

eque

nt y

ears

Li

tera

cy/n

umer

acy

leve

ls

Any

Le

arni

ng a

nd T

each

ing

lifec

ycle

C

urri

culu

m li

fecy

cle

St

uden

t lif

ecyc

le

(Lea

rnin

g an

d te

achi

ng

laye

r)

Wee

ks 1

-6

Pre

and

Post

te

achi

ng

Post

T

each

ing

Rev

iew

Uni

t/su

bjec

t co

ordi

nato

r C

ours

e/pr

ogra

m

Coo

rdin

ator

Cur

ricu

lum

de

sign/

teac

hing

Is

the

un

it/su

bjec

t cu

rric

ulum

de

sign

effe

ctiv

e?

Stud

ent

prog

ress

ion

in t

he u

nit/

subj

ect

Stud

ent

prog

ress

ion

thro

ugh

cour

se/p

rogr

am (

are

they

pre

pare

d fo

r ne

xt u

nit/

subj

ect?

) G

rade

s in

uni

t/su

bjec

t (s

tude

nt s

ucce

ss)

Stud

ent

Satis

fact

ion

(e.g

. Stu

dent

Ev

alua

tion

of T

each

ing)

A

re s

tude

nts

enga

ging

with

the

m

ater

ials/

disc

ussio

ns

Any

U

nit/

subj

ect

lifec

ycle

C

ours

e/pr

ogra

m

lifec

ycle

Wee

ks 4

-8

Post

T

each

ing

Rev

iew

Uni

t/su

bjec

t co

ordi

nato

r

Cou

rse/

prog

ram

co

ordi

nato

r

Page 19: Making the connections between academic work and learning

HERDSA Review of Higher Education

83

analytics being seen as a tool to support further improvement in the given area. It should not require separate training on how to technically use the tools (this should be intuitive and simple) but rather to focus on how we can understand and apply the data within the context of the teaching. As one academic succinctly stated, ‘Tell me what data is available, give me access to it, give me the time to use it and give me guidance in using it’. (West, Huijser, Heath, Lizzio, et al., 2016)

6. Conclusion

The application of analytics approaches to higher education has made some significant strides in recent years from a technical and data science perspective. However, the key challenge that remains is to increase the academic take-up of the tools available. Small pockets of use will not substantially improve the experience or success of students which is in fact the main goal.

Academics are not concerned with the sophistication of the techniques but rather want learning analytics reports that will fit particular use cases to make their work easier and/or more effective. They will not necessarily see the relevance of learning analytics reports to their work at key points in time unless the reports are integrated within academic work by being easy to access and specific to their teaching context. Connecting the use cases to key policy, procedures and processes, and embedding them in the context of the academic work so their value and relevance is clear, is most likely to promote greater take up at a broader level. Learning analytics enthusiasts will continue to make great strides in the data science that drives learning analytics development, but this is not the focal point of the average academic. It is necessary to work at both ends of the spectrum—starting where the user is and with the user needs in order to gain traction with academic and professional staff and ultimately provide benefits for our students.

7. Acknowledgements

I would like to thank and acknowledge key colleagues who have contributed to and help shape my thinking about learning analytics over a number of years. In particular I would like to acknowledge Mr Bill Searle at Charles Darwin University and Dr Ann Luzeckyj at Flinders University. I would also like to thank the Australian Government’s Office for Learning and Teaching

Page 20: Making the connections between academic work and learning

Deborah West

84

and the Innovative Research Universities who have funded the majority of the work on learning analytics that I have been privileged to lead.

8. References

Adams Becker, S., Cummins, M., Davis, A., Freeman, A., Hall Giesinger, C., & Ananthanarayanan, V. (2017). NMC Horizon report: 2017 Higher education edition. Austin, Texas: The New Media Consortium. Retrieved from https://www.nmc.org/publication/nmc-horizon-report-2017-higher-education-edition/

Arnold, K., & Pistilli, M. (2012). Course signals at Purdue: Using learning analytics to increase student success. In S. Buckingham Shum, D. Gasevic, & R. Ferguson, (Eds.), Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ’12) (pp. 269-270). New York: Association for Computing Machinery. doi:10.1145/2330601.2330666

Beer, C., Tickner, R., & Jones, D. (2014). Three paths for learning analytics and beyond: moving from rhetoric to reality. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Rhetoric and Reality: Critical Perspectives on Educational Technology. Proceedings ASCILITE Dunedin 2014 (pp. 242-250). Dunedin, New Zealand: Australasian Society for Computers in Learning in Tertiary Education.

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian higher education: Final report. Canberra, ACT: Department of Education, Employment and Workplace Relations. Retrieved May, 12, 2015 from www.mq.edu.au/pubstatic/ public/download.jsp?id=111997

Brown, M., Costello, E. & Giolla-Mhichil, M.N. (2016). A strategic response to MOOCs: What role should governments play? In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show me the learning. Proceedings ASCILITE 2016 (pp. 86-96). Adelaide: Australasian Society for Computers in Learning in Tertiary Education.

Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society, 15(3), 3-26.

Campbell, J., DeBlois, P., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. Educause Review 42(4), 40–57 Retrieved from https://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era

Chatti, M., Dyckhoff, A., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5/6), 318–331.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., Nelson, K., Alexander, S., Lockyer, L., Kennedy, G., Corrin, L., & Fisher, J. (2016). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching. Retrieved from: http://www.olt.gov.au/

Page 21: Making the connections between academic work and learning

HERDSA Review of Higher Education

85

project-student-retention-and-learning-analytics-snapshot-currentaustralian-practices-and-framework

Corrin, L., Kennedy, G., de Barba, P.G., Lockyer, L., Gaševic´, D., Williams, D., Dawson, S., Mulder, R., Copeland, S., & Bakharia, A. (2016). Completing the loop: Returning meaningful learning analytic data to teachers. Sydney: Office for Learning and Teaching. Retrieved from: http://melbourne-cshe.unimelb.edu.au/ __data/assets/pdf_file/0006/2083938/Loop_Handbook.pdf

Daniel, B. K. (2017). Big data and data science: A critical review of issues for educational research. British Journal of Educational Technology, 50(1), 101-113. doi:10.1111/bjet.12595

Dekker, G., Pechenizkiy, M., & Vleeshouwers, J. (2009). Predicting students’ drop out: A case study. In T. Barnes, M. Desmarais, C. Romero, & S. Ventura (Eds.) Proceedings of the 2nd International Conference on Educational Data Mining (EDM '09) (pp. 41-50). Cordoba, Spain: Universidad de Cordoba. http://www.educationaldatamining.org/EDM2009/uploads/proceedings/dekker.pdf

Department of Industry, Innovation, Climate Change, Science, Research and Tertiary Education (DIICCSRTE) (2012). 2011 full year higher education student statistics Appendix 2 – Equity Groups. Canberra: Australian Government. Retrieved 18 June, 2013 from http://www.innovation.gov.au/HigherEducation/ HigherEducationStatistics/StatisticsPublications/Pages/2011StudentFullYear.aspx

Diaz, D. (2000). Comparison of student characteristics, and evaluation of student success, in an online health education course. (Unpublished doctoral dissertation). Nova Southeastern University, Florida, USA. Retrieved 12 May 2015 from http://home.earthlink.net/~davidpdiaz/LTS/pdf_docs/dissertn.pdf

Dollinger, M., & Lodge, J.M. (2018). Co-creation strategies for learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK '18) (pp. 97-101). New York: Association for Computing Machinery. doi: https://doi.org/10.1145/3170358.3170372.

Ferguson, R., (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317.

Frankola, K. (2001, June 3). Why online learners drop out. Workforce, Retrieved May 12, 2015 from http://www.workforce.com/articles/why-online-learners-drop-out

Fritz, J., & Whitmer, J. (2017). Learning analytics research for LMS course design: Two Studies. Educause Review. Retrieved from http://er.educause.edu/articles/ 2017/2/learning-analytics-research-for-lms-course-design-two-studies

Froissard, J.-C., Liu, D. Y.-T., Richards, D., & Atif, A. (2017). Learning analytics pilot in Moodle and its impact on developing organisational capacity in a university. In H. Partridge, K. Davis, & J. Thomas. (Eds.), Me, Us, IT! Proceedings ASCILITE2017: 34th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education (pp. 73-77). Toowoomba, Qld: Australasian Society for Computers in Learning in Tertiary Education.

Gunn, C., McDonald, J., Donald, C., Blumenstein, M., & Milne, J. (2016). The missing link for learning analytics. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.),

Page 22: Making the connections between academic work and learning

Deborah West

86

Show Me The Learning. Proceedings ASCILITE 2016 (pp. 255-260). Adelaide: Australasian Society for Computers in Learning in Tertiary Education.

Hall Giesinger, C., Adams Becker, S., Davis, A., & Shedd, L. (2016). Scaling solutions to higher education’s biggest challenges: An NMC Horizon project strategic brief. Austin, Texas: The New Media Consortium. Retrieved from https://www.nmc.org/publication/scaling-solutions-to-higher-educations-biggest-challenges/

Ifenthaler, D., & Tracey, M. W. (2016). Exploring the relationship of ethics and privacy in learning analytics and design: Implications for the field of educational technology. Educational Technology Research and Development, 64(5), 877-880. doi:10.1007/s11423-016-9480-3

Joksimović, S., Kovanović, V., & Dawson, S. (2019). The journey of learning analytics. HERDSA Review of Higher Education, 6, 27-63

JISC (2018a) Business Intelligence Project: Working in partnership with HESA to deliver a national business intelligence service for the higher education sector. Retrieved from https://www.jisc.ac.uk/rd/projects/business-intelligence-project

JISC (2018b) Effective learning analytics: Helping further and higher education institutions to analyse and understand their data. Retrieved from https://www.jisc.ac.uk/rd/projects/effective-learning-analytics

JISC (2018c) Horizon scanning: What’s next in digital technologies and how can we help? https://www.jisc.ac.uk/rd/projects/horizon-scanning

Kift, S. (2009). Articulating a transition pedagogy to scaffold and to enhance the first year student learning experience in Australian higher education: Final report. Retrieved April 14, 2015 from http://www.altc.edu.au/resource-transition-pedagogy-report-qut-2009

Kift, S. (2015). A decade of Transition Pedagogy: A quantum leap in conceptualising the first year experience. HERDSA Review of Higher Education, 2, 51-86. Retrieved from http://www.herdsa.org.au/herdsa-review-higher-education-vol-2/51-86

Kift, S., Nelson, K., & Clarke, J. (2010). Transition pedagogy: A third generation approach to FYE - A case study of policy and practice for the higher education sector. The International Journal of the First Year in Higher Education, 1(1), 1-20.

Kim, D., Park, Y., Yoon, M., & Jo, I.-H. (2016). Toward evidence-based learning analytics: Using proxy variables to improve asynchronous online discussion environments. The Internet and Higher Education, 30, 30-43. doi: http://dx.doi.org/10.1016/j.iheduc.2016.03.002

Kitto, K., Lupton, M., Davis, K., & Waters, Z. (2016). Incorporating student-facing learning analytics into pedagogical practice. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me the Learning. Proceedings ASCILITE 2016 (pp. 338-347). Adelaide: Australasian Society for Computers in Learning in Tertiary Education.

Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: the ethical dilemmas of intervention strategies in a higher education institution. Educational Technology Research and Development, 64(5), 957-968. doi:10.1007/s11423-016-9459-0

Page 23: Making the connections between academic work and learning

HERDSA Review of Higher Education

87

Li, L.-Y., & Tsai, C.-C. (2017). Accessing online learning material: Quantitative behavior patterns and their effects on motivation and learning performance. Computers & Education, 114(Supplement C), 286-297. doi: https://doi.org/ 10.1016/j.compedu.2017.07.007

Liu, D.-Y.T., Froissard, J.-C., Richards, D., & Atif, A. (2015). An enhanced learning analytics plugin for Moodle: student engagement and personalised intervention. In T. Reiners, B.R. von Konsky, D. Gibson, V. Chang, L. Irving, & K. Clarke (Eds.), Globally connected, digitally enabled. Proceedings ASCILITE 2015 (pp. FP:168- FP:177). Perth, Western Australia: Australasian Society for Computers in Learning in Tertiary Education.

Lodge, J. M., & Corrin, L. (2017). What data and analytics can and do say about effective learning. npj Science of Learning, 2(1), 5. doi:10.1038/s41539-017-0006-5

Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46(5), 31–40. Retrieved from https://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education

Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburg, C. & Hung Byers (2011) Big Data: The Next Frontier for Innovation, Competition, and Productivity. McKinsey Global Institute. https://www.mckinsey.com/~/media/McKinsey/ Business%20Functions/McKinsey%20Digital/Our%20Insights/Big%20data%20The%20next%20frontier%20for%20innovation/MGI_big_data_exec_summary.ashx

Marbouti, F., Diefes-Dux, H. A., & Madhavan, K. (2016). Models for early prediction of at-risk students in a course using standards-based grading. Computers & Education, 103, 1-15. doi: https://doi.org/10.1016/j.compedu.2016.09.005

Nelson, K., & Clarke, J. (2014). The First Year Experience: Looking back to inform the future. HERDSA Review of Higher Education, 1, 23-45. Retrieved from http://www.herdsa.org.au/herdsa-review-higher-education-vol-1/23-45

Nelson, K., Clarke, J., Stoodley, I., & Creagh, T. (2014). Establishing a framework for transforming student engagement, success and retention in higher education institutions: Final Report. Canberra, Australia: Australian Government Office for Learning & Teaching.

Nelson, K., Kift, S & Clarke, J. (2012). A transition pedagogy for student engagement and first-year learning, success and retention. In I. Solomonides, A. Reid & P. Petocz (Eds.) Engaging with learning in higher education. Faringdon, UK: Libri Publishing.

Norris, D.M., & Baer, L.L. (2013). Building Organizational Capacity for Analytics. Washington: Educause. Retrieved from https://www.educause.edu/ir/library/ pdf/PUB9012.pdf

O'Riordan, T., Millard, D. E., & Schulz, J. (2016). How should we measure online learning activity? Research in Learning Technology, 24(1). doi:10.3402/rlt.v24.30088

O’Shea, H., Onsman, A., & McKay, J. (2011). Students from low socioeconomic status backgrounds in higher education: An annotated bibliography 2000-2011. Sydney: Australian Learning and Teaching Council. Retrieved from http://www.lowses.edu.au/assets/LSES-Annotated-Bibliography.pdf

Page 24: Making the connections between academic work and learning

Deborah West

88

Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438-450. doi: http://dx.doi.org/ 10.1111/bjet.12152

Prinsloo, P., & Slade, S. (2016). Student Vulnerability, Agency, and Learning Analytics: An Exploration. Journal of Learning Analytics, 3(1), 159-182. doi: https://doi.org/10.18608/jla.2016.31.10

Rivera, J., & Rice, M. (2002). A comparison of student outcomes & satisfaction between traditional & web based course offerings. Online Journal of Distance Learning Administration, 5(3).

Romero, C., Ventura, S. & Garcıa, E. (2008) Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51(1). pp. 368-384.

Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior, 78, 397-407. doi: https://doi.org/10.1016/j.chb.2017.06.030

Sclater, N., & Mullan, J. (2017) JISC briefing: Learning analytics and student success – assessing the evidence. Bristol, UK: JISC. Retrieved from https://repository.jisc.ac.uk/6560/1/learning-analytics_and_student_success.pdf

Sclater, N., Peasgood, A., & Mullan, J. (2016) Learning Analytics in Higher Education: A review of UK and international practice. Bristol, UK: JISC. Retrieved from https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v3.pdf

Newland, B., Martin, L., & Ringan, N. (2015). Learning Analytics in UK HE 2015: A HeLF Survey Report. London: Heads of eLearning Forum. Retrieved from helfuk.blogspot.co.uk/2015/10/uk-he-learning-analytics-survey-2015.html

SHEILA Project, (2018). Using data wisely for education futures. Retrieved from http://sheilaproject.eu/

Siemens, G., Dawson, S., & Lynch, G. (2013) Improving the quality and productivity of the higher education sector: Policy and strategy for systems-level deployment of learning analytics. Sydney: Office for Learning and Teaching. Retrieved from https://solaresearch.org/wp-content/uploads/2017/06/SoLAR_Report_2014.pdf

Tan, J. P. L., Koh, E., Jonathan, C. R., & Yang, S. (2017). Learner dashboards a double-edged sword? Students’ sense-making of a collaborative critical reading and learning analytics environment for fostering 21st century literacies. Journal of Learning Analytics, 4, 117–140.

Tinto, V. (2009, February 5). Taking student retention seriously: Rethinking the first year of university. A paper presented at ALTC FYE Curriculum Design Symposium, Queensland University of Technology, Brisbane, Australia. Retrieved May 8, 2015 from http://www.fyecd2009.qut.edu.au/resources/SPE_VincentTinto_ 5Feb09.pdf

van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012). Analytics in higher education: Establishing a common language. Educause Learning Initiative, 1.

Vigentini, L., Kondo, E., Samnick, K., Liu, D., King, D., & Bridgeman, A. (2017). Recipes for institutional adoption of a teacher-driven learning analytics tool: case studies from three Australian universities. In H. Partridge, K. Davis, & J.

Page 25: Making the connections between academic work and learning

HERDSA Review of Higher Education

89

Thomas. (Eds.), Me, Us, IT! Proceedings ASCILITE2017: 34th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education (pp. 422-432). Toowoomba, Qld: Australasian Society for Computers in Learning in Tertiary Education.

West, D. (2018 February1-2). Engaging the academics and students—making the promise of learning analytics real. Invited Keynote Address at Blackboard Learning Analytics Symposium, Austin, Texas.

West, D., Huijser, H., & Heath, D. (2016). Putting an ethical lens on learning analytics. Educational Technology Research and Development, 64(5), 903-922. doi:10.1007/s11423-016-9464-3

West, D., Huijser, H., Heath, D., Lizzio, A., Miles, C., Toohey, D., Searle, B., & Bronnimann, J. (2016). Learning analytics: Assisting universities with student retention. Final Report (Part 1). Sydney: Office for Learning and Teaching. Retrieved from http://www.olt.gov.au/resource-learning-analytics-assisting-universities-student-retention

West, D., Luzeckyj, A., Searle, B., Toohey, D., & Price, R. (2018). The use of learning analytics to support improvements in teaching practice. Melbourne, Australia: Innovative Research Universities. Retrieved from https://www.iru.edu.au/wp-content/uploads/2018/04/MRUN-Learning-Analytics-report.pdf

West, D., Luzeckyj, A., Toohey, D. & Searle, B. (2017) Are we there yet? Reflecting on the literature to gauge learning analytics maturity. Unpublished manuscript.

West, D., Luzeckyj, A., Toohey, D., Vanderlelie, J. & Searle, B. (in press) Do academics and university administrators really know better? The ethics of positioning student perspectives in learning analytics. Australian Journal of Educational Technology.

West, D., Tasir, Z., Luzeckyj, A., Si Na, K., Toohey, D., Abdullah, Z., Searle, B., Farhana Jumaat, N., & Price, R. (2018). Learning analytics experience among academics in Australia and Malaysia: A comparison. Australasian Journal of Educational Technology, 34(3). doi: https://doi.org/10.14742/ajet.3836

Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. (2017). Learning analytics: challenges and limitations. Teaching in Higher Education, 22(8), 991-1007. doi: 10.1080/13562517.2017.1332026.