From manual to automatic: developing learning analytics at Aston

5983
After Aston University carried out a project to identify students at risk of withdrawal or non-progression, they realised they needed to move from manual analytics to a more comprehensive learning analytics system. Following a partnership with Jisc to develop such a system, they are already seeing the benefits, as James Moran explains.

When I googled the phrase ‘learning analytics’ I found that, following the Wikipedia definition, the top two search entries are occupied by Jisc. It will therefore come as no surprise that when the opportunity arose to become part of a pilot project to help develop the Jisc learning analytics infrastructure, we were keen to be involved.

Aston University is working in partnership with Jisc on developing a learning analytics service for the higher education sector. Jisc describes this as “a basic learning analytics solution” which aims to provide institutions with an analytics toolkit which can be adapted to their individual needs.  

As part of this development, Aston University is a partner in implementing sections of this solution, including assessing the technical challenges of drawing together data from multiple sources such as the virtual learning environment (VLE) and student records databases.

Though we are at an early stage of involvement, we can already see the benefits of moving from our former manual approach to analysing data to a learning analytics system which provides a much broader scope for supporting students across the institution.

Effective data reporting, complicated interpretation

Prior to working with Jisc, we ran a project to explore interventions with students who were at risk of withdrawal or non-progression. This revealed that, while we had effective data reporting facilities, our processes for identifying, analysing and interpreting the data were predominantly manual ones. This was further complicated when we tried to compare data from multiple systems.   

Having said that, once we had identified those students who might benefit from some additional support, we were able to intervene with a substantial number of them via telephone calls, face to face meetings and referrals to university services.  

While causal links between our interventions and students’ outcomes were hard to evidence, the feedback from the students involved was overwhelmingly positive. The exercise has provided several case studies which have highlighted the issues students were facing and described how the institution was able to help.

After reflecting on this project, we recognised that we had already been using much of the methodology which underpins learning analytics, and we began exploring how we could extend this approach. The partnership with Jisc came just at the right time.

Early positive results

The early results from the retention and progression project look very positive and it is expected that by embedding and automating these approaches through the use of learning analytics that the performance will continue to improve.

The development of a wider analytics system will enable a greater level of individual agency for staff to support identified students while also reinforcing student-tutor relations. There is also capacity to empower the students to take greater control of their own engagement data so they can explore the impact of their own approaches to learning. While still under development, the advancement of our current levels of analytics will have the capacity to greatly inform our institutional ability to provide bespoke support to students.

Next steps are to engage students

The analytics function we now use is a moderate advancement on our previous manual reporting method, and the potential to enhance the scope and scale of the project is not far away.

However, having explored what is technically possible, our next step is to engage students and staff through focus groups and facilitated discussion over the development and use of analytics.  We will use this feedback, and publications such as the Jisc Code of Practice for Learning Analytics, to carefully consider the scope of the project and how the extended analysis of data may be used.

Additionally, we will ensure both ethical and privacy considerations are addressed and assess how these may need to be reflected in institutional policy and procedures. Initial feedback from users is positive but we want to encourage buy-in from all parties and ensure  that developments are informed by their input.

Top tips

Based on our initial experiences, I would offer these three tips to colleagues who might be considering similar issues:

  • From what I have found so far, there is no single ‘correct’ way of developing a learning analytics system; it will depend hugely on management buy-in, the data available within the institution and the motivation behind the project.  
  • Before beginning the journey, I would recommend completing an audit of ‘institutional readiness’ including identifying key stakeholders, systems and data sets.
  • Identifying key external partners and resources from organisations such as Jisc and the Learning Analytics Community Exchange (LACE) will also ensure that a first venture into learning analytics is more likely to be useful for both your institution and your students.

Photo credit: Theen … My Filing Cabinet via photopin (license)

SHARE
James Moran
James Moran is an achievement enhancement adviser at Aston University

LEAVE A REPLY