Nick Moore is director of IT services at the University of Gloucestershire. In this post, and at Inside Government’s conference this week on utilising data effectively across higher education, he describes how his university is using attendance data to inform decisions on tailoring services to meet the needs of individual students.
Faster, better decision-making. It’s a goal that any organisation worth its salt aspires to in the daily battle to move ahead of the competition. And as we all know, in our connected, digital age there’s usually no shortage of data that can be used to inform those decisions. Frequently, it’s effective analysis of data that poses the problems.
Certainly, in higher education, institutions routinely gather a wealth of data, particularly about students. And we’re fortunate that our high speed, internet-connected IT systems give us a prime opportunity to adopt machine-learning techniques that can enable us to manipulate and analyse data.
Learning analytics is being developed by – and for – universities because it is an excellent means of informing decisions right across their operations; at the University of Gloucestershire our own current learning analytics focus is on the student perspective. We’re developing ways to learn more about how each student is engaging with their learning so that we can create more personalised learning pathways, provide the right kinds of timely, tailored support and improve both experience and attainment at the individual level.
Attendance data is is one of the key data points useful for learning analytics and we are putting effort into capturing that information easily. Working with a group of past students we’re developing a geo-fencing app that enables students to record their attendance at lectures automatically; the former students are giving us a valuable perspective on what learners will want from such an app so that we can build a system that offers learners added value and that they will engage with willingly. That includes linking attendance to the timetable and it might also include things like deadline reminders and push notifications for events.
Of course, the success of our adventure in learning analytics depends on the goodwill of the students themselves. We’re gathering student data and, while we have to make sure we’re compliant with data protection legislation, it’s still more important to ensure that we have the understanding and buy-in of the students themselves before we use their data in new and experimental ways. To this end, we’ve taken care to communicate our plans and our objectives clearly, and we’ve developed a policy and a user-friendly student guide offering all the whys and the hows and explaining the benefits that we believe will follow.
We’re making good progress with development of this – and we’re not working in isolation. Our university has been an early adopter of the learning analytics solution currently under development at Jisc. Jisc’s goal is to create a national learning analytics service for UK education that will roll out after the project is completed late in 2017 and we are one of 50 or so institutions currently involved in the development work. One early tool is an app developed in consultation with university students (including some of our own) that enables students to track their own learning activity and compare it with their peers. The feedback that they receive in this way is neutral and non-judgemental – so that they can take control of their own learning and set themselves some practical targets and milestones.
At the same time, teaching staff can use the data that is emerging to identify students who may need additional support in order to achieve their full potential.
It’s still very early days but we’re looking into the emerging data to extract insights and identify how this work relates to positive feedback from staff and students and to improving outcomes.