UCISA Learning Analytics Conference

On the 12th of April, I attended the UCISA event on learning analytics. This was a single day event held in Birmingham. Due to travel issues, we unfortunately missed the first speaker, Sarah Porter, Co-Chair of the National Inquiry into data in HE, High Education Commission – the author of the HEA report, “from bricks to clicks

OU Analyse (ppt)

Zdenek Zdrahal, Knowledge Media Institute, The Open University

The second speaker was Zdenek Zdrahal from Media Unit at the Open University. He described the use of predictive analytics to attempt to identify at-risk students even before the first assignment had taken place, as 98.6% of students who failed the first assignment did not complete the programme. They used a combination of static data including gender and educational history, and fluid data notably VLE interactions. The accuracy of the predictions was increased greatly even when only using the count of “no of clicks” in the VLE.

In practice, each item in the VLE was given a label and different paths were mapped to view the successful students possible paths and the typical failing student path. The  data analysis involved the use of 4 different data modelling techniques. If the student failed two of those techniques they were classified as at risk and intervention was put in place by the School. Spreadsheets were sent to the School highlighting the risk levels of each student. A dashboard has been created that highlights at risk students and the data used.

A student dashboard has been created that shows the student their “nearest neighbours”. These are students showing similar engagement behaviours and the predicted outcome for those students. It also predicts whether the student will submit the next assignment. To improve the students predicted outcomes, various activities are recommended. Currently, students cannot access this dashboard but it is now a priority to release this. I think the extra step of showing suggested activity to students has a real benefit to the learning experience. There is an intervention strategy automatically suggested to the student. The key would be to make sure these suggestions are accurate and relevant.

OULA
They have released an anonymised open source dataset. – https://analyse.kmi.open.ac.uk/open_dataset

Newcastle University does not have a retention issue overall but there may be areas where the use of data in this way may be beneficial – for example distance learning programmes.

JISC Learning Analytics (ppt)

Michael Webb, the Director of Technology and Analytics at JISC talked about the history of learning analytics , then through some of the work that they are carrying out in the field of learning analytics.

Michael described their work with learning analytics as “the application of big data techniques such as machine-based learning and data mining to help learners and institutions meet their goals.”

Predictive learning analytics as seen in the Open University presentation was defined as the “statistical analysis of historical and current data derived from the learning process to create models that allow for predictions that can be used to improve learning outcomes. Models are developed by ‘mining’ large amounts of data to find hidden patterns that correlate to specific outcomes.” JISC are confident that predictive learning models would be ‘fairly’ transferable between institutions.

JISC currently have a learning analytics project that has three core strands:

  • Learning analytics architecture and service
  • Toolkit
  • Community

Learning analytics architecture and service

JISC are looking to create a national architecture that would help data modelling become transferable. They are working with several institutions to develop core services that institutions would be able to implement at a much lower cost than if developing themselves.
JISCLAJISC demonstrated their student app. It’s based on fitness apps where users can view an activity stream with their (and their peers) activity, including performance against student defined targets.
phoneLAJISC are also developing dashboards for students and for administrators:
dashboardLA dashboardLA2

Toolkit
They have released the JISC Code of Practice that outlines some of the ethical considerations institutions should make before embarking on a learning analytics project. This has been worked on with consultation from NUS. They have released their own guidance to University Students’ Unions.Michael finished off discussing what future developments may occur in learning analytics, including links to the teaching excellence framework and personalised next generation e-learning.

Beyond the dashboard: the practical application of analytics (ppt)

Ben Stein, Director, Student Success, Hobsons
Ben from Hobsons spoke about the inevitable rise in student dashboards. While dashboards are an integral part of learning analytics providing methods to display predictive statistics and recommended activities,  it is crucial that the support structures are in place that will provide positive interventions. Mark asked, “does a deeper understanding of the problem actually lead to a solution?” and stated, “ultimately it’s what you do with the information that makes the difference.” Mark then demonstrated a product from Hobsons that provide student / staff dashboards.

Personal tutor dashboards (ppt)

David Mutti, Head of Programme Management, University of Greenwich
David Mutti from the University of Greenwich showed their development of a personal tutoring system that pulls together various pieces of information about a tutee including a very basic application of learning analytics. Although feedback from academics was very positive, actual use was minimal. There was no compulsion to use the system. I thought the system was very well thought out with some good features, but the system was developed and promoted by the IT service. Would an increase in academic involvement lead to a greater take-up perhaps an academic lead?

Piloting learner analytics at the University of London International Programmes (ppt)

Dave Kenworthy, Head of Software Services , University of London Computer Centre and Tom Inkelaar, Head of Management Information, University of London
Tom described how the University of London International Programmes are using learning analytics with their students to try to improve retention. The University have 50,000 distance learning students across 100 countries.
The University implemented Bloom Thrive analytics in partnership with ULCC and Altis. They used student record data along with VLE usage to determine at-risk students. As part of this data, they created a happiness block in their Moodle environments where students could say how happy they were as well as providing some free text comments.
happyLA
The University found it relatively easy to determine which students were at risk but the intervention is more difficult when spread across such a wide geographical area. Another challenge they faced was regarding the data protection and whether the appropriate consent has been received for all students.

The conference was a very useful event to attend, and it demonstrated that although we are not currently implementing any centralised learning analytics we are in a good place to do so as required. The data sets we have could provide a rich learning experience for students.

Leave a Reply