Revolutionising Learning: AI and Group Work Unveil a New Approach to Reading Activities

Navigating through the extensive volume of reading material in certain modules can be a daunting task for students, often leaving them overwhelmed by the sheer magnitude of information. Recognising this challenge, the module leaders of ONC8017 took a pioneering approach to ease the burden on students. In a bold move towards innovation, they harnessed the power of artificial intelligence (AI) and embraced the collaborative spirit of group work to revolutionise the learning experience.

tablet showing research paper and a robot saying i can help with that
Image used in Discussion Board task

The Task

Article Allocation:

The first step involved compiling a comprehensive list of articles relevant to the module’s curriculum. Each article was carefully selected to contribute significantly to the students’ understanding of the subject matter. Subsequently, the articles were allocated to individual students, ensuring that each student had a unique piece of content to delve into. Students were asked to read and summarise their assigned article.

Student Autonomy:

To cater to diverse learning preferences, students were given the autonomy to choose their preferred approach in engaging with the assigned article. They could opt to read and summarise the content independently, a traditional method fostering individual comprehension and analysis. Alternatively, students had the option to choose an AI tool for summarisation, exploring the cutting-edge capabilities of technology to streamline the information extraction process.

Students who opted to use an AI tool were tasked with critiquing the summaries generated. This not only encouraged a deeper engagement with the material but also honed their analytical skills as they assessed the accuracy, coherence, and relevance of the AI-generated summaries.

Following consultations with the Newcastle University Library, we recommended the AI tools Scholarcy and TLDR This. However, students were able to choose any tool that best suited their preferences. The library, also provided valuable insights, including a copyright statement and links to AI Guidance, as well as the Uses and Limitations of AI.

If your allocated article is behind a sign in wall we kindly request that you do not upload or share this licensed material with third party AI tools

Copyright statement

Group Collaboration:

The students were asked to share their summaries to a discussion board and to look through the summaries posted by others. They could then identify which literature was most relevant to them and read the articles in depth themselves.

Recognising the significance of collaborative learning, the module leaders fostered a sense of community among students. Group discussions and collaborative sessions were encouraged, providing a platform for students to share insights, discuss varying perspectives, and collectively enhance their understanding of the subject matter. This collaborative element not only enriched the learning experience but also mirrored the collaborative environments students are likely to encounter in their future careers.

The Student Experience

40% used, 47% didnt use, 13% unable to use AI

53% of students opted for AI-assisted summarisation, showcasing a keen interest in exploring the capabilities of technology for academic purposes. This choice not only demonstrated a desire for efficiency but also provided students with valuable hands-on experience in harnessing AI tools for practical applications.

However, the practical application of AI tools had its challenges. 25% of students who chose AI encountered difficulties, with the tools unable to produce a summary at all.

tldr this 4 scholarcy 3 chat gpt 4 unknown 1

In their candid feedback, students highlighted both positive and negative aspects of their experiences. While some were impressed by the efficiency of AI tools, all students expressed concerns about gaps and missing details in the generated summaries. Specific instances of errors, omissions, and disjointed reading experiences were noted, revealing the practical limitations of relying solely on AI for complex tasks. The majority of students who opted for AI, eventually opted to manually summarise the articles anyway, indicating a less-than-ideal outcome from the AI tools.

The AI tool also provided a second longer summary. This summarised most sections of the paper individually, which was presented like a smaller version of the paper. There was still important information missing, which was clear from the disjointed reading experience. Even so, I was still quite impressed with how well the AI tool had summarised the vast amount of information in the original paper into something relatively usable. 

Student experience of Scolarcy

No inaccuracies were noted. Good summary of the epidemiology, although it seems that the AI summary has basically just been derived from the abstract of the article. A number of gaps were identified. 

Student experience of TLDR This

The article has been summarised into ten key points, but these are not detailed. For example. only one of the statistics provided in the article have been included in the AI summary.

Student experience of Chat GPT

Final Thoughts

These nuanced results underscore the importance of balancing technological innovation with practical considerations. While the incorporation of AI offered students valuable exposure to emerging technologies, the ultimate outcome indicated that, as of now, AI tools might not be the ultimate solution we were hoping for.

Despite the unexpected challenges encountered in the use of AI, this experiment has provided invaluable insights. Recognising the evolving nature of technology, we remain committed to maintaining the task, observing how AI technology progresses year after year and see if, as the technology advances, the dialogue from students changes.


This post was written with the assistance of AI tool, Chat GPT.

Case Study – Audio Commentaries on Client Consultations – Susan Lennie

This post is about using audio recordings of patient consultations in teaching. Commentary was added to the recordings by the lecturer to create a richer resource.

Introduction

This case study concerns Dietetics and Nutrition module NUT2006, Measurement and Assessment of Dietary Intake and Nutritional Status. As part of this module, dietary interview consultations are recorded so that the students can listen to these as examples. The FMS TEL Podcasting Webinar provided initial inspiration for what could be done with the recordings to enhance them. With a little more support, a new audio resource has been developed which adds audio commentary to the recorded consultations, highlighting various features.

Consultations and Recordings

The work of Dietitians and Nutritionists involves gathering information from individuals and populations on their recent or typical food intake. This enables them to analyse nutrient intake and understand dietary behaviours so that they can make suitable recommendations. Taking a diet history, or a 24-hour dietary recall, involves a structured interview with questions exploring habitual food intake, timing of meals, cooking methods and quantities. The effectiveness of the interviewers’ questioning technique impacts upon the quantity of information gathered and the quality of the nutritional analysis that can be undertaken. Students are working towards proficiency in these skills. Listening to recordings of these interviews exposes students to examples which will support in improving their skills when they perform these tasks for themselves. They can also practice analyzing the data provided from the audio recordings.

The recordings themselves are a very rich resource, which could be used in a variety of ways to help students improve their practice. The following task was developed, which required teaching staff to add audio commentary to the interviews.

The Task

Students first watched a short lecture on best practice for conducting interviews. They then listened to a recorded interview, by an anonymous peer, and made notes critiquing the effectiveness of the questioning techniques and determining if the quality of information obtained was sufficient to undertake nutritional analysis. Next, they listened to the same interview with professional commentary provided by staff, highlighting what could be improved and were asked:

  • Did you spot the same things?
  • Reflect on the comments and try to think about how you might use this knowledge to improve your own skills in gathering dietary information from service users.

This task was designed to allow students to develop their skills in conducting the interviews, and to reflect on practice and identify areas for development. The use of peer recordings meant that there would be a range of areas to comment on, making the task itself much more active than simply listening to a professional. Students were also offered more interview recordings to practice this task further.

Adding Commentary with GarageBand

A recording was chosen that demonstrated a range of teaching points. Having listened to the recording and made brief notes, cuts were then made in the original recording at natural stopping points, for example, after the participant and interviewer had discussed breakfast. It was important to allow the original recording room to breathe by not interjecting too often – this makes for fewer edits too.

You can record audio with a range of devices – Windows laptops can run Audacity, and Macs come with GarageBand. It is also possible to record audio clips on a smartphone and import them. When doing any recording, make sure to do a quick test first to ensure there is no unwanted background noise – just record a few seconds and listen back. GarageBand was used in this case, but the Audacity user interface is very similar.

The first 20-minute recording took around two hours to produce, but this time included learning how to use the software. The screenshot below shows how the editing process looks in GarageBand. The top half shows the three tracks that were mixed to create the final output. By cutting and arranging the various sections, it is possible to quickly add commentary and even intro music to the basic original recording.

The project file, which contains all of the information in the top half of the screenshot such as individual tracks and cuts, can be saved for later use. This is helpful if you want the flexibility to change the content, or re-use elements. The single stream of audio can be exported separately as an audio file and embedded into Canvas or the MLE with accompanying text and other resources to build the desired task.

Style and Substance

It is natural to worry about quality when producing an audio or audiovisual resource for the first time as the content should convey a level of professionalism matching its purpose. As long as content is clear and understandable, it will serve for teaching. Making a clean recording can be done relatively simply by avoiding background noise and speaking at a measured pace and volume. You can add a touch more professionalism to your recordings by adding a little music to the intro and using some basic transitions like fading between different tracks if needed, but there is no need to go out and buy specialist equipment. The content of the recordings was linked very closely to the students’ tasks and mirrored how they may receive feedback in future by showing what practitioners look for in their interviews. This clear purpose alongside the care taken in producing the audio ensures that this resource is valuable to listeners.

Conclusion

While at first it seemed like a big undertaking, a quick YouTube search for instructions on using the software, and then having a go with the audio recordings has opened up a new avenue of teaching methodology – it was a lot easier to do than it first appeared, and in total took around 2 hours. The software has a lot of capabilities, but only the basics are really needed to produce a high-quality, rich teaching resource. Commentated practitioner interactions allow teaching staff to draw students’ attention to key moments while remaining in the flow of the interaction, signposting how students can reflect on practice and develop their own interviewing skills.

Resources

Contact

Susan Lennie, Senior Lecturer, Biomedical, Nutritional and Sports Sciences

Case Study: Adapting a course for a larger cohort

Guest post by Sue Campbell from the FMS Graduate School, Module Leader for ONC8024: Chemotherapy Nurse Training.

The Challenge

In December 2020, we were informed that Lancashire Health would be sending their Nursing students to study our course, which was due to start in February 2021. We had already seen an increase in our own numbers so with these additional students we were going to be expecting a much larger cohort than usual. The increase was in part due to the COVID situation and study leave cancellation in the NHS. We needed to investigate if the course structure would be suitable for 50 students instead of the usual 10-15 we had taught in previous years.

What did you do?

We reviewed each activity and imagined how it would work with 50 students. Activities that students completed on their own such as crosswords and quizzes were fine. 

Our main concern were the collaborative wiki tasks – these are pages within Canvas, usually involving a table, that students completed together to create a resource. We wanted to keep these tasks as they encouraged teamwork, but the tasks were not suitable for 50 students to be able to contribute. After discussing the problem with others who have experience of working with larger cohorts we came up with a solution. 

With help from the FMS TEL Team we were able to separate the students into groups of 10-15 students and provide each group with their own collaborative wiki task to complete. Once the course began we experienced registration issues so students were all starting at different times. We decided to adjust the groups so the late starting students would be in the same group and would not feel left behind.

“It’s about finding solutions you are not aware of; groups was a really quick and effective fix for what I envisioned to be a much larger problem.”

We wanted to keep the discussion tasks as they worked well in the past but would they work with large numbers? We went through each discussion task and made changes. 

Where we had previously asked students to discuss three points, we changed so students could choose one discussion they could take part in but were able to view all discussions. 

Modified Discussion Board: Before and After

We decided to change the scenario discussions into branching activities instead. The questions asked in these discussions had only one right answer and were more of a fact checking exercise than something the students discussed. Students could complete the branching activities independently, so cohort size did not matter, but the objective of the task was still achieved. We also added a presentation to summarise the learning from the scenarios which replaced the interaction from the Module Leader that would have usually occurred on the discussion board at the end of the week.

Branching Activity

Tips

  • Ask for advice – I spoke with the FMS TEL and Programme Teams and they provided several solutions I wasn’t aware of. I also spoke with our DPD, Victoria Hewitt for marking help
  • Consider running the module twice a year if numbers/demand remains too high to sustain within one cohort
  • Branching activities will work regardless of numbers so we can easily roll those over year after year now
  • Groups in Canvas is easy to turn on/off and adjust depending on numbers

What might you do differently next time?

We shall wait and see the student feedback but we are currently in week 5 of the course and so far it is going well and the group work is successful. Some things we are thinking about are:

  • We have a lot of activities, but they are now largely peer to peer or independent tasks so to bring back the teacher presence I would like to include more videos and presentations
  • We do provide a general Q&A discussion board, and for the rest of the course we are also introducing fortnightly, 10 minute 1:1 Q&A bookable slots via zoom for any students preferring a one-to-one discussion with the tutor.

Resources:

The versatility of quizzes

Over the past couple of months I have been talking to a lot of teaching colleagues about how they use quizzes. A quick summary of some uses for quizzes can be found below. There are two quiz tools available in Canvas (old and new quizzes), as well as a lot of web services that offer quiz functionality.

Using quizzes before synchronous seminars allows students to check their knowledge and make sure they have understood things correctly before entering into a discussion. This boosts confidence and allows them to participate more effectively in the session, knowing they have definitely grasped the concepts. This is especially useful with topics that are very abstract or contain a lot of new concepts or terminology. The case study with Rosalind Beaumont and Lydia Wysocki can be found on the LTDS case studies site.

Quizzes can also be used in the sense of providing test-enhanced learning opportunities for students. Regular short quizzes encourage students to retrieve the information they have remembered and put it into practice, boosting knowledge retention. The case study with Nick Riches can be found on the LTDS case studies site.

Another use for quizzes is to use them to replicate a workbook – something that might be used in Present in Person (PiP) teaching to guide students through a series of problems as teachers monitor the room. Here the quizzes are instructive and challenge students to find the information they need, practicing the skills they are learning. Detailed feedback and extra information allows the students to step through the processes they are learning and approximates the monitoring that may be done in the classroom by anticipating difficulties that may need clarifying. Teachers can then look at analytics or ask students to send questions to identify anything that needs further explanation. More information can be seen in the case study with the Library Liaison team.

When testing higher-order thinking skills such as evaluation, automatically-marked quizzes may not spring to mind, as evaluation is often done in prose. The case studies mentioned above include examples of higher-order thinking questions. This can be done through careful question construction with high-quality distractors, for testing, as shown here, or as a learning activity, asking students to apply skills and enter a rating at each stage as modelled by the Drop Bear activity in the Library Liaison team’s case study.

How do you use quizzes?