Have you ever heard students complain about a particular topic? Module Leaders for our Chemotherapy Nurse Training module often come across such complaints for their Cell Cycle topic. As the issue was becoming a perennial problem, we decided to thoroughly review the topic that was causing frustration among our students. This blog post shares our project and the transformative impact it had on our students’ perception and understanding of the once-hated topic.
Revamping the Content
We started with the learning objectives for the topic. Using Blooms Verb Wheel as a guide we changed the wording so the objectives would be measurable, and therefore the students could better understand what was expected of them. We removed words such as “understand” and replaced them with “describe”.
Next we reviewed the flow of the topic. We started with a text heavy page with stock images covering; cell basics, cancer, chemotherapy, and ending with the phases of the cell cycle. This page was edited, removing over 100 words, and rearranged so we covered; normal cells and their cycles, then moved onto cancer and chemotherapy. The stock images were converted into animated infographics that were tailor made for the content being discussed.
The biggest change was with the asynchronous lecture. The audio was good quality and the messages were still relevant, however the lecture slides were text heavy and had an outdated look.
The slides were given a facelift and where possible, animations were used to replace the text. We hoped the animations would assist the students in visualising the processes being described by the voiceover.
Next the students had two tasks to complete. One was to answer some basic questions, and the other was to put their knowledge into action within a group activity. We spruced up the tasks visually, but didn’t make any other changes to them.
We chose to add a “check your knowledge” type quiz, using H5P, at the end of the topic. The questions directly linked to the learning outcomes and we hoped would help the students confidence with the topic.
Student feedback (before):
Found the cell cycle topic very difficult to understand.
I am dreading the cell cycle topic
Student feedback (after):
The topic I have enjoyed most was topic 3 (The cell cycle)
I have particularly enjoyed topics 3 – going further in depth into the cell cycle and pharmacology of the drugs we use has been really useful
I quite enjoyed the cell cycle section…I enjoy the lightbulb moments I’ve had understanding how everything links together.
The module team is thrilled with the amazing transformation we witnessed. Our goal was to address the complaints and make things better for the students. But we ended up surpassing our own expectations. The topic that used to be dreaded by the students has now become one of their favourites. The positive feedback from the students has made us proud and motivated us to continue on our journey for improvement. It’s really amazing to see such a big change in how they feel about the subject, with only a few tweaks in how the content was delivered.
This case study concerns a range of activities created for MCR8032 Clinical Research Delivery in Practice. The module leader, Fraser Birrell, put me in touch with his colleague, Associate Lecturer Ann Johnson, to assist in the development of a piece of learning about Unconscious Bias in Healthcare.
Ann Johnson has been a Patient and Public Involvement (PPI) Advocate, Lay tutor, and facilitator for twenty-five years, researching and creating a Patient Involvement Framework for Leicester University Medical School. She has conducted extensive community outreach in London, Leicester, and Florida USA with the goal of bridging communications between patients and practitioners. She is continuing her work as an Associate Lecturer and PPI Advocate at the School of Medicine.
As part of this module, it was important to ensure that a patient-centred approach to healthcare was highlighted. As such, Ann’s experience in the field allowed her to challenge students to look at healthcare – and clinical trials in particular – from the patients’ points of view.
One particular topic inspired Ann to focus on the topic of unconscious bias in more detail. In cases of hypertension, GPs had been trained to prescribe different drugs and treatment plans to people based on ethnicity, even though there is no evidence to support this course of action (Gopal, D.P. et al., 2022). This is an example of taught bias – but at the same time, GPs were making assumptions about patients’ ethnicities which could also be erroneous. Naturally, this is an area of concern for patients.
Equality, Diversity and Inclusion and Bias
The difference between EDI concerns and bias is important to clarify at this stage. While EDI principles are focused on actively working to improve outcomes, unconscious bias is present in all of us as a survival instinct and extends beyond those ‘protected characteristics’ formalised in EDI policies. Unconscious bias allows us to make quick decisions based on assumptions – for example choosing to cross the road to avoid encountering someone walking along with an unleashed Pitbull Terrier.
As a clinician, it is especially important to recognise one’s own potential for unconscious bias as it can affect decision-making, resulting in poorer outcomes for some patients. When this bias extends to choosing who to include in clinical trials, it is easy to see how misconceptions or omissions could be compounded.
You have been asked to become involved with the recruitment for the trial of Nosuchximab, a targeted therapy for Paediatric Lymphoma. The research target group is children aged between 02 and 14. There is a significant disparity in survival rates of the South Asian population and white European population. You have been asked to recruit children from the target age range. However, the NHS Foundation Trust site for the Nosuchximab trial is located within in a region where this population is under-represented – however, it is present (although in minimal number).
How might Unconscious Bias impact the outcomes of this trial?
Is it important to strategically recruit this cohort?
How might you put in place a strategy for recruiting those particular subjects?
What attempts should be made to minimalize barriers to their inclusion?
The above activity challenges students to consider a range of complex factors and is designed to explore the recruitment process for clinical trials, which can be affected by unconscious bias. As such, we designed a range of scaffolding activities to lead up to students exploring this topic in a more confident and informed manner.
The learning was divided into three stages, supported by Canvas’ tools.
An introduction to unconscious bias with a test-your-knowledge quiz. This built understanding of the basics, and used the quiz to instil confidence into the students that they had understood the basics. The introduction was also written in such a way to highlight that this was a supportive environment.
An opportunity to explore the effect of unconscious bias through a key reading, and a test that students could try to identify their own potential biases, followed by reflection in one of a few ways.
Attempt the activity in discussion with others. A webinar is also available for students to join and discuss the activity with Ann and the other students, as well as to explore the topic further if needed.
We understood that the topic of unconscious bias could be challenging for students to confront, as it is intensely personal and potentially triggering. To allow students to explore this area in a supportive way, we suggested a range of activities, from private reflection to group discussion, about the topic in general, to allow students to examine this in an environment where they felt comfortable. We felt that this was especially important as this meant students would not feel they may be judged or blamed for sharing their experiences and feelings about bias, and this would make the entire topic much more approachable, and the learning more effective.
The activities will soon be live for students to try out the materials and share feedback. Anything highlighted by the student feedback will be discussed, and appropriate changes made to the activities if necessary. These materials will then run as part of the module next year. Further distribution of this content can also be done via Canvas Commons, should other module leaders wish to incorporate them into their teaching.
How do oral presentations work for 100% online modules?
Oral presentations are a popular choice of assessment in the Faculty of Medical Sciences, especially in our e-Learning modules. Students are asked to submit a pre-recorded presentation to Canvas and the markers watch the presentations at a time and place that suits them.
Diarmuid Coughlan, module leader for ONC8028 Practical Health Economics for Cancer, has kindly agreed to walk us through how the Virtual Oral Presentation element works on his module.
This year we had 14 students on the module. We asked the students to create a 15 minute presentation using either Zoom, Panopto (Recap) or PowerPoint.
We informed the students right at the start of the module that an oral presentation was part of the assessment and 4 weeks into the module we provided a formative assessment. The formative assessment allowed students to familiarise themselves with their chosen software, gain experience talking to a camera and also get some limited feedback on their presentation skills.
The submissions are double marked by 2 markers. Marking is completed separately by each marker outside of Canvas, then markers meet to discuss which marks/comments would be entered into Canvas and made visible to each student.
The Set Up
We provided 2 submission points in Canvas:
Recording Submission Point:
This area was used for the marking. It was set up as Media Recording for MP4 uploads (max of 500 mb) with a Text Entry option for Panopto users (no size limit).
We allowed students to choose which technology they were most comfortable with and provided video and written instructions for Panopto and Zoom. PowerPoint instructions were added later as an option with links to guidance provided by Microsoft.
We also provided some instructions so students could crop their recordings to comply with the 15 minute time limit.
You are limited by time so remember to edit your recording so it is no longer than 15 minutes. Instructions: Windows | Mac | Panopto
Slide Submission Point:
This area had a 0 point value. It was set up as a File upload area for students to submit their slides as .ppt or .pdf, this allowed us to get a turnitin plagiarism score for each presentation as well as a reference copy of the slides, should anything be unclear in the video recordings.
How did it go?
There was a lot of fear from students initially. We encouraged students to give it a go, informing them that we were not trying to trick them. We provided clear guidance on what we expected and provided a rubric with a breakdown of points, clearly showing only a small percentage of the grade would be based on their presentation style and delivery. The content of the presentation was the most important part!
The use of technology was varied:
As markers we also had to overcome our fears of technology.
PowerPoint is easier once you know how to access recordings (you have to download the file, then click start slideshow).
Sometimes the Panopto recordings were hard to find, especially if students had experience of using the technology in Blackboard and did not follow the Canvas instructions correctly.
What are your next steps?
We only provided grades with a short feedback comment last year, we plan to provide more extensive feedback going forward
We will add more video content into the module as examples of how to create engaging slides and showcase our presentation styles – hopefully leading by example
We would also like to provide examples of a good presentation vs a bad presentation
Personal Tutoring is the process of assigning the availability of university staff for student tutoring. It is not actually the process of assigning individual students to individual staff. Staff in the faculty are assigned to programme groupings/pots (not individual programmes), these groupings have a lead administrator that can then work out by looking at the expected student intake whether they need more staff resource, or can free it up for others.
The faculty admin team have undertaken this process for many years using excel spreadsheets and email communications to pass the information back and forward. This kinds of activity is both time consuming and prone to errors caused by duplicate copies of data and missed communications.
This is the perfect example of a process that can be done better using a web application, the kind of work the Technologies Developers in FMS TEL undertake all the time.
No time to plan properly
Sometimes a project comes to the unit that needs to be completed in a short time frame. Ideally the amount of time spent on a new website would be evenly spread between specifications, design, implementation, testing and support/improving., it may even go through many loops of these processes.
When a project does not have the luxury of time, then all these steps need to be compressed and decisions made on which steps need to be prioritised . In the case of personal tutoring the design phase and specifications where collaborated from the old Excel spreadsheets and turned into a simple tabular wireframe display. These spreadsheets where also identified as the origin of the data the site would be based off and import scripts planned accordingly. No complex interface features where offered, just a clean display of the data, with filters and stats to help the tutoring assignments. As for the implementation of the site, we decided to host the new site on top of another, saving time on hosting framework and infrastructure. We chose a site that had similar tools (FMS Projects, which has a statistics section) and tools we could utilise. We also based the core of the new site off knowledge and experience the team was used to (API’s and data tables, spreadsheet importing / exporting). Finally the testing and support side was compressed, keeping the interface simple, reducing the need for support and documentation. The limited number of users the site may have, also helps support as we can offer short term direct guidance.
With all these measures, we managed to reduce a development process that could take 6 months down to 3.
The Personal Tutoring site is due to go live in July 2022. We managed to write and get a functional version of the site done in roughly 2 months. This left 1 month planning the release, testing and show casing the site to the customers and making improvements from their feedback. Overall we are pleased with the structure and quality of the site. The code design is based on solid principles and should offer a degree of flexibility when Personal Tutoring gets used and the inevitable suggested improvements come through.
If you are interested in this topic and wish to learn more, please contact:
This week students from the University of Cape Town have joined the pilot of the Exploring 3D Anatomy course.
The second pilot of the Exploring 3D Anatomy course is now live, with students participating at the University of Cape Town in South Africa.
Leonard Shapiro presented an introduction to the course on 18th March to a class of second-year medical students – many of whom joined the class remotely via Teams.
The course trailer was also shown, and students were invited to sign up using a QR code posted on the slide show as well as on posters in the lecture theatre. Senior lecturer in Anatomy, Dr Geney Gunston, also posted the sign-up link for the students on VULA, the university messaging system.
Many students have signed up, and we look forward to working with them over the next few weeks as they Explore 3D Anatomy from home.
After this pilot and a further development phase, we hope to make the course more widely available to many more students.
Images provided by UCT medical student Mulaudzi God-mother Matodzi-muswa and Leonard Shapiro.
Find out the pros and cons of 360 degree meeting room cameras in practical sessions.
Roisin Devaney, Susan Lennie, Andrea McGrattan
This practical session was based around students creating a meal to align with the recommendations of the EatWell guide, using ingredients provided in the food handling lab. The session involved a short introduction, followed by time for the activity, and feedback. At the last moment there was a request to record the practical session, and therefore a 360° camera was used as a quick solution – an ideal opportunity to test the camera’s capabilities.
These rooms are not set up for ReCap, so having this portable method of recording allowed for the session to be captured. The camera was connected to Zoom so that a remote student could watch the session or review the recording. As it happened, no one watched the stream live, so there was no interaction with a remote participant. As shown from the screenshots below, the camera could cover the whole room without needing to be moved. It can also zoom in and out to focus on where the action was taking place.
The camera was quick and simple to set up, and lecturers used lapel mics to capture sound when they were addressing the class. The camera was placed on the demonstrators’ bench at the front of the room, where it could see the kitchen spaces as well as the session leads, and the whiteboard.
All present were made aware of the device. It was relatively unobtrusive and didn’t get in the way of the session – most people just forgot about it. This is a bonus as it means the lecturers don’t have to consider what to capture at various points during the session. They could also walk around the room naturally rather than being stuck at the fixed point of presenting from a particular spot at the front as the camera would follow them.
As a quick solution, it was easy to set up, and the predetermined settings allowed for much of the session to be captured. It provided a record of the practical session, though due to the level of automation in where the camera chose to focus, some detail and some sound was lost. For example, on some occasions, the camera would choose to focus based on an incidental sound in the room, like a pot being scraped. This kind of auto-selection would work better in a situation such as a seminar where everyone sits around one table with less background noise and more obvious turn-taking when speaking.
The camera also captured incidental discussions – as it was placed at the front where students were collecting ingredients, it captured discussions between students about what they were choosing to include in their meal.
It was also able to capture two areas simultaneously and display them side by side, allowing for an overview of the room as well as capturing individual discussions.
There are a range of applications for this type of device in a practical setting. For example, for a remote student, the device could be placed in a group of students so that some collaboration would be possible. This would allow them to be more involved in the practical aspect of the session, though of course it would not be a complete substitution for in-person attendance.
Though the session recordings are relatively long, once uploaded to Panopto, bookmarks could be added to allow students to quickly navigate to various parts of the session, for example the input and feedback.
Roisin Devaney, Susan Lennie and Andrea McGrattan, Nutrition and Dietetics, School of Biomedical, Nutritional and Sports Sciences
This post is about the use of Menti – a pretty polling tool that can show responses in real time on screen. Lindsey Ferrie used this to gather student feedback during an in-person event. Link to guides included.
In Biomedical, Nutritional and Sport Sciences this year there is a focus on improving assessment and feedback. Including student voice in this development work is key. As well as the normal routes for student voice, such as staff-student committees and module evaluation forms, there were certain questions that staff had in mind which could reveal a lot about how students were feeling about this aspect of their course.
The formal module evaluation forms don’t always capture a large sample from the student body, and as with any survey, generating a high response rate is very difficult, and can sometimes end up only revealing the most polarised responses. Informal live response tools were a potential way to gather large amounts of more representative data, as well as demonstrating that feedback is wanted, heard, and acted upon within a shorter cycle. It is also much more convenient to capture a response immediately when students are in the room.
Having seen Menti in use during the FMS TEL Conference, the decision was taken to try it out in a much bigger setting – the inductions for all programmes in the School. Overall, this includes around 1500 students.
A useful aspect of Menti in this situation was the capacity to provide a wide range of question types, as well as the ability to see answers as they are coming in. Naturally, this has some disadvantages too – students’ responses may be coloured by what they see on screen if this is displayed. Possibly due to its anonymous nature, students provided some genuinely challenging feedback which proved very useful.
When in use, Menti allowed for the sessions to be more interactive as sharing live responses is quite immersive. It also allowed for the presenters to directly react and respond to things as they came in and broke the ice between students and staff. This can be useful for quick questions about content as well as student voice activities, for example polling the level of understanding of a concept at the start and end of a teaching session.
In terms of providing evidence to share with other colleagues, or at committees, Menti retains the responses for you to review, and can generate lists of entered comments. It isn’t designed as a statistical analysis tool, so comments don’t come in a convenient format for analysis. Graphs can be screenshotted for simple sharing if needed. It is of course possible to read through the comments and type up an overview of common themes yourself if needed.
The instant feedback received highlighted that some students weren’t entirely satisfied with the feedback they were receiving on their lab reports. Based on the responses, the School was able to put on a dedicated feedback session to further explore this, and to find out what could be improved, and have already been able to implement some changes.
Consider adding a delay before sharing answers on screen
For free text answers, it may be prudent to summarise rather than share the screen
Consider numbers when selecting your question types – it’s not easy to read 30+ typed responses quickly.
Don’t overuse it – keep use to key points so the novelty doesn’t wear off
Just try it – and be prepared for the honesty in the responses!
Menti works best in the moment in a session, or when you want to gather responses instantly
Can I try it?
You can create a free account on Mentimeter.com. This free account has limits on how many individual questions you can make, but you can always make multiple presentations and switch to them when sharing your screen. The university does not have any subscription to this service. As such, do not ask students to input any personal or sensitive information as this won’t be covered by our data policies.
When presenting, simply ask the audience to visit the site and enter the provided code that you show on the screen. When you’re ready, you can display the results to your audience. See the FMS TEL Community for a full walkthrough.
Other Ideas for Use
Gauging the experience already in the room before starting a session (free text or poll)
A quick litmus test on an opinion – yes/no/not sure (could be done at the beginning and end to see how things change)
Challenging students to define a term
Choices and branching scenarios
Feeding back results on a solo exercise
Anonymous feedback / rating for an aspect of the session
Lindsey Ferrie, Senior Lecturer, School of Biomedical, Nutritional and Sport Sciences
This post shows how cursor movement can be used in online presentations to show gesture, and the skills needed to add motion tracked items to video.
As part of the recently launched Exploring 3D Anatomy MOOC, two video presentations were created. These presentations involved explaining diagrams and pictures. One of these recordings had a moving cursor which the presenter had used to explain various parts of the screen, and the other was recorded without a cursor. To improve the clarity of the explanations, we had a request to display a larger cursor over the recorded material, using it to ‘point’ to the various significant areas shown.
When you’re planning a virtual presentation it’s worth checking if and how the software handles the cursor – some software will use a glowing highlight as you present, some will show and hide the cursor automatically depending on when you move it. The videos were recorded in ReCap, which automatically hides the cursor unless it is moving. The end of the post has links to various guides to help you choose your settings. The rest of this post details the animation process for how a larger cursor was added after the presentations had been recorded. This technique could be applied to other added graphical elements too if needed.
Creating the New Cursor
As the cursor is used in a lot of animations, there was already a scalable vector graphic image of a cursor available to use. This had been drawn in Adobe Illustrator. The next step was to use After Effects to add the cursor to the video and animate it.
Tracking the Cursor
For the video with the cursor visible, the motion tracking function of After Effects was used. After identifying the original cursor, the new larger cursor was set to track it. Here and there the original cursor changed colour to remain visible over different backgrounds. It wasn’t necessary to replicate the colour change with the larger cursor, but this colour change did add extra steps when setting up the motion tracking as it needed to be started afresh each time the original cursor had changed colour. For the video without the cursor, the process was simpler as there was nothing to hide or track. As such the animations could be set up from scratch. Based on the clear explanation from the presenter, it was possible to add a cursor to trace the areas being explained.
Adding the New Cursor
The animations were set up to take place between certain moments of the video – like scenes. Key points in the video were identified and ‘key frames’ added which allow us to set up when certain animations should take place, and how long for. Simple animations such as changing size, position or rotation can be done relatively quickly using these linear key frames.
Once the start and end points are set, further customisation can be done to change the feel of the animation. For example, in this case, the speed of the cursor should somewhat mimic a natural movement rather than a precisely uniform speed. Using ‘ease in’ and ‘ease out’ (combined as ‘easy ease’) allows for the animation to look a little more natural, and less jarring, as the cursor starts to move more slowly before speeding up and gradually slowing to a stop.
When moving from point to point it’s very rare that a straight line is the best path to take, usually a slightly curved path can help add a more natural-looking movement. This might be used to instruct a viewer to click a series of buttons, for example. The ‘spatial interpolation’ in After Effects allows for the path of the moving object to be linear (a straight line) or Bezier (curved). The temporal interpolation tool allows for variations to the speed of the movement – a more customisable version of easing. Adjusting these allows for a nice natural pace and movement, and for more creative effects. For example an item moving from A to B may move slowly at first, then speed up towards the middle of its journey, then slow down again before arriving at its final destination – imagine a train travelling between stations!).
The final videos allowed for a clear approximation of gesture to be added to the presentations, mimicking how a presenter might usually point to a screen or demonstrate a movement. While this is something very natural to do in person, you may need to think more about how you use and move your cursor in online presenting. Often it can be tricky to see the cursor, so you may wish to consider moving it more slowly than usual if you are using it to indicate processes or changes. Selecting some form of pointer or cursor highlighting in your chosen software can improve the visibility of the cursor during your presentation, whether recording or in person. On the other hand, you may wish to put your mouse out of reach so that random or accidental cursor movements don’t detract from your content.
Motion Tracking Demonstration
This video demonstrates the full motion tracking procedure, showing how you can track an object and then map the position of a cursor to it.
Cursors on Panopto – remember that only the slides are captured if you add a PowerPoint, so to capture your cursor, you should record your screen instead.
This post is about using audio recordings of patient consultations in teaching. Commentary was added to the recordings by the lecturer to create a richer resource.
This case study concerns Dietetics and Nutrition module NUT2006, Measurement and Assessment of Dietary Intake and Nutritional Status. As part of this module, dietary interview consultations are recorded so that the students can listen to these as examples. The FMS TEL Podcasting Webinar provided initial inspiration for what could be done with the recordings to enhance them. With a little more support, a new audio resource has been developed which adds audio commentary to the recorded consultations, highlighting various features.
Consultations and Recordings
The work of Dietitians and Nutritionists involves gathering information from individuals and populations on their recent or typical food intake. This enables them to analyse nutrient intake and understand dietary behaviours so that they can make suitable recommendations. Taking a diet history, or a 24-hour dietary recall, involves a structured interview with questions exploring habitual food intake, timing of meals, cooking methods and quantities. The effectiveness of the interviewers’ questioning technique impacts upon the quantity of information gathered and the quality of the nutritional analysis that can be undertaken. Students are working towards proficiency in these skills. Listening to recordings of these interviews exposes students to examples which will support in improving their skills when they perform these tasks for themselves. They can also practice analyzing the data provided from the audio recordings.
The recordings themselves are a very rich resource, which could be used in a variety of ways to help students improve their practice. The following task was developed, which required teaching staff to add audio commentary to the interviews.
Students first watched a short lecture on best practice for conducting interviews. They then listened to a recorded interview, by an anonymous peer, and made notes critiquing the effectiveness of the questioning techniques and determining if the quality of information obtained was sufficient to undertake nutritional analysis. Next, they listened to the same interview with professional commentary provided by staff, highlighting what could be improved and were asked:
Did you spot the same things?
Reflect on the comments and try to think about how you might use this knowledge to improve your own skills in gathering dietary information from service users.
This task was designed to allow students to develop their skills in conducting the interviews, and to reflect on practice and identify areas for development. The use of peer recordings meant that there would be a range of areas to comment on, making the task itself much more active than simply listening to a professional. Students were also offered more interview recordings to practice this task further.
Adding Commentary with GarageBand
A recording was chosen that demonstrated a range of teaching points. Having listened to the recording and made brief notes, cuts were then made in the original recording at natural stopping points, for example, after the participant and interviewer had discussed breakfast. It was important to allow the original recording room to breathe by not interjecting too often – this makes for fewer edits too.
You can record audio with a range of devices – Windows laptops can run Audacity, and Macs come with GarageBand. It is also possible to record audio clips on a smartphone and import them. When doing any recording, make sure to do a quick test first to ensure there is no unwanted background noise – just record a few seconds and listen back. GarageBand was used in this case, but the Audacity user interface is very similar.
The first 20-minute recording took around two hours to produce, but this time included learning how to use the software. The screenshot below shows how the editing process looks in GarageBand. The top half shows the three tracks that were mixed to create the final output. By cutting and arranging the various sections, it is possible to quickly add commentary and even intro music to the basic original recording.
The project file, which contains all of the information in the top half of the screenshot such as individual tracks and cuts, can be saved for later use. This is helpful if you want the flexibility to change the content, or re-use elements. The single stream of audio can be exported separately as an audio file and embedded into Canvas or the MLE with accompanying text and other resources to build the desired task.
Style and Substance
It is natural to worry about quality when producing an audio or audiovisual resource for the first time as the content should convey a level of professionalism matching its purpose. As long as content is clear and understandable, it will serve for teaching. Making a clean recording can be done relatively simply by avoiding background noise and speaking at a measured pace and volume. You can add a touch more professionalism to your recordings by adding a little music to the intro and using some basic transitions like fading between different tracks if needed, but there is no need to go out and buy specialist equipment. The content of the recordings was linked very closely to the students’ tasks and mirrored how they may receive feedback in future by showing what practitioners look for in their interviews. This clear purpose alongside the care taken in producing the audio ensures that this resource is valuable to listeners.
While at first it seemed like a big undertaking, a quick YouTube search for instructions on using the software, and then having a go with the audio recordings has opened up a new avenue of teaching methodology – it was a lot easier to do than it first appeared, and in total took around 2 hours. The software has a lot of capabilities, but only the basics are really needed to produce a high-quality, rich teaching resource. Commentated practitioner interactions allow teaching staff to draw students’ attention to key moments while remaining in the flow of the interaction, signposting how students can reflect on practice and develop their own interviewing skills.
A recent case study by David Kennedy about personal tutoring in the School of Medical Sciences and its impact on student satisfaction.
Dr David Kennedy of the School of Medical Education recently shared his work on personal tutoring and academic mentorship via the LTDS blog. The case study details what changes were made, and shows the massive increase in student satisfaction that resulted.
“As a School, we had a desire to improve academic and professional development, as well as pastoral support, for all of our students to enable them to achieve their true potential and support their transition to the workplace.”
David Kennedy, SME
This case study will be of interest to anyone who is interested in mentoring and pastoral support, and student experience.