Reading group in LTDS

A little while ago we started a small reading group for colleagues in the Learning and Teaching Development Service to share ideas and discuss current issues and publications related to learning and teaching in Higher Education.

We set ourselves a couple of parameters to encourage engagement, as we had tried a journal club previously to not a great deal of success.

This time we decided to limit ourselves:

  • to things that could be read or digested in around less than half an hour
  • to try too keep the readings short and digestible
  • to keep the discussion sessions to 30 minutes
  • to use small groups for discussion of themes, impressions etc

Over the past few groups we have:

Our next group will explore microcredentials by looking at the recent QAA quality compass paper – which way for micro credentials.  

This will be the first meeting of a slightly expanded group which includes colleagues from FMS TEL .

We have one person running the group for 6 months (Me!) and I look after collating suggestions which come in from anyone who wants to suggest something. I try to have a range of different types of materials and cover a range of learning and teaching related viewpoints as our group has people who work in policy, practice, pedagogy, quality assurance, data and governance, professional development and all the intersections thereof.

Last time we listened to a radio programme about closed captions, which really made me think about how we approach captioning in HE. Some great ideas resulted from the session and it certainly got us talking!

My internship supporting the rollout of learning analytics for students

By Em Beattie, Stage 2, Geography, Politics and Sociology student

This summer I worked as an intern for the learning analytics team. The learning analytics team has been developing a new system for Newcastle University students to allow them to review and have access to their own learning analytics data. Learning analytics refers to the measurement, collection, analysis, and reporting of data, for the purpose of understanding and improving students’ learning. Student’s data is collected from a variety of sources to enable students to view their attendance, engagement and module summaries. The aim of this new roll out is to empower and positively impact students’ academic achievement and progress for smarter insights and stronger outcomes.

My key role was to contribute to the methodology and development of student communication channels, organise pop ups, analyse and manipulate data, contribute to design and evaluation of material and present findings. I really wanted this experience to develop my career skills, and I am passionate about academic growth and attainment. 

I found the internship on MyCareer, which is a Newcastle University platform which provides internships and work experience students can apply for. After finding the learning analytics internship and reading through the description I thought it would be a valuable and interesting opportunity. The applying process was very simple I attached my CV and answered three questions on how I would manage the hours required to work, what skills I brought to the internship and why I am the right person for the experience. When writing these answers, I used the STAR technique to provide efficient details of skills I brought to the table. After submitting my application, I was fortunately emailed a few weeks later asking if I was available for an interview. I was very nervous for the interview as I had never had an in-person interview before. However, to prepare I read over the description of the role, writing down on a notepad what skills I could bring to each of the tasks I would be completing and ideas I had. I also looked at the advice Newcastle university gave about internships on their website. https://www.ncl.ac.uk/careers/making-applications/interviews-assessments/interviews/. After I completed my first ever in person interview which although was nerve wracking provided me with real world experience which will be super helpful later in life, I waited to hear for the result.

Before starting the internship, I was slightly nervous, but I worked with an incredible team which were very supportive all the way. The learning analytics internship has given me an incredible experience, teaching me valuable skills and lessons that have allowed me to develop both professionally and personally.

Working with the learning analytics team has been so much fun. Through hosting pop-ups and interviewing students, I learned how to gather meaningful feedback, listen actively, and represent student voices in a constructive way. This experience also helped me understand the importance of real student insights and how they can inform and improve educational strategies. Although the pop up was quieter than expected as some students had left to go home for summer, we still gathered a range of responses online and in person.

After the pop up and students filling in online forms, I analysed data which taught me valuable skills of critical thinking and paying close attention to detail to observe patterns and trends of student’s responses. This experience confirmed my interest in qualitative and quantitative research, and I am now more confident in analysing data.

I thoroughly enjoyed providing a student perspective and spin to the marketing research. Another one of my tasks for the internship was to develop communication channels for students. There were multiple channels that were highlighted from the pop-up including emails, canvas, social media and in person discussion. For social media channels I utilised Canva, which was a fun experience to design a social media post about the new learning analytics system. Additionally, I also helped design the structure of the student facing webpages, using PowerPoint to design an example and writing descriptions around explaining why videos and images should be used.  As someone who lacks creative skills, I found it really fun to try and design social media posts and webpages for learning analytics and felt it definitely developed my creativity.

The best part of the internship was knowing that what I was working on would help current students in their academic growth allowing students to set targets and review their engagement of their work.

An example of a type of day from the internship includes a meeting which would either be held in the Kingsgate building or remotely on teams depending on the team’s availability. During this meeting we discussed what we had all been working on, gave each other feedback and ideas and planned our tasks for next week. A lot of the work I did complete was online such as analysing data, creating ideas for communication channels and researching and comparing other universities learning analytics system.

The experience massively helped my confidence, interviewing students and presenting my research pushed me out of my comfort zone but helped me become much more comfortable in putting myself out there.

One challenge I faced was managing all the weekly tasks. Some weeks were busier than others, but on those busy weeks I used my notebook to schedule when I was completing each task, how long the tasks would take, when meetings where and if I had any questions during those tasks to keep track of everything.

One tip I would give to students doing an internship is to write down the skills that they have learnt during the experience with a description. I have done this, and it was helpful as I completed my student internship pathway reflection and will be useful for future interviews and applications as I can explain clearly what skills I developed from this experience.

Looking into the future…

Moving into third year is scary but knowing I am bringing valuable skills that I have learnt from this experience makes me feel more confident and ready. I am looking forward to use study goal to improve my academic progress and create targets to better myself.

Inspera new feature using Stimulus

Highlighting text within a Stimulus  

As seen on our dedicated content creation features website, within your Inspera Question Sets using sections, you are able to apply Stimulus to questions.  

Stimulus can be helpful when authoring exam content, as information that is relevant to question(s) within a section can be displayed alongside the question itself. Stimulus examples include:  

  • case studies 
  • background information 
  • key concepts that students should refer too 

To further capture student attention to a particular part of a stimulus, you can highlight text in a section Stimulus. The highlighted text becomes visible to students as outlined per individual questions in the section.  

If a Stimulus is used for multiple questions, as students moves through the questions; you could highlight different parts of the Stimulus text in line with what is relevant for the selected question.  

Using highlighting: 

  1. Add a Stimulus (using the document option) to your Section 
  1. While editing the Stimulus, select the text that you want highlighted and click on the highlight brush (at the top right of the toolbar) 
  1. Select the question this highlight refers too: 

In this example, the MCQ question is selected 

  1. Click insert  
  1. Repeat for all questions as required 
  1. Review all highlights using ‘Section highlights’ 
  1. Save 

Important note: for highlighting to be achieved, the Stimulus must be created as a Document.  

Reflecting on year one of NULA – and what’s coming next 

As we near the end of the first academic year using the Newcastle University Learning Analytics (NULA) system, we’re taking a moment to reflect—and we want to hear from you.

NULA was introduced to support teaching and learning by giving colleagues greater insights into student engagement and progress. Over the past year, colleagues across the university have used the platform to inform tutoring conversations and connect with students in more meaningful ways.

Now, your feedback will help us understand what’s working, what could be improved, and how NULA can be better used to support students moving forward.

Share Your Experience 

We’ve created a short survey (it takes less than 10 minutes to complete) to gather your thoughts. Whether you’ve used NULA extensively or only briefly, your perspective is incredibly valuable. 

Complete the survey now  

What’s Next for NULA 

We’re excited to share that several important developments are on the way: 

Student app launch – September 2025 

The student-facing version of the NULA app will be available for the start of the 2025/26 academic year, designed to give students greater insight into their learning and engagement. Dedicated resources for student will be made available on the Academic Skills Kit website. 

New data sources for colleagues 

From next academic year, the colleague-facing version of NULA will include ReCap lecture capture data and Library Reading List data—offering an even more comprehensive picture of student engagement. 

These enhancements are driven by your feedback, and we’re committed to ensuring NULA continues to support your work in meaningful and practical ways. 

Inspera Digital Exams Team Leading External User Group

Inspera External User Group

The Inspera Digital Exams Team at Newcastle have set up and led on a user group for Inspera users at other institutions in the UK. 

Through existing connections and the support of our Account Manager at Inspera, we have contacted colleagues in similar roles to ourselves (Learning Enhancement and Technology Advisers) to create a group where we can share best practice, learn from each other and have the opportunity to discuss aspects of our Inspera use.

The group first met in November 2024 and as of June 2025 has 33 members across 13 institutions. We have received lots of positive feedback on how useful the group is, and we have enjoyed meeting other fellow Inspera users.

The Inspera Leads at Newcastle are always looking for ways to improve the service our Digital Exams Team can offer and work with Inspera to share feature requests. The group have been sharing what their priorities are in terms of things we would like to see in Inspera so that we can share this with Inspera.

Inspera Priorities at Newcastle 

Maddie and Kimberly of the Digital Exams Team, along with some of our academics who use Inspera, took part in some user research with Inspera in relation to marking and feedback. 

In relation to marking and feedback, our key priorities are: 

  1. Release of feedback restrictions, i.e. flexibility to release essay feedback comments without releasing banks of MCQs. 
  1. Editing questions after an exam i.e. where mistakes have been found afterwards upon marking. A request has been highlighted whereby, for auto marked questions, users can use answer key corrections (like the functionality of MCQs) but for all for auto marked questions. 
  1. Data availability for questions used in exams. Inspera are currently working on their analytics, with a psychometric dashboard currently available for the Inspera Digital Exam Leads to access. 

Other feature requests we support, which are in relation to question authoring, include: 

  • Author tab to include example questions and question sets. 
  • ‘Tick box’ to confirm when a question set has been finalised.   
  • Ability to have template of example question sets to share across users.   
  • Being able to collapse sections in assessments to avoid having to scroll up and down when editing one of the later sections. 

If you would like to know more about our priorities at Newcastle, you can get in touch with the Digital Exams Team via: Digital.Exams@newcastle.ac.uk.  

Inspera: Marking Hints/Tips & New Community

Marking Hints & Tips 

 You can check out some of our hints and tips for marking Inspera Digital Exams or our dedicated website

  1. To access your exam(s) to mark, click the link within the assignment point in Canvas. You must be a teacher or teaching assistant on the Canvas module, and this will attach you to the exam in Inspera. 
  1. If you need to search for a specific student, within the Inspera ‘Grading Overview’ section, use the search bar – you can search for a student ID. 
  1. To download raw marks from Inspera, as an Excel file. Click the ‘Options’ button at the top right of the Grade screen and navigate to ‘Downloads’. Select ‘Marks and Explanations as Excel file’. 
  1. Digital Exams are always set up initially with the Canvas assignment associated with your Inspera exam as out of 100 points.  (If you’d like students to see raw marks, please edit the Canvas assignment points area to match that of your total Inspera marks).   
  1. For manually marked questions, Markers can add Annotations to student submissions. (Within students submitted text, click and hold the left mouse button to select the text you want to annotate. Click Annotate). 
  1. Once marking is complete in Inspera, don’t forget to Confirm Marks, this will complete the grading step and push the completed marks from inspera to Canvas Gradebook.  

Video demonstrations: There are a range of marking videos available on the Inspera website which provide on-screen demonstrations of grading tasks. See Video guides for Markers

Feedback Release: If you would like to release feedback to your students on your auto or manually marked Inspera questions, check out our dedicated feedback release webpage.  

Further questions? If you have any questions about marking an Inspera exam, please contact the Digital Exams Team via Digital.Exams@newcastle.ac.uk    

Internal Inspera Teams Community (Digital Exams using Inspera Assessment @ Newcastle) 

The Digital Exams Team have set up a Newcastle University community via a new Teams group. This new community will: 

  1. Enable colleagues who use Inspera for their Digital Exams, to ask general questions in a dedicated area 
  1. Create a space for colleagues to share ideas and best practice and/or learn from others experience. 
  1. Build a supportive community network of like-minded colleagues. 
  1. Enable the Digital Exams Team to share updates about Inspera such as feature developments, improvements and Roadmaps. 

The Digital Exams Team will monitor this community to ensure members are on track for enhancing their Inspera digital exam and provide support queries relevant across all disciplines.  

You can request to join Inspera @ Newcastle Teams site. 

If you need support with specific queries relating to your own exam(s), please email digital.exams@newcastle.ac.uk

E-Assessment in Mathematical Sciences (EAMS) 2025 Conference

EAMS Conference logo

The E-Assessment in Mathematical Sciences (EAMS) conference, takes place between 16th and 27th June 2025.  

Organised by the team behind our Numbas e-assessment system, the conference aims to bring together researchers and practitioners with an interest in e-assessment for mathematics and the sciences. It will consist of a mix of presentations of new techniques, and pedagogic research, as well as workshops where you can get hands-on with leading e-assessment software. 

EAMS 2025 is an entirely online conference, with a mix of live sessions and web-based activities, and plenty of opportunity for discussion and collaboration. 

Before the conference starts, there will be a programme of optional training workshops available for participants to get hands-on with state-of-the-art maths e-assessment software.  

Live talks will take place over Zoom at 9:30 and 15:30 BST (UTC +1) each weekday, with recordings available later. The online format and longer timescale allow participants to engage more deeply with the material presented. 

The call for talk and workshop proposals is currently open. If you have some research or an innovative technique related to mathematical e-assessment that you would like to present, then please submit an abstract at eams.ncl.ac.uk/call-for-speakers by 2nd May. 

We’re actively seeking to increase the diversity of our attendees and speakers, and particularly encourage speakers from groups under-represented in previous editions of EAMS to submit proposals. 

To attend the conference, please register for free at eams.ncl.ac.uk/register.  

Inspera New Feature: Multiple Attempts 

Multiple Attempts is a feature which supports formative auto-marked Inspera digital exams. Module teams can now allow their students to take practice auto-marked Inspera exams repeatedly, either by having students submit as many times as they wish, or by setting a defined number of retakes. Please be aware there is no option to lock down an exam using Multiple Attempts

Multiple Attempts can help students to learn rapidly and understand topics by allowing them to practice until answers are correct. This can also allow for a dynamic and effective learning experience. 

Students can: 

  • Improve their understanding of the topic by practicing multiple times 
  • Increase their confidence by identifying and correcting mistakes 
  • Prepare more effectively for exams by identifying improvement areas 

Multiple Attempts can currently only be used with Inspera digital exams which are using automatically marked questions. The feature can also be used with pre-defined feedback. Pre-defined feedback is recommended if using multiple attempts, as this allows students to improve their understanding of the exam content. 

Setting up Multiple Attempts 

  1. Within the Deliver Tab on Inspera, you will need to edit the exam settings and click ‘Enable Multiple Attempts’. 
  1. You will then be prompted to set a maximum number of attempts. If wishing students to have an unlimited number of attempts, click ‘Unlimited Attempts’. 
  1. Under ‘Setting final result’, choose the most appropriate option for your exam which will be applicable as part of your student feedback. Options are: 
  • Highest: The highest score achieved among all attempts will be the final result. 
  • Average: The average score obtained across all attempts will be calculated and used as the final result. 
  • Latest: The most recent score from the student’s attempts will be the final result. 

Student feedback when using Multiple Attempts 

When using Multiple Attempts, feedback should be set to be released to students immediately after each test attempt. Feedback which is immediate enables students to see full details of their attempt instantly and work on this for the next attempt. 

Full details about Multiple Attempts on Inspera can be found via: Content Creation Features in Inspera | Learning and Teaching @ Newcastle | Newcastle University 

Question Set Functionality in Inspera Digital Exams – Part 2

Introduction 

As we approach the Semester 2 (24/25) question set deadline on 7th March 2025, the Digital Exams Team are sharing another post about question set functionality in Inspera.  

You may also wish to check out our first blog post from Semester 1: Question Set Functionality in Inspera Digital Exams – Part 1. This first post covers a lot of the functionality around making your questions and enabling functions such as randomisation, random pulling and candidate selected questions. In this new post, we will cover some of the other functionalities available. These include:  

  • How to share question sets with colleagues,  
  • Accessing and printing PDF copies of the question set, 
  • Duplicating your question set. 

Sharing question sets with colleagues 

There are a couple of ways you can share your question set content with other colleagues or your external examiners. 

For other colleagues on your module team, you may find it useful to add them as a ‘contributor’ to your question set. This will allow them to view questions and preview them in Inspera. This is also particularly useful if you are creating the content together with another colleague. 

How to add contributors to your question set:
  1. Open Inspera (this can be done by clicking on your previous exam within Canvas assignment area) 
  1. Click on the Author tab 
  1. In the search bar type the question set name/module name 
  1. Find the question set you wish to print and click on the name to open this 
  1. Select the person icon icon (on the left hand side of the icons in the top right corner) 
  1. Search the person’s name and select them as a user from the drop down menu that appears 
  1. If you would like them to receive an email about this, please keep the ‘notify via e-mail’ box ticket 
  1. Click ‘Share’. 

Please note: if you cannot find a user when searching for their name, it is likely that they have not accessed Inspera via Canvas yet, and therefore do not have an Inspera account. They would need to access Inspera via Canvas to activate this, and for their name to show in the list. There is guidance available on the self enrol Inspera guidance course on Canvas. 

For external examiners, it is possible for you to add them as a contributor as well and ask them to access Inspera via Canvas. You may also wish to consider the option of downloading a PDF copy of the question set content from Inspera in order to share the question set content with them. 

Accessing and printing question sets as PDFs 

Within your question set in Inspera, it is possible to access and print your question set as a PDF. This could be to share a copy with External Examiners, or to create a paper copy if you are hosting a mock exam where you want a back up paper copy available just incase. 

How to print your question set:
  1. Open Inspera (this can be done by clicking on your previous exam within Canvas assignment area) 
  1. Click on the Author tab 
  1. In the search bar type the question set name/module name 
  1. Find the question set you wish to print and click on the name to open this 
  1. Select the print icon in the top right corner 
  1. On the right hand side click ‘download question set’ 
  1. The download may take a minute or two to prepare, once ready select download now 
  1. The download will be in your downloads folder 

Should you wish to change what is available on your PDF, you can use edit Settings. Once within the Print screen, drop down the settings bar. 

Settings allow you to: 

  • remove maximum marks 
  • remove ‘documents’ or instruction pages 

remove certain questions from the PDF (i.e. hide auto marked questions) 

How to edit the PDF of your question set 

You can edit your PDF using ABBY Finereader. You can familiarise yourself with the FineReader video guides, and detailed written user guides which have been shared by NUIT. 

How to duplicate your question set 

Once you have created a question set, you can re-use it. The Digital Exams Team recommend duplicating your Question Set and then editing, this means there’s a clear audit trail of the Question Set used in prior years. 

How to duplicate your question set:
  1. Open Inspera (this can be done by clicking on your previous exam within Canvas assignment area) 
  1. Click on the Author tab 
  1. In the search bar, type the Question Set name/module name (in this example a ‘demo’ Question Set is being searched for) 
  1. Find the question set you wish to duplicate and click into the tick box (on the left hand side) 
  1. An options bar appears along the bottom, click ‘duplicate’ 
  1. A message will appear about the duplication, click ‘continue’ 
  1. Click into this new ‘copy of’ question set 
  1. You can re-label the name of the question in the top left corner using the pencil icon. 

You can now make any changes to the Question set for an upcoming exam. 

Question set deadlines

Once the Module Leader has completed their Digital Exam Form to confirm the details about their Inspera Digital Exam, the next step is to start creating their question sets.  

Question sets are created by the module team and shared with the Digital Exams Team via email to Digital.Exams@newcastle.ac.uk.   

The question set deadlines for 2024/25 are detailed in the following table:  

Semester  Question Set Deadline  
Semester 1  15th November  
Semester 2  7th March  
Semester 1 August Resit (Semester 3)  17th April  
Semester 2 August Resit (Semester 3)  8th July  
A table with the question set deadlines for each semester. 

Training webinars  

The Digital Exams Team deliver a training webinar Creating and managing exam questions in Inspera which colleagues can sign up to via the Elements training system. The training session covers an on-screen demo of setting up and adding questions to your question set. 

If you have any queries about creating your question set, you can head over to our dedicated Creating Question and Content Creation Features webpages. You can also contact the Digital Exams Team via Digital.Exams@newcastle.ac.uk.

Digital Accessibility Demo Day – 5th March 2025

What does “accessible” mean?

What difficulties do students have accessing the material we provide?

How do students surmount those difficulties?

How do you improve the accessibility of your material?

We’re putting on an event to help answer those questions.

It’s important that all of our digital services are accessible to their users, whether they’re students or colleagues. The Public Sector Bodies Accessibility Regulations set out some legal requirements that we must meet.

But digital accessibility is a complex topic and many colleagues have found it hard to understand what they need to do to ensure their teaching material is accessible.

At our digital accessibility demo day, you can have a go at accessing university teaching material at a selection of stations simulating different access requirements and supports, including:

  • Screen reader
  • Speech to text
  • Keyboard-only interaction
  • Low vision
  • Low mobility
  • Magnification
  • Canvas Ally

We’ll have plenty of pointers to guidance and training opportunities to help you improve the accessibility of your material.

People from LTDSNUIT and the Disability Interest Group will be there to offer support and answer any questions you may have about digital accessibility.

Time and location

The event will take place 13:00 – 15:00 on Wednesday 5th March 2025, in the Boiler House.

The Boiler House is in the middle of campus, between the Armstrong Building and the Student Union. Access is step-free.

There’s no presentation as part of the session – just drop in and talk to one of the facilitators.

SpeedGrader Update – Coming in Spring 2025

Video Update

What’s New?

In Spring 2025, there will be some updates to Canvas SpeedGrader. This update makes SpeedGrader faster and more stable, while keeping the interface easy to use. The grading process you know will stay the same, but with some improvements behind the scenes.

Previously, courses with large cohorts or assignments with large file submissions experienced frustratingly slow loading times. This update aims to enhance SpeedGrader’s performance, making navigation quicker and more efficient.

In addition to performance updates, there will be minor interface changes to assist with navigation. Although small, these changes will help with the usability of SpeedGrader. After these changes, the interface will still have the familiar SpeedGrader feel.

Let’s dive into the changes made to Canvas SpeedGrader…

Sections Selector Dropdown

The section selector now has a streamlined interface, making it easier to navigate between different class sections. Previously, filtering by section required more steps. With the new Sections Selector Dropdown, you can quickly filter submissions by section.

In the Student Dropdown List, you’ll now see a Sections header. Under ‘Showing,’ you’ll find the current section that the list is filtered to (point 1).

To apply a new section filter, click on the Section filter (point 2). A dropdown list will then appear, as shown below:

In the dropdown list, you’ll see all the sections associated with the assignment. A tick mark will indicate the section currently applied as the filter (point 3).

To choose a new section filter, click on the name of the desired section (point 4).


No Submission Alerts

The alert for assignments without submissions has been enhanced to be more prominent and visually clear.

Previously, this would be indicated with the assignment showing as blank in the DocViewer. It is now clearly indicated that there is no submission.

You can see in the below (point 1), this is now clearly displayed in the DocViewer.


Grade Status Selector

Changing the status of a submission is now easier with a new dropdown box. However, it’s generally not recommended to use this feature, as our assignment statuses are tracked via the NESS system.

Previously, this status was managed by a pencil icon located in the top corner of the marking pane in SpeedGrader.

To change a submission status, click on the dropdown box and selected the appropriate status.

This is demonstrated in point 1 below:


Rubrics

Rubrics are now consistently displayed in the new traditional (grid) view. This view is very similar to the rubrics you’re used to marking with, though there are some minor changes.

The Instructor score is now displayed at the top of the rubric, making it easier to see while marking an assignment (point 1).

Providing feedback for rubric criteria is now easier with the feedback entry box clearly displayed (point 2). Previously, you had to access this feature via a button. Having the feedback option readily available encourages more frequent addition of comments to rubric criteria.


Media Attachments

Uploading and managing media attachments in submission comments is now more intuitive, thanks to an improved dialogue and a more straightforward deletion process.

Deleting an attachment has been made more intuitive with the introduction of a rubbish bin icon, replacing the previous red ‘x’ button (point 1). This change not only modernises the interface but also makes the deletion process clearer and more user-friendly. The rubbish bin icon is universally recognised, ensuring that users can easily identify and use this function without confusion.