Canvas October 2024 Updates

Over October there have been a series of updates to various features within Canvas including New Quizzes, Assignments and Discussions.

In this blog post, we will cover what is new in Canvas for October 2024.

How to Navigate this Update

This update is broken down into pages dedicated to updates for each of the Canvas Features.

By selecting the page numbers below, you will be taken to the update for each Canvas feature.

Included in this update is:

New Quizzes Updates (Page 2)

  • Add Time to Existing Quiz Sessions
  • Manage Student Result View
  • Submitted Date Displays in Moderate Log
  • Attempt Log Stopped Viewing the Canvas Quiz Page

Assignment Updates (Page 3)

  • Assign to Interface Change – “Assign To” Location Change

Discussions Updates (Page 4)

  • Edit button added to Discussions Index Page

SpeedGrader Updates

Over the summer there have been a series of updates to the SpeedGrader tool within Canvas to improve the ability to provide feedback.

In this blog post, we will highlight some of the key changes to the SpeedGrader and how you can utilise these changes in your courses.


Submission Comment Drafts

In SpeedGrader, after adding a submission comment, if this has not been saved, a Draft pill displays indicating that this comment has not been saved and a warning message is presented alerting the teacher that the comment has not been saved.

Previously there was no clear indication that a comment had been submitted and this would lead to students not seeing comments/feedback in their assignments.

With this update, it is clear for a teacher to see the status of a submission comment.

You can see in the example below that the submission comment has not been submitted and we have a draft pill alongside our comment:

When we press submit on this comment, the draft pill disappears which means the comment is visible to the student:


Rich Content Editor (RCE) In Submission Comments

In SpeedGrader, some Rich Content Editor (RCE) features are available when using submission comments. The available RCE features include:

  • Heading
  • Bold
  • Italic
  • Underline
  • Font colour
  • Insert Hyperlink
  • Bullets

This allows teachers to style feedback and provide further resources via linking. In the example below, you can see a link is provided to further resources to assist the student:

This functionality is available at the top of the submission comments box as demonstrated below:


Equation Editor in Submission Comments

In SpeedGrader, an Equation Editor function has been added to the Rich Content Editor. This feature enables instructors to incorporate math equations into their submission comments.

In the below example, you can see the new equation editor function within the submission comments in SpeedGrader:


Randomise Students in Submission List

In SpeedGrader Settings, instructors now have the option to randomise the order of students within each submission status. This update helps mitigate grading fatigue and biases by ensuring a random sorting of students. Additionally, it enhances grading efficiency by maintaining this random order within submission statuses.

Below are step by step instructions on how to do this:

  1. In the top left corner of the SpeedGrader, select the cog icon
  2. From the dropdown menu select “Options”
  3. Within SpeedGrader options, select “randomise students within a submission status”
  4. Select the “Save settings” button

Please note that when the randomised students, the preference is saved as the default in the browser for the course. When logging in on another device, instructors must select the sort by options again.

Question Set Functionality in Inspera Digital Exams

What is a question set? 

A question set is a group of questions created in Inspera. In Inspera you can make individual questions, or you can set up a question set and create a group of questions in there. Individual questions can also be imported into your question set.

Creating your question set

Accessing Inspera 

All colleagues (and students) access Inspera via Canvas. Colleagues can either access a ‘test’ exam set up in their Canvas Sandbox area, or by going through an existing Inspera exam. 

Naming your question set 

You should give question sets meaningful names, e.g. ‘MOD1234 Semester 1 Exam 202425’. This makes it easier to identify the question set for the current academic year. Question sets are created in the Author tab of Inspera.

Naming your individual questions 

There are a variety of question types you can create in Inspera. You can see the full list of automatically marked question types and manually marked question types for more information. 

Remember to rename your individual questions too, so that you can easily find them in the question set. Each new question is automatically given the name ‘New Question’ so if you are creating question sets with multiple questions in, giving them a meaningful name will help you find specific ones later. 

If you need to update a question, a meaningful name will make it easier and quicker to locate. 

Adding labels to your questions

Adding labels to your individual questions can also help with locating and re-using your questions in other question sets. 

If you’d like to know about adding labels and using filters to find questions, please see our dedicated label filtering blog post. 

Content creation in question sets 

There are various features you can use within Inspera to enhance your question sets. These are optional features you can apply within your question set. 

For example, you can randomise the order in which the questions appear for students when they sit they exam. You can also use the random pulling feature to pull a sub-set of questions from a larger bank of questions, so that each student receives a different combination of questions. 

If you are using both manually marked essay questions, and a set of multiple choice questions, you are able to put these into what is known as sections. This means you could apply randomisation to the multiple-choice questions only. You could also use another feature on the essay questions known as candidate selected questions

There is a dedicated Content Creation Features website on our Learning and Teaching Inspera site which lists the full details on these different features. 

Question set deadlines 

Once the Module Leader has completed their Digital Exam Form to confirm the details about their Inspera Digital Exam, the next step is to start creating their question sets. 

Question sets are created by the module team and shared with the Digital Exams Team via email to Digital.Exams@newcastle.ac.uk.  

The question set deadlines for 2024/25 are detailed in the following table: 

Semester Question Set Deadline 
Semester 1 15th November 
Semester 2 7th March 
Semester 1 August Resit (Semester 3) 17th April 
Semester 2 August Resit (Semester 3) 8th July 
A table with the question set deadlines for each semester.

Training webinars 

The Digital Exams Team deliver a training webinar Creating and managing exam questions in Inspera which colleagues can sign up to via the Elements training system. The training session covers an on-screen demos of setting up and adding questions to your question set. 

Digital Assessment Upcoming Training Webinars

The Digital Assessment Team in LTDS have various training webinars covering our Digital Assessment tools starting in the next few weeks.

You can check out the full list of sessions, dates/times and the links to book in our list of sessions.

Inspera Digital Exams

Inspera for Professional Service colleagues

6 November 2024, 10:00 AM – 11:00 AM

Creating and managing exam questions in Inspera

2 October 2024, 2:00 PM – 3:00 PM

7 November 2024, 9:00 AM – 10:00 AM

Marking an Inspera exam with auto marked questions

16 December 2024, 11:00 AM – 12:00 PM

22 January 2025, 9:00 AM – 10:00 AM

Marking and moderating an Inspera exam with manually marked questions

17 December 2024, 11:00 AM – 12:00 PM

23 January 2025, 9:00 AM – 10:00 AM

Digital Assignments: Canvas and Turnitin

Creating and Managing Digital Assignments

2 October 2024, 9:00 AM – 10:00 AM

Online Marking & Feedback (Canvas)

4 December 2024, 9:00 AM – 10:00 AM

Online marking and feedback (Turnitin)

11 December 2024, 9:00 AM – 10:00 AM

Inspera Marking: Hints and Tips

As we enter the assessment marking period, the Digital Exams Team want to share some marking ‘hints and tips’ for Inspera Digital Exams. Check out some of our hints and tips listed below. 

Hints and Tips

  1. To attach yourself to an exam as a grader, make sure you click the link from the Inspera assignment point in Canvas. This takes you into the ‘Deliver’ area of the exam and you can click the ‘Open in Grade’ button to enter the ‘Grade area’. 
  1. If you need to search for a specific student, within the ‘Overview’ section of the Grader area, you can search for a student number to locate their submission. In the screenshot below, ‘stutestX’ is a placeholder for a student ID. In your exams you will see student numbers listed instead. 
  1. If you are in the Grade area and need to go back to the Deliver area (for example, to set the feedback settings), there is a shortcut available. Click the ‘Options’ button at the top of the screen and navigate to ‘Shortcuts’. Select ‘Deliver’ and click ‘Open test in Deliver’, 
  1. It is possible to download raw marks from Inspera as an Excel file. Click the ‘Options’ button at the top of the screen and navigate to ‘Downloads’. Select ‘Marks as Excel file’, 
  1. As standard the Digital Exams team will set up the Canvas assignment associated with your Inspera exam as 100 points. This means (once released) students will view their Canvas Gradebook mark as a proportion. If you’d like students to see raw marks, please edit the Canvas assignment points area to match that of your total Inspera marks.  
  1. For manually marked questions, Graders can add Annotations to student submissions. Within student submitted text, click the left mouse button and move the mouse across the text you want to annotate. Click Annotate:  
  1. Within the Grade ‘marking’ area there is a search icon for students now for all graders. When marking, use the bottom panel to navigate to specific students using their ID. For example: 

Further Support

Webinars

The Digital Exams Team run two dedicated marking webinars which colleagues can book onto: 

These training webinars cover a range of marking workflows, including how to amend auto-marked questions and adding annotations to manually marked questions such as essays. 

Videos

There are a range of marking videos available on the Inspera L&T website which provide on-screen demonstrations of grading tasks. 

Feedback Release

If you would like to release feedback to your students on your auto or manually marked Inspera questions, check out our dedicated webpage on Inspera Feedback Release for further information.

Further questions?

If you have any questions about marking an Inspera exam, please contact the Digital Exams Team via Digital.Exams@newcastle.ac.uk.  

If you have any hints or tips that you think we could add to the above list, please do share them with the Digital Exams Team. 

AI in Education: The Art of The Possible

26-30 June 2023

The art of the Possible AI in Education graphic

Artificial Intelligence is this year’s hot topic for our Art of the Possible week 26-30 June 2023. 

We will be offering a series of in-person, online and asynchronous opportunities to join the conversation, share ideas and reflect on the ways AI affects education.  

Save the time in your diaries to join in and hear from external speakers and colleagues, and to experiment with a range of AI tools.  

Schedule 

Monday 26 June 

  • Embracing the AI Landscape: Debbie Kemp from the University of Kent will open our week, sharing and reflecting on how she has incorporated AI in her teaching and assessment. 
    Online 10:00-10:45 
  • Introduction to AI: a one-hour overview from LTDS and FMS TEL colleagues.   
    In person 14:00-15:00 

Wednesday 28 June 

  • AI and Assessment: a one-hour session exploring the impact of AI on assessment.  
    In person 10:00-11:00 
  • Embracing AI @Newcastle: find out how colleagues at Newcastle University are embracing AI in their teaching and learning.  
    Online 14:00-15:00 

Thursday 29 June 

  • Hands on Explore AI Tools: Join us in the Herschel Learning lab to try out a range of AI tools. 
    In-person, bring your own device: 10:00-11:30 
  • Microsoft 365 and AI: Join the NUIT Digital Adoption team for an overview of what is currently possible, and what the future holds, for AI in Microsoft 365.  
    Online 14:00-15:00 

Friday 30 June 

Get involved 

We will be blogging over the week, gathering question, sharing comments and recordings on our Learning and Teaching Development Blog, so come back for updates.   

Visualising programme level assessment

As part of our Assessment and Feedback Sprint Series. A small team of students and colleagues have been investigating the question: 

How do we articulate a meaningful programme experience that ensures a cohesive assessment journey for all of our students?

Feedback (Stage Surveys, NSS etc.,) tells us that students and colleagues struggle to see assessments from a programme perspective and this disconnection can lead students to feel like assessment isn’t part of a wider programme and that their skills/feedback don’t link across modules and assessments.  

Being able to visualise the assessment journey across a stage or programme is important because, as one colleague said,

“An assessment journey builds confidence in the education (and the education provider) and underscores the importance of each individual assessment towards an overarching goal. Articulation of assessment journeys allows for broader reflection and helps explain the skill development (rather than focussing on siloed, module specific content).”

An overview of some of the visuals we found from within Newcastle University and other HE Institutions are shown below. In summary, we found a range of approaches, often highlighting the ‘journey’ through the stage or programme, making it easier for students to reflect on progress. 

What have we created?

Using these findings, we created some template visuals which were then validated by colleagues and students along with feedback incorporated from our first showcase.

We decided to create a variety of templates to reflect diverse practices/skillsets across programmes and areas. Some are more suitable for Semester-based programmes and others for block-taught programmes. 

You can explore these yourself:

We started by looking at a standard linear stage one programme – V400 BA Archaeology. We initially had a large amount of text on the visual explaining each assessment and how it aligned to the wider programme learning objectives. However, it quickly began to look overwhelming.

We then started to explore using H5P as a way to keep the visual relatively simple but incorporate pop up boxes to make it more interactive and engaging. The version below has dummy text – click on the questionmarks to see how it would work.

We also considered how to visually represent a block-taught postgraduate programme and incorporated feedback from a Degree Programme Director (DPD) to represent larger-weighted modules with bigger circles. The DPD said this would be a useful tool for both staff and students including at recruitment and Induction events. 

The intention is that these editable templates will be useful for both students and programme teams to visualise assessment across a programme or stage. The visual could be produced as part of a workshop reviewing programme level assessment or could be a standalone tool designed to be student-facing. 

Find out more about our Sprint

We presented our Sprint adventures at the Sprint Showcase event on Friday 10 March, and you can watch the recording here:

To find out more about the Assessment and Feedback Sprint Programme contact Conny.Zelic@ncl.ac.uk in the Strategic Projects and Change Team.

New functionality for Inspera digital exams: question choice and marking rubrics

Inspera assessment is the University’s system for centrally supported digital exams. Inspera can be used for automatically marked exam questions, for manually marked question types including essays, or for exams with a combination of both.

New functionality has recently been launched that enables colleagues to do more with digital written exams.

Question choice for students

Candidate selected questions is used to give students taking your exam a choice of which questions to answer from a list.

For example in an exam where students need to answer 2 essay questions from a list of 6 questions, you can set this up so that a student can choose a maximum of 2 questions to answer.

How does it work for a student?

If candidate selected questions is used in an Inspera exam the student sees information above each question that shows how many questions to select in total, and how many they have already selected. To choose a question to answer they change the ‘Answering this question?’ drop down box to yes.

Screenshot showing student view of Inspera, with the option to choose whether to answer a question. Below the question title is some text which reads 'Answering this question? 0 of 2 questions selected.' There is a drop down box at the right of the text with the options 'Yes', 'No', 'Undecided' available to select.
Screenshot showing student view of Inspera, with the option to choose whether to answer a question.

If a student starts answering a question without changing the ‘Answering this question?’ drop down box, Inspera automatically changes it to ‘Yes’.

When they have selected the maximum number of questions, the student cannot start answering any more questions. However, if they change their mind about which question(s) they want to answer, they can simply change the ‘Answering this question?’ drop down to no, and select a different question instead.

How does it work for a marker?

A marker only sees answers to the questions that a student has chosen to answer.

As students can only submit answers for the maximum number of questions they are allowed to choose, this means you can say goodbye to the dilemma of trying to work out which questions to mark when a student has misread the instructions and answered too many questions!

How can I use it in my exam?

The Candidate selected questions function is available when you are authoring a question set for an Inspera digital exam. Find out more in the Inspera guide for Candidate selected questions.

Rubrics for marking

You can now create a rubric to use for marking any manually marked question type in Inspera. Rubrics allow you to build the assessment criteria for an exam question into Inspera, and use them in your marking.

Choose whether you want to use a quantitative rubric to calculate the mark for a question, or a qualitative rubric as an evaluation and feedback tool, and then manually assign the mark.

How to introduce a rubric for your exam

  1. When you are creating the exam question in Inspera, set up the rubric you want to use for marking that question. The Inspera guide to rubrics for question authors explains how to create a rubric and add it to your exam question.
  2. After the exam has taken place, use the rubric to mark the students’ answers.
  3. If you’ve chosen to use one of the quantitative rubric types, as you complete it the student’s mark for the question will automatically be calculated. If you’ve chosen a qualitative rubric, once you’ve completed the rubric use it to evaluate the student’s answer and help you decide on their mark for the question.
  4. You can choose to add feedback to the candidate in the box below the level of performance you’ve selected for each criterion (you can see an example of this in the image below).
Screenshot of the Grader view of a sample points-range rubric in Inspera.
Screenshot of the Grader view of a sample points-range rubric in Inspera

Want to learn more about using Inspera for digital exams?

Come along to a webinar to learn about creating exam questions or marking in Inspera.

Enroll onto the Inspera Guidance course in Canvas to learn about Inspera functionality at your own pace.

Find out about the process to prepare an Inspera digital exam, and how the Digital Assessment Service can help on the Inspera webpage.

Contact digital.exams@newcastle.ac.uk if you have questions or would like to discuss how you could use Inspera for a digital exam on your module.

Students evaluate using Inspera for 21/22 Digital Exams

Inspera Assessment, the University’s system for centrally supported digital exams, launched for the 21/22 academic year. A key part of understanding how we better use digital exams is to consider ways to improve the student experience of taking a digital exam. Following the launch, the Learning and Teaching Development Service (LTDS) asked for student feedback from those who took a digital exam in 21/22.

142 students submitted their feedback.

Here are our findings:

65% of students were somewhat or very satisfied with their overall experience of taking their exam using Inspera.

A pie chart titled ‘How satisfied are you with the experience of taking your exam(s) using Inspera?’ depicts that students reflected their experience(s) as:
1. Very dissatisfied 11%.
2. Somewhat dissatisfied 14%.
3. Neither satisfied nor dissatisfied 10%.
4. Somewhat satisfied 30%.  
5. Very satisfied 35%.
Results of the Inspera Student Evaluation

How easy is Inspera to use?

81% of students found starting their Inspera exam somewhat or very easy.

80% of students found handing in/submitting their Inspera exam somewhat or very easy.

When asked to compare a written exam paper and an Inspera paper which included written questions where students could type their answers, 63% of students stated they found it somewhat or much better using Inspera.

Is Inspera better for Take Home or on Campus PC cluster exams?

85% of students were somewhat or very satisfied with their overall experience of using Inspera for their take home exam(s).

73% of students were somewhat or very satisfied with their overall experience of using Inspera for their PC Cluster exam(s).

Thoughts for the future

Inspera seems to be a hit with students overall; the experience of using it is largely positive, with Inspera Take Home papers gaining the highest satisfaction scores. PC Cluster Inspera exam satisfaction scores showed the majority of students were satisfied with their overall experience. Feedback clearly indicated many students felt re-editing written answers works well in Inspera (and is better than trying to edit paper based written exams).

The most common concern raised was around plagiarism. LTDS is keen to work with colleagues to alleviate student concerns and ensure that the provision is developed and supported going forward.

LTDS opened its provision for digital exams to all modules, and the number of planned digital exams for 22/23 has increased.

To better support students before their exam, the LTDS recommend students practise with Inspera. Our survey showed 60% of students tried at least one demo before their main exam; we’d like to get that figure up! Practice exams can help with learning to use the tool and they are accessible via Canvas.

Try it out:

Student Inspera Demo Course

Announcing the University’s new Digital Exam System: Inspera Assessment

In September 2021 we will be launching a new system for centrally supported digital exams, called Inspera Assessment. Implementing the system will enable the Digital Exam Service to: 

  • Deliver secure locked down present-in-person exams on University computers and students’ own laptops, monitored by University invigilators 
  • Ensure that digital exams are accessible to all our students, and enhance the student experience of exams 
  • Increase the University’s digital exam capacity in the long term 
  • Enable more authentic exams by introducing new functionality

New exam types possible with Inspera will include:  

  • Students taking written exams online, by typing their answers on computer, and incorporating drawings or written calculations done on paper into their online answers where needed. 
  • Allowing access to specific online resources or applications during a secure exam, using allow listing functionality. 

Introducing Inspera is a big step forward for education, assessment and feedback at Newcastle University.  Adopting a specialist digital exam system allows us to do much more than would be possible if we continued to use the Virtual Learning Environment for digital exams.

Choosing a digital exam system 

Inspera has been selected as our digital exam system following a rigorous procurement process, which began with requirements mapping workshops in February 2020, attended by over 60 academic and professional services staff.  The procurement was postponed for a year as a result of the global pandemic, and restarted in semester 2 2020/21 when colleagues had the opportunity to feed in any new or updated requirements via an online survey.   

Once the tender was issued key digital exams stakeholders contributed to a rigorous evaluation process to decide on the system that best fit our requirements.  Students and staff were invited to volunteer for usability testing in each system that met the mandatory technical requirements. The team are very grateful to the 36 colleagues, and 13 undergraduate and postgraduate students, who completed a total of approximately 150 hours of usability testing between them! 

Inspera scored the highest overall for both usability, and for technical and functional requirements. 

Rolling out Inspera 

As standard all 2021/22 modules that have a present-in-person digital exam in MOFS will use Inspera.  If the public health situation requires, it will be possible for these modules to use the system for open book take home exams.

Numbas maths assessment system remains an option digital exams that need specialist mathematics functionality.

The system will be available for additional new digital exams from 2022/23 onwards.  There will be opportunities in the coming months to see demonstrations of the software, and learn more about the new types of assessment that it makes possible.  If you would like to learn more now, please contact digital.exams@newcastle.ac.uk

How to get started  

The Digital Exams Service team will contact all 2021/22 module teams with a digital exam in their MOF at the beginning of September, with details of the process for preparing their exam. 

Training will also launch in September 2021, and all colleagues who will be using Inspera in the new academic year are encouraged to sign up.   

Online resources to help students prepare for a digital exam will be published in September, and students will also be able to try out a demo exam in Inspera to help familiarise themselves with the system. 

If you are interested in introducing a new digital exam using Inspera in future, or if you have any queries about a 2021/22 digital exam, please contact digital.exams@newcastle.ac.uk