New functionality for Inspera digital exams: question choice and marking rubrics

Inspera assessment is the University’s system for centrally supported digital exams. Inspera can be used for automatically marked exam questions, for manually marked question types including essays, or for exams with a combination of both.

New functionality has recently been launched that enables colleagues to do more with digital written exams.

Question choice for students

Candidate selected questions is used to give students taking your exam a choice of which questions to answer from a list.

For example in an exam where students need to answer 2 essay questions from a list of 6 questions, you can set this up so that a student can choose a maximum of 2 questions to answer.

How does it work for a student?

If candidate selected questions is used in an Inspera exam the student sees information above each question that shows how many questions to select in total, and how many they have already selected. To choose a question to answer they change the ‘Answering this question?’ drop down box to yes.

Screenshot showing student view of Inspera, with the option to choose whether to answer a question. Below the question title is some text which reads 'Answering this question? 0 of 2 questions selected.' There is a drop down box at the right of the text with the options 'Yes', 'No', 'Undecided' available to select.
Screenshot showing student view of Inspera, with the option to choose whether to answer a question.

If a student starts answering a question without changing the ‘Answering this question?’ drop down box, Inspera automatically changes it to ‘Yes’.

When they have selected the maximum number of questions, the student cannot start answering any more questions. However, if they change their mind about which question(s) they want to answer, they can simply change the ‘Answering this question?’ drop down to no, and select a different question instead.

How does it work for a marker?

A marker only sees answers to the questions that a student has chosen to answer.

As students can only submit answers for the maximum number of questions they are allowed to choose, this means you can say goodbye to the dilemma of trying to work out which questions to mark when a student has misread the instructions and answered too many questions!

How can I use it in my exam?

The Candidate selected questions function is available when you are authoring a question set for an Inspera digital exam. Find out more in the Inspera guide for Candidate selected questions.

Rubrics for marking

You can now create a rubric to use for marking any manually marked question type in Inspera. Rubrics allow you to build the assessment criteria for an exam question into Inspera, and use them in your marking.

Choose whether you want to use a quantitative rubric to calculate the mark for a question, or a qualitative rubric as an evaluation and feedback tool, and then manually assign the mark.

How to introduce a rubric for your exam

  1. When you are creating the exam question in Inspera, set up the rubric you want to use for marking that question. The Inspera guide to rubrics for question authors explains how to create a rubric and add it to your exam question.
  2. After the exam has taken place, use the rubric to mark the students’ answers.
  3. If you’ve chosen to use one of the quantitative rubric types, as you complete it the student’s mark for the question will automatically be calculated. If you’ve chosen a qualitative rubric, once you’ve completed the rubric use it to evaluate the student’s answer and help you decide on their mark for the question.
  4. You can choose to add feedback to the candidate in the box below the level of performance you’ve selected for each criterion (you can see an example of this in the image below).
Screenshot of the Grader view of a sample points-range rubric in Inspera.
Screenshot of the Grader view of a sample points-range rubric in Inspera

Want to learn more about using Inspera for digital exams?

Come along to a webinar to learn about creating exam questions or marking in Inspera.

Enroll onto the Inspera Guidance course in Canvas to learn about Inspera functionality at your own pace.

Find out about the process to prepare an Inspera digital exam, and how the Digital Assessment Service can help on the Inspera webpage.

Contact digital.exams@newcastle.ac.uk if you have questions or would like to discuss how you could use Inspera for a digital exam on your module.

Announcing the University’s new Digital Exam System: Inspera Assessment

In September 2021 we will be launching a new system for centrally supported digital exams, called Inspera Assessment. Implementing the system will enable the Digital Exam Service to: 

  • Deliver secure locked down present-in-person exams on University computers and students’ own laptops, monitored by University invigilators 
  • Ensure that digital exams are accessible to all our students, and enhance the student experience of exams 
  • Increase the University’s digital exam capacity in the long term 
  • Enable more authentic exams by introducing new functionality

New exam types possible with Inspera will include:  

  • Students taking written exams online, by typing their answers on computer, and incorporating drawings or written calculations done on paper into their online answers where needed. 
  • Allowing access to specific online resources or applications during a secure exam, using allow listing functionality. 

Introducing Inspera is a big step forward for education, assessment and feedback at Newcastle University.  Adopting a specialist digital exam system allows us to do much more than would be possible if we continued to use the Virtual Learning Environment for digital exams.

Choosing a digital exam system 

Inspera has been selected as our digital exam system following a rigorous procurement process, which began with requirements mapping workshops in February 2020, attended by over 60 academic and professional services staff.  The procurement was postponed for a year as a result of the global pandemic, and restarted in semester 2 2020/21 when colleagues had the opportunity to feed in any new or updated requirements via an online survey.   

Once the tender was issued key digital exams stakeholders contributed to a rigorous evaluation process to decide on the system that best fit our requirements.  Students and staff were invited to volunteer for usability testing in each system that met the mandatory technical requirements. The team are very grateful to the 36 colleagues, and 13 undergraduate and postgraduate students, who completed a total of approximately 150 hours of usability testing between them! 

Inspera scored the highest overall for both usability, and for technical and functional requirements. 

Rolling out Inspera 

As standard all 2021/22 modules that have a present-in-person digital exam in MOFS will use Inspera.  If the public health situation requires, it will be possible for these modules to use the system for open book take home exams.

Numbas maths assessment system remains an option digital exams that need specialist mathematics functionality.

The system will be available for additional new digital exams from 2022/23 onwards.  There will be opportunities in the coming months to see demonstrations of the software, and learn more about the new types of assessment that it makes possible.  If you would like to learn more now, please contact digital.exams@newcastle.ac.uk

How to get started  

The Digital Exams Service team will contact all 2021/22 module teams with a digital exam in their MOF at the beginning of September, with details of the process for preparing their exam. 

Training will also launch in September 2021, and all colleagues who will be using Inspera in the new academic year are encouraged to sign up.   

Online resources to help students prepare for a digital exam will be published in September, and students will also be able to try out a demo exam in Inspera to help familiarise themselves with the system. 

If you are interested in introducing a new digital exam using Inspera in future, or if you have any queries about a 2021/22 digital exam, please contact digital.exams@newcastle.ac.uk

Assessment resources on Digital Learning website

Resources are available to help staff prepare for the semester 2 assessment period, including: 

Exams

Assignment set up 

  • Guidance is available on whether to use a Canvas Assignment or a Turnitin Assignment
  • It is important that module teams agree which assignment type to use before it is set up in Canvas, and that marking is done in the correct tool. SpeedGrader (link to Canvas Orientation course) must be used for a Canvas Assignment, Turnitin Feedback Studio (link to screencast) must for a Turnitin Assignment. 
  • When an assignment is created, the maximum number of marks available (for example 100) must be entered in the Points field.  The points should never be set as zero, as this causes technical issues. 

Marking and moderation 

Further help 

Buddycheck Updates

There has been a system update to Buddycheck which alongside some improvements to the student view has opened up some new functionality when creating evaluations. The major changes that users will notice are described below. User guidance available on the Digital Learning webpages has been updated to reflect these changes.

Creating an evaluation and reusing questions

When creating a new evaluation you will now be asked to add a title before moving to the full evaluation set up page. There is now the option to use a previous evaluation as a template. To use existing questions in a new evaluation you now need to select an old evaluation as a template.

Buddycheck create evaluation screen with title entry and template selection hihglighted

Student introduction

There is now an option to add in an introduction to the evaluation for students. This will appear to students before they begin an evaluation alongside some new additional guidance on the question types included in the Buddycheck evaluation.

Student introduction test entry

Question ordering

Question order can now be updated by using drag and drop. You can preview, edit or remove questions from an evaluation using the appropriate icon.

question ordering alongside edit, preview and delete icons

Adjustment factor cap

It is now possible to set a minimum and maximum value adjustment factor cap for an individual evaluation.

The adjustment Factor is the average rating of the student divided by the overall average rating for all members of the team. This is used to adjust the individual student mark

It is possible to use either the capped adjustment factor or the original factor with no cap applied when deciding final marks.

For more information on how the adjustment factor may impact marks see the adjustment factor guidance and the adjustment factor excel example.

adjustment factor amendment options

Adding team questions

Alongside the existing ability to create scored questions, it is now possible to create team questions that ask students to answer a 5-scale question about the team as a whole (strongly agree to strongly disagree). Team questions do not contribute to the adjustment factor. 

Team question creation screen

Option to ask students to ‘motivate’ peer question score

When creating a peer question it is now possible to ask students to optionally motivate  scores, i.e. provide a comment as to why they have selected a score for their peer. This is now possible as part of the question rather than through the use of open questions at the end of the evaluation.

For any queries on these changes please contact LTDS@ncl.ac.uk or see the guidance at the Digital Learning webpages.

Digital exam system usability testing

Are you a member of academic or professional services staff interested in digital exams?

The digital exam system procurement process is going ahead as planned, and we are making adjustments to enable staff to participate in usability testing while remote working. 

We appreciate that this is a very busy time for colleagues across the University. However, it is necessary to go ahead with usability testing now to support the digital exam system procurement process.  If you are interested and have capacity to participate in usability testing your contribution will be very valuable. 

We are looking for volunteers to test digital exam systems, to help assess how user friendly each one is.  Testers’ feedback will be a key part of the evaluation stage of the tender process, and have a direct impact on which digital exam system the University introduces from next academic year.   

Usability testing is open to all University staff.  You can choose to test from the perspective of either: 

An exam administrator testing how to create exam settings, and manage marking and moderation processes.  Approximately 90 minutes per system. 

An academic testing how to create exam questions, and carry out marking and moderation.  Approximately 2 hours per system. 

To participate you need to commit to test all of the systems that meet the University’s mandatory requirements, which we estimate may be between 2 and 4 systems.  This is required to ensure that the evaluation process is fair, and we’ll be able to confirm the number of systems being tested the week before the testing begins.   

Full instructions and video demonstrations will be provided for each testing task. You can complete the testing tasks at any time that suits your schedule over the usability testing period from Monday 1 June to Monday 15 June. 

To register your interest in doing usability testing please complete this form by 12 noon on Tuesday 26 May 2020.  Please contact digital.exams@newcastle.ac.uk with any queries. 

Transition to the Digital Exams Service: A Timeline

Following our October 2019 post introducing the digital exam service, we have a progress update and some news about what’s happening next.  Centrally supported digital exam provision (including the OLAF Service, and the Diversifying online exam provision project) is being combined into a single service, and we are reviewing our requirements ready to tender for a system that meets our needs. 

February 2020 

Requirement Mapping Workshops will be taking place. The outcomes of these sessions will help to inform the requirements that we will take to system providers.  All academic and professional services staff with an interest in digital exams are invited to contribute.  Please sign up via the link to have your say! 

March 2020 

Tender for digital exam system (30-35 days response time). A set of final requirements will be issued. 

April – May 2020 

Scoring of tender submissions against requirements will take place alongside user testing of software that meets our mandatory requirements.  Look out for updates about how to get involved. 

June  July 2020 

A provider will be awarded the contract to supply a digital exam system to the University.  

Following this, work will be undertaken to move as much of existing digital exam questions and content into the new system as is possible. 

August 2020 

The new system will be vigorously tested and integrated with University systems. User guidance and training for all stakeholder will be developed. 

August assessment period

Any exam deferrals and resits in the August assessment period will need to be completed/submitted in Canvas. The Blackboard license ends on July 31st and from that point no staff or students will be able to access that system.

Schools should adopt the same method of assessment that was used in Semester 2 for any resits/deferrals in the August assessment period. If a Blackboard test was used in the Semester 2 assessment period, then a Canvas quiz should be used in the August assessment period.

If you ran an OLAF exam in Semester 1 you can either deliver the resit using a Canvas quiz or a Turnitin submission.

Information and support is available via the Education Continuity webpages.

September 2020 

Digital Exam Service launches with new software – OLAF is no more. 

All digital exams previously taken in both Blackboard as part of the OLAF service and in WISEflow as part of the Diversifying Online Exam Provision project will be delivered using the chosen software. 

Training will be offered to all academic and professional services staff involved in delivering digital exams, and briefing information will be available for students. 

Phil Race and Sally brown – Assessment and feedback videos

Heriot-Watt University have released a range of videos of Professor Phil Race and Professor Sally Brown discussing key elements of assessment and feedback.

Phil is an authority on assessment and is widely published, including the excellent “The Lecturer’s Toolkit”.

Sally is Emerita Professor at Leeds Metropolitan and regularly keynotes at Educational conferences. Sally developed the National Teaching Fellowship scheme when working at the Higher Education Academy.

We’ve embedded some of the videos below, but please visit Youtube to view more of these videos.

Giving your first lecture

Marking your first assignment

Sally Brown – Marking your first assignment

Feedback on Assessment

Sally Brown – Feedback on assessment

Student Tips – feedback on assessment

Measuring Learning

Introducing the digital exams service

Building on the solid foundations of OLAF provision, and the successful first 2 years of the Diversifying and Expanding Online Exam Provision project, the University’s Technology Enhanced Learning Sub-Committee have approved the launch of a new combined Digital Exams service.

The story so far …

Newcastle University’s Online Assessment and Feedback (OLAF) Service has been running high stakes secure online exams using Blackboard’s test tool since 2007/08. The 13 years since that first exam have seen OLAF come of age, supported by well-established institutional processes that ensured all 132 OLAF exams in 2018/19 went smoothly.

In 2017/18 the Diversifying and Expanding Online Exam Provision project was launched, and the first of some new types of digital exams were piloted using software called WISEflow. Bring Your Own Device was introduced, enabling students to use their own laptops to sit a secure digital exam. Alongside this, moving essay and long written answer exam questions from paper to online has also become possible for the first time.

Continue reading “Introducing the digital exams service”

Assessment and Feedback

Helen Webster

By Helen Webster, Head of the Writing Development Centre

“The structure doesn’t flow”

“You need to engage more critically with the literature”

“More detail and greater depth of discussion needed”

“Hard to follow – make sure your points are clearly expressed”

It’s frustrating both to give and receive feedback repeatedly on the same issues and not see any improvement. Feedback is highlighted in the NSS and NUSU campaigns so we know that students see it as a priority. We also know that academic staff don’t always feel that students are engaging with their feedback or even recognise it as such. Continue reading “Assessment and Feedback”

Student views on feedback forms

Group of students

To find out more a student intern, working with staff in LTDS,  evaluated existing feedback forms and gathered opinions from students to identify what works and what could be improved. The project considered a total of 66 forms from 19 different schools and included focus groups and interviews with individual students.

What did they find?

These are a few key findings and you can find full details in the project report.

Form Design

Have clear, separate sections showing:

  • Strengths and areas for improvement
  • Clear advice for future work

Only use tick boxes for objective areas of the marking criteria, such as grammar. When tick boxes were used for subjective areas, such as argument, students found this unhelpful.

Look at your feedback forms and consider whether these should be redesigned. Consult with the students in your school as part of the process.

Utilising the form

Type feedback, wherever possible.

Introduce structured opportunities to help students understand:

  • expectations of the marking criteria
  • the ways in which this is reflected in the feedback sheet

Discuss how you use marking sheets with your colleagues. Try to develop a consistent approach to:

  • the volume of feedback
  • the use of notes in margins

For more information get in touch with LTDS@ncl.ac.uk