The role of Digital Exam Support Assistants (DESA)

A photo of hands typing on a laptop keyboard.

Digital Exam Support Assistants (DESAs) are PGR students who support invigilators in digital exam venues to help students troubleshoot any technical issues using the safe exam browser software. Safe Exam Browser is software which works alongside Inspera offering a secure ‘locked down’ digital exam. Inspera Assessment is the University’s Digital Exam system used for present-in-person, secure online assessments.

How do DESAs support exam invigilators in digital exams?

DESAs are on-hand to support students and invigilators to troubleshoot issues faced when accessing Inspera for Bring Your Own Device (BYOD) exams. Exam invigilators have reported that the presence of DESAs makes them feel more confident in digital exam venues. Feedback has stated that DESAs have been a ‘confidence booster’ and that invigilators ‘couldn’t do it without them’. Invigilators reported that the DESAs were responding to queries quickly which has also been stated by students who had DESA support.

How do students find the DESA support?

39 students submitted their feedback on their Semester 1 22/23 BYOD exam. When asked how satisfied they were with the technical support available in their exam, two thirds of students (67%) reported that they were satisfied or very satisfied.

Students reported that ‘those who requested support were dealt with quickly and there was little hassle.’

How did the DESAs find their experience?

We asked some of our DESAs how they found their experience in the role this year. Check out some of the quotes below:

I had a wonderful experience with the team. Enough training was given to staff. Would like to work with the team again. Thanks for giving me the opportunity.”


“Regarding my experience in the DESA role this academic year, it provided me with a valuable opportunity to contribute to the Digital Assessment Office and engage with fellow students. The role not only enhanced my understanding of digital assessment practices but also allowed me to develop essential skills in communication and collaboration. I am grateful for the experience and the chance to be a part of improving the assessment process at Newcastle University.”

What’s next?

We are pleased to report that the DESA role will be returning in the 2023/24 academic year. This support provision has been crucial in supporting our students with any troubleshooting during their BYOD digital exams. For more information you can email the Digital Assessment Team.

You can find out more about Inspera in our other blog posts on Inspera and on our Inspera Digital Exams webpage.

Visualising programme level assessment

As part of our Assessment and Feedback Sprint Series. A small team of students and colleagues have been investigating the question: 

How do we articulate a meaningful programme experience that ensures a cohesive assessment journey for all of our students?

Feedback (Stage Surveys, NSS etc.,) tells us that students and colleagues struggle to see assessments from a programme perspective and this disconnection can lead students to feel like assessment isn’t part of a wider programme and that their skills/feedback don’t link across modules and assessments.  

Being able to visualise the assessment journey across a stage or programme is important because, as one colleague said,

“An assessment journey builds confidence in the education (and the education provider) and underscores the importance of each individual assessment towards an overarching goal. Articulation of assessment journeys allows for broader reflection and helps explain the skill development (rather than focussing on siloed, module specific content).”

An overview of some of the visuals we found from within Newcastle University and other HE Institutions are shown below. In summary, we found a range of approaches, often highlighting the ‘journey’ through the stage or programme, making it easier for students to reflect on progress. 

What have we created?

Using these findings, we created some template visuals which were then validated by colleagues and students along with feedback incorporated from our first showcase.

We decided to create a variety of templates to reflect diverse practices/skillsets across programmes and areas. Some are more suitable for Semester-based programmes and others for block-taught programmes. 

You can explore these yourself:

We started by looking at a standard linear stage one programme – V400 BA Archaeology. We initially had a large amount of text on the visual explaining each assessment and how it aligned to the wider programme learning objectives. However, it quickly began to look overwhelming.

We then started to explore using H5P as a way to keep the visual relatively simple but incorporate pop up boxes to make it more interactive and engaging. The version below has dummy text – click on the questionmarks to see how it would work.

We also considered how to visually represent a block-taught postgraduate programme and incorporated feedback from a Degree Programme Director (DPD) to represent larger-weighted modules with bigger circles. The DPD said this would be a useful tool for both staff and students including at recruitment and Induction events. 

The intention is that these editable templates will be useful for both students and programme teams to visualise assessment across a programme or stage. The visual could be produced as part of a workshop reviewing programme level assessment or could be a standalone tool designed to be student-facing. 

Find out more about our Sprint

We presented our Sprint adventures at the Sprint Showcase event on Friday 10 March, and you can watch the recording here:

To find out more about the Assessment and Feedback Sprint Programme contact Conny.Zelic@ncl.ac.uk in the Strategic Projects and Change Team.

New functionality for Inspera digital exams: question choice and marking rubrics

Inspera assessment is the University’s system for centrally supported digital exams. Inspera can be used for automatically marked exam questions, for manually marked question types including essays, or for exams with a combination of both.

New functionality has recently been launched that enables colleagues to do more with digital written exams.

Question choice for students

Candidate selected questions is used to give students taking your exam a choice of which questions to answer from a list.

For example in an exam where students need to answer 2 essay questions from a list of 6 questions, you can set this up so that a student can choose a maximum of 2 questions to answer.

How does it work for a student?

If candidate selected questions is used in an Inspera exam the student sees information above each question that shows how many questions to select in total, and how many they have already selected. To choose a question to answer they change the ‘Answering this question?’ drop down box to yes.

Screenshot showing student view of Inspera, with the option to choose whether to answer a question. Below the question title is some text which reads 'Answering this question? 0 of 2 questions selected.' There is a drop down box at the right of the text with the options 'Yes', 'No', 'Undecided' available to select.
Screenshot showing student view of Inspera, with the option to choose whether to answer a question.

If a student starts answering a question without changing the ‘Answering this question?’ drop down box, Inspera automatically changes it to ‘Yes’.

When they have selected the maximum number of questions, the student cannot start answering any more questions. However, if they change their mind about which question(s) they want to answer, they can simply change the ‘Answering this question?’ drop down to no, and select a different question instead.

How does it work for a marker?

A marker only sees answers to the questions that a student has chosen to answer.

As students can only submit answers for the maximum number of questions they are allowed to choose, this means you can say goodbye to the dilemma of trying to work out which questions to mark when a student has misread the instructions and answered too many questions!

How can I use it in my exam?

The Candidate selected questions function is available when you are authoring a question set for an Inspera digital exam. Find out more in the Inspera guide for Candidate selected questions.

Rubrics for marking

You can now create a rubric to use for marking any manually marked question type in Inspera. Rubrics allow you to build the assessment criteria for an exam question into Inspera, and use them in your marking.

Choose whether you want to use a quantitative rubric to calculate the mark for a question, or a qualitative rubric as an evaluation and feedback tool, and then manually assign the mark.

How to introduce a rubric for your exam

  1. When you are creating the exam question in Inspera, set up the rubric you want to use for marking that question. The Inspera guide to rubrics for question authors explains how to create a rubric and add it to your exam question.
  2. After the exam has taken place, use the rubric to mark the students’ answers.
  3. If you’ve chosen to use one of the quantitative rubric types, as you complete it the student’s mark for the question will automatically be calculated. If you’ve chosen a qualitative rubric, once you’ve completed the rubric use it to evaluate the student’s answer and help you decide on their mark for the question.
  4. You can choose to add feedback to the candidate in the box below the level of performance you’ve selected for each criterion (you can see an example of this in the image below).
Screenshot of the Grader view of a sample points-range rubric in Inspera.
Screenshot of the Grader view of a sample points-range rubric in Inspera

Want to learn more about using Inspera for digital exams?

Come along to a webinar to learn about creating exam questions or marking in Inspera.

Enroll onto the Inspera Guidance course in Canvas to learn about Inspera functionality at your own pace.

Find out about the process to prepare an Inspera digital exam, and how the Digital Assessment Service can help on the Inspera webpage.

Contact digital.exams@newcastle.ac.uk if you have questions or would like to discuss how you could use Inspera for a digital exam on your module.

Students evaluate using Inspera for 21/22 Digital Exams

Inspera Assessment, the University’s system for centrally supported digital exams, launched for the 21/22 academic year. A key part of understanding how we better use digital exams is to consider ways to improve the student experience of taking a digital exam. Following the launch, the Learning and Teaching Development Service (LTDS) asked for student feedback from those who took a digital exam in 21/22.

142 students submitted their feedback.

Here are our findings:

65% of students were somewhat or very satisfied with their overall experience of taking their exam using Inspera.

A pie chart titled ‘How satisfied are you with the experience of taking your exam(s) using Inspera?’ depicts that students reflected their experience(s) as:
1. Very dissatisfied 11%.
2. Somewhat dissatisfied 14%.
3. Neither satisfied nor dissatisfied 10%.
4. Somewhat satisfied 30%.  
5. Very satisfied 35%.
Results of the Inspera Student Evaluation

How easy is Inspera to use?

81% of students found starting their Inspera exam somewhat or very easy.

80% of students found handing in/submitting their Inspera exam somewhat or very easy.

When asked to compare a written exam paper and an Inspera paper which included written questions where students could type their answers, 63% of students stated they found it somewhat or much better using Inspera.

Is Inspera better for Take Home or on Campus PC cluster exams?

85% of students were somewhat or very satisfied with their overall experience of using Inspera for their take home exam(s).

73% of students were somewhat or very satisfied with their overall experience of using Inspera for their PC Cluster exam(s).

Thoughts for the future

Inspera seems to be a hit with students overall; the experience of using it is largely positive, with Inspera Take Home papers gaining the highest satisfaction scores. PC Cluster Inspera exam satisfaction scores showed the majority of students were satisfied with their overall experience. Feedback clearly indicated many students felt re-editing written answers works well in Inspera (and is better than trying to edit paper based written exams).

The most common concern raised was around plagiarism. LTDS is keen to work with colleagues to alleviate student concerns and ensure that the provision is developed and supported going forward.

LTDS opened its provision for digital exams to all modules, and the number of planned digital exams for 22/23 has increased.

To better support students before their exam, the LTDS recommend students practise with Inspera. Our survey showed 60% of students tried at least one demo before their main exam; we’d like to get that figure up! Practice exams can help with learning to use the tool and they are accessible via Canvas.

Try it out:

Student Inspera Demo Course

Announcing the University’s new Digital Exam System: Inspera Assessment

In September 2021 we will be launching a new system for centrally supported digital exams, called Inspera Assessment. Implementing the system will enable the Digital Exam Service to: 

  • Deliver secure locked down present-in-person exams on University computers and students’ own laptops, monitored by University invigilators 
  • Ensure that digital exams are accessible to all our students, and enhance the student experience of exams 
  • Increase the University’s digital exam capacity in the long term 
  • Enable more authentic exams by introducing new functionality

New exam types possible with Inspera will include:  

  • Students taking written exams online, by typing their answers on computer, and incorporating drawings or written calculations done on paper into their online answers where needed. 
  • Allowing access to specific online resources or applications during a secure exam, using allow listing functionality. 

Introducing Inspera is a big step forward for education, assessment and feedback at Newcastle University.  Adopting a specialist digital exam system allows us to do much more than would be possible if we continued to use the Virtual Learning Environment for digital exams.

Choosing a digital exam system 

Inspera has been selected as our digital exam system following a rigorous procurement process, which began with requirements mapping workshops in February 2020, attended by over 60 academic and professional services staff.  The procurement was postponed for a year as a result of the global pandemic, and restarted in semester 2 2020/21 when colleagues had the opportunity to feed in any new or updated requirements via an online survey.   

Once the tender was issued key digital exams stakeholders contributed to a rigorous evaluation process to decide on the system that best fit our requirements.  Students and staff were invited to volunteer for usability testing in each system that met the mandatory technical requirements. The team are very grateful to the 36 colleagues, and 13 undergraduate and postgraduate students, who completed a total of approximately 150 hours of usability testing between them! 

Inspera scored the highest overall for both usability, and for technical and functional requirements. 

Rolling out Inspera 

As standard all 2021/22 modules that have a present-in-person digital exam in MOFS will use Inspera.  If the public health situation requires, it will be possible for these modules to use the system for open book take home exams.

Numbas maths assessment system remains an option digital exams that need specialist mathematics functionality.

The system will be available for additional new digital exams from 2022/23 onwards.  There will be opportunities in the coming months to see demonstrations of the software, and learn more about the new types of assessment that it makes possible.  If you would like to learn more now, please contact digital.exams@newcastle.ac.uk

How to get started  

The Digital Exams Service team will contact all 2021/22 module teams with a digital exam in their MOF at the beginning of September, with details of the process for preparing their exam. 

Training will also launch in September 2021, and all colleagues who will be using Inspera in the new academic year are encouraged to sign up.   

Online resources to help students prepare for a digital exam will be published in September, and students will also be able to try out a demo exam in Inspera to help familiarise themselves with the system. 

If you are interested in introducing a new digital exam using Inspera in future, or if you have any queries about a 2021/22 digital exam, please contact digital.exams@newcastle.ac.uk

Assessment resources on Digital Learning website

Resources are available to help staff prepare for the semester 2 assessment period, including: 

Exams

Assignment set up 

  • Guidance is available on whether to use a Canvas Assignment or a Turnitin Assignment
  • It is important that module teams agree which assignment type to use before it is set up in Canvas, and that marking is done in the correct tool. SpeedGrader (link to Canvas Orientation course) must be used for a Canvas Assignment, Turnitin Feedback Studio (link to screencast) must for a Turnitin Assignment. 
  • When an assignment is created, the maximum number of marks available (for example 100) must be entered in the Points field.  The points should never be set as zero, as this causes technical issues. 

Marking and moderation 

Further help 

Digital exam system usability testing

Are you a member of academic or professional services staff interested in digital exams?

The digital exam system procurement process is going ahead as planned, and we are making adjustments to enable staff to participate in usability testing while remote working. 

We appreciate that this is a very busy time for colleagues across the University. However, it is necessary to go ahead with usability testing now to support the digital exam system procurement process.  If you are interested and have capacity to participate in usability testing your contribution will be very valuable. 

We are looking for volunteers to test digital exam systems, to help assess how user friendly each one is.  Testers’ feedback will be a key part of the evaluation stage of the tender process, and have a direct impact on which digital exam system the University introduces from next academic year.   

Usability testing is open to all University staff.  You can choose to test from the perspective of either: 

An exam administrator testing how to create exam settings, and manage marking and moderation processes.  Approximately 90 minutes per system. 

An academic testing how to create exam questions, and carry out marking and moderation.  Approximately 2 hours per system. 

To participate you need to commit to test all of the systems that meet the University’s mandatory requirements, which we estimate may be between 2 and 4 systems.  This is required to ensure that the evaluation process is fair, and we’ll be able to confirm the number of systems being tested the week before the testing begins.   

Full instructions and video demonstrations will be provided for each testing task. You can complete the testing tasks at any time that suits your schedule over the usability testing period from Monday 1 June to Monday 15 June. 

To register your interest in doing usability testing please complete this form by 12 noon on Tuesday 26 May 2020.  Please contact digital.exams@newcastle.ac.uk with any queries. 

Student views on feedback forms

Group of students

To find out more a student intern, working with staff in LTDS,  evaluated existing feedback forms and gathered opinions from students to identify what works and what could be improved. The project considered a total of 66 forms from 19 different schools and included focus groups and interviews with individual students.

What did they find?

These are a few key findings and you can find full details in the project report.

Form Design

Have clear, separate sections showing:

  • Strengths and areas for improvement
  • Clear advice for future work

Only use tick boxes for objective areas of the marking criteria, such as grammar. When tick boxes were used for subjective areas, such as argument, students found this unhelpful.

Look at your feedback forms and consider whether these should be redesigned. Consult with the students in your school as part of the process.

Utilising the form

Type feedback, wherever possible.

Introduce structured opportunities to help students understand:

  • expectations of the marking criteria
  • the ways in which this is reflected in the feedback sheet

Discuss how you use marking sheets with your colleagues. Try to develop a consistent approach to:

  • the volume of feedback
  • the use of notes in margins

For more information get in touch with LTDS@ncl.ac.uk

Peer Mentoring: Feedback Sessions

With the Peer Mentoring Scheme well underway across the University, mentors have been meeting with convenors to check how things are going.

Alison Graham convenes the Peer Mentors in the School of Biology.

feedback-session-peer-mentoring
Students in Peer Mentoring feedback session

She meets with Peer Mentors in the school in week 2, week 4 and week 7 or 8, just to check how students are doing and make sure that mentors and mentees are getting the most out of the scheme.

‘What I’ve started to try to do is to incentivise the meetings, so the students feel that they are getting something out of them, as well as just catching up.

‘I came up with the idea of tying them to the Graduate Skills Framework, so I often work through how the mentors will be able to use their skills in applying for jobs.

‘We go through how to evidence the skills that they’ve gained in applications and at  interview.’

Alison hopes that this approach will make the scheme more lucrative for second and third year students who may be unsure about giving up their time.

‘It’s really about making sure that students can see and really use the skills they are gaining form being a Peer Mentor, in addition to helping other students.’

Alison says the scheme has proved popular in the School and that students have described it as useful but that often the whole experience relies on engagement from the mentors.

‘We have some excellent mentors who establish a real social group and relationship with their mentees by organising trips and events.

‘We try to encourage that and encourage teamwork within the groups – for example, we organise a treasure hunt in week one where they all have to work together.’

She says that the amount of engagement with mentors depends on individual students and often to circumstances.

‘But it depends on them. Some students only really liaise with their mentor in the first few weeks but some need a little bit more.

‘They also tend to turn to their mentors around exam and assignment time.

‘But it can also be really important for some students who are struggling.’

As a convenor for the programme, Alison points out that its important for the mentors to be trained and supported so that they know what queries they can answer.

‘We have to be quite careful to make sure that they know how much help they can give students with their academic work.

‘Obviously they can provide some advice but we don’t want people sharing assignments or anything, so that’s something we have to train them for.’

As well as the feedback meetings, Peer Mentors have all been invited to a Thank You party, taking place on 5th December in the Great North Museum.

Claire Burnham, the University’s Peer Mentoring Coordinator said: ‘We’re very excited about the event.

‘The Mentor of the year award will be presented on the night and we’ve already had 400 nominations from students across the University.

‘It’s a great way of rewarding our mentors and our convenors for all of their hard work.’