New assessment resources: assessment briefs and programme perspectives

We have recently added two new assessment resources to the Effective Practice branch of our Teaching and Learning site.

Both of these draw on the outputs and findings from our Assessment and Feedback Sprints. These brought together student, academic and PS colleagues to tackle common issues that student experience with assessment.

In this post we’ll fill you in on the background to new resources.

Continue reading “New assessment resources: assessment briefs and programme perspectives”

Digital Assessment Webinar Training Programme Available

A photo of hands typing on a laptop keyboard.

We are pleased to share our Digital Assessment Training Programme for 2023-24. Our training sessions are delivered as webinars via Microsoft Teams.

Inspera Assessment (the university system for centrally supported digital exams) is supported by the Learning and Teaching Development Service with a range of training options open to all staff.

Follow the links below to find out more about each session and to book onto individual sessions via the University’s elements system. 

Inspera Webinars

Inspera for Professional Service Colleagues

  • 9 October 2023 – 11:00-12:00
  • 8 February 2024 – 10:00-11:00

Creating and Managing Exam Questions in Inspera

  • 23 October 2023 – 9:00-10:00
  • 19 February 2024 – 14:00-15:00

How to Grade using Rubric in Inspera

  • 9 November 2023 – 10:00-11:00
  • 26 February 2024 – 14:00-15:00

Marking and moderating an Inspera exam with manually marked questions

  • 13 December 2023 – 09:00-10:00 
  • 25th January 2024 – 10:00-11:00 
  • 1st May 2024 – 14:00-15:00  
  • 6th June 2024 – 14:00-15:00 

Marking an Inspera exam with auto marked questions

  • 14 December 2023 – 09:00-10:00 
  • 24 January 2024 – 09:00-10:00 
  • 29 April 2024 – 14:00-15:00 
  • 3 June 2024 – 14:00-15:00 

Digital Assignments: Canvas and Turnitin

Creating and Managing Digital Assignments

  • 19 October 2023 – 11:00-12:00 
  • 29 January 2024 – 15:00-16:00 

Online Marking and Feedback (Canvas)

  • 5 December 2023 – 14:00-15:00 
  • 24 April 2024 – 14:00-15:00 

Online Marking and Feedback (Turnitin)

  • 4 December 2023 – 14:00-15:00 
  • 22 April 2024 – 14:00–15:00 

Any queries?

If you have any queries on any of the above sessions, please contact digital.exams@newcastle.ac.uk.

Meet the Team

You can meet the Digital Assessment Team in this LTDS Blog post.

Meet the Digital Exams Team

As the new 2023-24 academic year begins, we’d like to introduce the Digital Exams Team here in LTDS, who lead on the University’s Digital Exams via Inspera. A team of Learning Enhancement and Technology Advisers work together to facilitate digital exams through Inspera Assessment. Inspera Assessment is the University’s Digital Exam system used for present-in-person secure online assessments.

Meet the Team

First up we’ve got Maddie Kinnair who is one of our two Inspera Co-Leads. Maddie joined the team in September 2022 and has worked within the area of Learning and Teaching for 6 years. Maddie first joined HE in 2014 and has previously worked in the School of Computing, HaSS Faculty and within Central Services.

Maddie is also the lead for our peer assessment and feedback tool Buddycheck.

Next up, we’ve got Kimberly May-O’Brien, our second Inspera Co-Lead. Kimberly joined the team in July 2023, having worked at the University since 2019. Kimberly previously worked in the School of English Literature, Language and Linguistics, as well as the central Equality, Diversity and Inclusion team.

Finally, we have Susan Barfield who started working at the University 13 years ago, initially in NUIT as part of the ReCap team, she then joined LTDS in 2019 as lead on online marking and feedback using Canvas and Turnitin, whilst also supporting digital exams.

More Information

You can find out more about Inspera and the training webinars and videos available to colleagues via the LTDS website.

If you have any queries around Digital Exams, you can contact the Digital Exams Team via Digital.Exams@newcastle.ac.uk.

Numbas is the other centrally supported Digital Exam platform. You can contact the Numbas team via Numbas@ncl.ac.uk.

The role of Digital Exam Support Assistants (DESA)

A photo of hands typing on a laptop keyboard.

Digital Exam Support Assistants (DESAs) are PGR students who support invigilators in digital exam venues to help students troubleshoot any technical issues using the safe exam browser software. Safe Exam Browser is software which works alongside Inspera offering a secure ‘locked down’ digital exam. Inspera Assessment is the University’s Digital Exam system used for present-in-person, secure online assessments.

How do DESAs support exam invigilators in digital exams?

DESAs are on-hand to support students and invigilators to troubleshoot issues faced when accessing Inspera for Bring Your Own Device (BYOD) exams. Exam invigilators have reported that the presence of DESAs makes them feel more confident in digital exam venues. Feedback has stated that DESAs have been a ‘confidence booster’ and that invigilators ‘couldn’t do it without them’. Invigilators reported that the DESAs were responding to queries quickly which has also been stated by students who had DESA support.

How do students find the DESA support?

39 students submitted their feedback on their Semester 1 22/23 BYOD exam. When asked how satisfied they were with the technical support available in their exam, two thirds of students (67%) reported that they were satisfied or very satisfied.

Students reported that ‘those who requested support were dealt with quickly and there was little hassle.’

How did the DESAs find their experience?

We asked some of our DESAs how they found their experience in the role this year. Check out some of the quotes below:

I had a wonderful experience with the team. Enough training was given to staff. Would like to work with the team again. Thanks for giving me the opportunity.”


“Regarding my experience in the DESA role this academic year, it provided me with a valuable opportunity to contribute to the Digital Assessment Office and engage with fellow students. The role not only enhanced my understanding of digital assessment practices but also allowed me to develop essential skills in communication and collaboration. I am grateful for the experience and the chance to be a part of improving the assessment process at Newcastle University.”

What’s next?

We are pleased to report that the DESA role will be returning in the 2023/24 academic year. This support provision has been crucial in supporting our students with any troubleshooting during their BYOD digital exams. For more information you can email the Digital Assessment Team.

You can find out more about Inspera in our other blog posts on Inspera and on our Inspera Digital Exams webpage.

Enabling Students to Plan for and Reflect on Programme Level Assessment

By Levi Croom, Meg Hardiman-Smythe

In the fifth sprint of the Assessment and Feedback Sprint Series a small team of students and colleagues, from across the university, worked collaboratively for three weeks to investigate and design resources that would help answer the question:

How do we articulate a meaningful programme experience that ensures a cohesive assessment journey for all our students?

The initial discovery phase of the sprint revealed that students often struggle to see the ‘big picture’ of their programme and how their assessments relate to and are informed by one another within a modularised system. Rather than understanding their assessments as a journey that is an integral part of their learning, students chiefly viewed them as standalone, disconnected instances that are ‘tacked on’ to the end of a module with the sole purpose being to assess. Rarely were assessments recognised by students as a part of their continued learning and development of skills.

With this in mind, we have created an assessment and feedback planner. The aim of which is to provide students with a resource where they can collate all of their assessment information, across a stage of their programme, so that this can be easily visualised and stored in a single place. More importantly, however, the planner’s primary function is to encourage students to consider the skills that their assessments are designed to develop. This allows students to critically reflect on how these skills are transferable across their modules and the stages of their degree, and how to carry their feedback forward, thereby building a clearer picture of their programme as a whole.

modules

The assessment planner is operated through OneNote, a platform available to all students as part of their Microsoft package. The planner can be used to both type and handwrite information, as well as providing space to import or jot down any key notes. We have provided two links, one to a blank template, and one to a mock-up of a completed planner so you can visualise the planner in action. You can use the tabs to navigate through the planner and for more information we will be creating a ‘getting started’ video soon that offers a guide on using the planner. One of the key benefits of the planner is that it is fully editable so that students on any course can customise it to fit their specific programme’s needs and goals.

OneNote menu
Module Overview page

In the version we have created, we have decided to use a Stage One template, as student validation suggested that receiving the planner in stage one would be most useful to reinforce assessment reflection across all stages in a programme.

“This would have greatly helped me in Stage One”- Student, HaSS

Notes for SEL1003

The most important feature of the planner is the reflective output. We have included a “My Feedback” page for every module, a “Semester Reflection” to act as a bridge across semesters, and a “Thinking Back, Looking Forward” section to reflect on the stage as a whole and to feed forward into the next stage or into the post-degree future. All reflective sections offer students the opportunity to think critically on their assessment goals and knowledge, with questions such as “what assessment skills/knowledge have you developed since starting at Newcastle/since your previous years of study?” and “thinking back, how do you feel about the goals you set at the start of the year? (What progress have you made? Have your goals changed at all?)”. By having open-ended questions that require detailed answers, students can reflect on their educational assessment journey and feed this forward. A link is embedded into the last section of the planner to encourage students to create a new planner for the next year of study, if applicable.

Think back looking forward
Time to pause

We see the “My Assessment Planner” potentially being used as an active tool that students could work through with their Peer Mentor and discuss with their Personal Tutors. This is because when validating the planner with students it was suggested that they would find this most useful if they had the opportunity to review the completed planner with peers or staff.

“I would want this to be a resource facilitated in partnership with staff”- Student, HaSS

The overarching aim of the planner is to provide more cohesion across assessments to enable students to better understand the links between stages and their overall programme.

Try out the Assessment Planner

We have two versions of the assessment planner available for download. These are “packaged” versions of the workbook – simply download them and click to open them in OneNote.

If you have any questions about the ‘My Assessment Planner’ please get in contact with Levi Croom (HaSS Faculty Student Experience Administrator) Levi.Croom@newcastle.ac.uk

See our earlier blog post to view the Sprint Showcase Recording and find out more about our second “Minimum Viable Product” – Programme Assessment Journey Map.

Visualising programme level assessment

As part of our Assessment and Feedback Sprint Series. A small team of students and colleagues have been investigating the question: 

How do we articulate a meaningful programme experience that ensures a cohesive assessment journey for all of our students?

Feedback (Stage Surveys, NSS etc.,) tells us that students and colleagues struggle to see assessments from a programme perspective and this disconnection can lead students to feel like assessment isn’t part of a wider programme and that their skills/feedback don’t link across modules and assessments.  

Being able to visualise the assessment journey across a stage or programme is important because, as one colleague said,

“An assessment journey builds confidence in the education (and the education provider) and underscores the importance of each individual assessment towards an overarching goal. Articulation of assessment journeys allows for broader reflection and helps explain the skill development (rather than focussing on siloed, module specific content).”

An overview of some of the visuals we found from within Newcastle University and other HE Institutions are shown below. In summary, we found a range of approaches, often highlighting the ‘journey’ through the stage or programme, making it easier for students to reflect on progress. 

What have we created?

Using these findings, we created some template visuals which were then validated by colleagues and students along with feedback incorporated from our first showcase.

We decided to create a variety of templates to reflect diverse practices/skillsets across programmes and areas. Some are more suitable for Semester-based programmes and others for block-taught programmes. 

You can explore these yourself:

We started by looking at a standard linear stage one programme – V400 BA Archaeology. We initially had a large amount of text on the visual explaining each assessment and how it aligned to the wider programme learning objectives. However, it quickly began to look overwhelming.

We then started to explore using H5P as a way to keep the visual relatively simple but incorporate pop up boxes to make it more interactive and engaging. The version below has dummy text – click on the questionmarks to see how it would work.

We also considered how to visually represent a block-taught postgraduate programme and incorporated feedback from a Degree Programme Director (DPD) to represent larger-weighted modules with bigger circles. The DPD said this would be a useful tool for both staff and students including at recruitment and Induction events. 

The intention is that these editable templates will be useful for both students and programme teams to visualise assessment across a programme or stage. The visual could be produced as part of a workshop reviewing programme level assessment or could be a standalone tool designed to be student-facing. 

Find out more about our Sprint

We presented our Sprint adventures at the Sprint Showcase event on Friday 10 March, and you can watch the recording here:

To find out more about the Assessment and Feedback Sprint Programme contact Conny.Zelic@ncl.ac.uk in the Strategic Projects and Change Team.

New Inspera training offered

Inspera Assessment (the university system for centrally supported digital exams) is supported by the Learning and Teaching Development Service with a range of training options open to all staff. We now have a new training session aimed at Professional Service colleagues due to run on March 9 from 3-4pm. You can sign up via Elements.

This session will introduce the digital exam platform Inspera, and how to support an Inspera digital exam.

  • Introduction to Inspera
  • Creating an account
  • Reviewing crated questions and question sets
  • Basic functionality including randomisation and question choice options
  • Allow listing and adding resources
  • Checking the student view
  • Entering or amending question marks
  • Inspera Scan sheets

Who should attend?

This webinar is suitable for any professional services colleague supporting an Inspera digital exam.

New functionality for Inspera digital exams: question choice and marking rubrics

Inspera assessment is the University’s system for centrally supported digital exams. Inspera can be used for automatically marked exam questions, for manually marked question types including essays, or for exams with a combination of both.

New functionality has recently been launched that enables colleagues to do more with digital written exams.

Question choice for students

Candidate selected questions is used to give students taking your exam a choice of which questions to answer from a list.

For example in an exam where students need to answer 2 essay questions from a list of 6 questions, you can set this up so that a student can choose a maximum of 2 questions to answer.

How does it work for a student?

If candidate selected questions is used in an Inspera exam the student sees information above each question that shows how many questions to select in total, and how many they have already selected. To choose a question to answer they change the ‘Answering this question?’ drop down box to yes.

Screenshot showing student view of Inspera, with the option to choose whether to answer a question. Below the question title is some text which reads 'Answering this question? 0 of 2 questions selected.' There is a drop down box at the right of the text with the options 'Yes', 'No', 'Undecided' available to select.
Screenshot showing student view of Inspera, with the option to choose whether to answer a question.

If a student starts answering a question without changing the ‘Answering this question?’ drop down box, Inspera automatically changes it to ‘Yes’.

When they have selected the maximum number of questions, the student cannot start answering any more questions. However, if they change their mind about which question(s) they want to answer, they can simply change the ‘Answering this question?’ drop down to no, and select a different question instead.

How does it work for a marker?

A marker only sees answers to the questions that a student has chosen to answer.

As students can only submit answers for the maximum number of questions they are allowed to choose, this means you can say goodbye to the dilemma of trying to work out which questions to mark when a student has misread the instructions and answered too many questions!

How can I use it in my exam?

The Candidate selected questions function is available when you are authoring a question set for an Inspera digital exam. Find out more in the Inspera guide for Candidate selected questions.

Rubrics for marking

You can now create a rubric to use for marking any manually marked question type in Inspera. Rubrics allow you to build the assessment criteria for an exam question into Inspera, and use them in your marking.

Choose whether you want to use a quantitative rubric to calculate the mark for a question, or a qualitative rubric as an evaluation and feedback tool, and then manually assign the mark.

How to introduce a rubric for your exam

  1. When you are creating the exam question in Inspera, set up the rubric you want to use for marking that question. The Inspera guide to rubrics for question authors explains how to create a rubric and add it to your exam question.
  2. After the exam has taken place, use the rubric to mark the students’ answers.
  3. If you’ve chosen to use one of the quantitative rubric types, as you complete it the student’s mark for the question will automatically be calculated. If you’ve chosen a qualitative rubric, once you’ve completed the rubric use it to evaluate the student’s answer and help you decide on their mark for the question.
  4. You can choose to add feedback to the candidate in the box below the level of performance you’ve selected for each criterion (you can see an example of this in the image below).
Screenshot of the Grader view of a sample points-range rubric in Inspera.
Screenshot of the Grader view of a sample points-range rubric in Inspera

Want to learn more about using Inspera for digital exams?

Come along to a webinar to learn about creating exam questions or marking in Inspera.

Enroll onto the Inspera Guidance course in Canvas to learn about Inspera functionality at your own pace.

Find out about the process to prepare an Inspera digital exam, and how the Digital Assessment Service can help on the Inspera webpage.

Contact digital.exams@newcastle.ac.uk if you have questions or would like to discuss how you could use Inspera for a digital exam on your module.

Students evaluate using Inspera for 21/22 Digital Exams

Inspera Assessment, the University’s system for centrally supported digital exams, launched for the 21/22 academic year. A key part of understanding how we better use digital exams is to consider ways to improve the student experience of taking a digital exam. Following the launch, the Learning and Teaching Development Service (LTDS) asked for student feedback from those who took a digital exam in 21/22.

142 students submitted their feedback.

Here are our findings:

65% of students were somewhat or very satisfied with their overall experience of taking their exam using Inspera.

A pie chart titled ‘How satisfied are you with the experience of taking your exam(s) using Inspera?’ depicts that students reflected their experience(s) as:
1. Very dissatisfied 11%.
2. Somewhat dissatisfied 14%.
3. Neither satisfied nor dissatisfied 10%.
4. Somewhat satisfied 30%.  
5. Very satisfied 35%.
Results of the Inspera Student Evaluation

How easy is Inspera to use?

81% of students found starting their Inspera exam somewhat or very easy.

80% of students found handing in/submitting their Inspera exam somewhat or very easy.

When asked to compare a written exam paper and an Inspera paper which included written questions where students could type their answers, 63% of students stated they found it somewhat or much better using Inspera.

Is Inspera better for Take Home or on Campus PC cluster exams?

85% of students were somewhat or very satisfied with their overall experience of using Inspera for their take home exam(s).

73% of students were somewhat or very satisfied with their overall experience of using Inspera for their PC Cluster exam(s).

Thoughts for the future

Inspera seems to be a hit with students overall; the experience of using it is largely positive, with Inspera Take Home papers gaining the highest satisfaction scores. PC Cluster Inspera exam satisfaction scores showed the majority of students were satisfied with their overall experience. Feedback clearly indicated many students felt re-editing written answers works well in Inspera (and is better than trying to edit paper based written exams).

The most common concern raised was around plagiarism. LTDS is keen to work with colleagues to alleviate student concerns and ensure that the provision is developed and supported going forward.

LTDS opened its provision for digital exams to all modules, and the number of planned digital exams for 22/23 has increased.

To better support students before their exam, the LTDS recommend students practise with Inspera. Our survey showed 60% of students tried at least one demo before their main exam; we’d like to get that figure up! Practice exams can help with learning to use the tool and they are accessible via Canvas.

Try it out:

Student Inspera Demo Course