As part of our Assessment and Feedback Sprint Series. A small team of students and colleagues have been investigating the question:
How do we articulate a meaningful programme experience that ensures a cohesive assessment journey for all of our students?
Feedback (Stage Surveys, NSS etc.,) tells us that students and colleagues struggle to see assessments from a programme perspective and this disconnection can lead students to feel like assessment isn’t part of a wider programme and that their skills/feedback don’t link across modules and assessments.
Being able to visualise the assessment journey across a stage or programme is important because, as one colleague said,
“An assessment journey builds confidence in the education (and the education provider) and underscores the importance of each individual assessment towards an overarching goal. Articulation of assessment journeys allows for broader reflection and helps explain the skill development (rather than focussing on siloed, module specific content).”
An overview of some of the visuals we found from within Newcastle University and other HE Institutions are shown below. In summary, we found a range of approaches, often highlighting the ‘journey’ through the stage or programme, making it easier for students to reflect on progress.
What have we created?
Using these findings, we created some template visuals which were then validated by colleagues and students along with feedback incorporated from our first showcase.
We decided to create a variety of templates to reflect diverse practices/skillsets across programmes and areas. Some are more suitable for Semester-based programmes and others for block-taught programmes.
We started by looking at a standard linear stage one programme – V400 BA Archaeology. We initially had a large amount of text on the visual explaining each assessment and how it aligned to the wider programme learning objectives. However, it quickly began to look overwhelming.
We then started to explore using H5P as a way to keep the visual relatively simple but incorporate pop up boxes to make it more interactive and engaging. The version below has dummy text – click on the questionmarks to see how it would work.
We also considered how to visually represent a block-taught postgraduate programme and incorporated feedback from a Degree Programme Director (DPD) to represent larger-weighted modules with bigger circles. The DPD said this would be a useful tool for both staff and students including at recruitment and Induction events.
The intention is that these editable templates will be useful for both students and programme teams to visualise assessment across a programme or stage. The visual could be produced as part of a workshop reviewing programme level assessment or could be a standalone tool designed to be student-facing.
Find out more about our Sprint
We presented our Sprint adventures at the Sprint Showcase event on Friday 10 March, and you can watch the recording here:
To find out more about the Assessment and Feedback Sprint Programme contact Conny.Zelic@ncl.ac.uk in the Strategic Projects and Change Team.
Inspera assessment is the University’s system for centrally supported digital exams. Inspera can be used for automatically marked exam questions, for manually marked question types including essays, or for exams with a combination of both.
New functionality has recently been launched that enables colleagues to do more with digital written exams.
For example in an exam where students need to answer 2 essay questions from a list of 6 questions, you can set this up so that a student can choose a maximum of 2 questions to answer.
How does it work for a student?
If candidate selected questions is used in an Inspera exam the student sees information above each question that shows how many questions to select in total, and how many they have already selected. To choose a question to answer they change the ‘Answering this question?’ drop down box to yes.
If a student starts answering a question without changing the ‘Answering this question?’ drop down box, Inspera automatically changes it to ‘Yes’.
When they have selected the maximum number of questions, the student cannot start answering any more questions. However, if they change their mind about which question(s) they want to answer, they can simply change the ‘Answering this question?’ drop down to no, and select a different question instead.
How does it work for a marker?
A marker only sees answers to the questions that a student has chosen to answer.
As students can only submit answers for the maximum number of questions they are allowed to choose, this means you can say goodbye to the dilemma of trying to work out which questions to mark when a student has misread the instructions and answered too many questions!
You can now create a rubric to use for marking any manually marked question type in Inspera. Rubrics allow you to build the assessment criteria for an exam question into Inspera, and use them in your marking.
Choose whether you want to use a quantitative rubric to calculate the mark for a question, or a qualitative rubric as an evaluation and feedback tool, and then manually assign the mark.
How to introduce a rubric for your exam
When you are creating the exam question in Inspera, set up the rubric you want to use for marking that question. The Inspera guide to rubrics for question authors explains how to create a rubric and add it to your exam question.
If you’ve chosen to use one of the quantitative rubric types, as you complete it the student’s mark for the question will automatically be calculated. If you’ve chosen a qualitative rubric, once you’ve completed the rubric use it to evaluate the student’s answer and help you decide on their mark for the question.
You can choose to add feedback to the candidate in the box below the level of performance you’ve selected for each criterion (you can see an example of this in the image below).
Want to learn more about using Inspera for digital exams?
Come along to a webinar to learn about creating exam questions or marking in Inspera.
Inspera Assessment, the University’s system for centrally supported digital exams, launched for the 21/22 academic year. A key part of understanding how we better use digital exams is to consider ways to improve the student experience of taking a digital exam. Following the launch, the Learning and Teaching Development Service (LTDS) asked for student feedback from those who took a digital exam in 21/22.
142 students submitted their feedback.
Here are our findings:
65% of students were somewhat or very satisfied with their overall experience of taking their exam using Inspera.
How easy is Inspera to use?
81% of students found starting their Inspera exam somewhat or very easy.
80% of students found handing in/submitting their Inspera exam somewhat or very easy.
When asked to compare a written exam paper and an Inspera paper which included written questions where students could type their answers, 63% of students stated they found it somewhat or much better using Inspera.
Is Inspera better for Take Home or on Campus PC cluster exams?
85% of students were somewhat or very satisfied with their overall experience of using Inspera for their take home exam(s).
73% of students were somewhat or very satisfied with their overall experience of using Inspera for their PC Cluster exam(s).
Thoughts for the future
Inspera seems to be a hit with students overall; the experience of using it is largely positive, with Inspera Take Home papers gaining the highest satisfaction scores. PC Cluster Inspera exam satisfaction scores showed the majority of students were satisfied with their overall experience. Feedback clearly indicated many students felt re-editing written answers works well in Inspera (and is better than trying to edit paper based written exams).
The most common concern raised was around plagiarism. LTDS is keen to work with colleagues to alleviate student concerns and ensure that the provision is developed and supported going forward.
LTDS opened its provision for digital exams to all modules, and the number of planned digital exams for 22/23 has increased.
To better support students before their exam, the LTDS recommend students practise with Inspera. Our survey showed 60% of students tried at least one demo before their main exam; we’d like to get that figure up! Practice exams can help with learning to use the tool and they are accessible via Canvas.
In September 2021 we will be launching a new system for centrally supported digital exams, called Inspera Assessment. Implementing the system will enable the Digital Exam Service to:
Deliver secure locked down present-in-person exams on University computers and students’ own laptops, monitored by University invigilators
Ensure that digital exams are accessible to all our students, and enhance the student experience of exams
Increase the University’s digital exam capacity in the long term
Enable more authentic exams by introducing new functionality
New exam types possible with Inspera will include:
Students taking written exams online, by typing their answers on computer, and incorporating drawings or written calculations done on paper into their online answers where needed.
Allowing access to specific online resources or applications during a secure exam, using allow listing functionality.
Introducing Inspera is a big step forward for education, assessment and feedback at Newcastle University. Adopting a specialist digital exam system allows us to do much more than would be possible if we continued to use the Virtual Learning Environment for digital exams.
Choosing a digital exam system
Inspera has been selected as our digital exam system following a rigorous procurement process, which began with requirements mapping workshops in February 2020, attended by over 60 academic and professional services staff. The procurement was postponed for a year as a result of the global pandemic, and restarted in semester 2 2020/21 when colleagues had the opportunity to feed in any new or updated requirements via an online survey.
Once the tender was issued key digital exams stakeholders contributed to a rigorous evaluation process to decide on the system that best fit our requirements. Students and staff were invited to volunteer for usability testing in each system that met the mandatory technical requirements. The team are very grateful to the 36 colleagues, and 13 undergraduate and postgraduate students, who completed a total of approximately 150 hours of usability testing between them!
Inspera scored the highest overall for both usability, and for technical and functional requirements.
Rolling out Inspera
As standard all 2021/22 modules that have a present-in-person digital exam in MOFS will use Inspera. If the public health situation requires, it will be possible for these modules to use the system for open book take home exams.
The system will be available for additional new digital exams from 2022/23 onwards. There will be opportunities in the coming months to see demonstrations of the software, and learn more about the new types of assessment that it makes possible. If you would like to learn more now, please contact firstname.lastname@example.org.
How to get started
The Digital Exams Service team will contact all 2021/22 module teams with a digital exam in their MOF at the beginning of September, with details of the process for preparing their exam.
Training will also launch in September 2021, and all colleagues who will be using Inspera in the new academic year are encouraged to sign up.
Online resources to help students prepare for a digital exam will be published in September, and students will also be able to try out a demo exam in Inspera to help familiarise themselves with the system.
If you are interested in introducing a new digital exam using Inspera in future, or if you have any queries about a 2021/22 digital exam, please contact email@example.com.
Assessment guidance for students is available, including how to submit an assignment and advice about accepted file types and file size that will help answer student queries. This page can be shared with students as part of assessment instructions.
It is important that module teams agree which assignment type to use before it is set up in Canvas, and that marking is done in the correct tool. SpeedGrader (link to Canvas Orientation course) must be used for a Canvas Assignment, Turnitin Feedback Studio (link to screencast) must for a Turnitin Assignment.
When an assignment is created, the maximum number of marks available (for example 100) must be entered in the Points field. The points should never be set as zero, as this causes technical issues.
Are you a member of academic or professional services staff interested in digital exams?
The digital exam system procurement process is going ahead as planned, and we are making adjustments to enable staff to participate in usability testing while remote working.
We appreciate that this is a very busy time for colleagues across the University. However, it is necessary to go ahead with usability testing now to support the digital exam system procurement process. If you are interested and have capacity to participate in usability testing your contribution will be very valuable.
We are looking for volunteers to test digital exam systems, to help assess how user friendly each one is. Testers’ feedback will be a key part of the evaluation stage of the tender process, and have a direct impact on which digital exam system the University introduces from next academic year.
Usability testing is open to all University staff. You can choose to test from the perspective of either:
An exam administrator testing how to create exam settings, and manage marking and moderation processes. Approximately 90 minutes per system.
An academictesting how to create exam questions, and carry out marking and moderation. Approximately 2 hours per system.
To participate you need to commit to test all of the systems that meet the University’s mandatory requirements, which we estimate may be between 2 and 4 systems. This is required to ensure that the evaluation process is fair, and we’ll be able to confirm the number of systems being tested the week before the testing begins.
Full instructions and video demonstrations will be provided for each testing task. You can complete the testing tasks at any time that suits your schedule over the usability testing period from Monday 1 June to Monday 15 June.
Following our October 2019 post introducing the digital exam service, we have a progress update and some news about what’s happening next. Centrally supported digital exam provision (including the OLAF Service, and the Diversifying online exam provision project) is being combined into a single service, and we are reviewing our requirements ready to tender for a system that meets our needs.
Requirement Mapping Workshops will be taking place. The outcomes of these sessions will help to inform the requirements that we will take to system providers. All academic and professional services staff with an interest in digital exams are invited to contribute. Please sign up via the link to have your say!
Tender for digital exam system (30-35 days response time). A set of final requirements will be issued.
April – May 2020
Scoring of tender submissions against requirements will take place alongside user testing of software that meets our mandatory requirements. Look out for updates about how to get involved.
June – July 2020
A provider will be awarded the contract to supply a digital exam system to the University.
Following this, work will be undertaken to move as much of existing digital exam questions and content into the new system as is possible.
The new system will be vigorously tested and integrated with University systems. User guidance and training for all stakeholder will be developed.
August assessment period
Any exam deferrals and resits in the August assessment period will need to be completed/submitted in Canvas. The Blackboard license ends on July 31st and from that point no staff or students will be able to access that system.
Schools should adopt the same method of assessment that was used in Semester 2 for any resits/deferrals in the August assessment period. If a Blackboard test was used in the Semester 2 assessment period, then a Canvas quiz should be used in the August assessment period.
If you ran an OLAF exam in Semester 1 you can either deliver the resit using a Canvas quiz or a Turnitin submission.
Phil is an authority on assessment and is widely published, including the excellent “The Lecturer’s Toolkit”.
Sally is Emerita Professor at Leeds Metropolitan and regularly keynotes at Educational conferences. Sally developed the National Teaching Fellowship scheme when working at the Higher Education Academy.
Building on the solid foundations of OLAF provision, and the successful first 2 years of the Diversifying and Expanding Online Exam Provision project, the University’s Technology Enhanced Learning Sub-Committee have approved the launch of a new combined Digital Exams service.
The story so far …
Newcastle University’s Online Assessment and Feedback (OLAF) Service has been running high stakes secure online exams using Blackboard’s test tool since 2007/08. The 13 years since that first exam have seen OLAF come of age, supported by well-established institutional processes that ensured all 132 OLAF exams in 2018/19 went smoothly.
In 2017/18 the Diversifying and Expanding Online Exam Provision project was launched, and the first of some new types of digital exams were piloted using software called WISEflow. Bring Your Own Device was introduced, enabling students to use their own laptops to sit a secure digital exam. Alongside this, moving essay and long written answer exam questions from paper to online has also become possible for the first time.