Category Archives: Assessment

Online Assignment Submission Principles

In 2014  University Learning, Teaching and Student Experience Committee agreed a set of principles which stated that all appropriate assessments should be submitted through Turnitin. 

Now we have moved to Canvas as the Virtual Learning Environment, this has opened up some new options for online submission. Alongside the Turnitin tool it is now possible to create Canvas assignments, which offer features like double blind marking, group submission and moderated marking, whilst still using the Turnitin similarity checker.    

Given the new functionality now available, this is an appropriate time to revisit the principles.  The updated Online Assignment Submission Principles were approved by University Education Committee in August 2020.   

These principles are guidelines for how to get the most from submissions, advising that the Turnitin Similarity checks are carried out on Canvas and Turnitin assignments. If you allow students to submit multiple drafts they should not be allowed to see the similarity score, unless the assessment is focused on improving the students’ academic writing. Where appropriate the students’ work should be added to the Turnitin repository.   

The principles recommend that Schools communicate to their students when their work is going to be put through the Turnitin similarity checker.    

Full details are available in the Online Assignment Submission Principles document Online-Assignment-Submission-Principles.pdf  

If you require support creating assignments, or using the marking tools, please see our list of Canvas webinars https://services.ncl.ac.uk/digitallearning/canvas/colleagues/training/ 

or the Flexible Learning 2020 Webinar programme https://services.ncl.ac.uk/digitallearning/contactandsupport/dropins/ 

Digital exam system usability testing

Are you a member of academic or professional services staff interested in digital exams?

The digital exam system procurement process is going ahead as planned, and we are making adjustments to enable staff to participate in usability testing while remote working. 

We appreciate that this is a very busy time for colleagues across the University. However, it is necessary to go ahead with usability testing now to support the digital exam system procurement process.  If you are interested and have capacity to participate in usability testing your contribution will be very valuable. 

We are looking for volunteers to test digital exam systems, to help assess how user friendly each one is.  Testers’ feedback will be a key part of the evaluation stage of the tender process, and have a direct impact on which digital exam system the University introduces from next academic year.   

Usability testing is open to all University staff.  You can choose to test from the perspective of either: 

An exam administrator testing how to create exam settings, and manage marking and moderation processes.  Approximately 90 minutes per system. 

An academic testing how to create exam questions, and carry out marking and moderation.  Approximately 2 hours per system. 

To participate you need to commit to test all of the systems that meet the University’s mandatory requirements, which we estimate may be between 2 and 4 systems.  This is required to ensure that the evaluation process is fair, and we’ll be able to confirm the number of systems being tested the week before the testing begins.   

Full instructions and video demonstrations will be provided for each testing task. You can complete the testing tasks at any time that suits your schedule over the usability testing period from Monday 1 June to Monday 15 June. 

To register your interest in doing usability testing please complete this form by 12 noon on Tuesday 26 May 2020.  Please contact digital.exams@newcastle.ac.uk with any queries. 

Support for online marking

There are an extensive range of new resources available on the Digital Learning Website to support you with setting up assignments and marking online.

These include screencasts and TEL Guides that you can work through in your own time, as well as daily webinars and drop ins, if you would prefer some real time support.

A number of resources cover how to get started with Turnitin and how to get the most out of the marking tools in Turnitin Feedback Studio. There’s a summary of what’s available below.

The website also includes support for other tools including Blackboard Tests, Numbas and how to use Recap assignment folders for presentations.

Turntin

Getting started with marking online

For colleagues who want to find out:

  • How to access the Turnitin assignment
  • How to use the main marking tools including bubble comments, inline comments, feedback summary and audio feedback.

Support available


Making use of the additional tools in Turnitin

As well as the tools outlined above you might also be interested in additional marking tools including:

Quick Marks

Comment libraries that can help speed up your marking.

Support available

Rubrics

Rubrics can help the marker provide consistency in marking, and will help students clearly understand what is required to improve on future assignments.


Have a question about marking online?

Come along to an online drop in session, happening everyday, to speak to a member of the team. We can help with questions about the application of any of the tools and approaches to support remote delivery of teaching and assessment.

You can also send your questions to LTDS@ncl.ac.uk or to the IT Service Desk.

Transition to the Digital Exams Service: A Timeline

Following our October 2019 post introducing the digital exam service, we have a progress update and some news about what’s happening next.  Centrally supported digital exam provision (including the OLAF Service, and the Diversifying online exam provision project) is being combined into a single service, and we are reviewing our requirements ready to tender for a system that meets our needs. 

February 2020 

Requirement Mapping Workshops will be taking place. The outcomes of these sessions will help to inform the requirements that we will take to system providers.  All academic and professional services staff with an interest in digital exams are invited to contribute.  Please sign up via the link to have your say! 

March 2020 

Tender for digital exam system (30-35 days response time). A set of final requirements will be issued. 

April – May 2020 

Scoring of tender submissions against requirements will take place alongside user testing of software that meets our mandatory requirements.  Look out for updates about how to get involved. 

June  July 2020 

A provider will be awarded the contract to supply a digital exam system to the University.  

Following this, work will be undertaken to move as much of existing digital exam questions and content into the new system as is possible. 

August 2020 

The new system will be vigorously tested and integrated with University systems. User guidance and training for all stakeholder will be developed. 

August assessment period

Any exam deferrals and resits in the August assessment period will need to be completed/submitted in Canvas. The Blackboard license ends on July 31st and from that point no staff or students will be able to access that system.

Schools should adopt the same method of assessment that was used in Semester 2 for any resits/deferrals in the August assessment period. If a Blackboard test was used in the Semester 2 assessment period, then a Canvas quiz should be used in the August assessment period.

If you ran an OLAF exam in Semester 1 you can either deliver the resit using a Canvas quiz or a Turnitin submission.

Information and support is available via the Education Continuity webpages.

September 2020 

Digital Exam Service launches with new software – OLAF is no more. 

All digital exams previously taken in both Blackboard as part of the OLAF service and in WISEflow as part of the Diversifying Online Exam Provision project will be delivered using the chosen software. 

Training will be offered to all academic and professional services staff involved in delivering digital exams, and briefing information will be available for students. 

Introducing the digital exams service

Building on the solid foundations of OLAF provision, and the successful first 2 years of the Diversifying and Expanding Online Exam Provision project, the University’s Technology Enhanced Learning Sub-Committee have approved the launch of a new combined Digital Exams service.

The story so far …

Newcastle University’s Online Assessment and Feedback (OLAF) Service has been running high stakes secure online exams using Blackboard’s test tool since 2007/08. The 13 years since that first exam have seen OLAF come of age, supported by well-established institutional processes that ensured all 132 OLAF exams in 2018/19 went smoothly.

In 2017/18 the Diversifying and Expanding Online Exam Provision project was launched, and the first of some new types of digital exams were piloted using software called WISEflow. Bring Your Own Device was introduced, enabling students to use their own laptops to sit a secure digital exam. Alongside this, moving essay and long written answer exam questions from paper to online has also become possible for the first time.

Continue reading Introducing the digital exams service

Expanding and diversifying online exam provision

In 2017/18 academic year the University launched a project that aims to make online summative assessment possible for a wide range of assessment types and in non-cluster environments, which is one of the objectives in the TEL Roadmap.

The project focuses on two key areas:

  • Expanding the types of online exams that we can deliver here at Newcastle
  • Exploring the possibility of students using their own laptops to take secure online exams

Continue reading Expanding and diversifying online exam provision

Student views on feedback forms

To find out more a student intern, working with staff in LTDS,  evaluated existing feedback forms and gathered opinions from students to identify what works and what could be improved. The project considered a total of 66 forms from 19 different schools and included focus groups and interviews with individual students.

What did they find?

These are a few key findings and you can find full details in the project report.

Form Design

Have clear, separate sections showing:

  • Strengths and areas for improvement
  • Clear advice for future work

Only use tick boxes for objective areas of the marking criteria, such as grammar. When tick boxes were used for subjective areas, such as argument, students found this unhelpful.

Look at your feedback forms and consider whether these should be redesigned. Consult with the students in your school as part of the process.

Utilising the form

Type feedback, wherever possible.

Introduce structured opportunities to help students understand:

  • expectations of the marking criteria
  • the ways in which this is reflected in the feedback sheet

Discuss how you use marking sheets with your colleagues. Try to develop a consistent approach to:

  • the volume of feedback
  • the use of notes in margins

For more information get in touch with LTDS@ncl.ac.uk

Transforming Assessment Webinars

Dr Mathew Hillier, Monash Education Academy, Monash University, Australia and Professor Geoffrey Crisp, PVC-Education, University of New South Wales, Australia will be hosting a series of webinars over the coming months focusing on transforming assessment with  topics such as digital literacy, written and audio feedback and blended simulation-based learning. Take a look at the further details below. Continue reading Transforming Assessment Webinars

Turnitin UK Academic Integrity Summit 2017

I recently attended the Turnitin UK Academic Integrity Summit 2017 held in Newcastle Upon Tyne.  This was a very timely conference following the release of the QAA report into contract cheating.  I was concerned that this would be a day-long sales pitch from Turnitin but was pleasantly surprised to find the opposite. There were many presentations from institutions around the world, but very little ‘grandstanding’ from Turnitin.

Stephen Gow, Academic Integrity Coordinator, University of York

The first session I attended was a look at the approach from the University of York towards academic integrity. They discussed the importance of the language used at the University, moving away from terms such as “plagiarism” towards “academic integrity”. All their students have a mandatory academic integrity online tutorial they must complete in Semester 1 of Stage 1. They are working closely with the student union on their “integrity week” and are also working more closely with staff, including on their Postgraduate Certificate in Academic Practice (PGCAP).

Turnitin Data Workflows

The second session was a discussion session with the Turnitin staff exploring the types of data and statistics institutions would like to get out of Turnitin. This included reports on feedback return time, statistics around number of students receiving extensions, archiving, learning analytics, and reporting on the various functions used. We hope Turnitin will use this in the further developments of the software.

Bill Loller, Turnitin

The third session was facilitated by Bill Loller, Chief Product Officer at Jobvite, who is working on a technical solution to expose contract cheating for Turnitin. They are using expertise from the field of forensic linguistics to develop a product. Forensic linguistics may be used in a court case to determine whether a person did, or did not, write a document. They are currently testing their modelling and developing a report that will provide a confidence score.

Bill continued this theme into a larger session with the group, showing some of the contract cheating/essay mills websites prevalent online. He admitted that Turnitin may have helped with this problem – “crack down on plagiarism and students will look elsewhere”. These websites offer 10,000 words for approximately £300.

Simon Bullock, QAA

Simon Bullock from the QAA was next to discuss his recent publication “Contracting to Cheat in Higher Education – How to Address Contract Cheating, the Use of Third-Party Services and Essay Mills.”  He discussed the risks to the public if students were obtaining their degrees through cheating but that despite attempts it is not yet illegal to offer essay mill services online. The QAA is exploring as many non-legislative methods as possible.

Irene Glendinning, Coventry University

Irene Glendinning of Coventry University presented her research work analysing the impact of policies for plagiarism in Higher Education across Europe. She highlighted the UK and Ireland as being some way ahead of many other countries in Europe. They have developed an academic integrity maturity model, a tool to compare the results of the impact analysis across 27 EU member states.

Cath Ellis, New South Wales

The presentation that had the most impact on me was from Cath Ellis from the University of New South Wales. Cath reported that there was too much anecdotal information forming decisions, and not enough hard data.

To find out how many students are using contract cheating services, Cath asked them anonymously. Out of the 14,096 students surveyed around 6% (n=814) admitted to cheating in some form during their programme. The vast majority of this cheating comes in the form of assistance from other or former students. It is not commercially driven. The cheating group’s attitudes show they are less likely to think it’s wrong, although there was no discernible difference between English and non-English speaking students. Non-English speaking students are as likely to think cheating is wrong as English speaking students. Other findings of the study showed that when there are perceptions that there are a lot of opportunities to cheat, cheating goes up. And when there is dissatisfaction with the teaching environment, cheating goes up.

Cath discussed the need for students to have “ethical fitness” – we should not try to remove every opportunity to cheat as students need to be ethical.

She then discussed the various types of contract cheating and review some of the typical websites.

Assessment design is widely advocated as a possible solution to contract cheating, but Cath argued that this is a myth. We should not change our assessment design because of a small percentage of cheaters. Reduced assessment time (shorter deadlines) will actually drive students towards essay mills.

Cath noted that we are not having the correct conversations with students and advised us to discuss contract cheating with them. Part of the study looked at the perceptions of how prevalent contract cheating is, compared to how damaging it is. The study showed that students in the cheating group thought that a lot of students were doing it and it was not that serious. Staff members thought it was not very common but it was very serious. Students in the non-cheating group followed the same path as the students in the cheating group. They also thought that lots of students were doing it while it was not very serious.

Professor Phil Newton – Swansea University

The last presentation was given by Professor Phil Newton from Swansea University. He presented various research projects that explored academic integrity.

I found the event extremely useful and I have reflected since on the way Newcastle University approaches academic integrity. The presentation from Cath Ellis convinced me that we should not be changing any approaches to assessment to attempt to counter the small number of cheating students, but we should be minimising their opportunities to cheat. We also need to be having more conversations with staff and students about the promotion of academic integrity, and the impact contract cheating could have on their career.


Assessment & Feedback Event – 1st Nov – all welcome

Assessment and feedback continue to be a source of student dissatisfaction across the sector. In particular student surveys highlight concerns about the alignment of feedback to marking criteria and inconsistencies in both the application of criteria and quality of feedback received.

The HaSS Faculty will be holding a workshop, to discuss these issues in the context of student transitions, on:

Wednesday 1st November 2017
1300-1600 (lunch will be provided)
Lindisfarne Room, Hadrian Building

Though hosted by HaSS, anyone interested in this event from any faculty is welcome to attend – please feel free to disseminate details of this event to your colleagues.

The session will explore the following questions:

  • How can we better understand assessment and feedback in the context of student transitions?
  • What are the assumptions inherent in our assessment criteria and feedback?
  • Can electronic assessment and feedback tools enhance students’ academic literacy?
  • Can we develop a ‘student as partners’ approach in assessment and feedback?

It will be particularly relevant for Degree Programme Directors and Module Leaders, whose input will help identify some key priorities for further action.

It will include presentations from Rowan South (Education Officer, Newcastle Union Students Union), Graeme Redshaw-Boxwell (Learning and Teaching Development Service) and Sarah Graham (Combined Honours).

To attend, please complete the booking form.

If you have any queries in the run up to the event, please contact susan.mclean@ncl.ac.uk.