Canvas & Turnitin Assignments: Key Issues

LTDS have offered each School the opportunity to receive a presentation ahead of the new academic year highlighting the key issues that should be considered when deciding how to implement coursework submission and marking procedures.​

For Schools and subject areas who are not able to take up this offer, or as a refresher for those who have been able to, a recording of the presentation and the presentation slides are now available.

​The presentation includes issues that have occurred across the last academic year that have caused extra workload for colleagues and impacted the student experience​, including:

Assignment Types

Canvas or Turnitin Assignment?

Using Similarity Checking 

Online assignment submission principles

Assessment and Feedback Procedure​

Avoiding common issues

Assignment, online marking and feedback guides

Student assignment submission guidance (ASK Website)

​Delegated Marking​

Canvas Delegated Marking

Turnitin delegated marking

Moderated Marking​

Moderated grading and double blind marking

Where to find help

Canvas orientation

All L&T Workshops and webinars

To discuss any of the issues raised further, or if are any issues that we have not captured, please contact LTDS@ncl.ac.uk

Buddycheck Updates

There has been a system update to Buddycheck which alongside some improvements to the student view has opened up some new functionality when creating evaluations. The major changes that users will notice are described below. User guidance available on the Digital Learning webpages has been updated to reflect these changes.

Creating an evaluation and reusing questions

When creating a new evaluation you will now be asked to add a title before moving to the full evaluation set up page. There is now the option to use a previous evaluation as a template. To use existing questions in a new evaluation you now need to select an old evaluation as a template.

Buddycheck create evaluation screen with title entry and template selection hihglighted

Student introduction

There is now an option to add in an introduction to the evaluation for students. This will appear to students before they begin an evaluation alongside some new additional guidance on the question types included in the Buddycheck evaluation.

Student introduction test entry

Question ordering

Question order can now be updated by using drag and drop. You can preview, edit or remove questions from an evaluation using the appropriate icon.

question ordering alongside edit, preview and delete icons

Adjustment factor cap

It is now possible to set a minimum and maximum value adjustment factor cap for an individual evaluation.

The adjustment Factor is the average rating of the student divided by the overall average rating for all members of the team. This is used to adjust the individual student mark

It is possible to use either the capped adjustment factor or the original factor with no cap applied when deciding final marks.

For more information on how the adjustment factor may impact marks see the adjustment factor guidance and the adjustment factor excel example.

adjustment factor amendment options

Adding team questions

Alongside the existing ability to create scored questions, it is now possible to create team questions that ask students to answer a 5-scale question about the team as a whole (strongly agree to strongly disagree). Team questions do not contribute to the adjustment factor. 

Team question creation screen

Option to ask students to ‘motivate’ peer question score

When creating a peer question it is now possible to ask students to optionally motivate  scores, i.e. provide a comment as to why they have selected a score for their peer. This is now possible as part of the question rather than through the use of open questions at the end of the evaluation.

For any queries on these changes please contact LTDS@ncl.ac.uk or see the guidance at the Digital Learning webpages.

Transition to the Digital Exams Service: A Timeline

Following our October 2019 post introducing the digital exam service, we have a progress update and some news about what’s happening next.  Centrally supported digital exam provision (including the OLAF Service, and the Diversifying online exam provision project) is being combined into a single service, and we are reviewing our requirements ready to tender for a system that meets our needs. 

February 2020 

Requirement Mapping Workshops will be taking place. The outcomes of these sessions will help to inform the requirements that we will take to system providers.  All academic and professional services staff with an interest in digital exams are invited to contribute.  Please sign up via the link to have your say! 

March 2020 

Tender for digital exam system (30-35 days response time). A set of final requirements will be issued. 

April – May 2020 

Scoring of tender submissions against requirements will take place alongside user testing of software that meets our mandatory requirements.  Look out for updates about how to get involved. 

June  July 2020 

A provider will be awarded the contract to supply a digital exam system to the University.  

Following this, work will be undertaken to move as much of existing digital exam questions and content into the new system as is possible. 

August 2020 

The new system will be vigorously tested and integrated with University systems. User guidance and training for all stakeholder will be developed. 

August assessment period

Any exam deferrals and resits in the August assessment period will need to be completed/submitted in Canvas. The Blackboard license ends on July 31st and from that point no staff or students will be able to access that system.

Schools should adopt the same method of assessment that was used in Semester 2 for any resits/deferrals in the August assessment period. If a Blackboard test was used in the Semester 2 assessment period, then a Canvas quiz should be used in the August assessment period.

If you ran an OLAF exam in Semester 1 you can either deliver the resit using a Canvas quiz or a Turnitin submission.

Information and support is available via the Education Continuity webpages.

September 2020 

Digital Exam Service launches with new software – OLAF is no more. 

All digital exams previously taken in both Blackboard as part of the OLAF service and in WISEflow as part of the Diversifying Online Exam Provision project will be delivered using the chosen software. 

Training will be offered to all academic and professional services staff involved in delivering digital exams, and briefing information will be available for students. 

WebPA Replacement: Buddycheck Pilot

There is a need to replace WebPA as the University’s VLE integrated tool for peer evaluation of group work contribution. WebPA is open source and this leaves the University vulnerable to system failure with a lack of technical support. There have been a number of bugs and stability issues impacting on the usability and reliability of WebPA that have caused disruption for staff and students.

The viability of a number of options has been considered in order identify a product that has high usability i.e. has a simple workflow for setting up and managing an evaluation, and that offers as a minimum the same functionality that is available in WebPA. Buddycheck has been identified as an option that meets these essential requirements and offers VLE integration. Buddycheck has a simple workflow for setting up evaluations and allows customisation of questions, rubrics, terminology and default settings.

Buddycheck is being piloted during Semester One 2019/20. A number of Semester One modules have signed up to take part but there may be capacity to support more. If you are interested in taking part in this pilot please contact LTDS@ncl.ac.uk with details of the group assessment.

WebPA will remain available in Blackboard until the end of 2019/20 and users will be supported by LTDS, however the intention is that it will not be available once the University moves to Canvas at the beginning of 2020/21.

OLAF Service Capacity

Due to the success of the OLAF service and the capacity to support online exams across LTDS, Exams and Awards and NUIT, the service is currently not available to support new online exams for fewer than 30 students during the assessment periods.  New exams for more than 30 students will go onto a waiting list and be considered based on student numbers, with exams with highest numbers of students being prioritised.

Outside the formal exam period there is no capacity for additional in semester assessments, however The OLAF Service will continue to support all online exams that have previously used the service.

Resit exams  cannot be supported through the OLAF service if there are less than 15 students, the OLAF service will not be available for these exams. The exams could still be run through Blackboard, although this would be without the use of the locked down browser or the University invigilators. Staff can utilise self-help resources well in advance of the resit assessment periods if they wish to run the exam online themselves.

Module Evaluation Results

How and when are results of Module Evaluations received by Academic Staff?

Each module should be evaluated every time it is delivered using the University’s module evaluation system, EvaSys. The results are usually sent to Academic staff via email in the form of PDF attachments, and this is done in one of two ways;

  1.  The survey is set up by local Professional Services staff to automatically send the PDF results upon closure of the survey. This option can be selected during the creation of the survey.
  2.  Local Professional Services staff manually send the results in PDF format from within the EvaSys system at an agreed time. This option can be used if the automatic dispatch is not selected during survey setup.

In both instances the timing of the surveys and the receipt of results should be agreed within the academic unit, paying particular attention to survey close times to allow for discussion of results with senior colleagues if required.

More information regarding Module and Stage Evaluations is available on our webpages

The Policy on Surveying and Responding to Student Opinion details who is entitled to see results of Module Evaluations.

Postgraduate Loans for Masters Study

Soon LTDS will be asked to complete the postgraduate course information return for the next academic year in relation to Postgraduate Loans. Ahead of the release of the details by the Student Loans Company (SLC) for 2018/19 programmes and applicants, this post serves as a reminder of the funding available to postgraduate student in the current academic year. Note that students who have commenced studies during 2017/18 may still be able to apply for funding and should be directed to the relevant funding body. Continue reading “Postgraduate Loans for Masters Study”

Boosting ISB Response Rates

The International Student Barometer is currently open and, as with any survey, there are actions that could be taken to help boost response rates.

Mobile Devices

Actively encourage completion using a mobile device. Most people have at least one mobile device and the ISB Survey can be completed on any device by following the personalised link emailed to students. Wireless access is being continuously improved across campus (as a result of student feedback!) which should make this really easy and convenient.

If possible arrange dedicated information sessions or set aside a brief amount of time at the start or end of timetabled sessions for students to complete surveys on their own devices.

Engage Students

Task student ambassadors or stage reps with encouraging their cohort to take part in surveys by posting on School/Programme social media. Encouraging discussion among student cohorts may lead to positive suggestions for improvement. Announcements could also be made on Blackboard community or module pages.

For all internal and external surveys it is important to ensure examples of improvements made both in house and across the wider University in response to results are communicated to students. Try to highlight what has been achieved at local level in response to past surveys of any kind and direct students to the ‘You Said We Did‘ webpage for examples of how student feedback has helped shape the student experience.

Prizes to be won!

Don’t forget to remind students that in return for their valued opinions, all respondents are entered into a prize draw (see terms and conditions). In 2017, the prizes include:

  • 1st Place prize: 5-inch iPad Pro (one available to win)
  • 2nd Place prizes: iPad mini 4 (two available to win)
  • 3rd Place prizes:£20 Amazon gift card (20 available to win)

What does it matter anyway?

The Student Voice is an essential component of how the University does business. We need to hear about student experiences and work with students to improve the student experience for them and for future students. While feedback can be gathered in other ways such as through Student-Staff Committees, student surveys give the opportunity to capture data that can be compared easily between academic years and stages. Positive and negative responses are equally as important as we need to know what we do well so it can be rolled out as best practice, and where we can improve to help students have the best experience possible.

The higher the response rate to a survey, the more representative the findings should be.

If you have any queries regarding the ISB or any examples of efforts to boost response rates you would like to share please contact us.

LTDS Working with QAA

QAA/Jisc/HESA Business Intelligence Labs

Over the past few months three members of the LTDS team have undertaken the role of ‘Development Team Member’ as part of a QAA team within the QAA/Jisc/HESA Business Intelligence labs project. The idea is that members of the Higher Education community develop data ‘dashboards’ that analyse existing data in new ways. If deemed of interest to wider sector these dashboards may be published on HESA’s HeidiPlus Community Dashboard site.

Using ‘Agile’ methodology and working with colleagues from Durham, Cardiff Met, Queen’s University Belfast and Bournemouth University (alongside support from a QAA ‘scrum master’ and data and tableau experts), we set out to develop a data dashboard that would allow a university to consider the student journey/value added/learning gain, by looking at different factors and how they affect outcomes and leaning gain, so that support and gap areas and effectiveness of interventions can be identified.

Working remotely with only four face to face meetings the team narrowed down the data source to HESA and DLHE data to analyse the outcomes of students from different backgrounds and answer the following questions;

  • How can we demonstrate that as an institution we add value to students?
  • Does everyone get the same level of value from studying or do some groups continue to be disadvantaged?
  • How do our outcomes compare with the sector
  • How do we compare at subject level with our comparators

The final outcome was a set of data dashboards that can aid an institution to assess their position in terms of adding value. The dashboards were presented to a Jisc/HESA experts group with a voting session at the end. The work produced by the team received strong support and it is hoped that some of it will be earmarked to be made available in Heidi Plus Community Dashboards Beta in 2018.

You can find out more about the Business Intelligence labs project by following the link. If you would like further information regarding the project please contact LTDS@ncl.ac.uk

 

PRES & PTES: Record Response Rates, High Satisfaction

Record Response Rates

Both the Postgraduate Research Experience Survey and the Postgraduate Taught Experience Survey closed with the highest response rates that Newcastle University has ever achieved at 65% and 57% respectively.

This is a fantastic achievement which would not be possible without the continued support and promotion of the surveys from colleagues across the university, thank you!

High Satisfaction

In terms of Newcastle’s overall satisfaction rate for the PRES, 85% of students agreed with the statement ‘Overall, I am satisfied with the experience of my research degree programme’. This represents an increase of 2% on 2015, and is 3% higher than the Russell Group average.

Also in the PRES, over 90% of students agreed that their supervisor has the skills and subject knowledge to support their research and that they have regular contact with their supervisor that is appropriate to their needs. This represents an increase of 2% on 2016

In the PTES Early results show that satisfaction has remained high with over 90% of students agreeing that staff are good at explaining things and are enthusiastic about their teaching.

 

For further information regarding student surveys please visit our webpage.