Learning from Sprints

Hurrah! our teams have recently added new assessment resources to the Effective Practice branch of our Teaching and Learning site.

Both of these draw on the outputs and findings from our Assessment and Feedback Sprints. I was delighted to be part of one of these, bringing together student, academic and PS colleagues to tackle common issues that student experience with assessment.

In this post I can fill you in on the background to new resources.

Continue reading “Learning from Sprints”

Curriculum conversations

This week I am working with colleagues on our 5th Assessment and Feedback Sprint, in this one we are asking the question:

How do we articulate a meaningful programme experience that ensures a cohesive assessment journey for all of our students?

We have been looking back at student feedback, surveying DoEs, looking at external HEIs and interviewing colleagues and students.

Here are a few of my thoughts on the hoof:

Programme journey and assessment journey

It is difficult to articulate the assessment journey separate to the programme journey – the two are inevitably linked. (This is no real surprise given that assessment drives learning.) If assessment evidences the skills and knowledge that the programme is designed to develop – then these are two sides of a single coin.

“My” vs “Our”

One of the things that we (LTDS) are super keen to do when we facilitate programme design/redesign is to start with the vision. There will be post-its, doodles, maybe even lego – all props to provoke conversations around “What is this programme for?”, “What values will it embrace and reflect?”. Our goal here is two fold – defining and articulating these things is valuable, but more importantly we can begin to see a shift in language. “The” programme, becomes “our programme”.

Small scale, I’ve seen clearly with our MOOC work – the lead educator may have the initial idea, but as others contribute to it and shape it it becomes our shared venture. It is all the more important at curriculum/programme level where the numbers of contributors are much more. Once there is a shared co-created vision the inevitable negotiations around content and compromises will be easier.

We benefit when we involve students (as peers) early in these conversations – they keep it real and grounded.

Who owns modules?

Do module leaders own modules, or are modules ultimately owned by the programmes they serve? You would hope it is the latter, but our survey responses suggest tensions in this area. If module leaders don’t have a clear idea of how their module contributes to the whole then they are not going to be able to share any clarity with their students.

Information, timing, trust?

A university Programme is a supported, curated journey designed to meet particular aims. By signing up to come here students are placing a level of trust in the programme, the teaching and the University to support their success. We do a great job articulating the programme and skills at open days, but after that I don’t have a sense of what level of detail is valuable. There’s lots of information out there but let’s not get into TL:DR / overload.

  • Programme specifications/regulations are not student friendly, only the brave will read these
  • Induction has to focus on “need to know” for the first few weeks.
  • Students are really vocal over the basics “what are my assessment deadlines” – finding this out can be a steeplechase for some. Consistency and conventions help everyone.
  • Ongoing references to how elements of modules feedforward, mentioned in the moment, will feed into trust, but presuppose that module teams have a clear understanding of the programme and role of their module.
  • Visuals and infographics that we have seen so far reduce what is a complex journey of content, skills, knowledge, assessments, feedforward to something simple, but in doing so I wonder if they still have value?
  • Not all students will wish to engage with this sort of information; we shouldn’t beat ourselves up if they don’t.

So, at this point, I have more questions than answers, I gather this is normal! We are half way through – clarity will come :-).

Our Sprint Team will be presenting our findings at the end of next week, do join us Friday 10 1:30-2:30 on Teams.

the ongoing dialogue between the problem and the solution….

We have been talking and thinking about how messy design is. It’s so easy to present it as a linear, procedural activity. But the reality is that it is backwards and forwards.

Our mentor @Jannah Aljafri kindly pointed us to the Double Diamond Process (Design Council). There’s an exploration – lots of divergent ideas – what is the problem we are trying to fix? how are we hoping to add value, before coming to shared understanding and definition of “the thing” followed by yet more exploration in the develop stage.

As the design council say “this isn’t a linear process“.

What might this look like from a L&T perspective?

Discover and Design stages with MOF in the  middle

Maybe in our context the skeleton MOF could be thought of as the definition.

In the develop stage, experience tells us that:

  • videos will be started and end up on the cutting room floor
  • technologies will be abandoned and new ones adopted
  • new requirements or stakeholders will appear
  • prerequisites will emerge….

But, what about moving back and forth between the two diamonds?

When online or blended approaches are integral we may have time to get feedback on samples of the approach, but more often than not we are up against it to develop and deliver modules – laying the tracks just in time.

Unless there is a healthy design window it can be hard in practice to return to discover/define — changing MOFs brings with it variable amounts of friction depending on what has been published and the nature of the change. Ideally we need time for exploration, experiments and feedback early on.

Planning for evaluation

At our fourth bootcamp workshop we had a masterclass from the OU team on how they designed in evaluation as part of the module planning. Student feedback, a student reference group, analytic dashboards (and custom reports), and tutor reports come together to build a picture of how the module performs and informs decisions about the following presentation.

During our Jamboard exercise we thought through some of the ways we could plan evaluation into our SML module. Here we aren’t dealing with students at a distance, instead there will be lots of opportunities to actually see and hear how they are getting on:

While we don’t have a reference group we do have lots of opportunities to gather feedback:

  • reviews of content pre-run
  • regular informal in-class check-ins
  • end of module survey / focus group
  • student reflections on the process and what they are learning
  • views from student reps via Student Voice / Student Staff Committees

Thinking about this further, its obvious that one size doesn’t fit all. Yes, there is a need for standard questions in centrally run surveys, but you need to know what the module is about to design how to evaluate it effectively. And, in many respects where our focus is on skills development -the “mentor” role of the academic leads will give an immediacy to feedback and permit in-flight corrections.

Parameters for a NCL Bootcamp

We have also begun the process of thinking about how we gather up our learning to present an in-house variant of the Bootcamp. Some things are clear:

  • it needs to reflect our focus as a predominantly campus based university (blended is normal, online is rare)
  • tools and techniques need to complement and extend our existing module approval processes
  • any learning design frameworks or approaches need to be easy to pick up and easy to pass on (i.e. Made to Stick)
  • centred on a mentored or action learning approach – a supported journey
  • it needs a pacemaker – a structure and metronome to enable teams to complete the design (too fast and we will drop people, too slow and people will disengage)
  • it needs to be offered as a pilot, and developed with feedback.
  • it need to be flexible enough to support our gloriously diverse range of disciplinary cultures

We have lots of options. In some ways this is freeing – we can take the best of our current practice, add in elements that are helpful, and document other approaches to come at the design task from a different angle.

However, if we are to develop something that could have traction we need input from stakeholders, and that is what is next on the agenda.

Storyboards, episodes and patterns

One of the really interesting ideas from last week’s “Developing your Design” bootcamp was that of considering an “episode level design” between the module level design and the the detail of the activities. The episode could represent a week (or maybe a fortnight) and it could have one or more design patterns giving a rhythm and predictability to teaching.

Thinking about some of our examples, a pattern for an episode (a week) could be something like:

  • Big question
  • Unpacking theory and practice
  • Group activity
  • Q&A
  • Discussion and reflection

Once learning outcomes are authored for each episode, then then, tools like ABC activity cards, CoDesign cards, or OU’s Activity cards can help to structure each each element of the pattern into a set of tasks and content (e-tivities) geared towards meeting the learning outcomes.

But, before learning outcomes can be in the drivers seat, we were reminded of the importance of carefully crafting them – with active verbs, defining a level – so we can evaluate whether the proposed activities will work i.e. enable students to meet these learning goals.

I’ve really enjoyed working with ABC as storyboarding tool, but am aware that people can get confused on the level they are working. Some are happy to abstract them “this is the pattern for weeks 2-6” but in other situations I’ve found participants bogged down in the detail of what week 3 will contain and, in the time constrained workshop, be unable to see the sweep of the module. As a facilitator, you do encourage participants to work at the overview level, but placing an emphasis on a pattern for the week as a prelude, or additional step, may well help.

Lots to think about…

Design is….

Top down? iterative? collaborative? creative? hard work? systematic? hard work? stimulating? fun? messy?

This week my head has been full of design. We’ve had a Bootcamp workshop on Developing your Design and I’ve had some healthy conversations with colleagues as we scope out a NEPS unit on Programme Design. From where I sit now, all the descriptions above apply.

I’m intrigued by the fact that there is no one answer on how to design learning. We’ve been pointed to the Learning Design Family Tree – demonstrating the ongoing evolution of approaches (and tools), and at curriculum level Mick Healey collates a treasure trove of approaches.

Constructive alignment provides a sturdy skeleton for both programmes and modules but it needs to be clothed and we need other perspectives to inform and evaluate the many potential answers to “how?”. I had a chuckle when I read Jenkin’s Ouija board metaphor describing how curriculum design is influenced and shaped by forces: Assessment as Learning, student time, pedagogy, costs and resources, subject benchmarks… these are all super relevant. But rather than seeing them as forces – I view these as helpful factors which enable us to work within a smaller design space and provide us with insights that help us iterate towards better design.

For both of these task I’ve referred to (our Bootcamp module and the NEPS unit) we are the slightly messy idea generating stage, but importantly, the conversations that we are having now hold the aims and values of both. Convergence isn’t that far off.

Of courses and resources

Last year I was part of a team that authored our Flexible Learning 2020 (FL2020) course – 11 topics on how to rethink teaching and learning in the shadow of a global pandemic, while changing VLE from Blackboard to Canvas.

Canvas’ stats of page views confirms that the course had a relatively short shelf life. What do we do with it now – do we replace it with another course or a set of resources? Is one approach better than the other? I made a table…

Course Resource 
It has a beginning a middle and an end .
There’s some kind of feedback (formative, auto marked) or it is moderated.
There is motivation (intrinsic or extrinsic) to complete it.   
Once complete there is little motivation to return to it, apart from reference . 
There may be an idea of a cohort progressing through it at key times.   
It’s designed to take participants from a defined level of knowledge/skill to a more advanced place.
Ideally, it contains activities  for participants to do with the information.
Something that’s designed to work just-in-time.
Signposts further resources and information.
Designed to be searchable – jump in at any point.  
Visit multiple (short) times.
Digestible chunks – works on the web . 

Now that we are in Semester 2, it’s clear that the questions we are asked are not ones that our FL2020 course answers. We have all moved on. There’s a temptation to add more content for the intermediate audience, but we know this will make everything harder to find and our sense is that what will now be the most valuable is a set of searchable resources.

Co-incidentally, the University of Kent have been running a series of “Digitally Enhanced Education Webinars” and I stumbled on Dominik Lukes’ presentation What should educators know: User interface and User Experience . I was struck by his description of how design needs to be aimed at intermediate users (with routes in for beginners). 2020’s collective baptism-into-blended leaves us with very different mental models than we had pre-covid – and being reminded of the basics is plain annoying.

From this perspective what’s sensible now is to retire the course, it did a reasonable job, but the scaffolding isn’t needed any more. We can pull out some nuggets into shorter help guides, articles and case studies that colleagues can find more easily.

Persona Journeys

In this week’s bootcamp session we used a couple of techniques to put students at the heart of design.

The first was a “Student Profile” in which we imagined a “typical student” and noted motivations, expectations, enablers and barriers to study, alongside their educational background and experiences.

In UX domains these might be called personas becoming the foundations of user stories, keeping users at the centre of design. I was introduced to the idea a while back and have found personas really helpful in curriculum design projects to date, particularly when thinking about vision, values and teaching methods.

We might suggest coming up with a few different personas – maybe an international student, a home student, one with some access needs. The personas can be informed by student feedback, real students, NSS feedback etc – and these imaginary people can be “walked through the design”. Although imaginary, our personas can take on a life of their own — when we were designing HSS8002 we asked questions like “which options would George take?”

We can also use personas to design from the future, by imagining graduates, say three years post-graduation. What did they most value about their programme? What job are they doing? What were the most valuable skills they picked up? What would they like to have seen more of? We can talk to alumni, and employers to flesh these out alongside attributes in our own graduate framework. We can use these insights to prioritise objectives and teaching approaches.

Our second activity involved thinking about the student journey. In our case, the journey through a module – What concerns or difficulties did we envisage students having at certain points? What could we build in at those times to mitigate? This is a really useful exercise to do once the module concept is fixed.

Digital tools: love them? hate them?

What tools do you use in work and outside work and how do you feel about them? In our introductory Bootcamp session we were asked to draw and then discuss our personal view.

There wasn’t a huge amount of time to do this in the session, so I redid my own diagram afterwards using it as an excuse to try out Mural. The diagram is adapted from work by Lancos and Phipps and, for me, the best bit was adding emoji’s to some of the tools. You will see that email and Teams warranted sad crying but Excel and mindmapping put me in happier zones. (I know I am not exactly normal in my love of data visualisation and infographics.)

Like others in our team I used a laptop inside work, and ignored this when not at work. In many ways Covid-safe working from home has polarised this even more. I don’t do any digital creation outside work, and whilst I would listen to leisure podcasts in my own time – its only a rare occasion that I’d tune into the Wonkhe podcast in work time.

How does this help?

The point of this was to help us get into the shoes of our learners. Our students aren’t a homogenous blob living up to Prensky’s musings on digital natives – they have gaps and preferences. They might feel differently about digital tools? How does that impact our choice of tools to use for learning? How do we scaffold the learning? Is the learning curve worth the reward? Is it inclusive? These questions come well before Privacy/GDPR/Impact Assessment.

Taking a curriculum viewpoint

A tool focus is helpful, but I agree the real trick here is to take a step back, to a curriculum view. How are we developing digital skills and digital agility across the curriculum? Katharine Reedy blog post describes how the OU use card sets to map this out.

Bootcamp beginnings

We are at the start of our Learning Design Bootcamp journey and have been encouraged to reflect – so lets do it!

I’m not completely new to designing online learning: I’ve supported colleagues through the design of free online courses on FutureLearn, I’ve authored self-paced learning units (on Canvas), and have worked with module teams to redesign Masters level modules for blended delivery.  I also had a rich online student experience studying on OU’s MAODE (Masters in Online and Distance Education).

Current approaches to learning design

As far as I know, we don’t really have an institution-wide approach to designing learning, but our programme and module approval process ensures that modules can be articulated in terms of learning outcomes, teaching methods and an assessment rationale.  Constructive alignment is hard-baked in! 

I’m based in LTDS, and with colleagues we have looked at a few design approaches in detail: Carpe Diem, Cairo, ViewPoints, ABC and I’ve dipped into others during my MAODE travels (Ulster’s Hybrid Learning Model  and various frameworks from Grainne Conole), I dabbled with rhizomatic with Dave Cormier in Change11 and lasted a few weeks in OLDS MOOC.

I’ve been involved in supporting a small number of modules/programmes where I have been allocated to them on a “project basis”.   Project work could involve delivering a series of workshops running over a year, or a redevelopment project involving both design and content development.

“Blended” is not the goal

For the projects I’ve supported we’ve found UCL’s ABC particularly useful.  It works in our campus-based context and has been effective in helping module teams to consider blended approaches as options (rather than starting out with a goal of N% online). 

But ABC only works well when you come to it with a clear view of aims, students, learning objectives and possible assessment approaches. 

If these haven’t been thrashed out already, say for a new module or programme, we choose from a range of tools to come up with a shared view. 

Go-to tools for “vision” are things like student personas which we draw up to reflect our prospective students.  We can also imagine them in the future – and ask “what will students most value about the programme 2 years after graduation?”  And, where possible we back this up with input from prospective students, current students (in person or via student voice) and employers as we form the feel, shape and values of the project.

Once the concept is fixed, we’ll work with colleagues to write and refine clear learning outcomes – using guidance from our own institution (and QMU have a great guide too).  Next we weigh up appropriate assessment options -what methods will sit best with with the outcomes and skills we want to develop.  If we want to encourage creative assessment we’ll offer some form of an assessment sorting hat activity and use prompts on viewpoint cards to spark conversation around feedback or authenticity. 


The tools and activities we use are dotted around different workshop folders – we’ve not brought them to a single place.  Our pick and mix approach at the moment is somewhat  “artisan” and isn’t scalable, or easily communicable to colleagues.  I up for picking up new ideas and learning new approaches. One of the things I’d like to see by way of output from this project is a clear pathway of activities leading towards a design goal.

The power of collaboration

In my experience multiple viewpoints and an understanding of the interactive nature of design makes for a better end product.  I know design to be a messy, and sometimes contentious process.  But with experience comes the knowledge that the uncomfortable thrashing it out process is essential.  It helps the project to become “our thing” and at the end there are artifacts and storyboards that articulate what the thing is about, and almost as importantly what it is not about.  If there isn’t a shared vision and understanding there will be trouble down the line!

Ups and downs of redesign

How is it possible to represent three and a half years of work in a poster? About a year ago Ros Beaumont and I met up to do just that. Our project: the redesign of the HaSS PG Cert Research Methods involved a huge effort, the gantt chart was redrawn regularly, there were dramas and ups and downs.

We wanted to be honest about our struggles and to foreground the importance of the collaboration at the heart of the project. So, we represented it as a board game, with ups (the ladders) and downs (the snakes). It’s never a good idea to put an A1 poster on a blog post, but do have a closer look at it as a pdf.

The ladders

  • Early champions: We wouldn’t have got off the blocks without Dr Adam Potts taking on the module lead for HSS8007 and the library team being willing to rework HSS8002.
  • Teamwork: this was a shared endeavour, we had difficulties to overcome, we had to be open, supportive and bring a can-do attitude.
  • Capturing Student feedback: early results showed that this was a positive experience for majority, the flexibility was great for diverse learners, including those new to UK HE. Initial positive feedback helped us recruit collaborators.
  • Building momentum: we also encouraged our early contributors to share positive stories and talk about the impact on their own development
  • Review and revise: we made annual adjustments in response to student feedback, eg. introducing a face to face session for HSS8002/007 introduction and expectation setting session.
  • Having a back catalogue: As more content was blended it became easier to articulate and model possibilities for new topic leads.

The Snakes

  • Complexity: 32 contributors with competing priorities and workloads; inadequate time and a fixed project timetable – we needed to be able to deliver modules to students.
  • Reconceptualising learning: we had to take our contributors on a journey, to rethink their topics in a student-centred versus teacher-focused way, incorporating active learning and having a different relationship with students.
  • Reality Check: Understanding what can be achieved by when, and when to acknowledge defeat and re-scope.
  • Academic Presence and identity: We found that pre-work familiarises students with the session leader without the involvement of the session leader. How then do they start a face to face session?
  • Loss of Momentum: Time pressures resulted in development of materials being put on hold – these though needed to be picked up later.
  • Own goals: Not trusting expectations which had been set and using valuable in person time to repeat pre-work rather than extend and explore.

The enablers

  • Vision: Clear goals, Sell and keep on selling the vision to different audiences
  • Expertise: Technology, pedagogy and content knowledge
  • Project management: prioritising, monitoring
  • Resource: Team members who have the project/ work as major part of role or responsibility. Time for contributors to engage.

The poster was made for our Covid-cancelled 2020 Learning and Teaching Conference; but we managed to submit this to the event in 2021 and won the poster competition!

What tools should I invest in?

tools for interactionsWe start 2020 with our new VLE, Canvas, and a rich array of digital learning tools that can be used to support teaching. There are so many possibilities and it could easily be overwhelming.

This is a short post to begin to answer one of the questions I hear last week “What 5 tools should I invest in?”.

But, let’s back up a bit,  before considering tools we need to think about what we want these tools to help us to achieve? Way back in 1998 Anderson and Garrison described the three more common types of interaction involving students:

  • Student-content interactions
  • Student-teacher interactions
  • Student-student interactions

Let’s use this to come up with our list…

Student-content interactions

Your starting point here is Canvas itself. You can present information on pages, embed documents, link to resources on library reading list, include videos, audio and ReCap recordings.

Go to tool #1 has to be Canvas itself.

Linked to this is tool #2 Canvas quizzes.

Canvas support a wide range of question types: multiple choice, gap fill, short answer, matching, multiple answer.  Quizzes can help students practice skills, check their learning and encourage them revisit material.

For short PowerPoint narrations the easiest place to start is the recording features that come as part of ReCap.  We tend to think of ReCap as a lecture recording tool, but there is also a fabulous ReCap Personal Capture tool that you can use to record yourself, and publish in Canvas.  There are several bonuses with using ReCap – you have the ability to do make simple edits, you can use automatic speech recognition to generate captions, and students have the ability pause, rewind and make notes on the recordings that you publish.  ReCap personal capture comes in as tool #3 – you can install on your computer, or if you prefer you can use the new browser based recorder – Panopto Capture (beta).

Student to Teacher interactions

Outside the limited amount of PiP time you are likely to be meeting your students online.  For synchronous meetings there is increasingly little to choose from between Zoom and Teams – the only significant factor being that Zoom permits people to connect by phone – so supports those on lower bandwidth.

Now is a great time to become confident with the online meeting tool you are planning on using throughout your module.  I’ll leave it to you if #3 for you is Teams or Zoom – it would be sensible to settle on one, for you and your students.  Teams could be a strong contender if you plan to use this as a collaboration space over the module/stage, in which case do review the article on Building an online community using Teams.

Once you setting on your meeting tool, now is a great time to explore options for using whiteboards, polling, breakout rooms in these spaces and to begin to plan active online sessions.

For tool #4 I’d go with Canvas Discussions – these are easy to use, work really well in the Canvas Student and Teacher apps and are great for Q&A sessions, introductions, crowd-sourcing activities, and of course discussions!

Student to Student interactions

Learning at university is a social! There are huge limitations on what we can do in person – but what can we do to help learning be as social as it can be?  This isn’t so much about tools, but about the activities we design in: break out room discussions, group tasks, peer reviews, debates – things that might start in a timetabled session and then spill out.

For synchronous meetings and study sessions all our students have access to Zoom and Teams.  We can model how to use these, build students’ confidence in these spaces and show them how they can collaborate in Microsoft 365 collaborative spaces (Word documents, OneNote…).   I’ve already mentioned Teams and Zoom (#3), so for tool #5 I’ll pitch for Microsoft 365 with an emphasis on collaboration.

What do you think?

These are my top 5 tools, you may have a different list.  What have I missed out?

“Technology Enhanced”

If I want to write,, I can use a pen or type it in some form using computer/tablet/phone – is this technology enhanced writing?  Or is it just writing?

In my view technology is something to be adopted if it is right, fitting and appropriate to the task.  If that is the case then it just becomes the way we do things.  Say I want to “share photos” – in 1983 I would get them developed at Boots, put them in a sticky album and inflict them on unsuspecting family and visitors.  The thought of doing this in 2018 is laughable.  Sharing photos is something you do digitally.  I still use the same language  “sharing photos” but what we now understand by that phrase has changed.  I wouldn’t dream about talking about “technology enhanced photo sharing”.

You’ll imagine from this that I really dislike the term “Technology Enhanced Learning”.

  • If the tech-way is the best way the technology becomes invisible. At some point in the future  – e-submission is just going to be the way we do “submission” and e-marking is just going to become marking.  (Forcing the adoption of immature tech creates agro.)
  • If you’ve worked in education for any time, there is a good chance you’ve had some form of post-traumatic stress from badly behaving technology that has ruined a session, it’s all too easy to jump on “shiny” and force it into the classroom (or VLE).  We need to be more critical about whether the technology *will* add or detract.
  • Lots of the so called TEL is actually e-administration. Take a tool like WebPA – it gives group members on-line forms to evaluate each other and works out a peer score. Here all the tech tool is doing is the drudgery of collating, counting and presenting the scores.  It’s doing the background admin for the peer assessment task.  In my view this is a sensible way to administer the process.  The clever bit comes in how you frame the group task, how you introduce it to students, how you interpret the results and whether it is a good fit for your programme aims.  Doing this well is all about the skill of the academic lead, WebPA is the administrative enabler.
  • We need to think about the investment and payback. Technology (apps, devices, systems) have learning curves. It’s all too easy to be hijacked by this learning curve and lose sight of the actual learning that is our focus.  Tools and tech needs to be “frictionless” or easy enough to learn so that there is a real payback.  Talk to any academics involved in teaching – time is not something they have in abundance.

Critical quadrants

Now, neatly sidestepping the debate on “learning gain” let’s imagine we could graph “improved learning” (for students) against academic time.

 More academic timeLess academic time
Enhanced learningB
Diminished learningCD  

We get 4 quadrants – C is the one to avoid at all costs – and sadly it’s easy to come up with a scenario that fits here.  Prof Smith spends weeks developing an online simulation exercise that bombs with the students.  Dr Jones runs a one off webinar and the time is taken up with “can you hear me?” Result: It’s mothballed, never to return.   I’ve used tech examples here, but let’s face it this grid is tech neutral.  You could get in this quadrant just as easily by developing lecture materials pitched at completely the wrong level or by having to work with bad systems that gobble up time and leave your creative energies depleted!

If we leave technology in the mix B is perhaps the only one that could potentially qualify as being “technology enhanced”… students learn extra stuff and the academic gains time.  You may get here after you’ve been through a learning curve with online marking.  You spend less time, but students get richer feedback.

Living in quadrant A is only something you can do for a short time. It’s easy to see how you could be here – you could have more meetings with students, smaller group sizes in seminars (more seminars), you could make short supplemental videos, you could develop a multi-choice quiz that helps students revise key topics.  You need to be selective about what you do in here, and may want to venture in for short sprints if you can make gains elsewhere.  Notice that my examples aren’t all about Tech.  Any redevelopment or redesign work can bring you here.

Technology and automation in the extreme can take us into sector D – standard or no responses, little chance for interaction, impoverished experiences.  A mentor-free MOOC won’t deliver a campus based experience.

Where Tech is e-administration it sits somewhere along the x-axis – depending on whether it is students or faculty doing the processes.  Online Module choices e-administration.

My point – let’s drop TEL as a phrase.  It’s meaningless.  Instead let’s consider what works, what is effective teaching – let’s use technology if it helps.  But please don’t call it TEL.


This post provoked by:Sian Bayne (2015) What’s the matter with ‘technology-enhanced learning’?, Learning, Media and Technology, 40:1, 5-20, DOI: 10.1080/17439884.2014.915851Kirkwood, Adrian and Price, Linda (2014). Technology-enhanced learning and teaching in higher education: what is ‘enhanced’ and how do we know? A critical literature review. Learning, Media and Technology, 39(1) pp. 6–36.

Blended Learning 101

Here are a few practical tips on getting started.

Bling it up

Our institutional VLE Blackboard can be made to look more interesting.  It’s down to you…

  • add images to items
  • think about what you can embed – Video, Slideshare, Polls and quizzes
  • Add links to discussion boards – and if you use these make sure you have time to contribute and set the tone (be a good cocktail party host at the start).  Create an introductory post on each discussion board – it’s far too scary to be the first one who posts.


Make the journey really clear on your course and use weekly emails/announcements to reinforce what’s to be done and reflect on what’s been good.  Use one of the new module templates based on our Blackboard baseline so that students get a consistent experience.

If you include a link to a document tell the students what you want them to do with it – read it? skim it? make notes on specific points? focus on pp 14-19?

Include some getting started material – how the course is organised,  how to get help, how to subscribe to discussion boards (so you get notified when someone posts!)….

Set expectations of how much time student should spend on sections of the material, so that they know what are the essential parts and don’t burn out on the introductory elements.


If you are going to be making any videos think about where you are going to host them.  You can then include them on Blackboard with links or embed code.  We have a few choices:

Once you have the video hosted make sure you add your captions so that these play in the embedded player.

Making videos

If you don’t want to spend money – you can make great narrated powerpoints, talking heads or screen captures without spending money…

  • Use ReCap personal capture – talk to LTDS about getting a pCap folder set up, get the software installed and have a hand to get stared
  • Use Powerpoint to record your slides (Slide Show/record slide show),  or use it to do a screen recording (insert/screen recording), then Save As mp4

If you have a longer lead time and need some professionally produced media get in touch with our crack Digital Media Services team.

Recording Audio

Have a smartphone?  Use a “sound recorder” app on your phone to record an audio track. Or you could download and use Audacity to record and edit yourself!  You can upload audio to ReCap and NUVision.   On ReCap you can add bookmarks (eg three views on one question).

Polls and Quizzes

Blackboard quizzes can look a bit scary, so if your questions are formative and you don’t need to know who answered what you can use Google Forms or Microsoft Forms to make a quiz.  Then you can embed the quiz, provide a link to it (or both), and share the results back.

Prototype and Get feedback

Ask colleagues or students to comment on your initial designs: does it work? Does it make sense? Is it clear what you are being asked to do?  Are the discussion spaces inviting?

Talk to colleagues

Shamelessly pick the brains of colleagues who have done similar projects! You will have more ideas together.

Talk to LTDS for “how-to” support on Blackboard, ReCap, ePortfolio and for tips on designing your learning materials.

Connect with NUTELA and join in their workshops and community (there is free pizza and pop).


Feeling Connected


LTDS hosted the first of our Feeling Connected events this week.  The series is all about how to help to engage large students cohorts in the classroom and beyond. In this initial session we considered how to engage students in the lecture.

Tony Chapman-Wilson brought an actors perspective on how to get the best out of, and look after our vices.  He had us humming, thinking about our diaphragms and tripping over tongue twisters.

Sue Gill spoke about Powerpoint as an aid; we presented feedback from the recent TEA awards from NUSU on what student appreciate; heard top tips from colleagues via podcasts and Dr Alison Graham rounded the session off with real examples and insights from her use of the OMBEA student response system.



We interviewed Dr Sylvia de-MarsDr Julian Knight and Dr Keith Brewster and shared their insights in the session.

The links below take you to the recordings on ReCap.

Thanks to all who attended and contributed to a lively session, and to and to colleagues who gave up their time to contribute before and after.

Next in the series…staying connected

Our next Feeling Connected event will focus on how to stay connected between lectures.  It will take place on Tuesday 19 September in the Herschel Learning Lab and have a hands-on feel.  You can book via our web-page: http://www.ncl.ac.uk/ltds/about/training/feelingconnected/