Digital technology and aphasia: From teaching machines to Eva Park and beyond

I don’t think Audrey Holland, a most eminent aphasiologist, who published the first report (Holland & Matthews, 1970) of a computerized application for speech and language therapy, knew what a viral spread digital technology would become. A ‘teaching machine’ is what we now call a ‘computer programme’ or even an ‘app’! Back then, the words ‘Google’, ‘internet’, ‘digital technology’, ‘blog’ were either alien or didn’t exist. Now, they’re part of everyday conversations. More importantly, what they represent, is not just aspects of digital technology but everyday communication environments and communication media. The digital revolution is happening, evolving and challenging aphasia therapists, their clients and services to embrace it, almost willy-nilly.

Speech & Language Sciences were privileged to host this year’s Research Update Meeting of the British Aphasiology Society. This is an annual forum where research projects at any development stage (from ideas to finished or near-finished projects) are presented. This year’s theme was ‘Digital technology and aphasia’ and attracted an audience and researchers who were aphasia therapists and software engineers of human-computer interaction (HCI). The event took place in April and included 12 cutting-edge projects.

The morning session started off with Fiona Menger’s (PhD student in Speech & Language Sciences) presentation of the IDEA project (Inclusion in the Digital Economy for Aphasia). Laorag Hunter (NHS Tayside) described the inter-disciplinary work carried out in Dundee aiming to facilitate computer use in people with aphasia. This was followed by Abi Roper’s PhD project (City University) on computer-assisted gesture therapy for severe aphasia and Gennaro Imperatore’s (Strathclyde University) word-prediction work for a mobile AAC application.

In the afternoon session, Faustina Hwang (University of Reading) discussed the challenges of assessing computer skills in people with aphasia, while Rachel McCrindle (University of Reading) described several cutting edge applications that could be used to mitigate communication difficulties for people with aphasia. Laura McCain (Speech & Language Sciences) described Memo, a new software application used to assess and treat short-term memory difficulties for spoken language. And the firework of the day or big splash, at least for me, was Jane Marshall’s (City University) virtual communication game of Eva Park, a fictitious island, which enables people with aphasia to use language in a virtual environment, and, even have a splash in the ocean!

And, last but not least, the poster presentations. Zula Haigh (City University) explored the challenges of remote aphasia therapy. Lexi Johnson (dissertation student in Speech & Language Sciences) presented the results from a UK-wide survey of speech and language therapists about the technological challenges they face in their clinical practice. Becky Moss (City University) described a treatment study for reading and writing deficits for aphasia. Monika Winkler with Victoria Bedford (City University) talked about the potential power of blogs person-carer relation in aphasia.

This was the most inter-disciplinary array of projects I’ve ever come across in one forum. It highlighted the close collaboration between speech and language therapists, HCI specialists, people with aphasia and their carers. The projects also highlighted the new, and quickly evolving challenges that we have to embrace to make digital technology work for people with aphasia.

Christos Salis

Lecturer in Speech & Language Sciences

Speech & Language Therapist

 

Reference

 Holland, A. L. & Matthews, J. (1970). Application of teaching machine concepts to speech pathology and audiology. Language, Speech, and Hearing Services in Schools, 1, 2, p14

Leave a Reply