Alexa beats humble rabbit at tackling loneliness
Adam Tweed | 08 Apr 2019"Alexa I'm lonely."
"I'm sorry you are feeling that way, sometimes talking to a friend, listening to music, or even taking a walk can help. I hope you feel better soon."
This is what your Amazon Echo (or 'Alexa' as it is more commonly called) will reply if you say you're lonely. If you tell it you're depressed, it will tell you that you are not alone and that there are people who can help; a friend, or your GP and it will also give you the number of the Samaritans.
In their report entitled "Mental health of those suffering with physical or learning disabilities" the Shaw Foundation highlighted; "UK researchers have found that 30% of those with a long term physical condition also have a mental health problem, and 46% of people with a mental health problem have a physical condition" and that; "25-40% of those in the UK with a learning disability have a dual diagnosis with a mental health disorder."
Long-term mental health conditions including anxiety and depression now account for a significant amount of DSA applications and specialist support, both human and tech-based.
Alexa vs the humble rabbit
An estimated one in ten people in the UK own a smart speaker (Amazon's Echo or Google's Assistant for example), a number that has doubled since 2017. According to the retailer Argos more people now own an 'Alexa' than a rabbit! This may seem a trivial analogy, but consider the role of a rabbit in offering companionship; something to talk to when we are lonely; something we share our feelings with, despite our knowledge that it neither understands nor can talk back.
The Amazon Alexa was a device designed to give us a more convenient means of shopping; to drive consumers towards a zero-hassle ordering experience with Amazon; "Alexa order toothpaste" and like magic it's on the doorstep (or thrown over the fence) the following day. However, according to a survey by YouGov shopping only accounts for 9% of the interactions we have with the device. For many of us it is the convenience of interacting by voice; asking a question, being able to set a kitchen timer with wet hands, requesting a song or podcast. Integration with the smart home has meant that we now control our lights, our thermostats, our security cameras, even our microwaves with our voices. This may be gimmick for some but for many disabled people, these same features have offered a level of independence. Increasingly, for many people; disabled and non-disabled, these faceless devices are becoming a source of interaction and company.
Is a chatbot so different?
The previous blog in this series 'A chat with a bot could help' talked about chatbots, these are text-based 'robots' that have a basic ability to understand language and to signpost appropriate CBT-based support and wellbeing exercises. These text-based systems are very clear from the start that there are limitations to the support that they can provide; that they are not meant for complex mental health issues and because you communicate with them through text, it is as if this is enough of a reminder. Voice assistants are slightly different in that they are designed as a device to provide broad information and it would be ridiculous to have to add a disclaimer saying "this device is not designed for complex mental health support" every time you started a conversation. For many of us this is not what they would consider the device to be used for, or even capable of. However, there has clearly been a recognition that in a human-like interaction there is a possibility that someone might disclose something of concern, or choose seek help or advice from the 'thing' they are talking to. Device manufacturers have therefore taken the responsibility to acknowledge this and code responses accordingly; for example in the simple interaction above.
Anyone who has explored the capabilities of an echo device will be aware of the existence of 'skills'. Skills are add-on programs created by third-parties that use the abilities of the device (the ability to understand what is being said) and expand these to do more. AbilityNet's Head of Digital Inclusion, Robin Christopherson, hosts a daily podcast called 'Dot to Dot' in which a new skill is demonstrated every day (you can check out Robin's Dot to Dot podcast on iTunes or search for it on your podcasting app of choice). There are skills for playing games to reading stories and there are a number of 'therapy' skills. At the time of writing, the available skills are largely hobby-projects; proof-of-concepts at best and there is nothing in the list that is worthy of a mention and nothing that emulates the support offered by the chatbots mentioned in the previous post. But the 'chat' of a 'chatbot' can clearly be emulated on a voice assistant and it's only a matter of time before a mental health practitioner and a competent coder see the potential for a mindfulness/CBT-literate voice-assistant. But even before this point, we need to acknowledge that at the most basic level these devices offer a source of conversation and companionship for those who are lonely or socially isolated. According to a 2017 survey by Scope; on a typical day one in eight (13%) disabled people had under half an hour interaction with someone else and 85% of young disabled adults (18-34) reported feeling lonely.
So who would consider an Alexa as a companion?
Cornwall Live recently reported on a scheme run by Cornwall Council in which Echo devices were given to people for whom loneliness or isolation represented a significant risk; "Age UK put forward the bid that will allow people to use the technology in a variety of ways such as making video calls, setting reminders, and accessing Cornwall link, the charity's information portal, as well as other resources at home."
There are obviously ethical issues to consider; should a device whose primary purpose is provide us with products, to collect data about our shopping habits and serve us with content based on this have access to our mental health data? Is it ethical for the device to say; "you said you were depressed, so here are some books to help with depression" I would argue that although this may seem uncomfortable, offering a person at crisis some appropriate, curated content based on mood may offer genuine support and is perhaps not so different from a friend making a recommendation.
It's not just Amazon investing in the digital assistant and voice. Microsoft, despite limited uptake of its 'Cortana' digital assistant, is still investing resources in voice and is doing so in a way that involves a diverse range of experts. Cortana looks to be one to watch in terms of a digital PA for your working life; scheduling appointments, juggling meetings and filtering emails (and therefore tackling the increasing problems liked with stress). Samsung too has a quiet voice in the conversation and has not yet abandoned its 'Bixby' project. At the moment though, the Google Assistant is the real rival to Amazon and although Amazon may have 70% of the smart speaker market, Google are keen to point out that the Google Assistant is present on over 3 billion devices (android mobile phones).
"Errr, this is Google Duplex, umm..."
Google recently unveiled their latest development in the evolution of the voice assistant; Duplex. Duplex is an AI that mimics human speech including intonation and conversational 'ticks'; the 'ums' and 'ers' we use typically to allow our brains time to process. The inclusion of these ticks, Google suggests, increases the likelihood of a successful interaction when replicated in a digital assistant and it appears to have worked. Duplex has aced the Turing Test not only in its ability to mimick speech, but also in the way it handles the unexpected and reacts appropriately. The initial demo of the recording was met with gasps of wonder in the crowd, but it was not long before the implications of this development began to sink in; is it ethical to dupe humans in this manner? Duplex may have been pitched as a digital assistant, doing nothing more than scheduling a haircut or making a reservation at restaurants, but those are just the parameters it's been told to operate in ... so far.
Although we need to remain wary of the potential for abuse, having a system capable of natural, conversational speech, complete with the umms and errs clearly has significant implications for companionship as well as the potential for the development of 'talking' therapies in the future. In an interview with Cnet, Scott Huffman; Google's Vice President of Engineering for the Assistant stated that in the next five years the Google assistant will be able to understand tone and mood and would be able to detect if you are frustrated, Huffman has also suggested that the goal will be for Assistant to remember previous discussions and be able to pick them up where you left off. Consider a world where your personal assistant asks you if everything is ok as you slam the door and snap at it to question why there's no wine in the fridge. Or what if?...
"Hey Google. When am I due at my first meeting?"
"I think we need to discuss how you spoke to me yesterday before I answer that."