Siri can now answer your calls ... but can't hang up

Guest blog: Colin Hughes

Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people. Colin is a regular contributor to Aestumanda.

iOS 14.5 brings lots of new features to your iPhone that let you do things like stop apps from tracking you, and unlock your iPhone with Face ID while wearing a mask. However, one new feature received less attention than the headline grabbers, but is great news for disabled users.

iOS 14.5 now lets users answer calls using the company's virtual assistant Siri without having to press anything.

The option is part of the Announce Calls with Siri feature, which lets users hear the name of who is calling when using AirPods, the company’s popular headphones. However, in addition to telling you the name of the caller, Siri now understands the commands to answer the call. There’s no need to say, “Hey Siri,” you just need to say “answer” or “decline.”

Iphone displaying a message saying 'Hey Siri on the screen

How to activate the new setting

The new option can be found under Settings > Phone > Announce Calls.

As someone with a severe physical motor disability, who can't touch the iPhone screen, this is the company’s most life changing feature in a long time. The beauty of it is Apple doesn’t even categorise it as an accessibility feature, which makes it very inclusive.

When I first tried the new capability last week, I was crying tears of joy at its simplicity and effectiveness. It will change my life going forward as I can now easily and spontaneously answer the numerous phone calls, I receive each day hands-free with just my voice.

I’ve lobbied Apple to include this ability for the past four years. Almost since Siri launched, you could place a call with a voice command, but until now you couldn't answer a call in the same way. I always found this situation untenable and it is good to see the company has finally acted.

Unfortunately, Apple’s accessibility speech recognition app Voice Control can’t answer and hang up calls. In any case, Voice Control relies on your iPhone being out and on display and unlocked with Face ID, but this is not my preferred setup when out and about.

If you too could do nothing about someone wrenching your phone from its holder as bold as brass and strolling off with it, you too would prefer to have it stowed away and rely on Siri through your AirPods as your main interface to all your iPhone's functions.

Accessibility feature or inclusive design approach?

Curiously, Apple hasn't explicitly highlighted this new 'answer calls with Siri' ability as an accessibility feature. I hope this is because the company has taken an inclusive design approach, which sees accessibility features incorporated into the core product, rather than siloed off as part of the operating system where they can be ignored.

However, I do have a slight worry that some disabled users may miss out, should Apple fail to properly market it. I think the tech giant should be more proactive in their marketing and product information, in terms of reaching out to disabled users and saying: "Hey, we have this great new feature, it's for everyone and does X, Y, and Z and we think it will be really helpful for people with physical motor disabilities". So far, the company has said nothing.

Expanding on this inclusive approach, the next AirPods advert, rather than showing a fit young person dancing down the street, it would be great if Apple could show how AirPods and Siri can help a disabled user answer phone calls and keep in touch with friends and family more easily.

Room for improvement

Apple’s Worldwide Developer Conference (WWDC) kicks off in a few weeks on 7th June. The annual keynote is where the company unveils details of the next major releases of its operating systems, such as iOS 15, watch0S 8 and macOS.

Despite the welcome introduction of the answer calls feature there is still plenty of room for improvement when it comes to accessibility in the company's forthcoming 2021 software updates. There is a tonne of stuff you can't do hands-free with your voice on Apple gear, which is very limiting.

iPhone upright on a table displaying different apps on its screen

Siri, why do you keep me hanging on the telephone?

A glaring flaw with iOS, and a major part of my advocacy over the past four years, is users can't hang up a call with Siri: "hey Siri end call" or "hey Siri hang up" or dropping the need for "hey Siri". Incredibly, users still can't do this hands-free on an iPhone or Apple Watch cellular.

It isn't too much of a problem if the other person on the line can hang up and end the call for you. However, this does mean that if you call a number and go to voicemail, you have to wait until the mailbox times out before the call ends – which is very frustrating for both parties. I feel a real sense of powerlessness when this happens to me.

There may also be occasions when I don't want to speak to someone and want to end the call and I simply can't hang up, if a persistent tele marketer calls me, for example. This unsatisfactory situation can't be allowed to persist and I hope something can be done about it in iOS 15.

Auto answer option

Thankfully, perhaps in no small part to The Register highlighting my experience, auto-answer was introduced as an accessibility option in iOS 11 in 2017.

However, the implementation has shortcomings because you have to touch the screen to turn it on and off. I can't do that and there is no Siri command option such as, "turn on auto answer". You might not want to answer every call. Auto-answer is a crude catch-all, and there is no way to whitelist certain contacts.

There is no watch iOS support for auto answer either, despite cellular versions effectively being wrist-worn phones.

Despite the introduction of answer calls with Siri auto-answer remains important for severely physically disabled people because the less you have to project your voice the better. It can help save vital bits of energy so automation remains important.

Third party messaging apps

Many popular third-party messaging apps don’t work with Announce Messages with Siri, which allows you to dictate and reply to messages hands-free when wearing AirPods. I am in the UK where WhatsApp is more popular than Apple’s iMessage, and all my friends and family use WhatsApp but there is no Announce Messages with Siri integration with WhatsApp.

It is my understanding that Apple provides access to the feature for third party developers through an API and it is up to them if they want to take advantage. Suffice to say if WhatsApp and Facebook Messenger integrated with the Announce Messages with Siri feature it would be of huge benefit to people with limited mobility.

It would be great to both hear and respond to WhatsApp and Facebook messages through your Airpods as they come in, in the same way you can with iMessages at the moment.

Security versus accessibility

Another problematic area is that dreaded announcement you receive via Siri on your iPhone when you ask the virtual assistant to read your messages "you need to unlock your iPhone first".

I want to keep my iPhone locked with a pass code and Face ID, but I am unable to unlock my iPhone with my hands so in this scenario Siri is unable to read out my new messages. If I turn off Face ID and keep my phone unlocked this won't happen, but I don't want to do that. I don't want to show a preview of my messages on the lock screen either.

This expected behaviour is about security but surely there could be an override, with sufficient warnings, for accessibility purposes. I just want to be able to access and listen to all my messages, iMessages, emails, WhatsApp messages, FB Messenger messages from a locked iPhone through my Airpods when I ask Siri to read them to me, or as they come in, whichever I set as a preference.

I constantly have to weigh up accessibility versus security, but for me independence and accessibility are important, and I would like to be able to make the choice, rather than Apple saying: "no you can't do that".

Read Colin's previous guest post 'How Alexa can change the life of a disabled person'

Apple Watch

Obviously, it goes without saying, that as an Apple Watch cellular owner, which is also a phone, I would like it to have the same Siri and Airpods functionality I have mentioned above. Siri to end a phone call, Siri to turn auto answer on and off, and Siri to answer and decline phone calls.

Voice Control bugs

Apple’s dedicated accessibility speech recognition app has disappointed since its launch to much fanfare at WWDC 2019. The company appears to have invested little in Voice Control and there are no exciting new features of note.

Dictation accuracy needs improving, as does its editing capabilities, particularly for long form dictation. Sophistication has to go beyond dictating short messages like: "happy birthday" with a cute emoji. Disabled people need more from speech recognition applications, both for education and employment purposes, and keeping in touch with friends and family in the online world we all rely on so much these days.

There are still bugs in Voice Control dictation. If you pause, even for a split second in your dictation, Voice Control applies a capital letter to a word with no reason for the word to have a capital letter. This happens quite frequently in a paragraph of dictation and always has done. When you correct a word the list of alternative words that comes up rarely includes the word you are looking for. I don't think much clever machine learning is being used.

Voice Control dictation doesn't learn from your mistakes, or its failure to recognise the words that you dictate, so the same mistakes keep happening over and over again making it not very productive to use.

How accurate is Voice Control?

Accuracy and performance of Voice Control dictation varies depending on which text box you are using. For example, I have found Voice Control dictation to be most accurate in the iMessage text box, both on iPhone and Mac. Much less accurate in the Mail application, particularly on the Mac, and in text boxes such as Google and WhatsApp in Safari the accuracy can sometimes be very poor.

Accuracy should be the same across all text boxes in the Apple eco system of devices and operating systems. Finally, you should be able to train words, so they are always recognised the way you pronounce them after training.

Speech recognition in Voice Control doesn't come close to what Nuance offers Windows users with its Dragon products, which is somewhat ironic as Nuance had to drop its Apple voice dictation product in 2018 because of the way Apple controls API access to its platforms, it was alleged at the time.

Try writing a long email to a loved one, running a business, writing a book, campaigning or journalism, if you can't use a keyboard and rely 100 per cent on Voice Control recognising your voice and dictating your words accurately onto the screen. It simply isn't up to the job at the moment and hasn't been for the past two years. Soon after it was unveiled, I feared Voice Control would become a bit of a ghetto, a specialist accessibility application that receives next no investment, and is not updated very often.

Mainstream speech recognition app

I think Apple would serve all its customers better if it made Voice Control a mainstream inclusive speech recognition app powered by the improving Siri speech engine. I am sure lots of general users would be interested in it. There is an appetite and need for high quality speech recognition for all sorts of reasons, RSI, dyslexia, physical motor disability, and, who knows, perhaps Long Covid too.

All these accessibility improvements and making it easier for everyone to control their Apple devices with just your voice, will, I believe, prove to be popular with a lot of people generally meaning any improvements the company decides to make will be inclusive. This is exactly how it should be nowadays. Apple devices should be for everyone.

This article was written by Colin Hughes. Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people. Colin is a regular contributor to Aestumanda.

Additional related content