Three Apple updates and how they help disabled users with voice access

Guest blogger Colin Hughes puts Apple’s autumn hardware and software updates under the microscope and looks ahead to what more the company needs to do around voice accessibility in 2023.

Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people. Colin is a regular contributor to Aestumanda.

Collection of Apple products including laptop and echo and smartphone: 'Homekit'Voice technology aids independence

Apple’s voice technology, baked into the iPhone, iPad, Apple Watch and Mac computers, has prevented a lot of shrinkage in my independence in recent years, and in some areas of my life has extended my independence.

As my progressive muscle wasting disability, muscular dystrophy, has become more challenging to live with I have turned to voice technology to get things done. It gives me independence, makes me feel safe, and instils confidence and peace of mind. The confidence to go out for a walk alone in my wheelchair knowing help is a Siri voice request away, if I had a problem.

There’s a lot of focus at the moment on new features for the iPhone and Apple Watch like car crash detection, and satellite SOS if you find yourself stuck in some remote place like the Sahara Desert.
 
However, people like me can get into problems in more mundane situations. For example, inside my home, going to the shops, or heading to the pub down the road. With Siri to help me make a phone call, control my smart home, or send a message hands-free, I have a lot more confidence and independence indoors, and away from home.

Apple Accessibility and HomeKit

This video shows how Apple Accessibility features and smart home tech has enabled me to maximise my independ​ence:​​​​​​

Three updates that make a big difference

Apple Watch Ultra - side viewThis autumn has seen a slew of Apple software and hardware releases, including iOS 16, macOS Ventura, watchOS 9, the iPhone 14, Apple Watch Series 8 and the new Ultra smart watch (pictured) as well as AirPods Pro 2, but what do they offer severely disabled people who rely on voice commands to control their devices.
 
I often find that it can be the smallest things in hardware and software that can make the difference when it comes to accessibility. In the past few weeks Apple has introduced three accessibility features that might sound quite small but are making a big difference to my independence.
 
I have been calling for these improvements for the past four years, and I am delighted Apple has listened, and that all three are present in the company’s recent updates.

1. Siri call hang up

iPhone users have long been able to ask Siri to initiate a phone call, but there was no way to ask the voice assistant to hang up a call. That left me reliant on the person on the other end of the line to do this for me – or left me stuck in a voicemail box unable to terminate a phone call.
 
Siri logo - swirl inside multicoloured circleApple has fixed this in iOS 16 – and it has meant a huge amount to me, and others who rely on voice to control our Apple gear. I’m not one for public displays of emotion but being able to hang up my own phone call on my iPhone for the first time ever, hands-free, with just my voice, was a special moment! Lump in the throat, a well-up-in-the-eyes moment! The sheer relief of finally being able to do what many take for granted was a special moment I will never forget. 
 
Siri call hang up is a very inclusive feature and should be useful for anyone who has their hands full. For example, washing up and not wanting to touch your phone screen with wet or dirty hands.

2. Auto-answer calls

In 2016 I successfully campaigned for Apple to introduce auto-answer calls on the iPhone. Since then, you have been able to have phone calls auto-answered hands-free without touching the iPhone – but you couldn’t toggle the feature on and off via a Siri voice command. iOS 16 fixes this.
 
Some disabled people find it easier to set their iPhone to auto-answer incoming calls.  However, one of the great ironies has been that a feature designed for people who have problems touching the iPhone screen up until now required them to do something they couldn’t: Go into settings and flip an on-screen toggle to turn on auto-answer.

So, for the past couple years I have been calling on Apple to enable Siri to turn auto answer on and off with a voice command “Hey Siri turn on auto answer”. I am delighted the company has listened and the ability is now here in iOS 16 and watchOS 9.
 
I don’t want auto answer on all the time, and I want to be able to turn it on and off myself without having to ask a carer for help. Personal independence is so important to disabled people, practically and psychologically, and technology can really help with this. This feature may sound niche, but it is massive for people like me.

3. Update to Announce notifications

An iPhone charging in a dockA year ago, in iOS 15 Apple introduced the ability to have messages announced when wearing the company’s AirPods, and reply to messages with your voice as they come in. 

A year on in iOS 16 Apple has extended the announce notifications feature so you don’t need to wear AirPods for it to work. The feature will now work on the iPhone speaker, which is helpful when my iPhone is sat on a charger stand on my desk next to my MacBook.
 
These three updates may not seem a lot to some people, but they add up to a lot in my day-to-day life, and the ease and independence they offer cannot be overstated.

Future accessibility improvements wish list

The job of improving and extending tech accessibility is never done.

Looking ahead to next year these are the accessibility improvements I would like to see from Apple in 2023.

Smarter replies to messages

Although a Siri command lets you reply to messages, I believe it could be smarter.
 
My friend Jane sent me a message on WhatsApp recently. I listened to her message as it came in with Announce notifications. I forgot to reply to her for about two minutes after I listened to her message. At that point I said, “Hey Siri reply to Jane” and even though I used the word “reply” it defaulted, without telling me, to sending the message I dictated by iMessage/ SMS and not WhatsApp – where Jane’s message two minutes earlier originated via WhatsApp. This confused Jane.
 
You can instruct Siri to “Reply with WhatsApp,” but it should automatically use the same messaging platform – and ideally note which chat apps you use with which contacts.

A read back toggle, when sending messages

iOS offers two toggles that should ensure Siri always reads back dictated messages before sending – but they don’t work reliably in iOS 16. It would be better to have a single toggle to instruct Siri to never send a message without a read back. 
 
I have Automatically Send Messages set to OFF and Reply without Confirmation OFF and yet when dictating and sending new messages, or replying to messages, what you dictate is rarely read back to you by Siri before the message is sent. In effect, your user preference settings are being ignored.
 
I have recently discovered there is a way of getting your dictated messages read out to you before sending, all you need to say is “read it” before sending, but I preferred the previous iOS 15 behaviour where you were given a choice whether to change the message, or to send it.
 
I am pleased to note there is a new Siri accessibility setting in iOS 16.2 beta 3 released in the past few days that should help with this, so it looks like a fix is on the way next month when iOS 16.2 is released to the public. 

Access older messages

The iPhone lets Siri read out incoming messages immediately after they arrive – but not if some time has passed. Siri asks you to unlock your phone first, which is not something that can be done when someone is entirely dependent on voice control. I believe this, and other issues, could be solved with a new iPhone unlock method. 

iPhone unlock via voice authentication

Voice Control does let users speak their passcode, but this isn’t secure when around other people. Given that Siri on HomePod can now recognise individual voices, being able to unlock your iPhone by simply saying, “Hey Siri, unlock my phone” and having it check for a voice match before doing so would be perfect.

Keyboard dictation with Siri to send a message

The new keyboard dictation enhancement in iOS 16 means that, for the first time, iPhone owners can seamlessly switch between typing and dictation. But in this mode, there is no way to instruct Siri to actually send the message with a voice command as you can on a Pixel 7 phone.

Automate auto-answer calls with Siri Shortcuts 

Now that you can turn auto-answer on and off with a Siri voice command in iOS 16 I would now like to see Apple go further and intelligently automate it.

I would love to be able to create a Shortcut to automatically enable auto-answer when my carer puts in my AirPods, or puts the Apple Watch on for me.

Improvements to Voice Dictation on the Mac

Woman speaking into microphone connected to computerWhile Siri dictation on the iPhone is incredibly helpful for short messages, there are still significant shortcomings in Voice Control dictation for longer dictation on a MacBook that can be incredibly frustrating when you are trying to write a report, blog post, work assignment, or essay.

To offer up real-world illustrations: 

  • a) I have a friend called Wojtek, which isn’t in Voice control vocabulary. I have added his name in custom vocabulary with a capital “W” Wojtek. However, when I dictate his name, Voice Control dictation always transcribes with a small “w” wojtek. 
  • b) I frequently communicate with a company called SpeechWare. I have added the company name to custom vocabulary with a capital “S” and “W,” but when I dictate the word it is transcribed as “speechWare.”

There are further strange anomalies with the way Apple dictation handles grammar. Both Voice Control dictation and Siri dictation always transcribe sun (the fiery orb) as Sun with a capital S, (like the UK tabloid newspaper), even when you are dictating a sentence about the weather!

The verb “will” is often transcribed as “Will” – as in the man’s name – when the context should make it obvious you mean the verb, for example saying, “It will be hot later.” These sorts of errors are annoyingly common and draining to have to constantly clear up. 

Spelling Mode should add entries to custom vocabulary

The new Voice Control “spelling mode” in macOS Ventura (only in the US at the moment) allows you spell out words that dictation doesn’t understand – but it doesn’t remember your preferred spelling of a word, so the same mistakes keep happening. This is so inefficient.

Voice Focus mode for dictation

When you don’t have the luxury of dictating in a very quiet, studio-like environment and you rely on dictation for everything you write, and thus need to be able to dictate in noisier environments too, the MacBook internal microphones work less well – which is why I use a SpeechWare external one.

In the recently released Surface 9 Pro, it appears that Microsoft may have come up with a solution to this. They call it Voice Focus mode and the feature uses AI to isolate a voice, and discard background sounds. This is something I would love to see Apple do for disabled users who are completely reliant on dictation.

Always-on Hey Siri on the Apple Watch

You can set the iPhone to always be listening for Hey Siri voice commands, but not so on the Apple Watch. That is, you can choose the setting, but it requires you to twist your wrist and lift it to wake the watch first – something severely disabled people can’t do.

I’m sure that limited battery life has been the reason for this Siri behavior on the Apple Watch, but with the recent release of the new Apple Watch Ultra with its three-day battery life, I would happily accept a reduced 1.5-day battery life if Siri could truly be always listening without any wrist movement required, just as it is on the iPhone.

Looking to the future releases

It’s been a stellar year for voice accessibility and Apple technology, particularly for those of us who rely on voice enhancements. Apple has clearly been listening to what disabled users have been saying, and that’s great to see. However, there is more work to do if Apple devices and operating systems are going to be truly accessible to disabled people. 

If you are an Apple user and have accessibility issues or suggestions Colin would be interested in hearing from you: info@aestumanda.com 

Colin may not be able to reply to everyone, but welcomes feedback.

Further resources: