Accessibility and Apple’s autumn announcements 2021

Guest blog: Colin Hughes

Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people. Colin is a regular contributor to Aestumanda.

Apple user and disability campaigner Colin Hughes earlier this year shared his experiences of using Apple gear as someone who uses a wheelchair and has only limited use of his hands. He’s now followed up with a look at the accessibility improvements Apple made in iOS 15 and watchOS 8 released this autumn.

Over the past few weeks, I have bought the new iPhone 13 Pro, Apple Watch Series 7, and AirPods 3 and have been trying them out along with Apple’s latest operating system updates iOS 15 and watchOS 8.

Auto-answer calls

One of the biggest gains for me has been the addition of auto-answer for phone calls on the Apple Watch cellular.

A significant part of my advocacy over the past 3 years has been calling on Apple to introduce auto-answer to the company’s Watch. Auto answer has been on the iPhone since iOS 10 so its absence on the cellular Apple Watch, which is a wrist-worn phone, always struck me as odd. I turned on the functionality for the first time recently and soon received a phone call that was clear and, as my disability means I can’t touch the Watch screen, I didn’t need to do anything to manage the call effectively. The functionality brings a level of convenience, security and accessibility that is so important to people with severe upper limb disabilities.

Apple Watch series 7, 6 and 3 in comparison

However, there is a surprising failing – you can’t use Siri to switch on the auto-answer feature! It’s beyond ironic that auto answer, a feature designed for people who can’t touch the screen, still requires you to touch the screen to toggle it on and off. It’s long been the same situation with auto answer on the iPhone.

I would like to see users being able to toggle auto answer on and off by Siri voice commands, “hey Siri turn on/off auto answer,” and by setting up Siri Shortcuts. For example, turn on auto answer when I leave home, at a certain time, when I put the watch on my wrist, and so on.

Additionally, there is a bug: auto-answer remains active when the Watch is on its charger rather than your wrist. This really shouldn’t happen. Like many, I charge my Watch by my bed, so it gets topped up for sleep monitoring. It’s a bit unnerving to have auto-answer calls kick in when the Watch is off my wrist and in a bedroom.

Announce calls and notifications

Another huge win for accessibility this year has been Announce Calls and Notifications. It’s been massive for me. Every day, when I am out, I am answering calls, sometimes important calls, effortlessly, hands-free with just saying the word “Answer”. Unless I had auto-answer on, (which requires me to ask my carer to switch it on for me), I was never able to answer calls. This really increases independence for people like me.

Apple Airpods up close shot with packaging in background

Similarly, it has been a joy to have notifications from with third-party apps like Facebook Messenger and WhatsApp, read out to me while wearing AirPods for the first time with Announce Notifications in iOS 15. As someone who can’t pick up and open my iPhone to read messages and notifications this new functionality makes me feel connected like never before. I've dealt with important Outlook emails, WhatsApp messages, and more besides hands-free, responding and actioning important things promptly, with only what I'm hearing in my ears through my AirPods. This is really liberating and productive.

Siri and Shortcuts

Siri keeps getting better, faster, and more responsive. I can control more of my home devices with my voice, including my front door allowing me to get in and out of my flat independently with the help of Siri on both my iPhone and Apple Watch. "Hey Siri, open the door".

However, despite all this amazing progress there is still room for improvement.

Handling phone calls

You still can’t hang up a phone call with a Siri voice command, “Hey Siri hang up”, which causes me problems most days as I can’t press the red button to end a phone call. The good news is I feel confident that Apple is listening to this gap in provision and a solution is coming and it’s now a question of when, not if.

AssistiveTouch for Apple Watch

When I got the Apple Watch Series 7 one of the first features I was excited to try out was AssistiveTouch for Apple Watch. It’s a feature specifically designed for people with limited upper limb mobility. Here’s what the company has to say about it:

"AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.

Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Centre, Control Centre, and more."

Disappointingly, I am unable to activate AssistiveTouch with the limited muscle power in my arms and hands. It seems I just don't have enough physical movement to trigger the feature. It's made me question who Apple designed it for because on paper it should be tailor-made for people like me. I have just enough upper limb movement to wake the Watch screen and clench my fist but much of the technology relies on users being able to lift their wrist and look at the screen, and hand movement and muscle power to do things, which I don't have. It's a clever idea but its implementation falls short of what people with severe upper limb disabilities need. Hopefully, the company will refine the technology.

Speech recognition

Speech recognition on desktops and laptops, both Apple and Windows, is in a bad place at the moment. My productivity is hanging by a thread thanks only to Dragon Professional, the Firefox browser, and being able to run Dragon Professional with Parallels on my MacBook.

Sadly, speech recognition performs poorly when dictating copious amounts of text. Voice Control has hardly improved this year and remains only good for dictating short, (often error strewn), sentences or two. You couldn't write a 1000-word blog article, run a business, or write a dissertation with its dictation capabilities. It would take you hours of frustration compared to Dragon Professional. As an Apple user I am looking enviously at what Google is doing on the new Pixel 6 with the Tensor chip. Although I am yet to try it, it’s the advanced speech recognition I would like to see Apple provide users who rely on voice dictation on the Mac for work, education and staying connected.

Access to technology and communication is a human right and for some speech recognition is the only means to communicate with the world and do grown up things that go much further than dictating "happy birthday" with a cute emoji. Disabled people who rely on voice access deserve better than that and I believe the big tech companies can and should do more. It's not just Apple, I have tried Windows new Voice Typing tool in Windows 11 and its similar to the limited dictating capabilities of Voice Control.

Face ID failing with CPAP masks but worked with older iPhones

Apple’s facial recognition tech worked flawlessly when wearing my CPAP mask all the way from the iPhone X four years ago up to last year’s iPhone 12 but has ceased doing so with the more compact notch tech in the iPhone 13.

Person holding Iphone showing screen icons

It seems hardware, and software changes Apple made in response to Covid mask wearing, may have had an unwelcome knock-on effect for those of us who wear CPAP masks. I have made use of "unlock with Apple Watch" a new iOS feature which helps you unlock your iPhone if you are wearing a face covering. However, the functionality is a £500 workaround to a function that worked perfectly previously and has shortcomings because it won't work with banking apps and payments; you still need to put your passcode in for those. Face ID’s failure to recognise me when wearing my CPAP mask represents a major step backwards when it comes to accessibility on the iPhone.

Whilst there is a lot to be thankful for with Apple‘s 2021 releases there is some work still to do.

This article was written by Colin Hughes. Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people. Colin is a regular contributor to Aestumanda.

Additional related content