VoiceOver on the iPhone turns 10 - and turns blind access up to 11

10 years ago this week (19 June 2009) Apple introduced it’s screen-reading solution for blind iPhone users. Since then VoiceOver has grown up into a really powerful tool - giving access to all aspects of the iPhone and many millions of apps. Now, however, VoiceOver has really come of age and Apple has turned things up to 11.

Display of the different iphone models from 2007

Image source: 9to5mac

No eyes and no keyboard

With the advent of the first iPhone in 2007,the blind community around the world was very sceptical about whether a flat sheet of glass would ever give them the reasonable access they enjoyed with their Nokia or Blackberry feature phones with physical keyboards and third-party (very costly) screen-reading software.

When, two years later, VoiceOver first appeared in the iPhone 3GS, blind users soon came to realise that there concerns were largely unfounded. Although relatively basic compared to the support that VoiceOver offers today, Accessibility menu - voiceover section on iPhonewe were able to access every part of the phone’s features (including Apple Maps – it’s true) and third party apps with ease. 

Typing was indeed a little tricky on that shiny sheet of glass at first. With practice, however, we became more proficient - and VoiceOver also became smarter over time with options such as direct typing (roam around with your finger until you hear the key you want and then lift it off to send), Braille and handwriting input modes (where the entire screen can be used to enter letters or Braille characters) and, of course, the addition of text dictation.

Along with everyone else, blind users benefitted from the simpler interface of the smartphone and it’s blossoming app ecosystem. With apple’s excellent developer tools, most apps were accessible and people with disabilities weren’t left behind in the new, amazing age of mobile computing.

VoiceOver is, of course, a part of the iOS operating system and every year saw new refinements and capabilities to bring us where we are today; at a point where no one can doubt that a smartphone is as suitable a choice for a blind user as it is for anyone else. It has replaced many hundreds of specialist devices with one mainstream, much more affordable, package. Affordable? iPhone? Yes – a talking GPS device was circa £750, a talking notetaker over £1,000 and so on. The iPhone has replaced a backpack of devices and their chargers and, of course, there’s always Android too.

It’s turning 10 - and turning it up to 11

At Apple’s recent week-long World Wide Developer Conference there were many announcements made about accessibility – not least of which was the life-changing full voice control for people with significant motor impairments. For blind users, however, the most exciting features were bound to be the bundle of brand-new abilities in VoiceOver.

In its 10-year history, there has never been a bigger single step in the set of new features coming to VoiceOver, than we see this year with iOS 13. Let’s list them all.

New VoiceOver features in iOS 13

Gleaned from the developer beta that was released at WWDC, the below features seem destined for iOS 13’s final launch in the autumn. It’s possible that some may be removed and equally possible that new ones may be added – but this list is breath-taking as it stands today. With thanks to this post for the low-down – some may only be completely understandable to VoiceOver users (and undoubtedly blow their minds), but hopefully everyone will come away with an appreciation of just how refined the additions to an already ‘Rolls-Royce’ solution they are.


Already swift and responsive, it appears that performance in general has been noticeably improved, particularly when quickly dragging your finger through a lot of items or when switching through screens in an app.

Much more haptic feedback

Haptic feedback is hugely helpful for blind users. There is now subtle haptic feedback for most sound queues including navigating and tapping items, turning the speech rotor and various errors. Also the ability to turn on and off every sound and haptic feedback effect individually, in addition to just having a master switch for sounds and haptics.

Add or customise gestures and keyboard commandsIphone accessibility voice over and modifier keys menus

VoiceOver is driven by a wide range of gestures on-screen, as well as by many keyboard shortcuts if a Bluetooth keyboard is present. You can now reassign existing gestures and keyboard commands – as well as adding new ones to perform different actions. These include basic things like navigating to different kinds of elements, adjusting speech settings, quickly going to the home screen, app switcher and the notification and control centres, to really advanced and powerful ones like running any shortcut. Most gestures can be changed, with the exception of the 1-finger swipes that move through items and the double-tap which performs a tap on whatever is focused. 

Let’s let that sink in … You can now add or change gestures and keyboard hotkeys to do almost anything on your iPhone or iPad.

Custom VoiceOver ‘activities’

You now have the ability to create custom ‘activities’ - a set of settings that can be quickly switched to manually, or be applied when you enter a specific app or enter a specific context like working in a word processor. 

Currently, the settings you can apply include the preferred voice, speech rate and volume as well as its punctuation level.

Custom verbosity levels

The ability to create custom punctuation pronunciation levels in addition to the usual none, some and all is new in iOS 13. These allow you to change whether a character is just past to the speech synthesizer, or spoken in a different way. This can be used for example to change the default pronunciation of the # character from "number" to "hashtag" or even shorter just "hash". These punctuation schemes are synced over iCloud to other iOS devices and macs and can be exported out into a file that can be shared with other people or backed up for importing later.

You can now also completely turn off the reading of Emoji if you wish. If you interact with people that like to spam them without introducing much benefit or include them in their usernames on social media, you can now kill them on an OS-level.

Automatic analysis of images

Blind users have no idea what an image is unless the creator has added a text description. You can now customize how VoiceOver handles image descriptions. Ever since iOS 11 Apple has been using the new machine learning features to guess objects and text on pictures and having VoiceOver read them out if you performed a 3-finger tap to get additional information. Now, you can have VoiceOver read them automatically or have it play a sound to let you know that one is available.

Better camera support

Yes, blind people want and have the right to take photos using their iPhone’s camera and now the camera app provides some extra guidance while taking a picture. In addition to telling you when one or more faces are in frame and where they are, you are now told if you are tilting your device and get additional audio, haptic and spoken feedback to help you hold the phone level.

Screen capture now includes VoiceOver

If you make a screen recording, VoiceOver speech is now included in the recorded audio. Previously, it wasn't. This makes the feature extremely helpful if you want to report accessibility issues to an app developer because you can just make a recording and demonstrate exactly where and how things aren't reading well.

Better Braille support

And finally let’s cover the improvement to Braille support in iOS 13. Blind users often connect an optional Bluetooth Braille display which allows them to read real-time Braille output of everything that is spoken on-screen.
Typing on a Bluetooth Braille display has been speeded up greatly, which should be particularly noticeable when using contracted grade-2 Braille.

VoiceOver now includes the open-source Liblouis Braille translator to provide Braille translation for Braille displays. This has become an industry standard, is used by Microsoft and Google, and supports a larger number of languages. However if you prefer the old Braille tables they are still available as well.

There is also now a separate rotor for changing the Braille language table. Previously, this was tied to the speech language rotor.

Lastly, VoiceOver now displays position information inside a list in Braille. So for example, if you focus the airplane mode switch in the settings app, in addition to the switch itself being indicated in Braille you'll also see a message like "1/50", indicating that this is the first item out of 50 of the list of settings (this doesn't appear to be indicated with Speech for the moment, at least in the context of lists).

Two thumbs-up for VoiceOver at 10

The above may have blown your mind or possibly left you cold (depending upon how much you know or use VoiceOver) but believe me when I say that this is a bumper year for blind users. It’s huge. 

You may be excited for some of iOS 13’s other flagship features – perhaps dark mode, external storage device support, or multiple windows for the iPad, but the above features for VoiceOver are the biggest thing for blind users to come in many years. They take a miraculous device and make it even more empowering for those that are perhaps the least obvious users to accommodate.

Thanks Apple. Here’s to another great decade of accessibility.

4 iphone screens showing the dark mode option

Image source: 9to5mac

Related content: