Appily Ever After: Robin Christopherson's speaker notes from FOWA 2012


AbilityNet's Head of Digital Inclusion, Robin Christopherson, presented at Future of Web Apps and Mobile in London in October. A video of the full presentation will be available shortly. 


Robin Christopherson speaking at FOWA London 2012Today I want to talk about how mainstream apps and gadgets that are meant to make the lives of 'normal' users easier are also helping disabled users - sometimes much more so than their able-bodied counterparts.

Let's first hear from accessibility evangelist Matt May of Adobe who explains what I mean (video).

The iPhone is a better place than ever to develop for as its accessibility features just grow and grow. With the iOS6 accessibility enhancements we now have digital hearing aid drivers, AssistiveTouch, Guided Access and Zoom and VoiceOver Can now be run simultaneously. And the new Apple maps feature now allows a blind person to track roads with their finger and investigate intersections and local POIs – how cool is that?

But still issues with even very high-profile apps such as Facebook (recommended friends are invisible to VoiceOver users), Google Plus is just a complete nightmare, the Sainsbury's best deals all just say 'button' to VoiceOver users, but the products in the Tesco app all speak fine so we know it can be done.

There are also apps designed for people with no speech that are being used by those with, say, laryngitis on a blind date (video).

New apps like Fleksy add speed to touchtyping for those without Siri or without speech

And there are keyboards that add convenience for the average user that really help those with a vision impairment like the new Touchfire tactile overlay (video).

The virtual assistants; Apple's Siri and Google's Voice Assistant (video), are battling it out for supremacy to help those with a motor, cognitive or visual impairment. But which one wins when the going gets tough? (video) Take a look at my blog post for more details on how disabled people could be the big winners in the battle of articfical intellgence interfaces.

Then there are apps like Vokul that can do some of what Siri does but completely hands-free. So with the right app you can even be productive if you have no vision and are completely paralysed (like my sister)

To wrap up this section let’s take a quick look at the OneVoice report on mobile accessibility best practice.

For the last few minutes I want to talk about the fantastic potential for disabled users represented by Google’s Project Glass (video). They’re undoubtedly sexy tech but they also offer huge possibilities for people who are blind (image recognition and geolocation information), those who are deaf (speech recognition giving real-time subtitles as people talk) and facial/emotion recognition for people with autism to name but a few.

Here’s a news story on a similar product that can detect emotions - X-ray specs news story (video).

And here’s a demo of emotion recognition software in action (video).

Here’s a news story about a UK firm developing similar glasses but using a mini projector in the arm rather than an embedded screen.

Lastly let’s look at Felix Baumgartner’s 24 mile skydive (video) reaching max 833mph. Highest freefall and highest manned flight and yet he could have died because his visor had fogged up from the temperature outside and the moisture inside – yet another example of where multiple inputs/outputs really count. In this case having spoken outputs from his instruments as well as a heads up display could have helped him know his altitude and hence when to open his parachute. I think that knowing how far you have to go before you are going to plummet head-first into the ground is kinda reassuring, don’t you?

Finally I'm pleased to announce that we're working with the Royal London Society for the Blind on a free event in London called ‘Everybody Technology’ - book now and join us on 30 Nov.

Please follow me on Twitter @USA2DAY and @abilitynet for more of the above!

Default content