Seven top tips for creating accessible mobile apps

AbilityNet test apps for many of its clientsYou might also be interested in...

These seven top tips will help you crack mobile accessibility and are based on our experience of working with many companies on creating accessible mobile apps. Making apps accessible means millions more people can use your product, not just those with recognised disabilities such as sight or hearing loss.

1. Native controls are best

Use native controls and components rather than developing custom User Interface (UI) elements. Native controls typically have accessibility built in, especially when implementing them in line with the developer guidelines.

2. Make use of platform specific guidance

Both Apple and Google provide specific accessibility guidance for developers. They're well worth following to ensure a high level of accessibility.

3. Test during development

Start thinking about accessibility from the beginning of a project and the development process will be easier and less costly than trying to fix it later. During the process there is a certain amount of in-situ testing that can be carried out under the Quality Assurance (QA) process, or by developers themselves.

Mobile screenreaders such as VoiceOver (iOS) or Talkback (Android) can be used to expose the underlying accessibility of mobile applications. You can test using VoiceOver on iOS and the Android website offers a testing checklist for Android, including Talkback.

Comprehensive accessibility reviews are also recommended.

a snap shot of the BBC accessibility page


4. Develop in-house guidelines

It's useful to have an in-house standard for accessibility to use across applications. While the Apple and Google guidelines are worthwhile as reference for developers, the BBC has developed its own mobile guideline, which you can find on their website.

They're platform agnostic guidelines, covering both native and hybrid apps, with example code for both platforms. You could use these as the basis for creating your own guidelines, or just use the BBC ones instead.

5. Create an accessible colour palette

All accessibility guidelines make reference to colour contrast. Rather than address issues during the QA or development stages, it is best to create a colour palette that makes use of the best colours for accessibility. The W3C web based colour contrast checkpoint is well recognised here, defines standard checks for contrast.

This blog from UX designer Staphanie Walter offers a useful guide on creating an accessible colour palette can be found here. And Tanaguru Constrast-Finder offers a good online resource for testing contrast of specific colours.

6. The benefits of user testing

Technical accessibility is half of the battle, but an application needs to be usable in addition to being accessible. This can be determined through actual user testing with disabled people. You could do this yourself, or outsource to a third party. This web usability testing article is really useful and applicable to mobile as well as regular computers

7. Persona based testing

A simpler way to include provisional user testing during design and development is through the use of 'personas'. You could create a set of personas representing typical users and use these in design sessions to work through potential issues. A useful overview of this approach is available from the gov.uk website

More information

Wayfindr creates a new way for blind people to navigate

AbilityNet Tech4Good Awards 2016Now in its sixth year the AbilityNet Tech4Good Awards showcase some of the amazing ways that tech can help the world a better place, including digital health, skills and young people. The Accessibility Award is judged by AbilityNet and demonstrates innovation in meeting the needs of disabled people. This year’s winner is Wayfindr, which provides a new way for blind and partially sighted people to navigate their environment.

After the excitement of the ceremony in the summer, we caught up with Katherine Payne of Wayfindr to learn more about how it works and what it offers. She explained that the the team is working with Transport for London (TfL) to develop a technical solution that could transform the lives of people around the world.

Tell us more about Wayfindr...

Essentially, it’s a set of instructions and code that can be built into an app, such as the Transport for London (TfL) app. When someone using the app passes a strategically-placed bluetooth beacon, they will get audio instructions and directions spoken via their smartphone to help navigate their environment, for example a tube station. The instructions are very detailed, so they tell a user how many steps they are about to walk down and so on.

THE Journeys are planned using humans and algorithms. It works a bit like a satnav, but with more detailed instructions, and can be used on the Underground or in places without phone signal. We're still doing trials at the moment, so it's not publically available.

Where did the idea for Wayfindr come from?

In their 2014 manifesto, RLSB's (Royal London Society for Blind People) youth forum said they'd love more independence while using the tube and being out and about, they also talked about how useful they find smartphone maps. Something sparked. Within six weeks, digital product studio Ustwo had quickly prototyped a basic version of Wayfindr working with RLSB and the forum.

How is the idea coming to life?

It's not a new idea as such, but we're at the point where consumer technology has caught up so we can use off-the-shelf equipment, such as smartphones and the Bluetooth beacons.

We are creating a standard set of instructions (Open Standard) to be used by app developers and those managing public spaces. This will include advice for developers and a prototype demo app that can be incorporated into apps, for instance Google Maps, or an app for a shopping centre. The Open Standard was created after months of research, testing and designing, while collaborating with experts.

Wayfindr won the AbilityNet Accessibility Award at the Tech4Good Awards 2016

What stage are you at with Wayfindr?

Wayfindr is a joint venture between RLSB and ustwo which received £1m funding through Google Impact Challenge Disabilities programme for the project. Over the last 18 months or so we've trialled it with around 100 people who have sight loss and we are adapting it all the time.

Our trials so far have been on transport networks in London and Sydney, but the idea of the Open Standard is that the user experience would be consistent whenever it is used to make travelling simpler when people are moving from place to place.

Could Wayfindr also be useful for people other than those with sight loss?

Absolutely. We're realising the global scale of the solution. It can be useful for those who struggle to interact with signs or who have cognitive issues, as well as those who find London and crowds overwhelming or don’t have English as a first language. People with learning disabilities or anxiety might also find it useful because it can take some of the confusion out of travelling.

What's next for the project?

It's very much an ongoing piece of work. At the moment we're working on the next release of the Open Standard, which should be available by the end of the year. People are really keen to know when they can have it on their phones and we’re working tirelessly with transport networks to move Wayfindr forward, including looking at how this would work large scale within a transport system.

More information

Check out these blogs from people who've trialled Wayfindr:

AbilityNet resources

Wearable tech that interprets sign language

Wearable technology is poised to bridge the communication gap between the 70 million people in the world who use signing as their first language and the rest of the hearing world. A research team at Texas AM University, lead by associate professor of biomedical engineering Roozbeh Jafari, is developing technology that can detect hand and finger movements and translate them into spoken or written text as we see in this short video.

The wearable technology combines sensors that detect hand motion with other sensors that measure the electro-magnetic signals from the wrist muscles. It is able to recognise the hand gestures of the wearer and then translate the individual’s sign language into text or even spoken words using synthetic text-to-speech.

“We decode the muscle activities we are capturing from the wrist,” says Roozbeh Jafari. “Some of the activity is coming from the fingers indirectly, because if I happen to keep my fist like this versus this [slightly changes hand shape], the muscle activation is going to be a little different.”

A break-through in translation

British Sign Language sign for computer - both open hands held next to each other in front of body with palms facing down. Fingers wiggle.Computers are getting ever better at translation from one written or spoken language to another. There are a huge number of apps, such as Google Translate on your smartphone, that make communicating with others who speak a different language much easier.

Even Skype on your computer now comes with an option to instantly translate what you have just said into another language which is then spoken out loud to the person on the other end of the call – and vice versa.

Sign language, however, provides a completely different challenge, not least because it varies considerably from English as we know it. Though based upon English, It has quite a different grammar (sentence structure) and reduced vocabulary. It is in effect another language, and sign language varies across the world. Translating from British Sign Language into spoken English presents the same challenge as translating from French, for example, into English.

Actions speak louder than words

So the technology of translating from one written, or even spoken, language to another is maturing nicely. Translating from hand gestures, however, has its own unique challenges and this breakthrough tech will help give the deaf a voice and put their actions into words.

The team at Texas AM University hope eventually to shrink this prototype into something the size of a smart watch – reading tendon and muscle contractions and variations in electrical pulses directly from the wrist and translating into everyday spoken English.

More information

Take the AbilityNet CAPTCHA challenge

There's no doubt about it: CAPTCHA is evil. For a lot of people those distorted images you see on online forms are annoying, but for a disabled person they can be a complete showstopper. I have written about how CAPTCHA hinders people with disabilities (Website Security: Sorting the Humans From the Robots) but today’s blog is to give you a first-hand experience of its evil ways.

What is the AbilityNet CAPTCHA challenge?

It's a very simple way for you to understand why CAPTCHA is an accessibility no-go. All you need is pen and paper (or suitable digital alternative).

Below is a video in which I play an example of audio CAPTCHA. I challenge you to play the video, write down the audio version of the CAPTCHA and try your best to get the right answer.

The nightmare that is CAPTCHA

So how did you get on? You didn’t think that the audio would be the same as the visual CAPTCHA did you? That would have been far too easy.

And how many times did you have to play it before you got two answers the same?

I’ve never successfully been able to interpret an audio CAPTCHA. For the record I listened to the video twice and my answers were:

  • 93797800908
  • 923797800908
  • 937978O0090
  • 9237978o00908

You’ve just experienced what a blind person like myself hears when we click on the audio-CAPTCHA icon (usually a wheelchair for some bizarre reason). It’s usually encountered at the very end of an online process for something important such as registering for an account or buying a product.

For many others with a vision impairment, dyslexia or learning difficulty the visual alternative is just as challenging. As it invariably comes at the end of the online process, for these users it’s a tragic tumble you aren’t getting up from - just as the finishing line is in sight.

CAPTCHA Be Gone

Yesterday I came across a new service that may help. Called ‘CAPTCHA Be Gone’ it enables someone to submit a CAPTCHA for solving with a simple keystroke. Within a few seconds you have your illusive answer miraculously pasted to your clipboard.

It’s a plug-in for Firefox so only useful if that’s your preferred browser (which for many disabled users it is) and moreover has a $3 a month service charge. I suspect this is because it employs actual humans solving your CAPTCHAs across the internet.

For disabled people everywhere, however, who know the pain of being blocked from so many services by the evil that is CAPTCHA this is a very small price to pay.

Don’t be evil - Don’t use CAPTCHA

Google famously has a motto ‘Don’t be evil’ and are indeed taking steps to improve on the accessibility catch-22 that is CAPTCHA (more info in that blog on sorting the humans from the robots) but there is still a long way to go before its tyranny is no longer felt by users across the internet.

In the meantime remember the day you took the AbilityNet CAPTCHA challenge. Think about the impact that CAPTCHA, unlabelled images more generally, and so many other aspects of website inaccessibility are having on disabled users every day.

Try to be part of the solution. Try not to be evil - embrace accessibility in everything you do.

Challenge others and spread the word

Now you know how much of a challenge CAPTCHA is why not challenge others?

Share this post on all your social platforms using the widget below - or using the post on our Facebook page.‘Website Security: Sorting the Humans From the Robots’

Rio 2016 Paralympics website: Gold medal lessons on accessibility

Rio Paralympics 2016Over the past couple of weeks, we've been monitoring the Rio 2016 Paralympics site closely. At the start of the competition, we briefly highlighted some of the accessibility issues of the Paralympics site. Now we've dug a little deeper and can provide some examples of issues which need to be resolved in the hope that lessons will be learned for the future.

The main points the site fails on are: 

  • inconsistent skip to content navigation
  • focus indicator is required
  • don't assume understanding
  • tag graphics clearly.
     

Inconsistent Skip Navigation

The Paralympics 2016 website has a 'skip to content' option on some pages but not on others.  The skip option, which is simple to instate, is used primarily by people who have a physical disability that makes mouse use painful or impossible. This could be ULD (upper limb disorder/ RSI) or Parkinson's. Such users instead mainly use a keyboard and tabs.

Users with profound physical disabilities rely on switch input - these are one or two physical switches (buttons/ voice commands/ movements or other) which allow the user to step through the page in a linear way, interacting with any content of interest. Without skip navigation, the method of interaction can be very time consuming as users need to step through all content in a linear fashion.

In order to access the video on the Paralympics 2016 home page, a mouse user would click just once. But because the page has no skip content option, the keyboard or switch user would have to tab 30 times to navigate to the video. In the following image, we've highlighted all of the content that a keyboard/ switch user would have to tab though.

A focus indicator is essential

When navigating a page using a keyboard/ switch it is essential for users to be able to follow their own progress as they tab through content. As they tab onto links, or buttons, an outline shows them which link\button is selected.

This works well on navigation menus on the Paralympics 2016 site. However, on the schedule and results page, tabbing through the results table is not possible for keyboard/ switch users as there is nothing to show which link is selected. This is as simple to fix as one line of CSS (the language which governs how web pages look) eg simply adding the CSS command 'a:focus' to an existing 'a:hover' declaration would resolve this problem for sighted keyboard users.

Don't assume understanding

On the login form and registration page, there are a number of issues.  Firstly, an asterisk is used to indicate required fields, however there is no instruction telling users that this is so – some users may not be aware of the meaning of the asterisk.

In addition, there are no error messages at all on these pages. Instead, when registering with missing or incorrect information, input fields are highlighted in red (see image below). This is particularly problematic for users with learning difficulties, but will ultimately affect all users. Without error messages, users will find it difficult to know why they could not submit the form. This fails multiple accessibility checkpoints, but is not challenging to fix.

Developers need to make sure they:

  • Clearly identify errors to all users (including screenreader users)
  • Describe the error in plain text (eg, which field had the error)
  • Describe how to fix the error (eg, the username field cannot contain any punctuation)

Tag graphics clearly

The table above uses graphic icons to represent competition days/ days with medals. Visually, the table is very useful - a sighted user can easily hover over certain areas of the chart to clearly and quickly see dates and event. However reading through the table using the screenreader JAWS, the highlighted table cell for 9 September is announced as '9th Sept. Link graphic. Nav/medal-indicator. Athletics Fri. 9th Sept. Link graphic. Nav/medal-indicator. Column 4'. This is not very helpful! Friday is also abbreviated to 'Fri'.

It would only take seconds and to make these descriptions more useful. Check out our expert accessibility review of the Rio 2016 Olympics website.

If you'd like AbilityNet's experts to do an accessibility check on your site, click here for more information

Common questions about fibromyalgia and computing

Fibromyalgia has been in the news recently with Kirsty Young the presenter of Radio 4 "Desert Island Discs" announcing that she is going to take a break from the programme as her fibromyalgia is causing her issues. The condition is much misunderstood and causes pain all over the body.  Fibromyalgia can also have symptoms such as non refreshing sleep and clumsiness.   This week (2-9th September) is Fibromyalgia Awareness Week.


FMA Awareness week logo

From day to day I have really sore fingers. I’ve heard about voice recognition. Is it difficult to set-up?

No! Not at all. If you have a fairly new Windows or Apple computer then you have built in voice recognition. It is easy to use and as long as you practice for a while you should be able to get fairly good recognition. We’d always suggest getting a USB microphone though as normally the external microphones on a computer never tends to be of a high enough quality to be effective at recognizing your voice.

I want to keep on typing to control the computer. What might work for me?

Depending on how you are affected by fibromyalgia there are a couple of solutions that might work for you. There are keyboards which are known as “compact”. These don’t tend to have the number pad on the right hand side so it means you don’t have to “stretch” from one side of the keyboard to the other.  Other keyboards have a bit of a “softer touch” so you don’t need to hit the keyboard quite as hard.  Other technology might include word prediction software.

Fibromyalgia causes “brain fog” and I have real issues trying to work. I find software with many options confusing. What can I do?

Within software packages like Microsoft Word there are lots of ways of making things easier for you. One of the most effective is the ability to delete icons from the software that you never seem to use.  This should help you focus more effectively on the functionality that you need to use.

What about smart home devices? Could they help me?

Devices such as Google Home and Amazon's Alexa device can certainly help you in all sorts of ways. If you have poor memory skills you can ask the devices to remind you about certain appointments or things that you need to buy at the supermarket. They can also help you if you feel anxious as there are lots of  "skills" that can improve your mental health or encourage you just to take some "time out" from your day. If you have difficulties sleeping you can always turn on some of the relaxation sounds on as you drift off to sleep.

Case study

Clive's sister Fiona has fibromyalgia and she has lots of difficulties with trying to keep up to date with hospital appointments. They had a chat to our friendly Advice and Information Officer and we suggested using a online diary in conjunction with their smart home device so they could make and more importantly remember important hospital visits. Fiona is a very visual person so one of our volunteers went out and helped her to colour-code her appointments to make them easier to see.

How can we help?

AbilityNet provides a range of services to help disabled people and older people.

Call our free Helpline. Our friendly, knowledgeable staff will discuss any kind of computer problem and do their best to come up with a solution. We’re open Monday to Friday from 9am to 5pm on 0800 269 545.

If you are in work your employers have a responsibility to make Reasonable Adjustment.   For more details on this have a look at www.abilitynet.org.uk/ctod and www.cleartalentsatwork.com

Arrange a home visit. We have a network of AbilityNet ITCanHelp volunteers who can help if you have technical issues with your computer systems. They can come to your home, or help you over the phone.

We have a range of factsheets which talk in detail about technology that might help you, which can be downloaded for free. You may find our factsheets talking about voice recognition and keyboard alternatives useful.

My Computer My Way. A free interactive guide to all the accessibility features built into current desktops, laptops, tables and smartphones.

Accessibility check: Is the Paralympics website ready for action?

At AbilityNet we're running checks to see how the Rio 2016 Paralympics website works for people with sight loss and other disabilities. The Rio 2016 Olympics site failed in several key areas. The good news is that the Paralympic site, at first glance, does look better. However, there are some issues too. See what our checkers have picked up on, below:

The positives

1 There is a ‘skip content’ link
This allows people who have trouble using a mouse, or those using a screenreader, to quickly navigate through site menus by tabbing to find what they want, rather than having to hear content that isn't of interest.

2 Contrast / enlarge text customisation option
Allowing anyone with a visual impairment, colour blindness and those with other disabilities to see content more clearly and easily

3 Most images and buttons do have alt tags
Meaning screenreaders can decipher various elements of a site easily and everyone gets a fuller picture. This was not the case with the Olympics 2016 site.
 


 

The negatives

1 Confusing homepage link
The main Paralympics logo (which is a link to the homepage, as with most sites) is not labelled accurately as 'home page'. Instead, it's entitled 'Go to' which is not clear or useful to a screenreader.

2. Unlinked text for language selection
The words 'language selection' on the home page are static text. There is no option to actually select a language, whether you are sighted or blind.

3 Confused and broken links for auto play
There is a button on the homepage called ‘stop auto play’ but it is not clear what the purpose of this button is and when clicked it just takes you to the top of the page.


Our team works with organisations around the UK to ensure their sites comply with UK law and meet accessibility standards. We will update you as we do further checks on the Rio 2016 Paralympics site.

For more information about how we make the AbilityNet site accessible, see here: https://www.abilitynet.org.uk/accessibility-statement.

Siri update makes AI work harder for disabled users

“Hey Siri, text John to say I’m running 10 minutes late.”

It takes only a few seconds to say, but for someone with a disability the same task could take several long minutes. Artificial Intelligence (AI) is powering a revolution for disabled smartphone users and the latest Siri updates announced by Apple show us how virtual assistants could transform the lives of disabled people.

Let your virtual assistant take the strain

Despite the truly excellent accessibility of many smartphones, having a disability can mean that performing tasks is often time-consuming, sometimes tiring and even painful.

For a blind user it can take time to review the screen with speech software – and typing when you can’t see the virtual keys on a sheet of glass hobbles even the most breath-taking of touch-typists. For someone with motor difficulties each tap takes some time and effort. For someone with a learning difficulty, there is considerable thought and concentration that goes into completing every task – and each separate stage in a process is another possible hurdle where one could stumble.

Enter AI.

Artifical Intelligence has the promise of turning a multi-stage process (a process that requires that you are familiar with its functions and features and can physically interact with its interface) into a much less daunting one in which you simply have a chat with your device in the same way that you would with an attentive and obliging friend who is ever-ready to help. I say “Promise” as, of course, we’re not quite there yet.

Sirious potential

Sirikit expands the power of Siri For me as a blind person to send a text to John in the positively old-fashioned way (i.e. unlock my phone, fire up the Messages app, choose John from Contacts and type or dictate the message and finally hit Send) might take two minutes at best. If I can’t dictate the message because I’m in a noisy environment it can take a lot longer – especially if I have a lot to say. To get Siri, my virtual assistant, to do the same task takes only a few seconds.

I know what you’re thinking – what you may even be positively shouting at your screen. “But Siri is about as intelligent as a toddler and half as reliable” you say. And you’d be right. Well, actually I would argue that in many ways a toddler is far smarter but when Siri works well she works wonders and when she doesn’t then you just move on and you’ve only wasted a few seconds in the trying.

For the average user this ‘First try the AI’ approach might on balance be too frustrating. But for many disabled users it’s a strategy well-worth exploring and, as our virtual assistants get ever-smarter, they may eventually find themselves with an ingrained way of working that will save significant time over everyone else who abortively abandoned AI.

SiriKit set to step up the smartness

Due to be released at Apple’s much awaited event tomorrow, iOS 10 will include amongst its features the new ‘SiriKit’. SiriKit will allow many more apps to be controlled by the virtual assistant which in turn will give us much more choice when it comes to doing things the lightning-fast way.

In addition to those current areas where Siri can be helpful (such as finding out information, making a call, asking for a particular song or playlist, sending a Tweet, creating an appointment or setting a reminder) the following categories of app that will initially be supported include:

·         Messaging - “Tell John I’m running 10 minutes late using What’s App”

·         VOIP - “Call John on Skype”

·         Payments – “Send John £10 using PayPal”

·         Ride booking – “Call a taxi using Uber”

·         Workouts – “Start a 10 minute run with RunKeeper”

It is rumoured that there will be a total of 600 apps that will integrate with SiriKit at tomorrow’s iOS 10launch. The list of supported categories will undoubtedly grow significantly in coming months, as will the number of apps in each category.

Siriously simple to support

It seems pretty straightforward to support Siri’s new capabilities. Developers can readily build an extension within their app that communicates with the virtual assistant. Siri will do all the heavy-lifting - i.e. the difficult job of accurate recognition of the user's speech and the smarts needed to extract their meaning or intent. As a result the app receives a straightforward command to perform a certain task and can then communicate back to Siri for her to display the result.

That’s it. Within a matter of seconds I’ve paid the bill, booked the cab or begun a group video call. And for those inevitable times when Siri lets me down and I have to go the old-style, positively Neanderthal route of labouriously tapping my fingers multiple times on different places on a sheet of glass, I’ll do it gladly in the knowledge that I’ve only wasted mere seconds, that my virtual assistant may well come through for me next time, and that AIs will only get smarter and quicker as the weeks and months go by.

Regardless of how you may personally feel about your virtual assistant, rest assured that disabled users like me are really enjoying the power and productivity they bring. And when our work is done, we can ask Siri for the latest sports scores, quickly catch up with Tweets and posts, find a good place to eat and decide upon the best movie to rent tonight. Just think what we’ll be able to do after Siri’s update tomorrow…

For a full run-down on what’s new in iOS 10 and Siri after tomorrow’s launch check out Apple’s events page and read this TechCrunch article for more information on what SiriKit has in store.

amazon echoUpdate: Siri meet Alexa

Newsflash: It looks as if Amazon’s virtual assistant Alexa is finally coming to the UK. ETA 14 September.

According to Engadget UK’s breaking article on the Amazon Echo we are soon going to be able to use Alexa to perform a plethora of tasks too. Alexa has a similar set of skills to Siri – actually called ‘Skills’ that a developer can use to teach Alexa to interact with apps and services.

Being blind I feel a particularly sweet anticipation for a completely screenless virtual assistant.

With Apple’s event tomorrow, and the final arrival of the Echo on 14 September, a fruitful future for virtual assistance for the disabled is virtually assured.

Pokémon Go: Are incense and Street View the key to more inclusivity for disabled people?

Everyone seems to be playing Pokémon. It's hard to explain the game unless you've played, but the app uses augmented reality so that when you hold your phone camera up, you'll see a virtual pokemon world around you full of various species of Pokémon, who are cute, and a bit tricky to catch.

The big question we’ve been pondering at AbilityNet is “how accessible is Pokémon Go?”.  Few in our office admits to playing Pokémon, but bloggers, reviewers and disability charities have plenty to say on the matter, including how incense can offer a handy fix...

Unstoppable Gamer is not happy that people with a physical disability are almost instantly limited from being able to play Pokémon Go.

One of the site's bloggers AJ Ryan, who uses a wheelchair, has managed to find a fix so you can, to some degree, play Pokéball without leaving the house. It means making a small in-game purchases of “incense”. This seems to attract them so they come to you, rather then you having to go out and get them.

Pokéball flinging and colour blindness

Catching a Pokémon also relies heavily on your ability to see colours and have good target skills, notes the Unstoppable crew. Different colours define how easy a Pokémon is to catch: a green ring means it is easy to catch, and a red one means very hard. So, that's the two main colours that get confused for people who are colour blind!

Flinging the pokeball is pretty tricky too as this excellent video about the craze shows.

The American Federation for the Blind says that audio cues would be good to let you know where the Pokémon is and whether or not you’ve aimed the Pokéball properly.

Audio cues and Pokémon Go Plus

A visitor to Apple accessibility site Applevis agrees: “My sighted husband has been playing the game and we had a bit of a brainstorming session on how it could be made accessible. The most tricky part would be actually catching the Pokémons as you have to 'aim' and flick Pokéballs at it.

I figured some sort of audio cue ala Audio Archery would probably work here. Interestingly, at some point in September a special add-on is set to be available to aid the above gamer, at around $35. It's a small bracelet called a Pokémon Go Plus which vibrates and has an LED light to let you know Pokémon are near. It will apparently let you catch it simply by pressing a button.

Pokémon street view option?

Action for Blind’s guest blogger is frustrated at the lack of thought around disabilities. They suggest one option for inclusivity would be to offer an option to play Pokémon Go from home using a street view application.

AbilityNet’s Head of digital inclusion, Robin Christopherson, says “While on the face of it it seems an entirely visual game it actually lends itself very well to being made accessible – even for blind VoiceOver users. Let’s hope that the practical suggestions provided by these users will lead to accessibility improvements in the near future and that all users, regardless of ability or impairment, can enjoy participating.”

Pokémon Go positive

We should stress that there has been some really positive feedback too, such as the recent BBC clip of autistic teenager and Pokémon fan Adam who has suffered from anxiety and has hardly left the house for five years. He's found the game easy to relate to and he and his mum now love going out into green spaces every evening to catch Pokémons. She says it's changed their world.

The Huffington Post reports too that Ralphie who is six and has autism has found Pokémons to be a great tool to help with social skills. And, guest blogger Shaylee Rosnes, who has cerebral palsy, writes in The Mighty that Pokemon gives her a chance to forget about her disability.

Just a last important note - while we can advise on all sorts of technology and accessibility (0800 269 545) including free home visits for older or disabled people, we do not advise on Pokémons.

Alex Barker and Claudia Cahalane

Three tech innovations for hearing loss

Amelia Lewis, 25, is health and wellbeing advisor for DeafPLUS and a research assistant for the Institute of Cognitive Neuroscience. She tells us the top three tech developments which are most important to her.

1. Cochlear Implant

I had a hearing aid from 18 months. I was born with Pendred's Syndrome. My hearing deteriorated as time went on, so I had my cochlear implant fitted when I was 15, which is absolutely essential to my life.

2. Compilot

My number one piece of tech apart from my implant is a compilot. I wear it wireless around my neck. There's a little microphone on it and in staff meetings I just put it in the middle of the table and it then transmits directly into my ear implant. I can also use it for listening to music on my ipod. There's an additional lead you can plug in and then plug into an mp3 player. It's really good at cutting out background noise.

3. Subtitles

There's so much tech that helps me make the most out of life. Even the basics like subtitles on TV and films make such a difference. It's great being able to access everything said on TV. I love documentaries and never get bored of Friends either.

To hear more from Amelia, check out our video below. 

Pages