Use your voice to help you get organised this #UniMentalHealthDay 

The mental health of the over 2.3 million students studying in UK universities is an important topic. With around a third of students reporting clinical levels of psychological distress, this Uni Mental Health Day 2019 students are being encouraged to make their voices and stories heard. One way that your voice can help to reduce stress in your studies is to take advantage of some of the many features and functions of smart assistants.

Take control with smart assistantsPhoto of Google Assistant

We all know about the virtual helpers within our phones and tablets, and increasingly common are those smart speakers on our desks and bookshelves. They can perform a wide range of tasks from giving us useful facts and information such s the weather or news, playing music, sending texts or starting calls. When they work they save time and feel a little like magic.

But did you know that your voice and these smart assistants can also help keep you organised, study more efficiently and reduce those stress levels?

Let's take a look at a few top tips to reduce Uni-related stress to more manageable levels.

Attack absent-mindedness with appointments

This first one's a no-brainer. There's nothing more stressful than being late for important meetings or forgetting to turn up for tutorials. Use your voice assistant to add recurring appointments that cover all your classes and extra-curricular activities so that you know where to be during your working week. It's easier than you think; simply ask your Google Assistant, Siri or Alexa to create a new appointment and they'll prompt you for the rest. Tip - when they ask for the day and time say something like "Every Friday morning at 10am." 

Don't forget to put in where each appointment is. Giving each appointment a location means that Siri or Google's Assistant can help to get you there if you need a little help those first few times. They'll also know how long it will take you to get there and will remind you in good time to leave. No more missed lectures or dates for coffee. 

Recruit the power of remindersExample iphone screens showing reminders being used on Siri

This is another obvious one. All of the smart assistnats give you the ability to set reminders that will be triggered at a certain time or place. For example, ask Siri, "Remind me to hand in my essay next Friday morning at 10." or say to your Google Assistant, "Remind me to speak to Dr Smith about extra time when I'm at the Engineering Department." Taking the jumble of things to remember out of your brain and dumping them into your smart assistant will undoubtedly help you feel more organised and in control.

If you're in the US, then your Echo can do location-based reminders too. Even though Alexa is confined to a cylinder in your room, the companion app will use your phone's GPS to trigger the alert as you approach that location. This will be coming to the UK soon - just like all new things with Alexa, they take a few months before we get to play with them too.

Don't forget that you can also set recurring reminders; "Remind me every Thursday at 10pm that it's my turn to empty the bins."

Enlist the power of lists

This too is a simple but powerful idea. Finding a piece of paper to write down a task or note may be easy enough, but finding those scraps of paper again when you need them might not be. If you're creating longer lists of points, then you'll always need that same bit of paper to hand when the next notion strikes. It's easy to lose a piece of paper, but we all know where our phones are all of the time - don't we? Take the stress out of capturing those tasks and important notes by using your voice to pop them into your personal digital assistant.

Asking Siri, say, to "Add toilet paper to my shopping list," will do just that. You can then ask her to show you your shopping list when you're next at the supermarket and you're away. Never again will there be another communal toilet-paper crisis like the famous one of Freshers' Week 2018… and there's nothing more stressful than that. With both Siri and the Google Assistant you can create as many lists as you like. With Alexa, at present, you're limited to 'Shopping' and 'To-dos' - but again the companion app will keep track of them so you can review them whenever and wherever you are.

You're probably thinking that making a few lists isn't going to make a massive difference to the levels of anxiety associated with Uni life - but we're reducing stress wherever we spot it to make things more managable when we can.

Stay focused with the power of tomatoes

Feeling anxious about getting down to work and continually looking for distractions? If the prospect of open-ended, unstructured study is stressing you out then you may The Pomodoro Technique infographic: decide on the task to be done, set the timer to 25 minutes, work on the task until timer rings, take a short 5 minute break, take a 15-30 minute breakwant to enlist the power of tomatoes. Well, pomodoros to be precise. What am I talking about?? 'Pomodoro' is Italian for 'tomato' and the Pomodoro Technique is a simple way of segmenting up your studies into managable juicy chunks. 

This is how it works:

1. Choose your task (planning that essay or revising those notes) and start a timer for 25 minutes. This is your first Pomodoro. Commit yourself to giving your full focus to the task for that length of time - it isn't long after all is it? If you suddenly realize you have something else you need to do, note it down for later but don't get distracted from your task.
2. When the Pomodoro rings, put a satisfying checkmark on a piece of paper. Congratulations! You’ve spent an entire, interruption-free Pomodoro on a task.
3. Now take a short (5 minute) break. Breathe, meditate, grab a cup of coffee, go for a short walk or do something else relaxing (i.e., not work-related). Your brain will thank you later.
4. Ready for the next Pomodoro? Go for it. After 3 or 4 Pomodoros you deserve a 15-30 minute break. That's it. You're now using the Pomodoro Technique to work harder and smarter.

"But where," I hear you say, "do voice assistants fit in?" Well, you can obviously use them to set those Pomodoro timers so that you don't lose track of time. On the Echo,Tomato helper symbol however, there is a skill called Tomato Helper which will do the timing for you. During each Pomodoro you can choose to listen to a quite ticking noise or else just silence. During each break you get some fabulous music to chill out to. To start the skill just ask Alexa to "Open Tomato Helper."

After work, wind down with help from your smart assistant

When some work has been done and your meetings not missed, when the shopping is fetched and some to-dos ticked off,  you can de-stress with the help of your voice and your smart assistant. While Siri and Google's Assistant can launch your favourite tunes or open an app or two to take your mind off work, there's no better choice when it comes to entertainment than Alexa. With over 32,000 skills (over 50,000 in the US), Alexa can play every sort of game from quizes to immersive audio-adventures and all using your voice alone. Regular readers of my posts may be aware that my enthusiasm for Alexa's many talents are featured daily in the 'Dot to Dot' podcast. If you have an Echo, why not try asking her to play a game or list her top skills? And if games aren't your thing when it comes to unwind, then just ask her what's on TV or in the local cinema. She also knows addresses for local nightlife and their opening times etc. Once you've had a great, relaxing evening then you can tell your smart assistant to tick that item off your to-do list.

Related articles:

 

Note-taking hacks to improve student mental health and well-being

It is unsurprising that 71% of students say university is one of their main sources of stress. For the staggering 1 in 4 students who suffer from a mental health condition, stress can make existing problems even worse. 

Having good note-taking skills, which is a common part of any educational experience, can give an important boost to any student. OneNote, which is free to download on Windows and iOS, is a multiplatform digital notebook for capturing thoughts in free-form text, drawings, screen-clippings and audio notes. 

Technology can be harnessed to help students focus on their mental health and organise their time better for improved wellbeing. OneNote enables students to feel in control, which not only helps general mental wellbeing but can alleviate mental health conditions such as bipolar disorder, anxiety, depression and ADHD. 

Discover the ways note-taking in OneNote can make university life easier for you:

Researcher

Researcher in OneNote is an invaluable tool for students who are not sure where to start. In just a few easy steps Researcher helps you to find relevant quotes and citable sources to start an outline in OneNote. Begin by typing a keyword for the topic you’re researching, and then sit back as OneNote pulls together a list of relevant and credible sources.Screen grab example of OneNote Researcher in use 

Bonus Hack!

Concentration can be a real challenge at the beginning of a project, even more so if you live with conditions such as ADHD and depression. Whether you’re in a lecture or you’re trying to keep to a deadline, use Microsoft’s Focus Assist in Windows 10 (Quiet Hours in earlier versions of Windows) to switch off distracting email and social media notifications in order to stay focussed.

Note Tags and To-Do Lists

Student Minds encourages small mindful actions to manage self-expectations, such as writing to-do lists, because “clear, time-specified goals enable us to succeed, as goals can be achieved”. Tags in OneNote are perfect for creating intuitive and versatile to-do lists.  Use interactive tick boxes or create custom tags to organise both typed and handwritten notes. Your notes can be visually flagged by importance, action, question or topic. You can search your notes by tag, and on OneNote 2016 you can even see a tag summary, which gives you the option to narrow your search scope and create summary pages of chosen tags.Screen grab example of OneNote tags in use

Share Notebooks

With group work, make sure you are all on the same page with OneNote. In OneNote for Windows 10, Mac, iOS and online you can share whole notebooks with other people and give them access to view and edit content. The ability to virtually collaborate on shared work in real time can alleviate the pressure of coordinating busy schedules and allows you to avoid crowded and loud communal areas.

Bonus Hack!

Working with new people you are not familiar with in group projects can be daunting and difficult. The ability in OneNote 2016 and for Mac to show and hide the authors of edits is a valuable tool for differing study preferences. Perhaps seeing the author’s initials clutters the page for you or seeing the pace of your peers’ work creates unneccessary pressure? Hide the authors and settle into your own rhythm. Or maybe seeing the authors of edits provides the insight and motivation you need to feel more at ease in a group project? Reinstate the authors with the click of a button.

Tell Me

Microsoft’s new Tell Me feature in all Office 365 apps, including OneNote, is a multi-purpose feature. A lightbulb icon for those eureka moments aptly represents a text box which says ‘tell me what you want to do’. Simply type in a feature you want to use or an action you want to do and Tell Me will take you there. Struggling to understand a word used by an author? You can use Tell Me to access Smart Lookup, which researches and defines the term you enter. 

AbilityNet can help

AbilityNet is a UK charity that helps people to use technology to achieve their goals. If you have questions about disability and technology you can call us on 0800 269 545 or email enquiries@abilitynet.org.uk.

Related Reading:

The radio industry and accessibility

With World Radio Day on our radar in February, AbilityNet chatted to stars of the radio world who are blind or have sight loss about life on air and behind the mixing decks. How has the accessibility of radio technology changed over the years and what do they love about their work? 

Interviewee 1: Darren Paskell, Mushroom FM

"I joined Mushroom FM in 2015 to rediscover my passion for internet broadcasting and devote some of my time in pursuit of an enjoyable hobby. I love the station because of its reputation for professional production standards and its well established spirit of innovation. We have a great community of listeners.

It’s easier than ever to be an internet radio broadcaster as a person with sight loss, thanks to some excellent developments in accessible software. StationPlaylist, which makes professional software for the broadcasting industry has always been mindful of accessibility needs. StationPlaylist Studio has many advanced features and can handle the whole process of broadcasting, from playing music and managing schedules to connecting to the internet radio stream and monitoring listening levels.

Brian Hartgen from the station Team-FM has worked to build add-ons and adaptations with StationPlaylist software which makes it even more accessible to use.

Radio broadcasting used to be so much more complicated. All that's required these days for a budding broadcaster is a decent Windows-based computer, a USB microphone/headset, some content and a radio station to complete the link between you and your audience. I would encourage anyone interested in internet broadcasting to get involved, we love welcoming new people into the industry."

Listen to Darren's Mushroom FM show, Sunday for Tea on Sundays from 5pm-7pm.


Interviewee 2: Allan Russell (pictured below), RNIB Connect

"I started volunteering at RNIB’s Connect Radio in 2001 and was offered a post as content producer in 2005. It’s a full-time job involving production and presenting. I didn’t think about radio presenting until I lost my sight and had to leave work. I had to take time to train with a white cane, a guide dog and a screenreader and my previous work took a back seat.

Allan inside the RNIB studio surrounded by equipment wearing headphones

While learning to use JAWS screenreader software on an RNIB course, I heard the charity was thinking about setting up a radio station and looking for volunteers and things evolved from there.

I use JAWS screenreader with a PC and VoiceOver with my smartphone or tablet and use a mix of both for editing and researching stories. We’ve got a mainstream studio set up, similar to those used by most radio stations. It’s not completely accessible, but it’s used because RNIB wants presenters to be able to use mainstream equipment, should the opportunity to work at a station like Radio 2 arise.

Studios are very tactile anyway and I use JAWS for some aspects of the work. The playout system, RCS, that we use for promos, adverts, music and to schedule shows is accessible to a point. Though I have a support worker through Access to Work to help with this side of things, i.e. setting up shows, putting audio on to live logs and voicetracking programmes. When I go out and about, I can do things on my own. I have flash mics for recording when outside, they record everything in the microphone so it's simple. My job is always different. People think you just come in and talk on a mic, but there’s a lot more to it."

Listen to Allan on weekdays at 8am presenting on Early Edition: RNIB Connect Radio's newspaper review and discussion programme around issues relating to blind and partially sighted people. RNIB Connect Radio is available across the UK on Freeview channel 730 and online.
 

Interviewee 3: Brian Hartgen, Team-FM

"I’m a computer programmer by trade and in the last ten years I’ve made many adaptations to the StationPlaylist software to make it more accessible both for myself, my wife Lulu (pictured below) and the rest of the crew at Team-FM. These adaptations are also available for people who are blind or who have sight loss and work at other radio stations.

Lulu Hartengen speaking into mic, smiling

I write JAWS for Windows Scripts as my day job and so some of the accessibility measures are about getting the StationPlaylist Suite communicating with JAWS more clearly to improve accessibility for everyone.

I create scripts for StationPlaylist Studio, (for planning playlists and song playout), together with StationPlaylist Creator, (for station management and scheduling).

We refine accessibility as the need arises. I’m lucky to have development skills and there’s always something to improve upon.

We now sell the adaptations as part of our consultancy company. Without such adaptations blind people have to rely either on human assistance or convoluted methods to put together a radio show. A good example would be having to quickly learn how long a music track still has to run. Without using the JAWS script we’ve created, that is a very slow and complex thing to do. 

Most station software is Cloud based these days, but I don’t feel that method sounds anywhere near as good as the StationPlaylist suite of products and we prefer it for that reason.

Listen to Brian at 7pm Saturdays on his Music Machine show - an interesting musical mix from different genres and at 5pm Sundays on Our Time - with a selection of music from the 60s and 70s."

If you have a disability and need help using technology, contact the AbilityNet helpline on 0800 269 545.

Check out 7 radio stations with an accessibility / disability slant here. 

Winner of Global Mobile Accessibility Award 2019 announced

I am very proud to once again be a judge at the Mobile World Congress in Barcelona this week (25-28 Feb). Now the winner of the award for 'Best Use of Mobile for Accessibility & Inclusion' has been announced. And the winner is…

The glamorous GloMos

The Global Mobile Awards (or GloMos for short) celebrate the best of mobile in dozens of categories as diverse as 'Best mobile tech breakthrough', 'Most innovative mobile app' and 'Best mobile innovation for health and biotech'. 

Podium set for announcing the winners at the GloMo Awards ceremony. "Intelligent connectivity #GLOMOAwards" on the backdrop

At a glitzy ceremony on Tuesday, the winner of the best use of accessibility award was announced. The well-deserved winner was Motorica and Beeline for their smart prosthetic limbs for children.

Motorica and Beeline for smart prosthetic limbs for children

The team at Motorica are working to revolutionise the power of prostheses. They say that, "Disability shouldn’t limit human capabilities - modern technologies can expand them. We want prosthetics to be perceived not only as medical equipment but also as delivering its own unique super ability. Our users are already modern cyborgs and superheroes" 

Photo of the Beeline and Motorica team accepting their award at the GLOMO Awards

Together with Russian mobile network giant Beeline, Motorica have developed cutting-edge IoT- enabled prosthetic hands. Most children rely on dummy- hands until they’re 12- 13 years old because high-tech products exist only for adults. Now, instead of useless cosmetic prostheses simulating the presence of a hand, both children and adults have the opportunity to use high- tech and truly functional products that expand their possibilities and bring joy to the patients.

Judges comment

Read out at the ceremony was the following quote from the judges:

“Most children without upper limbs have to rely on immovable ‘cosmetic’ hands because high-tech products are only usually available for adults. Thanks to this fantastic work, young children now have the opportunity to use a range of smart, connected and fully-functional hands that greatly expand their abilities at home, in school and at play. Truly life-changing.”


Related articles:

Hot shortlist for Global Mobile Accessibility Award 2019 announced

Volunteering your IT skills to help others

AbilityNet has a network of volunteers that provide free IT support to older people and disabled people of any age. This month AbilityNet (AN) went up to Scotland to meet up with some of our volunteers in the north of the UK. As part of the trip we sat down with Chris Grant (CG) to get a first-hand perspective of what it’s like to be an AbilityNet ITCanHelp Volunteer. Here’s what he had to say…

Interview with an AbilityNet ITCanHelp Volunteer

Profile photo of Chris Grant, smiling facing the cameraAN: Hi Chris, it’s great to be able to meet face-to-face like this and thank you for agreeing to be interviewed. To start us off would you mind introducing yourself?

CG: Sure! My name is Chris Grant. I’m an AbilityNet ITCanHelp Volunteer and also the County Co-ordinator for AbilityNet’s network of volunteers in Scotland.

AN: And am I right in saying you’ve been volunteering with us for coming up to a year now? Thinking back, how did you first hear about AbilityNet?

CG: Well whilst browsing a website for local volunteering opportunities I came across the role at AbilityNet and applied straight away. I was really impressed with what AbilityNet was doing and their mission. I read a lot online about your services which made me keen to start volunteering as soon as possible - following a successful interview process that is!

AN: We shared your enthusiasm! So having been with us a fair few months now, can you tell us more about your role and what are some of the common things you help with as a volunteer?

A photo of the Facebook login screen in an internet browserCG: I mainly work with our volunteers in Scotland to ensure they can help people that contact AbilityNet for support and in a timely manner. I also create links with local organisations to market AbilityNet’s ITCanHelp services. As a volunteer myself I often support people in their homes with Facebook related queries and helping people to stay safe online.

AN: Thank you, and can you tell us a bit about your life prior to becoming a volunteer?

CG: Prior to joining the AbilityNet family I volunteered with a number of organisations to support their IT projects. I also had two spells at BT as a Customer Service Advisor for Business Repair.

AN: So throughout your career you must have met and helped many people. In your time as a volunteer with AbilityNet do any particular moments stand out?

CG: You meet some lovely people volunteering. I really have met some great people. The support from AbilityNet’s central team is amazing and I have a lot of time for them. When it comes to the people we’re helping there’s one lady in the North East of Scotland that I’ve developed an amazing friendship with. I get great pleasure from helping people with their technology, but I also enjoy connecting with people on a personal level, so if they want to talk about what’s happening in their life and their favourite TV show then even better!

AN: Supporting people with their technology, you must be very knowledgeable about IT. What apps or device features have you been most impressed by recently?

Vector graphic image via Microsoft website of a mobile phone camera being used to read aloud that there is a taxi on the streetCG: IT is such an ever-progressing thing, and over the last few months I’ve learnt about a lot of different assistive technologies. I’m very impressed with Seeing AI which is an app that uses the camera on your phone to read text out loud and allows you to check the amount of money in your hand amongst other things. It’s described as the ‘talking camera app for those with a visual impairment’.

AN: That’s amazing and it’s great to hear you’re always discovering new things in your role. So, for someone reading this, realising they could probably use their IT knowledge to help others, what would you want to say?

CG: If you want to use your IT knowledge to help people, then get in touch with AbilityNet and apply to become a volunteer.

AN: And before we finish, what are three attributes you need to be a good volunteer?

CG: You need to be energetic, reliable and committed.

AN: Thanks so much Chris for sharing your experience with us. Its there anything else you’d like to say?

CG: Get involved with AbilityNet. Being a volunteer is such a rewarding cause and great fun. You meet so many lovely people and remember, if you can help just one person, think about what they can now do all because of your support!

Find Out More

If you’d like to use your IT skills to support people you can enquire on our website about becoming an AbilityNet ITCanHelp Volunteer in your local area.

If you or someone you know could benefit from free IT support at home you can request a home visit on our website, call our helpline on 0800 269 545 or email us at enquiries@abilitynet.org.uk.

Two amazing ways that blind people can borrow other people's eyes

Smartphones - and increasingly smart glasses - have made it possible for blind and visually impaired people to instantly benefit from someone else's eyes and intelligence. Let's look at two top solutions in this space.

AIRA: iOS and Android

The first solution we're going to look at is AIRA. Shortlisted in the mobile accessibility and innovation category in the 2019 GloMo Awards and now available in the UK, AIRA uses the camera on your smartphone - or built into a pair of smart glasses - to bring assistance to people who need to borrow a pair of eyes and someone's trained support at those crucial times when they need that little extra help. Let's see it in action in this short video.

As we see in the video, there are many instances where the combination of a remote pair of eyes and efficient, informed support can make life easier for those with little or no vision. For example:

  • having those remote eyes look around and tell you where you are
  • having your agent use Google Maps and their training to guide you from where you are to where you need to go via the best and safest route
  • having the built-in software provide audible chimes to count you down to the next intersection as you walk (with additional verbal guidance from your agent if required) 
  • help to find that illusive pedestrian crossing button
  • help to find and then read airport or train station departure screens (or have the agent look up the data online)
  • help in guiding you to the right platform and onto the train and finally to a free seat,
  • having the agent look up the Facebook profile of a colleague to ensure that they can then help you recognise them in the crowded foyer prior to that important meeting.

The ability to use the service hands-free with smart glasses and not have to hold your phone in front of you also makes for easier and less socially-awkward interactions - particularly when the assistance you need is longer than a few seconds or a few minutes. 

In a recent article I wrote about three cool smart glasses to help people who are blind or have sight loss we can see the AIRA solution among them – and it most certainly earns its place in this list.

The service isn't free. Monthly price plans start at around £89 for 100 minutes of assistance. This includes the smart glasses, insurance and training on how to use them.

Be My Eyes: iOS and Android

Be My Eyes was a winner at the Tech4Good Awards 2018. Like AIRA, this free app uses the phone's camera to quickly connect with willing helpers and get sighted assistance with everyday tasks.

Be My Eyes, however, differs in two very significant ways; it's free (so no monthly subscriptions) which is an obvious plus in its favour, but on the flipside the people you connect with aren't paid professionals with the panoply of digital backup services that we've seen in action above - they're willing and often very able to help but don't expect to keep them on the line for a long session of in-depth assistance.

Not yet with smart glasses support, this app is really meant for shorter (but hugely helpful) snatches of assistance at the tap of a virtual button. Be My Eyes is absolutely perfect for those moments when your computer has stopped talking to you, when you need to know if the dishwasher has finished its cycle (flood anyone?), when you're out and about and wondering where that postbox, doorway or doughnut stand is, or when you need to use that ATM and would rather a stranger across the internet helped you with the buttons than a stranger in the queue who could easily then stroll off with your money.

You too can be a Be My Eyes helper. Simply download the app on Android or iOS and select the option to be a sighted volunteer.

Example of Be My Eyes home screen with message "Call first available volunteer"Example of volunteer screen on mobile device being asked which button is HDMI 2 on a TV remote control


Related articles:

Barclays: Creating an accessible organisation

Is your organisation thinking about accessibility? Do you provide an accessible working environment for staff and customers? What could you do better to provide this? AbilityNet's TechShare Pro 2018 event featured a popular session on organisational accessibility, with presenters from Barclays and Google.  

In this blog, David Caldwell, senior digital accessibility consultant - who presented for Barclays - gives some top tips on creating an accessible culture at your organisation.

David Caldwell writes:

In 2012, a collaboration with RNIB for the Making Money Talk Campaign led us to become the first UK bank to enable our cash machines to talk. It was to help blind customers and those with sight loss use cash machines without assistance. We had thought about accessibility before, but this led to a real catalyst change at the company.

Talking cash machines really got internal leaders interested. We then began running a one-day immersive programme for leaders called 'Living in a Customer’s World' and started working with the Business Disability Forum for advice on how we could build on what we were doing.

David Caldwell

David Caldwell (pictured above)

There are 80,000 colleagues across the business, 25,000 of who are technologists. The rest are working in a really broad spectrum of positions from those on the high street to others working with complex financial instruments and those in investment banking. Trying to get accessibility embedded in all of that has been a challenge. But there have been wins, such as being the first bank in the UK to do video relay for BSL users in branch.

Accessibility for Barclays staff

There’s a strong adjustments process for staff who might need adaptations in how they work, which colleagues get from the moment they join the company. If there’s something a member of staff thinks they need to do their job, it will be given to them. If you’re dyslexic and you used Read and Write Gold with your last employer, we’ll provide it. There’s no need to see an occupational therapist.

We also work and partner with disability organisations so that if, for example, a member of staff or their manager thinks they might be dyslexic, but they’ve never been tested - we work with them on testing and support.

Barclays also has a couple of programmes that look at employing people with disabilities, who might find it harder to get into the workplace (disabled people are far more likely to be out of work than the average population). Able to Enable is one of the programmes and is part of our apprenticeship proposition.

In addition there is the Purple Champions, an allies programme for disabled staff and those who wish to continue to raise awareness around disability issues.

Accessibility for customers 

Colleagues in customer facing roles receive training in how to interact with and best support customers with disabilities and mental health issues, while colleagues working within our technology departments receive training on how to design, develop and deliver accessible and inclusive products and services.

Then there are our Digital Accessibility Champions who have gone through the baseline accessibility training and additional, more detailed training. The programme primarily, but not exclusively includes those who have tech role, ie developers, designers and testers. We work with them to become champions and develop a closer relationship between them and our in-house accessibility consultants.

We currently have 90 champions on-board and they each get a consultant within the digital accessibility team to go to if they need to know more and understand more about accessibility.

David Caldwell's top five tips on company-wide accessibility:

1. Where to start

The Business Disability Forum (BDF) is a useful place to start. They have an Accessible Technology Charter. We’ve found this very useful for getting leaders engaged and focused. Leaders like frameworks like this as it’s a consistent approach used by other organisations and we can provide data on what other companies are doing.
 
2. Getting colleagues and leaders engaged

While the messages need to be different for colleagues and leaders, it’s important that you approach things from both angles. Broadly speaking, leaders are interested in things which help achieve organisational goals or objectives e.g. growing the top line, reducing complaints, reducing costs e.t.c. Colleagues are more interested in outcomes for themselves or improving the work they do for others e.g. improving the user experience, building their skillset, reducing effort, for examples.

3. Make it easy

Don’t subject people to standards unless they ask for it – focus on the impact accessible technology can have. Use a culture change approach to embed a new mindset about accessibility. Build tools and process to support colleagues to deliver accessibility consistently. For example, we’ve built a training programme called the Accessibility Academy which is a role-based training programme for colleagues. It includes specific resources for leaders too. We’ve also built accessibility into web development processes to make accessibility easy and deployed test tools to allow colleagues to check their own work.

4. Work with partners

Doing this alone is hard, if not impossible. Find partners who can help at both a strategic and practical level. Generally, this would look like a testing partner and then someone to help them look at the bigger elements of work that need to be done to become more accessible and inclusive e.g. training development/delivery, procurement, deploying testing tools etc.

5. Measuring progress

You can measure progress on accessibility with the BDF’s Accessibility Maturity Model (AMM). There are five steps to go through. They’re used to seeing maturity models for technology and so it feels familiar. The AMM framework provides not only the ‘you’re here right now’ but also a ‘how you can get here’ process.
 

Related content:

Did you miss TechShare Pro 2018? Here's what you missed.

Want to come to TechShare Pro 2019? Find out more here. 

Make your website accessible here. 

Our top 4 tech tips for Encephalitis sufferers

Today (February 22nd) is World Encephalitis Day and whilst a comparatively rare illness, the after effects can be long term and can cause difficulties with memory issues as well as physical movement. Here are some questions that we get asked regularly...

Frequently Asked Questions

Since my diagnosis, I've found that I get tired when trying to produce work. What tech can help?Photo of a woman holding a mobile phone and using its voice recognition app

Voice recognition is a really useful piece of technology and will help you get work done in a more effective way. You can edit documents and also control many of the functions of your computer. Enrolment is quick and easy too - all you need is a USB microphone and a little bit of patience! The built in voice recognition is surprisingly good and once correctly set up, the system should recognise around 95% of what is said if you speak clearly.

My memory has got worse with my illness and I need to be reminded to take my medication. What might help me?

You can download apps on your smartphone which will remind you when it's time to take your drugs. If you have a smarthome device such an Alexa you can download a skill to help you remember your medication schedule

I am now finding it difficult to control my pointing device. Are there any alternatives?Selection of alternative computer pointers

There are lots of other pointing devices that might be more efferctive for you to try out. We've got a Factsheet which has lots of ideas and companies are often happy to let you try equipment out before you buy to see if it is suitable for you. You can also slow down your pointing device to make it more manageable.

I'm struggling at work and I don't know if my employer will understand my issues. How can I start a conversation with them?

It is difficult to start a conversation about how your health is affecting your work with your employer. It might be useful to create a profile on Clear Talents as it is a resource that is designed to aid you in discussing your difficulties with your employer. Your employer might feel that it is appropiate for a work place assessnent to take place to identify useful solutions. 

Case study

Gulbadam contacted our helpline as she is recovering from Encephalitis and wants to be able to use her voice to control her Iphone and Ipad.  We pointed her towards our My Computer My Way resources, where there are some great examples of what functions you can access by voice.  She  is not very good at remembering appointments so the ability to use her voice is going to be very useful!

How we can help:

AbilityNet provides a range of free services to help people with disabilities and older people use computers and other digital technology to achieve their goals. There are a number of ways and situations in which you can contact us and request our help:

  • Call our free helpline - our friendly, knowledgeable staff can help with many computer problems and questions about adapting digital technology to your needs. Our helpline is open Monday to Friday from 9am to 5pm on 0800 269 545.
  • In a work environment, all employers have a responsibility to make Reasonable Adjustments to ensure people with disabilities can access the same opportunities and services as everybody else. For more details read How to Identify Reasonable Adjustments and visit the Clear Talents website.
  • Arrange a home visit - we have a network of volunteers who can help if you have technical issues with your computer systems. They can come to your home, or help you remotely over the phone.
  • We have a range of factsheets which can be downloaded for free and contain comprehensive information about technology that might help you.
  • My Computer My Way - a free interactive guide to all the accessibility features built into current desktops, laptops, tablets and smartphones

Related content:

AI is making CAPTCHA increasingly cruel for disabled users

A CAPTCHA, (an acronym for "completely automated public Turing test to tell computers and humans apart"), is a test used in computing to determine whether or not the user is human. You've all seen those distorted codes or image-selection challenges that you need to pass to sign up for a site or buy that bargain. Well, improvements in AI means that a crisis is coming … and disabled people are suffering the most.

CAPTCHAs are evil

Whatever the test - whether it's a distorted code, having to pick the odd-one-out from a series of images, or listen to a garbled recording - CAPTCHAs have always been evil and they're getting worse. The reason is explained in an excellent recent article from The Verge; Why CAPTCHAs have gotten so difficult. Increasingly smart artificial intelligence (AI) is the reason why these challenges are becoming tougher and tougher. As the ability of machine learning algorithms to recognise text, objects within images, the answers to random questions or a garbled spoken phrase improve month on month, the challenges must become ever-more difficult for humans to crack.

Example of a hard to read CAPTCHA

Jason Polakis, a computer science professor at the University of Illinois at Chicago, claims partial responsibility. In 2016 he published a paper showing that Google's own image and speech recognition tools could be used to crack their own CAPTCHA challenges. "Machine learning is now about as good as humans at basic text, image, and voice recognition tasks," Polakis says. In fact, algorithms are probably better at it: “We’re at a point where making it harder for software ends up making it too hard for many people. We need some alternative, but there’s not a concrete plan yet.”

We've all seen the 'I am not a robot' checkboxes that use clever algorythms to decide if the user's behaviour navigating the website is random enough to be a human. These used to work well - letting us through with that simple checking of the box - but increasingly the bots are able to mimic a human's mouse or keyboard use and we get the same old challenge of a selection of images popping up as an additional test of our humanity.

The Verge article quite rightly bemoans the place we've arrived at – highlighting how difficult these ever-more-obscure challenges are for people with normal levels of vision, hearing and cognitive abilities. We just can't compete with the robots at this game.

Don't forget the disabled - we're people too

But what about all those people who don't have 'normal' abilities? People with a vision or hearing impairment or a learning disability are well and truly thwarted when it comes to CAPTCHAs that test the vast majority of humans to the very limit and beyond. After reading the article, I came away feeling that this very significant group (a fifth of the population and rising) deserve a mention at the very least - after all, they've been suffering in the face of these challenges far, far longer than those who do not have a disability or dyslexia (and have been locked out of many an online service as a result).

At the very heart of inclusive design is the ability to translate content from one format into another. For example, if a blind person can't see text on-screen, it should allow the ability to be converted into speech (that's how I'm writing this article). If someone can't easily read a certain text size or font style or in certain colours, then it should allow for resizing or the changing of fonts and colours - this is all basic stuff that most websites accommodate quite well. Images should be clear and their subject easy to understand - and they should include a text description for those who can't see it at all. Audio should be clear. All aspects of ‘Web Accessibility 101’.

The whole point of CAPTCHA challenges is to allow for none of these. No part of the challenge can be machine-readable or the bots will get in. Example of an image Captcha, asking to select the squares with road signs in them Text can't be plain text that can be spoken out by a screenreader for the blind - it has to be pictures of characters so excruciatingly garbled that no text-recognition software can crack it. Ditto with an audio challenge. Pictorial challenges must be so obscure that object recognition software can't spot the distant traffic lights amongst the foliage etc, etc. It has ever been thus. 

Today the road signs need to be obscured by leaves because the bots are better than ever at recognising them - but five years ago the images were still chosen to be just complex enough so as to thwart the bots of the day. And because the bots are using the same machine-learning AI as the assistive software used by disabled people to convert content into a form that is understandable to them, they were locked out too.

Did I mention? - CAPTCHAs are evil

So long as websites want to keep the bots from registering spam accounts or posting bogus comments, there will need to be some way for developers to detect and deflect their attempts. The use of CAPTCHA challenges, however, is not and has never been a fair (or even legal) one. It discriminates and disenfranchises millions of users every day.

So, whilst the article neglects to mention the significant segment of users most egregiously affected by CAPTCHAs, I'm hopeful that its main message - namely that this arms-race is rapidly reaching a point where the bots consistently beat humans at their own game - is a herald of better times to come. 

As CAPTCHAs actually begin to capture the humans and let the bots in, then they begin to serve the opposite objective to that intended. They should then disappear faster than a disillusioned disabled customer with money to spend but wholly unable to access your services.

So what's the alternative?

Companies like Google, who have long provided commonly-used CAPTCHA services, have been working hard on a next-generation approach that combines a broader analysis of user behaviour on a website. Called reCAPTCHA v3, it is likely to use a mix of cookies, browser attributes, traffic patterns, and other factors to evaluate 'normal' human behaviour - although Google are understandably being cagey about the details.

So hopefully by now you get the bigger picture. Hopefully you're saying to yourself, "Ah, but will the clever analysis cater for users who aren't so average or will they once again be excluded by not being 'normal' enough?" Excellent question - I'm glad you're on your game and on-board.

For example, will I, as a blind keyboard-only user of a website, be flagged as a bot and banished? Will a similar fate befall switch users (like the late and much missed Prof Stephen Hawking) who use certain software settings to methodically scan through a page. Dragon users issue voice commands that instantly move the mouse from one position to another in a very non-human way. I could go on.

I hope you get the picture. Moreover, I hope that Google and other clever types working on the issue elsewhere get the picture too. They certainly haven't to date. 

Related articles

‘Web Accessibility Guidelines’ turn 10 but still less than 10% of sites are accessible
A new way to log in will put an end to passwords, and that's good news for people with disabilities
Take the AbilityNet CAPTCHA challenge 
Website Security: Sorting the Humans From the Robots
Why are disabled people treated like spammers? 

How the Techshare Pro audience responded to Slido

We set out to make TechShare Pro, our flagship accessible technology conference, as inclusive an Screen shot of mobile phone showing Slido app questions from Techshare Pro audience to the  main stageexperience as possible. This included the audience participation elements. In the run up to the event, we worked with Question & Answer (Q&A) tech platform Slido to make their app more inclusive so we could road test it at the event.  Here's how we got on at Techshare Pro...

Our Head of Marketing & Communications at AbilityNet, Mark Walker, introduced our journey with Slido during the welcome session. Just a few minutes later audience questions started to appear on the screens of our devices. The first few questions were more about people saying hello to Mark on stage, but quickly the tool was started to be used in all seriousness.

“Is Slido a good way to improve accessibility at events?”

Over the course of the day we had around 200 questions from our delegates and around 240 likes (people showing support for someone else’s question) to help order the relevance of the questions. The most popular question on the main stage was actually a question about Slido itself: “Is Slido a good way to improve accessibility at events? Tick if the answer is yes. From Mark on stage!”.  With 10 likes, it seemed our audience agreed. 

We used Slido throughout the day and In the conference breakout sessions the most popular question was “How do you select the users apart from their disabilities?”, a question asked to RNIB and Google during the user research talk.

Digital questions are trackable questions

Other than being able to capture everyone’s questions during the day, and providing an accessible platform for people to interact with, using a digital Q&A platform also means you can track topics and analyse the results. For example, the most frequent word used in the questions was, no surprises here, ‘accessibility’! Barclays, Google or Apple were also popular topics. Having that insight into the topics that interested our audience is invaluable for planning TechShare Pro 2019. 

Lessons learned for the next event

This was our first event using Slido and the feedback on the day in person, on screen and on social media was very positive. Our Q&A’s are now more inclusive and the questions asked were definitely more interesting for everyone. 

For the next event we have learned that we probably need to have a dedicated screen for Slido during the talks to make it more present for the audience. In the sessions where Slido was on screen all the time, we had more questions of a better quality from our delegates.

We now know more about the admin part of the platform and have discovered some little tricks that help us to be quicker. And at this year’s conference we will probably start using the polling functionality to interact more with our audience.

Improving accessibility through partnership

We set out to make TechShare Pro more accessible and inclusive and we achieved this by working in partnership with Slido. It was an absolute pleasure to collaborate with their enthusiastic, determined and hard-working team. I would like to give my very special thanks to Klara Sarkanyova for helping us during the process and on the day. 

Slido now can say that by working with our accessibility consultants they have made the process of asking a question easier not only for people too shy to put their hands up, but also for people with speech disabilities, dexterity issues or language barriers.  

Questions, anyone?

Related articles:

Part 1 - How assistive technology can increase interaction at events

Pages