How merging minds with computers could help disabled people

The ultimate way to use a computer is by thought alone – you think it and it happens. While comprehensive brain control is some way off, today’s tech is already pretty mind-blowing and the thought-controlled tech of tomorrow has the backing of billionaires.

The rise of brain-computer interfaces

It's long been the goal of both the military and assistive-tech manufacturers to enable computers to be controlled by thoughts alone. Back in the mid-nineties at AbilityNet we were shown a headband which was purported to read brainwaves and control the mouse on-screen.

Elon Musk is one of several billionaires known to be developing mind control interfaces

The headband had a number of sensors that attached to the forehead, but in reality we strongly suspected the electronic signals these sensors picked up were mostly governed by energetic facial muscle gymnastics than anything our brains were thinking.

We soon learned to contort our faces in certain ways to make the arrow progress up, down or across the screen in the vague direction we’d intended. A successful click was achieved by a concerted effort to think “click” but again was probably more to do with the intense furrowing of the brows than any brainwaves we were generating.

Nevertheless this solution was used very successfully, if very slowly, by people with no other means of controlling a computer. For users able to combine this tech with the incredibly expensive and often temperamental eye-tracking technology of the day, it was even more effective.

Fast-forward a few years and we see a positive proliferation of brain-computer interfaces – many of which are being used to help people with paralysis control their environment, as I outlined in my recent post on a new age of thought-controlled robotics.

While tech billionaires like Amazon’s Jeff Bezos can have lots of fun developing and driving giant robotic suits (as we see below), the ability to do away with the need for manual control of not only robotic helpers but how we interface with technology in its broadest sense is a goal well worth pursuing.

The century of the cyborgs

For several years we’ve been able to peer into the human brain and understand much of its activity – as we see here in this video demonstrating how, using an MRI scanner, researchers at Berkeley can actually reconstruct the movie someone is watching from a brainscan alone.

However, if we want a seamless connection with computers, their processing power and knowledge, we need to be able to control technology with our thoughts. The dream of adding to our own brain all the power and potential of artificial intelligence and the internet is an awesome one for some – but it’s a dream that another tech billionaire, Tesla CEO Elon Musk, hopes to hasten into reality.

According to a recent article in The Wall Street Journal, Musk is heavily investing in a brain-computer interface venture called Neuralink. The project is centred on developing devices that can be implanted in the brain - with the ultimate objective of enabling human beings to ‘merge’ with software and enable man’s cognitive abilities to keep pace with ever-faster advancements in artificial intelligence. This is the ‘cyborg’ (or cybernetic organism) so familliar in science-fiction.

Over the last few months Musk has often referred to the need for more sophisticated interfaces, hinting at the existence of Neuralink. Recently he told a crowd in Dubai: “Over time I think we will probably see a closer merger of biological intelligence and digital intelligence,” adding: “it's mostly about the bandwidth, the speed of the connection between your brain and the digital version of yourself, particularly output."

The benefits of having such ‘brain extensions’ are truly breathtaking - if still a long way off. For now, however, we’ve quite enough tech to get our heads around. Advances in brain-computer interfaces still continue apace.

Ever-better brain-computer connections

In recent months we’ve seen several announcements of advances in technology which could lead us to believe that Musk’s vision may be more than idle fancies.

These include thought-controlled software that can type at the speed of the average one-finger typist and developments in mind-reading computers that can translate thoughts directly into words.

 

Thus, while we may still be some way away from a cyborg future where our every thought is interpreted by the brain-implants we have embedded in our heads and our every wished command is acted upon with the swiftness of silicon (or possibly sub-atomic quanta), we have certainly come a long way from that first headband.

In the same way that interacting with computers with everyday natural language is not only helping people with disabilities, but is soon set to become second-nature to everyone (possibly supplanting the internet as we know it), we will one day take for granted that those who need it most will be talking directly to their technology through the power of their minds alone.

 

Why retro gaming is better for physically disabled people

Naomi McAdam is a gaming student at City College Brighton. She’s passionate about game design and uses a whole range of tech to help her deal with her spinal quadriplegia level C3. From the text-to-speech apps built into her phone to touch screens and her tablet, tech is a key part of her college success.





The tech I love, by Naomi McAdam


1 Speech-to-text
I use speech-to-text on my mobile phone all the time, it's really, really helpful. I use it for texts, emails and especially for work. It's much easier than writing.

2 Touch screens
Typing on a touch screen is 10 times easier than pressing buttons because I have no physical strength in my fingers. So pushing on a button like with the old Nokias was freaking impossible!

3 A tablet
I use Photoshop a lot because I'm an artist, so using a tablet is extraordinarily helpful for me instead of using loads and loads of paper.


My tech wish

Although her tech gives her the power she needs every day, there are plenty of things Naomi is hoping to see in the shops soon.

“With video game technology, I use big controllers that have got massive buttons, but obviously you can only use those controllers for retro gaming now. You can't use them for say, a PS4. There's been nothing recent made, that I know of, that's been made for disabled people to game with.

“Back in early 90s, Nintendo and Atari made these big big controllers that my father went out and got for me especially to play Zelder or Mario. They stopped making them because of the lack of demand.”

Thanks to Naomi and the film students at City College Brighton for working with us on this collaboration. You can see the full series, here, including Hugo, who is dyslexic and James, who is deaf.

 

If you're a university student who has a disability and you'd like help with tech, we might be able to help, here
 

Siri on the Mac and 9 other simple tech fixes that will make your life easier

1. Did you know virtual assistant Siri is now built into the latest version of OS X on Macs?

2. Would you rather dictate your documents and emails for your computer to do the typing? Try the Voice recognition option built into PCs. Information on voice recognition on Macs is here

3. Do you have difficulty spelling certain words, but always spell them the same way – ie wierd and weird - Global Auto Correct will help dyslexic people and others with their individual spelling mistakes

4. If you have trouble reading, or prefer to give your eyes a rest, try a free text-to-speech reader like natural reader.

5. Struggle with mouse buttons? Try Dwell Clicker 2 from Smartbox on a PC

6. Do you sometimes find it hard to see where your cursor is on screen? You can follow these instructions to make your cursor bigger in Windows 7 or 8

7. If you are, or know, an older or disabled person who needs help with tech and IT, AbilityNet can send you a free DBS checked volunteer to help with computer, smartphone, tablet issues.

fact check keyboard button

8.  If you're a student who has a disability, including a mental health problem, AbilityNet has a team to help you with tech

9. We can make your website accessible and inclusive, meaning a much wider audience reach

10. If you want the chance to come and work with us, we’ve got some of the best accessibility jobs in the world on our website 

  • Call our free helpline on 0800 269 545 or more information or go to abilitynet.org.uk for loads more help and advice on getting the most out of your technology and IT.

Robin Christopherson Receives MBE from His Royal Highness The Duke of Cambridge

Image of Robin Christopherson receiving his MBE from The Duke of CambridgeRobin Christopherson MBE, Head of Digital Inclusion at AbilityNet, has recently received an MBE for his services to digital inclusion.

Robin was presented the MBE by His Royal Highness The Duke of Cambridge at Buckingham Palace shortly after it was announced in the New Year Honour’s List.

A founding member of AbilityNet in 1998,  Robin began AbilityNet’s accessibility consultancy team in 2003. The team is now globally acclaimed for its expertise in accessibility auditing and disabled user testing, as well as helping clients design attractive websites and mobile apps that are both accessible and easy to use by all.

The MBE comes shortly after Robin’s winning the Tech4Good Special Award 2016, which recognised his longstanding contribution to inclusive design and digital accessibility.

Robin said:  “Receiving the MBE has meant a huge amount to me. I actually didn’t feel that my work was anything other than a pleasure and an opportunity to tell people about technology and solutions that I already felt very passionate about.

“Having been able to speak to audiences and decision-makers about the power of tech for nearly two decades has been its own reward.

“But receiving this honour is the icing on an otherwise already very tasty cake!”

He continued: “Going to the Palace was an extraordinary experience.

“It was amazing to be surrounded by so many who have made significant contributions to people’s lives across a wide range of areas, and meeting His Royal Highness, The Duke of Cambridge, was incredible.

“He was very interested and well informed. Considering there were over seventy people receiving their honours that day he must have a prodigious memory and really care.”

And Robin remains as humble and passionate as ever about his work in digital inclusion: “I didn’t feel like I personally needed to be recognised as I’ve just felt lucky being able to do what I do.

“But if it will help get the message about how tech can transform people’s lives regardless of their particular abilities then it’ll be an amazing milestone in my life.

“Just enabling more people to hear the phrase ‘digital inclusion’ with regards to my honour will help.

“And if anyone out there would like to feature a news item or blog post on their website about why it’s so important to all users then I’d be delighted to provide more information.”

If you want to write about Robin and the need for inclusive design and digital accessibility you can email Robin at robin.christopherson@abilitynet.org.uk or call him on 0800 269 545 for more information or an interview.

Robin also writes numerous articles on the power of cutting-edge technology on the AbilityNet website.

 

 

The latest in accessible camper vans and ready made meals - what's new at Naidex 2017

Naidex is a great chance to go and see the newest equipment that might help people with disabilities. Held at the National Exhibition Centre near Birmingham it showcases everything from ready prepared meals right through to campervans specially designed to maximise accessibility.Naidex show at the NEC

I had carefully marked off all of the important stands that I wanted to visit. Instead I wandered from stand to stand because nearly all of them had something very interesting to demonstrate!

Of course there are lots of technology companies showing off their latest products.   Alongside these companies however are some companies that have very good ideas that seem very simple.

Nudgu is one of these services which allows carers of people with conditions like autism and autism to record phone messages and then have them played to their relatives at specific times. Handy if you want to remind someone to take their medication. You even get a message telling you if your message has been received or not.

REMAP were also at the show and they can come up with a custom solution if you have a problem and there isn't any commercially available technology that can help you out.

Ram Mounts were also at the show and it was interesting to see how their mounts actually connected onto wheelchairs.

Well known communication specialists such as Abilla Toby Churchill and Tobi Dynavox were also in attendance but small companies such as Dad in a Shed were there too with their Articuloud software which can turn any Android smartphone into a fully  functional AAC device to help with communication.  Tippy-talk were also there showing off their instant communication app which gives the chance for a non verbal child or adult to use pictures which are translated through text messages.

Moving away from communication, there were companies showing off OCR built into glasses so if you have conditions such as dyslexia you can have text read out to you with a discreet voice in your ear, Games controllers were also much in evidence with some nice hardware from Lepmis.

I think one of my favourite devices though was made by a company called VirtuCare who have brought the world of virtual reality into care centres where some of the patients might have dementia. If I didn’t know better I could have sworn I was watching the Northern Lights whilst wearing one of their VR headsets. It was so realistic I could hear the fire crackle outside of some 'virtual' log cabins!

Shows like Naidex do really show how technology is helping people with disabilities regain and keep their independence. I'm already looking forward to next year's show.

Dyslexic student's top 3 tech hacks to improve grades

Technology moves so fast and for people with dyslexia, it can be even harder to keep up. As part of our Me, Myself and IT mini documentary series, made with City College Brighton, this two minute video sees dyslexic film student Hugo Hobs talk us through his favourite tech for managing modern student life. 

My top tech for dyslexia

How I do my assignments

  • Grammarly has really helped. It goes through your work and sorts out spelling and grammar and other words that might need to change.
  • Then I go through Thesaurus.com to add better words. This gives me a higher score and gets me to the level my colleagues are at.

Voice recording to remember details

  • I use an app on my phone called Google Keep to record meetings with other colleagues, or chats with the teacher about a deadline, for example.
  • The recording syncs up with Google Drive so I can access the recordings on my laptop to make things easier.

More information

Me, Myself and IT is a series of films about how disabled students use tech to get through college, made by students at City College Brighton. 

A big thank you to Hugo and the students at City College Brighton for their work in making these films.

Alexa, bots and how a future without websites could help disabled people

Robin CHristopherson MBE is Head of Digital Inclusion at AbilityNetThere are websites out there delivering everything from essential medical information and government services, to hate speech, fake news and the latest juicy posts from friends about their love lives. We’ve also seen the rise of the bots: bots that intervene to assist us when we’re messaging friends, those that help us with instant chat-support online, and those that annoy us when used to snap up all the tickets for a gig in the first seven seconds.

So, what is the evolution of our interface to the internet - and will it be friend or foe?

The death of the web as we know it?

Websites have been around for decades. First simple pages of information linked together to form an interactive digital booklet or brochure, then increasingly complex creatures that behave more like fully-fledged online applications. Websites now also have quite a bit of intelligence sprinkled around in the form of chatbots that can often feel as real as a human when responding to written enquiries.

abstract futuristic photo of man using mobile with robot alongside

But bots can readily exist in other habitats too – they can be found in popular messaging apps (such as Google’s Allo) popping up to help with useful autocorrect and emoji responses or with Siri-like assistance to typed questions, as well as at the end of the phone where a more natural conversation with a robot is often far preferable to the universally-loathed spoken menus.

My favourite poet, Pam Ayres, once wrote about such automated systems in one of her shorter pieces: “If you would like to meet the person who invented this system and shoot him with a gun, press 1.”

Bots, and virtual assistants more broadly, are undoubtedly here to stay and getting cleverer all the time.

Will increasingly intelligent bots eventually do away with the need for websites altogether? Will the labourious job of manually clicking our way around pages of information, navigating through menus and perhaps having to resort to a search or two one day feel as antiquated as writing a message on paper, putting it in an envelope, sticking on a stamp, and walking to the nearest postbox in order to communicate with another person?

Instant information without lifting a finger

In a recent article on Tech City News entitled ‘AI and chatbots: The future of customer service’ Richard Stevenson, CEO of Red Box Recorders, explored a future in which intelligent chat-based interfaces may well completely obfuscate the need for companies to have a website. The bot could be text-based or voice-driven, but the central feature will be its ability to have a natural conversation with the user and be able to answer any question, however complex, relating to its domain.

Once you’ve experienced the ease and satisfaction of asking a virtual assistant (such as Amazon’s Echo or Google Home) to give you the latest news or a piece of information, play music, perform a task such as setting a reminder or alarm, or operating a connected device such as a thermostat or lights, you might never want to lift a finger again.

Inclusive bots = profound benefits

For people who literally can’t lift a finger, or for whom a disability presents other challenges when it comes to using technology, an intelligent bot or virtual assistant that can understand natural language requests has even more profound benefits.

You can get the tiniest of insights into the many ways a virtual assistant like the Echo can be used, by listening to my quick Echo demos podcast (currently at fifty episodes and counting, there's lots to share!)

The vast majority of websites still present huge issues for people with disabilities. Whether it’s unlabelled images, links or buttons that require a mouse, intrusive ads that confuse and distract, autoplaying videos, a cluttered appearance and wordy text, or just poor choices for colours and fonts, the internet is a challenging place. 

And just how many of these issues does someone without a disability encounter on a daily basis? Accessibility is certainly not just a disability issue – especially in this mobile-first world.

photo of the Amazon echo cylinderWe’re still some way away from AIs that are intelligent enough to enable us to entirely avoid using websites to manually trawl for information or to carefully complete online forms to use a service or order a product, but we can already do all of those things very successfully in many areas using the existing semi-smart virtual assistants on the market today, and they’re getting smarter all the time.

In a recent episode of iOS Today, hosts Leo Laporte and Megan Morrone demonstrate the relative intelligence of each of the most popular assistants and discuss the pitfalls of trying to provide single definitive responses to questions that haven’t got simple answers.

It’s possible to say how tall the Eiffel Tower is, for example, but even a question as seemingly straightforward as “Can I take knitting needles onto a plane?” could end in big trouble for the individual if the assistant gets it wrong.

Google is trying to overcome this problem in order to give the user a single answer rather than constantly pointing them at a list of search results instead. Many of us have had Siri or Cortana tell us “I don’t know, but I’ve found this on the web for you”.

They are achieving this by employing complex algorithms that compare a number of search results, try to ascertain from them the ‘right’ answer, and then finally provide it to the user while at the same time attributing the response to one of the more reliable sources. So in this case the Google Home AI will respond: “According to the Huffington Post, you can take knitting needles on internal US flights, but not circular thread cutters” (whatever they are).

Called Google snippets, these single answers are by no means always reliable, which is why they attribute the answer to the chosen source.

The natural, flexible, interface of the future

With the continual improvement of bots or virtual assistants that are inexorably helping to turn labourious clicks into natural chats, we may well see the weakening of the web as we know it. An intelligent natural language interface is flexible in the extreme – we are not just talking about cylinders in your kitchen now, but an intuitive interface that could have text or speech as the input or output in either case and with ever-more-sophisticated smarts helping to understand you and swiftly deliver exactly what you want.

There may well be a screen involved (on which the AI could display additional information) along with any number of other helpful output alternatives, but these would all be complementary rather than essential. In this way, the interface of the future would be able to be used as easily over the phone as on your phone and as easily by an eighty year old, as by an eight year old.

Websites are great (when they are accessible at least), but in the future we may well look back at how we access the internet of today and leave us feeling that it must have been, quite literally, like crawling a web.

Resources:

You might also like:

 

"Getting help with my dyslexia meant I was able to do my job again"

Teresa started at Unilever in 1990 as an organic chemist. She couldn't tell her colleagues she wasn't able to read the names on chemicals. She thought she was stupid and had no idea she was severely dyslexic. Fifteen years and several promotions later, the pressure became intense. Teresa found herself off work for months with depression. But once she discovered she was dyslexic life changed for the better. She tells her story here. 

silhouette of a woman looking troubled"In that first 15 years I got a degree in biochemistry and several promotions. Everyday I'd walk in hoping they didn’t find out how stupid I really was. And then I got a further promotion into clinical studies for some big name brands; I was given much more responsibility and had a budget to manage.

I loved the role and enjoyed solving problems, however, with it came the paperwork and time restrictions - this is where it became very difficult for me. I still didn't know if I was dyslexic. I was struggling.

I ended up off work for four months with depression because of the issues I was having. When I returned, my end of year performance assessment was due and my manager was furious at my performance.

Son's dyslexia test raised questions

I spoke in confidence to a colleague and said I suspected that - after seeing my nine year old son struggle at school and undergo a dyslexia assessment – that I had dyslexia myself.  

We spoke to my union and HR and meanwhile my manager presented his case for disciplinary. HR asked whether there was anything they needed to be aware of and, at this point, I declared that I believed I was dyslexic.

They suggested I contact the British Dyslexia Association so I had a test with them and the score showed I was indeed severely dyslexic. The company doctor delivered the same results.

I was so relieved and I see it as a gift and not a burden, and never have. HR suggested I contact the government's Access to Work (something an employee has to do themselves) and the company said they'd help in any way needed.

The Access to Work process was very interesting. A woman came to work and asked me what I was good at. I said 'I only know what I'm bad at!'. She walked with me around my department and watched me work, then sat at my desk and then presented her report.

Tech solutions for managing dyslexia at work

My life radically changed that day. After her recommendations, my desk was moved into a better spot (less distracting and less glaring light, as I also have and eye condition called Irlen, which makes my eyes sensitive). I was also provided with grammar and spelling teaching software.

She suggested assistive IT - Dragon, Claro and Global Autocorrect and headphones to minimise distractions. I also now have overlays for my computer (these help with glare and enable people with dyslexia, Irlen and other conditions to better read text) and purple notepads – people with dyslexia or Irlen often see a certain colour best, it can be different for different people – mine is purple.

coloured overlays for written documents

I've had some excellent strategy training and coaching for my dyslexia with a company called Genius to Work. Separately, my manager has also received training, which has helped enormously.

New ways of working with my team after my diagnosis

I've created a set of guidelines so that my project team are aware of my needs, ie how best to communicate with me – emails rather than verbally, so I don’t forget. This helps a lot. I also have coloured signs on my desk so people don’t distract me. So, I put up the red label if I'm concentrating and green when I'm available. I now concentrate for no more than 45 minutes and in that time focus on one task – to aid processing and clearer comprehension.

Previously I would have had loads of applications open and flip from Word to do a report, to Excel to do data. It meant my output was poor - I was doing everything and yet nothing.

Being independent at work with dyslexia

All these tools really help with my independence at work, and they provide my team with the confidence. Spelling mistakes in science really matter - it's only a single letter change between amines and amides, but my god one can react very differently to the other! I check things more regularly.

With all the support, I'm gaining my confidence back, enjoying reading teen fiction for pleasure and exploring my own creativity. At work, my manager recognises when I am struggling and allows me to take some time out so we can grab a coffee and chat. 

I don't sneak into work any more hoping they won’t find out how stupid I am. My secret is out. I'm very clever. Clever because I hid my dyslexia all my life while getting a degree and my dream job.

Updated BBC Mobile Accessibility Guidelines promote a more inclusive gaming experience for disabled people

If you're not already aware of the BBC mobile accessibility guidelines, you should get to know them. They now contain inclusive design principles, and specific guidance for games as well. Examples of the guidance aimed at game developers includes:

  •       Provide means to aid focus or aim using visual, audio and haptic cues
  •         Provide means to remove or reduce the number of obstacles.
  •         Provide a single-hit no-fail mode.

You can see how these would be useful for a range of users, with the last option of a "single hit no fail mode" enabling even users with complex requirements to enjoy progressing through the game without needing lightning reaction times (practically impossible with some disabilities\assistive technologies), so these really promote a more inclusive gaming experience.

Mario video game displaying the word 'Finish' The BBC guidelines are a really useful resource as they include content not only for web (HTML) development, but also mobile (iOS and Android) code examples.

So, if you have a button in an app that's incorrectly labelled, you can view this which explains how to label a button in iOS\Android apps.  Developers will likely know how to do this, but the page can be a useful reference to see what's possible for mobile apps.

Of course, don’t forget about the gaming accessibility guidelines produced collectively by experts in the field too.

Want to know more about accessible web design?

- 6 Top tips for a dementia friendly website

- 6 Essential easy checks for web accessibility

- How to create accessible emails

 

Could Microsoft’s in-car AI for driverless vehicles make us all safer and more equal?

The age of driverless cars is fast approaching. Autonomous vehicles will soon be carrying passengers of every shape, size and ability and Microsoft believes that artificial intelligence (AI) will be as important on the inside of every driverless car as it is on the outside…

Keeping up with driverless cars

Autonomous technology is rapidly reaching maturity. Not a day goes by when a simple but powerful search using Google’s news-aggregating feature doesn’t bring up a bootful of stories charting the advance of autonomous tech. At the time of writing, there were twenty-six articles.

inside of automated, futuristic car

Here is the link to what I see when using the Google news service with ‘autonomous’ as the keyword. obviously the list will change from day to day and even hour to hour. And of course I have those articles come direct to me, minus ads, using the power of RSS.

The three main trends I see are:

  1. That driverless cars are getting better at coping with complex environments and inhospitable weather conditions
  2. Authorities at both the national and city level are increasingly active in roadtest trials in their localities
  3. The date that keeps being mentioned around when we’ll see them in numbers on our streets is only three short years away: 2020.

Artificial intelligence – inside and out

Despite the significant coverage of progress regarding the ability for driverless cars to cope with a wide range of daily traffic and pedestrian hazards, until now, one aspect of the smarts behind autonomous vehicles has been absent from the discussion, namely how AI can be used on the inside of the vehicle to help passengers have the best, and safest, journey possible.

Now Microsoft has begun to explore this next frontier of autonomous functionality in a recent presentation at tech event DesignCon 2017.

As a result, the conversation has now begun around how autonomous cars of the future will need as many smarts on the inside to monitor and assist their passengers as they do on the outside to have a better chance of safely getting them where they need to go.

Adding in-car AI to assist every passenger

In his talk at DesignCon 2017 ‘The Internet of Things That Move - Connected Cars & Beyond’, Microsoft’s Principal programme manager of the Azure group, Doug Seven, outlined several cases where internal sensors and applied intelligence have the potential to improve the in-car experience and may even save lives.

Using their Microsoft Cognitive Toolkit, which is open-source and assists in creating applications capable of deep learning - combined with the Microsoft Cognitive Services API, which includes an Emotion API capable of understanding the emotions of someone's face - the car could be far more aware of what passengers are doing and even feeling. The Emotion API can understand emotions including anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise.

“If we could detect things like road rage or stress, we could [have a vehicle] do things to alter the environment for the driver or passengers,” Seven said.

Cars in the cloud?

At present the Cognitive API and Cognitive Toolkit are cloud-based, which won't suffice for self-driving cars.

“What we can do in the Cloud with our AI capabilities is to build algorithms and models to let cars become intelligent and make decisions,” Seven told the DesignCon audience. “But we can't rely on the Cloud because we might lose connectivity or there might be (latent) issues."

woman in car looking like she has road rage

Microsoft plans to create localised AI within the cars themselves that will be more aware of drivers, passengers, and the car's environment. Seven said, “We need hardware in the car capable of processing that data in real time.”

The benefits of AI from within a driverless car

Seven gave a specific example of how such smarts could be put to use, explaining how instances of road rage could be avoided when the car [through the built-in cameras and emotion recognition algorithms] is aware of the emotions and motions of its passengers. But, it’s possible to think of many more scenarios where an empathetic and vigilant car could help make journeys more pleasant and potentially even save lives.

Many of us love the convenience of talking to Siri or Amazon’s Echo, and having the ear of your in-car AI has obvious applications. Being a passenger in a fully autonomous vehicle means that you could be eating lunch, engaged in work or any other type of activity you chose. If your hands are full and you need to change your route or destination (or just the song or video playing on the entertainment system) then voice is an easy way of doing it.

Feeling ill? With a word the car can pull over or else take you straight to the nearest pharmacy. And of course you would also have all the information available on the internet accessible through a natural, conversational interface.

How far can a smart car go?

However, a truly smart car can go a lot further than that. Once the AI has access to data from other sensors such as heart rate monitors built into armrests (or connected fitness trackers worn by passengers) then someone experiencing an irregular cardiac rhythm could be alerted to take their medication, or indeed a passenger who is unconscious after a heart attack or seizure could be taken directly to hospital.

Advanced sound detection algorithms could also be employed to listen for sounds of distress or alarm from passengers, as well as for external sounds of beeping horns or emergency vehicles that might require the car to make a sharp evasive manoeuvre. The AI could warn or reassure the passenger. For travellers with a hearing impairment who do not hear the sirens, or for blind passengers who’d appreciate an explanation for the drastic swerving, this level of assurance would be invaluable.

In a driverless future disability fades from view

In a future where autonomous vehicles not only need to navigate their way through complex and ever-changing streets, but also protect and interact with occupants regardless of what they happen to be doing throughout the journey, the car will need to be aware of their needs and able to decide upon the best method to communicate with its precious cargo at any given time.

If passengers don’t respond to a verbal prompt, it might be because they are listening to loud music on almost invisible earbuds, or it might be that they have hearing loss. If their eyes are closed, they may be relaxing yet still alert, fast asleep or perhaps it’s because they’re blind (or blinded by the sun). 

The car will have the sensors and the smarts to deal with such cases - along with numerous others. When in-car AI reaches this level of awareness there will be little distinction between disability, difference and distraction in a driverless world.

Read more

Pages