How Project Relate can help those with atypical speech patterns

ME: “Voice assistants are great.”
ALEXA: “I’m sorry I didn’t get that – did you mean play songs by Taylor Swift?”

The trouble with voice

A person in a café holding a smartphone running the relate app We’ve all had the odd frustrating experience where a voice assistant has misheard or decided we said one thing rather than another. Or how many of us have double-checked our Alexa’s timer as we thought she said 50 rather than 15 minutes and we’d rather not risk a charcoal frisbee rather than a nicely cooked pizza?

Voice assistants have become a standard of modern life, we control our houses, access information, music, navigate, all with the expectation that we will be heard and understood.

But whilst accent recognition is improving, there are still people who may be more difficult to understand, people with Amyotrophic lateral sclerosis (ALS, Multiple Sclerosis, people who are Deaf, have had a stroke, or have a stutter. Atypical speech patterns that can make interacting via voice extremely challenging and whose speech patterns aren’t included in the datasets on which the algorithms are taught.

Matching patterns

For a speech assistant to work, it needs to be trained on what words sound like and whereas in the past this tended to be one person reading the same word over and over so that one device was given a pattern it could match, machine learning and ‘AI’ now mean that you feed in huge datasets that a machine can be trained on, they can learn what a word sounds like when it is said in different ways, with different accents, pitches and tones it ‘learns’ a word, not a voice.

The key thing with this is that the machine is naïve; it doesn’t have any preconceptions as to what a word should sound like. 

Project Euphonia

The ability to understand atypical speech was something that Google Research Scientists Michael Brenner and Dimitri Kanevsky (who is Deaf and speaks with an atypical speech pattern) set out to achieve with Project Euphonia.

Dimitri provided the model with examples of how he spoke, recording thousands of sentences and training the algorithm to understand his atypical speech pattern – and it worked!

Dimitri himself was a guest at AbilityNet’s Techshare Pro in 2019.

The project then encouraged other people with atypical speech patterns to contribute, to provide the dataset needed to train the software, building greater understanding of a more diverse range of voices and speech patterns.

Then, on Nov 9th, 2021, Google announced Project Relate.

Project Relate

Project Relate is an app that uses the data from Project Euphonia and provides it as a tool to translate an individual’s atypical speech pattern to connect with their phone. 

The app operates in three modes; Assistant, Listen and Repeat:

Assistant

The assistant mode translates the speech to interact with the Google Assistant on your phone, allowing you to use voice commands for different features, “Send a text”, “Take a selfie”. 

Listen

The listen mode transcribes speech allowing others to read what you are saying, should the person have difficulty understanding. 

Repeat

Repeat mode will repeat back what you are saying using a synthesised voice.

The app has been developed with the input of disabled people, people who experience the frustration of not being easily understood by both people and machines.

How to get involved:

Currently Project Relate is inviting English-speaking users in Australia, Canada, New Zealand, and the United States to sign up to test the app, with access being granted over the next few months.

Hopefully, all things going well, it will be rolled out to the rest of the world in the months following this initial testing.

Free Accessibility Insights from Xbox at Microsoft!

Join Anita Mortaloni, Director of Accessibility of Xbox at Microsoft, for her Accessibility Insights on our free webinar on Tuesday 7 December at 1pm GMT.

Register for the webinar

Further resources

Blog: Dictating with speech challenges: A tester's experience with Google’s Project Relate speech app

Blog: Ethics, Machine learning, and disabilities

Blog: 9 useful apps for people who are Deaf or have hearing loss

Blog: Siri can now answer your calls... but can't hang up

Get assistive technology training - including ClaroRead, Dragon and JAWS