Amazon’s Alexa wants to ease your ‘cognitive overload’

The goal with Alexa to give it access to its full library any of “skills” — which are like cloud-based apps — at all times, and let it decide on its own which ones to use and when to achieve an outcome.

Rohit Prasad, vice-president and head scientist, Amazon Alexa.

Rohit Prasad, vice-president and head scientist, Amazon Alexa.

“You get the skill you need, without knowing about the skill”, says Prasad, who was in Melbourne for the annual WSDM (web search and data mining) conference.

“You just express your request. If you say ‘Alexa get me a car’, Alexa should be smart enough to know what you really need”.

It’s always worked this way for some core abilities, for example you can just ask for music and Alexa will play the appropriate track from whatever services you’re subscribed to. But in order to access many of the 80,000 skills you need to say “Alexa, tell X to do Y”.

The end result is that you still need to remember the name of the skill and an action, just like you need to remember an app. Prasad says it was designed this way to avoid Alexa misinterpreting what you were asking for.

“Imagine if you suddenly say ‘play songs by Sting’, and somebody writes a skill called ‘Sting’. And then it serves you [information on] how to treat bee stings. We didn’t want this ambiguity to hurt”, he says.

Imagine if you suddenly say ‘play songs by Sting’, and somebody writes a skill called ‘Sting’. And then it serves you [information on] how to treat bee stings.

But the latest techniques being used by the company allow Alexa to analyse the request, make a shortlist of possible skills being invoked, test which skills are most suitable and then offering an answer almost instantaneously.

“It’s a very hard problem. Because the way these skill builders are building is independent of each other”, Prasad says. “Developers are going to be very innovative. They’re going to come up with all kinds of skills. And now it’s upon us to discover the right skill for the customer. Our long term vision is that you will not have to remember the name of the skill.”

The result will be that you can ask any question or make any request in natural language, and Alexa will connect to the appropriate skill automatically. For example if you say “what are the surf conditions at Collaroy Beach”, Alexa will read information directly from the Coast Watch skill. The technology is already working for thousands of skills, Prasad says.


So what happens when Alexa still doesn’t know how to help with your request, or you don’t know how to ask for what you need? Prasad says you should be able to just keep talking, like you would when clarifying something to a friend, and Alexa will learn.

In fact, this is a key way the system becomes stronger over time. When a user has to ask multiple times to get to what they want, Alexa can analyse those previous attempts to learn more about they way people make requests. So next time you’re frustrated by a smart speaker, take comfort in the fact you might be making the experience of other users better in the future.

“And also everybody else’s interactions help you”, Prasad says, “so it’s a symbiotic relationship”.

Tim is the editor of The Age and Sydney Morning Herald technology sections.

Most Viewed in Technology


Source link Technology

Enter your Email Address

Leave a Reply

Your email address will not be published. Required fields are marked *