It’s easy to imagine a future in whichyour virtual personal assistant is everywhere you are. Before long, Alexa, Siri, Google, and others like them will be woven into the fabric of your home, ready to fulfill your every need whim. Need milk? Tell your fridge. Forgot to close the garage door? Grumble about it to the micin your dashboard. Want to order your post-marathon double cheeseburger and fries before evencrossingthe finish line? Scream an order into your smartwatch.
This isn’tas outlandish as it might sound. Amazons Alexa is about to be everywhere. On your phone. In your hotel room. Throughout your home. Even in your car. In the year or so since Amazon opened theAlexa developer kit, no end of companies have integrated simple voice commands intotheir products. Yet this seamlessly connected world still feels faraway. The challenge isn’t in creating the devices, it’s in creatinga consistent user experience as they proliferate.
This isn’timpossible, but it will take a while. “The next couple of years is going to be a lot of talking objects,” says Mark Rolston, former creative director of Frog and co-founder of studio Argodesign. The rulesthat dictate how you’llinteract with all your connected devices, andhow those deviceswill interactwith each other, are not yet codified. Developing norms and standards will take time—and experimentation.
A customer shouldnt have to learn a new language or style of speaking in order to interact with her. They should be able to speak naturally, as they would to a human, and she should be able to answer.Brian Kralyevich
It seems everyone is baking Alexa into something. LG peddles its InstaView Smart Fridge, which, among other things, displays mealtimesuggestions on a 29-inch LCD screen with a simple: Alexa, show me recipes. Ubtech is getting a lot of buzz for Lynx, a smallrobot whose simplistic responses to your Alexa queries was overshadowed by its entrancing dance moves. And there are no end of Echo copycats from the likes of Mattel, Lenovo, and Klipsch. Even Ford and Volkswagon rolled into CES with cars that featured Alexa in the dashboard.
In theory, Alexa everywhere is a good thing. The more devices that support it, the more streamlined yourexperience. In practice, the open nature of Alexa Voice Services makes a consistent user experiencea colossaldesign challenge. That’s why Amazon is developingguidelines for third party developers. It already requires everyoneto usethe wake word “Alexa.” Italso encourages simple, explicit language in their commands.
Our core goal is to make Alexas interactions with a customer seamless and easy, says Brian Kralyevich, vice president of Amazons user experience design for digital products. A customer shouldnt have to learn a new language or style of speaking in order to interact with her. They should be able to speak naturally, as they would to a human, and she should be able to answer.”
This is easy when you’re asking your Echoto queue a song or telling your fridgeto make ice. But as your home fills withsmart devices, addressing each device individually will grow cumbersome. At a high level you need to be able to interact with devices how you want to, says Dan Faulkner, a senior vpat the software company Nuance. If I think two years out, three years out, are we really going to have millions and millions of users who are learning the unique dialogue path with each disparate device? That just doesnt seem likely to me.
For now, though, most of these devices are simplyEchos in elaborate packaging. When you say Alexa to your fridge, other Alexa-infused gadgets are listening, too.LG touted the fact that youcan summonan Uber from its new fridge (which raisesthe question of why you’d want to, but put that aside for now), but this presents a new problem: What happens when everything in your kitchen can do that? And what happens when multiple devices don’t understand what you’re saying?
One solution is to diversify the wake word so you can address each device directly, Rolston says. Another approach is to let the gadget figure it out.If you have more thanone Echo within earshot, Amazon’s “Echo Spatial Perception” technology calculates your proximity to each device so that only the closest one responds.But even that’s a temporary fix. Ideally, yourgadgets will connect to a central hub (Rolston predicts a smart can-lightthat turns aroom—or a car, or an office—into a communication device.) “What this is really portending is the day where, rather than standing in your kitchen and talking to your fridge, you stand in the kitchen and just talk to the house,” Rolston says.
Faulkner agrees. Our vision for this is voice capabilities should really be more kitted into the fabric of the home for it to be useful, he says. All the devices need to be aware of each other and you need to be able to talk to these devices in an interoperable way. To make that work requires cooperation betweenplatform providers—or a market dominated by a single company. Faulkner says Nuance is working with softwarecompanies to figure out how to stitch disparate platforms together. What’s more, getting to a place where talking to your smart home feels manageable—let alone natural—hinges on improving the natural language understanding and contextual awareness of these gadgets.
For now, Amazon remains focused ongetting Alexa into as many places as possible. And the details are of little concern to most companies, which see Alexa as little more than a sellable upgrade. Thats OK. New technology is alwaysmessy. It exposes what works and what doesn’t, and the difference between ubiquity and utility. Eventually, all these disparate threads might come together to create a truly useful ecosystem. Until then, try to find some delight in the fact you can call an Uber from your refrigerator.