Amazon’s assistant now has its own Alexa LLM poised to improve the brains in your smart home thanks to its extensive knowledge of smart home APIs.
Amazon’s Alexa is about to come out of its shell, and what emerges could be very interesting. At its fall hardware event Wednesday, the company revealed an all-new Alexa voice assistant powered by its new Alexa large language model. According to Dave Limp, Amazon’s current SVP of devices and services, this new Alexa can understand conversational phrases and respond appropriately, interpret context more effectively, and complete multiple requests from one command.
Voice assistants need a shake-up. A general lack of innovation and barely imperceptible improvements around comprehension have turned them into basic tools instead of the exciting technological advancements we hoped for when they broke onto the scene over a decade ago.
Generative AI has looked like their best shot at survival for a while. But while these digital assistants have always had an element of AI, they’ve lacked the complex processing abilities and more human-like interactions generative AI is capable of. This is a big moment for the smart home, as it could take home automation to the next level, moving it from a remote control experience to a home that’s, well, actually smart.
In an interview with The Verge ahead of the event, Limp explained that the new Alexa LLM “is a true generalizable large language model that’s very optimized for the Alexa use case; it’s not what you find with a Bard or ChatGPT or any of these things.”
However, this all-new Alexa isn’t being unleashed everywhere, on everyone, all at once. The company is rolling it out slowly through a preview program “in the coming months” — and only in the US. Clearly, there have been lessons learned from the missteps of Microsoft and Google, and Amazon is proceeding with caution.
The first big change with the new Alexa will be a more conversational assistant
“When you connect an LLM to the real world, you want to minimize hallucinations — and while we think we have the right systems in place … there is no substitute for putting it out in the real world,” says Limp. If you want to be notified when you can join the preview, tell your Echo device, “Alexa, let’s chat,” and your interest will be registered.
Unsurprisingly, this superpowered Alexa may not always be free. Limp said that while Alexa, as it is today, will remain free, “the idea of a superhuman assistant that can supercharge your smart home, and more, work complex tasks on your behalf, could provide enough utility that we will end up charging something for it down the road.”
Amazon’s voice assistant is about to get more conversational.Photo by Jennifer Pattison Tuohy / The Verge
The first big change with the new Alexa will be a more conversational assistant, one that can understand more of what you say and require less specific nomenclature to do what you ask. This is one of the most common causes of frustration with voice assistants — having to repeat yourself when you ask it to turn down the thermostat or having it respond, “A few things share the name ‘lights.’ Please choose unique names and run discovery again,” for the 900th time when all you want is to see where you left the remote control.
With the new Alexa, you can say a phrase like, “Alexa, I’m cold,” and the assistant should turn up the temperature on your connected thermostat. Or, as Limp explained, “Say, ‘Alexa, make this room feel like the Seahawk colors,’ and it’s going to know what room I’m in and what the Seahawk’s colors are and make those translations between APIs.”
This superpowered Alexa may not always be free
It’s the APIs that are key, says Limp. “We’ve funneled a large number of smart home APIs, 200-plus, into our LLM.” This data, combined with Alexa’s knowledge of which devices are in your home and what room you’re in based on the Echo speaker you’re talking to, will give Alexa the context needed to more proactively and seamlessly manage your smart home.
This contextual understanding will extend beyond knowing what other connected devices you might want to control to things like inferring when something’s changed in your home. “If you add a new device to your home, you can say, ‘Alexa, turn on the new light,’ and it will know what the new light is. It will disambiguate things, so if you put in a new smart plug or light, it will be easier to control,” explains Limp.
Another new capability is responding to multiple requests at once. This is not just the basic stuff it could already do (to some extent), such as “Alexa turn off the lights and lock the door.” This is more advanced. “You can say, ‘Alexa, turn on the sprinklers and open my garage door, and turn off the outside lights,’ and it’ll figure that all out,” says Limp.
This capability will extend to creating Routines on the fly entirely by voice — without any manual programming in the Alexa app. “I set one up this morning for my kid just by saying, ‘Alexa, every morning at 8AM, turn up the light, play wake-up music for my kid in his bedroom, and start the coffeemaker,’” says Limp. “That can be as complicated and ambiguous as you want, and immediately, it is going to pop up in your app as Routine.”
Initially, the multiple command feature will only work with a subset of device types — including lights, smart plugs, and a few others, says Limp. But the team is working toward adding everything.
The possibly soon-to-be-Amazon-owned Roomba is getting in on Alexa’s new AI capabilities thanks to a new developer program that lets device manufacturers poll its LLM abilities to allow for more conversational commands.Photo by Jennifer Pattison Tuohy / The Verge
Developers will also be able to leverage Alexa’s new cognitive functions. Amazon is introducing two tools that allow the new Alexa to control certain unique features of third-party manufacturer products that aren’t necessarily in Amazon’s smart home ecosystem toolkit. These are called Dynamic Controller and Action Controller.
Dynamic Controller will enable features such as prebuilt scenes for lighting control to be surfaced more naturally. So, if you have GE Cync-colored light bulbs and say, “Alexa, make it look spooky in here,” Alexa will know what to do without you having to program a Routine or import scenes to the Alexa app.
Similarly, Action Controller allows developers to add simple actions that Alexa can act on. For example, if you say, “Alexa, the floor is dirty,” the assistant will know you want the robot vacuum to get to work.
Amazon says it’s already working with GE Cync, Philips, GE Appliances, iRobot, Roborock, and Xiaomi on these features and is opening the program up to more developers. Amazon hasn’t provided details on how or when developers will gain access to these tools; we’ve reached out for clarification.
“We’ve come a long way baby.” The original Echo smart speaker arrived in 2014.Photo by Sean O’Kane / The Verge
Limp says this is just the start of Alexa’s new journey. “We’ve built a new generative AI LLM that’s going to — over time — power a bunch of areas of Alexa, including a bunch of new smart home experiences,” he says. “The first bucket is to try and simplify these everyday tasks.” Where it will go next will be an interesting journey to watch.
The new Alexa LLM-powered voice assistant will launch first in preview in the US and be available to anyone with an Echo device. Amazon hasn’t announced a date for the preview, and the new Alexa LLM-powered smart home features will be part of an additional invitation-only preview. You can request an invitation once you are part of the preview. Amazon says those will be available at a later date.