Alexa, what will you be able to do in 2030?

Amazon Echo with Alexa
(Image credit: Shutterstock)

Imagine your house 10 years from now. You might picture a huge TV screen that turns into a wall, art or something else entirely with a simple gesture, a driverless car parked in the garage and a robot butler emptying the washing machine for you. But just how likely is it that Alexa will have evolved from a humble voice assistant that ‘lives’ in a cylinder to a walking, talking robot companion that helps out around the house in a decade’s time? 

Alexa, you and your voice

Smart voice assistants are already a staple in many homes. According to the latest stats, more than 66.4 million people in the US – 26.2% of the total population – now own a smart speaker. Similar stats in the UK suggest they’re nearly as popular, with more than 22% of households reportedly now kitted out with a voice-activated device.

It’s difficult to predict whether these numbers will continue to rise – and whether Alexa, Google Assistant or Siri will be the most popular AI-powered assistant in your living room. But whichever is on top, there’s one thing we know for sure: we all like the sound of our own voices.

The latest data from Juniper Research, as reported by TechCrunch, suggests voice-activated tech is likely to increase over the coming years. It’s quickly becoming one of the top ways we now search online and voice features and assistants are being added to our phones, smart TVs, fitness trackers and even our cars. This is no surprise. Using your voice rather than relying on a screen, or other interface, is quick and convenient.

“Switching from traditional screen interfaces to voice interfaces is a logical move,” Dor Skuler, CEO and co-founder of Intuition Robotics, tells TechRadar. “It allows for a greater variety of third party skills to be employed, it keeps people present during interactions, and it’s a much more accessible form of interaction for many users who have certain visual or physical impairments or have a difficult time utilizing computers and phones.”

But as natural as barking commands at our voice assistants may seem right now, there’s one thing we can all agree on: they need to become smarter for us to stay interested.

Amazon Echo with Alexa

Amazon’s Echo now comes in a range of colors – and has Alexa built in (Image credit: Amazon)

Advancing Alexa's intelligence 

Amazon is constantly upgrading the way Alexa works. Last year, a significant number of updates were announced to bring the AI assistant a ‘memory’ feature, which enables Alexa to remember important dates, as well as ‘context carry over’, which means you can ask Alexa one question followed by another.

There’s more on the horizon. According to reports, a patent was also  filed that would allow Alexa to respond to commands without needed a wake word. That means you could say “turn lights on Alexa” rather than saying “Alexa” before the command.

Although this change raises privacy concerns (is Alexa always listening?) it could make interactions more fluid. “This technology will continue to improve, with much more natural sounding interactions, until the user feels as though they are in a conversation with another human.” Skuler says.

Many of the latest updates are focused on making conversation between you and Alexa feel more like you’re talking to a human than a computer. But it’s not just conversation skills that need to develop. The suggestions Alexa provides, and the way it helps around the home, will be important for evolving it from a smart assistant into an invaluable member of the family.

“The model of interaction, at its core, is not entirely different than going on a computer and typing a question into a browser,” Skuler tells us. “Until this format changes, and becomes more intuitive and more human with the AI initiating interactions and understanding context and users, there is a large chance the functionality of the devices will plateau.”

Sony Aibo

Sony’s Aibo is a robot dog that responds to your touch (Image credit: Sony)

Building a body for Alexa

One way Alexa could become more efficient, help out around the home and understand the context of interactions is if it has more of a physical presence in our homes.

The vice president and head scientist at Alexa Artificial Intelligence, Rohit Prasad, has revealed that he thinks Alexa would be better if it had a robot body and cameras. “The only way to make smart assistants really smart is to give it eyes and let it explore the world,” Prasad said at the MIT Technology Review’s EmTech Digital AI conference earlier this year.

Giving smart assistants physical bodies could enable them to learn more about the world, including your routines, how you move around your home and where furniture and other bits of technology are situated.

This context-building could be what Alexa needs to start serving up more personalized suggestions, help you accomplish manual tasks and pre-empt issues before they arise, straddling the line between AI assistant and robo-companion.

“We will see a move from voice assistants to companions,” Skuler says. “AI agents that encompass rich characters that anticipate our needs and are more integral to people's lives.”

He explains that this is important for more natural and helpful interactions. “Instead of users having to explicitly ask for what they want, the companion will understand the context of the situation and proactively interact with the user to assist them,” Skuler tells us. “It is natural that robot bodies will be built for AI companions and incorporate multiple modalities to allow for a natural interaction.”

On the other hand, there’s also a case for Alexa having even less of a physical presence. “It's also natural that regular voice assistants won't need a body,” Skuler tells us. “They will be invisible and embedded in other devices.” So rather than ‘living’ in a smart speaker or a robot body, Alexa could be integrated into tech, furniture and the very structure of our houses – all that’s necessary is a camera, sensors and a connection between all of your devices.

Zoetic Kiki

The Zoetic Kiki has a number of expressions, which helps to build a connection between you and your robo-pet (Image credit: Zoetic)

Robo-pets, companions and AI friends

A number of home robots have entered the market in recent years that promise to either become an assistant, act as a replacement pet or even offer physical assistance.

Companies like Jibo, Sony Aibo, Anki Vector and Zoetic Kiki have launched social robots that are both pets and personal assistants.

Some are even created to build an emotional connection. “Kiki's main purpose is to be a companion, a friend to keep you company,” Mita Yun, Co-Founder and CEO of Zoetic, tells TechRadar. “We don't intend for Kiki to be a total pet replacement – instead, we see Kiki as another character in your home, to interact with and to share in your life.”

Companion robots, like Kiki, can do a range of things, including looking at you and recognizing your face, using sensors to respond to touch and some have their own Alexa-style AI assistant to help you check messages, call friends and more.

But to some a robo-pet may sound comforting, to others a nuisance. There’s certainly not a clear demand for social robots yet as a number of big names in this space, including Anki and Jibo, have shuttered operations. Which begs the question: do we need robots in our homes?

ElliQ

The ElliQ is a digital companion aimed at keeping older adults sharp, connected and engaged (Image credit: ElliQ)

Skuler tells us this depends on the purpose of the AI and best user experience. That means if you’re in a busy home and don’t need a physical presence, Alexa could best serve you integrated into your smart products. But there are other use cases where a robot with a body could be more important, including for older people or those who suffer from loneliness.

Skuler explains that for his robot product, ElliQ, the body was necessary for its core functions. “We were designing for an ageing population that needs multiple modalities of interaction in order to optimize their experience and make it intuitive, approachable and empathetic,” he explains. “Each physical element serves a clear purpose.”

Using different interfaces

Although voice interaction is expected to rise in usage and popularity, we could find different ways to interact with AI assistants in the future.

For example, many major companies are rumored to be working on augmented reality tech and eyewear – and there are already a number of headsets available to enterprises and developers. This means that Alexa, and other AI assistants, could work with a mixture of different interfaces and inputs, including AR eyewear, gestures and voice commands combined.

Alexa could also integrate into other kinds of technology beyond speakers and smart home products removing the need for active interaction altogether. Last month, the Fitbit Versa 2 was announced with Alexa integration. Although it remains to be seen how useful the current iteration of Alexa is on your wrist, this could be the start of AI assistants paying close attention to your routine, body and health.

Fitbit Versa 2

The Fitbit Versa 2 has Alexa built in. It allows you to ask queries like you’d ask an Amazon Echo home speaker, but you can now do that on your wrist (Image credit: Fitbit)

Rumors have suggested Amazon is working on its own Alexa-enabled wearable, which could tap into your emotional state by keeping tabs on breathing and heart rate. Although this may sound a little invasive, it could lead to more helpful assistants.

Imagine a future in which Alexa already knows you’re tired so, without asking, runs you a bath, dims the lights and has already picked up the phone to cancel your plans for the evening. Bliss.

This kind of joined-up thinking is the key to Alexa’s long-term usefulness. But of course there are plenty of other factors that could affect how we use and relate to Alexa in future.

For example, will we openly accept robots into our homes? Will we suddenly go off the idea of always-on tech in our homes after a few too many security blunders?

There’s a lot more to consider than the technology, but also our own attitudes towards its increasing presence in our lives and in our living rooms.

The main focus for Amazon in the near future is on making Alexa’s capabilities, suggestions and language more natural and tailored to us. That could mean infinitely smarter conversational skills, personalized suggestions and an omnipresent Alexa in all our tech. It could also mean a robo-companion that gives Alexa a body, so it can whizz around helping us with chores and providing companionship. Or something else entirely. Because ten years ago many of us would have laughed at the idea of a disembodied voice taking up residence in our homes – who knows just how different the next 10  years will look.

Becca Caddy

Becca is a contributor to TechRadar, a freelance journalist and author. She’s been writing about consumer tech and popular science for more than ten years, covering all kinds of topics, including why robots have eyes and whether we’ll experience the overview effect one day. She’s particularly interested in VR/AR, wearables, digital health, space tech and chatting to experts and academics about the future. She’s contributed to TechRadar, T3, Wired, New Scientist, The Guardian, Inverse and many more. Her first book, Screen Time, came out in January 2021 with Bonnier Books. She loves science-fiction, brutalist architecture, and spending too much time floating through space in virtual reality.