BMW has announced a next-generation interface that it hopes will let drivers interact with their cars like they're chatting with another human.
Natural Interaction, unveiled at MWC 2019, uses a combination of gesture, voice and gaze recognition in various combinations. The system will detect and interpret the commands based on the situation and context, and will use machine learning to improve its accuracy over time.
The system will detect your exact hand and finger movements, as well as the where you're looking when you make it. For example, you can point at a window and say 'down' to lower it, or point at a mysterious button on the dashboard to find out what it does.
- Driverless cars explained: everything you need to know
- Are self-driving cars safe, and when will they earn our trust?
- BMW iDrive: the ultimate guide
Gestures aren't limited to the car's cockpit; thanks to HD mapping, you can also point to buildings outside and ask for information on them.
You could point at a restaurant that looks interesting and ask to see reviews or book a table, or point at a cinema to find out viewing times and book tickets. You can even point to a space and tell the car to park there.
There are currently seven gestures, two of which are customizable: pointing with two fingers, and opening and closing your hand (a little like squeezing a sponge).
BMW has also improved its voice controls, making them more human-focused. A video demonstration showed how telling the car you're feeling tired could turn down the temperature, open a window, and put on some rock music to sharpen you up.
"Our goal is to push the boundaries of interaction between humans and cars," explained Dr Christophe Grote, senior vice president of electronics, at a press conference. "Our vision is an intelligent car becoming more human-like."
The road ahead
Natural Interaction will make its debut in the BMW iNext, a prototype of which was on display at the conference, and which is due to launch in all major territories in 2021.
For now, most of the system focuses on the driver (though Natural Interaction can detect gestures for both front seats), but future iterations could extend to the rear seats as well. As Dr Grote explains, while the iNext features level three automation (which allows you to safely take your eyes off the road for a while), in cars with level five automation, everyone is a passenger.
Natural Interaction will also reduce the need for physical controls within the car, but Dr Grote says BMW doesn't intend for it to replace its iDrive infotainment system just yet.
Some processing takes place in the cloud, but a lot is performed within the car itself, which keeps your information private and means the system still works in places without reliable 5G connectivity.
None of your data will be sent to third parties. If you decide you specifically want to ask Alexa a question, you'll be able to do that, but the connection to Amazon will close as soon as you're done.
However, you might never need to turn to another virtual assistant. Ultimately, Dr Grote says, BMW envisions Natural Interaction as your "ultimate companion, who anticipates and suggests before I know what I want".
It's an ambitious goal, and we look forward to trying the system for real in 2021.
MWC (Mobile World Congress) is the world's largest showcase for the mobile industry, stuffed full of the newest phones, tablets, wearables and more. TechRadar is reporting live from Barcelona all week to bring you the very latest from the show floor. Head to our dedicated MWC 2019 hub to see all the new releases, along with TechRadar's world-class analysis and buying advice about your next phone.