Home » In-car Voice AI Shifts from Basic Commands to Personal Assistants

In-car Voice AI Shifts from Basic Commands to Personal Assistants

by Editor
0 comments

Say “I’m chilly” to your car, and it adjusts the temperature. Ask about the weather, and it answers conversationally. Welcome to the era of automotive AI assistants that finally understand how humans actually talk.

In-car AI now acts as “proactive copilots that help drivers and passengers get things done,” Christian Mentz, chief revenue officer at the U.S. software company Cerence, told Automotive News Europe in an email. “Driving is a high-stakes, safety-critical task, which makes voice the most practical and safest way to access intelligence, services and information.”

The global automotive voice recognition market size is set to be worth $9.9 billion by 2034, up from $3.7 billion in 2024, according to market research firm GM Insights. The growth reflects automakers shifting toward software-defined vehicles (SDVs) that enable more advanced voice-controlled features.

The AI assistants of today and tomorrow will be controlled by voice because it is “the most natural and efficient way to interact,” especially in environments where attention and safety matter, Mentz said. Cerence uses large language models (LLMs) to make in-car assistants more intuitive and conversational for automakers including Volkswagen Group.

In-car AI assistants with LLM-powered intelligence now offer much better conversational capabilities as they enable the voice assistant to understand nearly everything the user says, Gartner Vice President of Research Pedro Pacheco told Automotive News Europe in an email. This marks the end of memorizing manual commands and the start of the car understanding human context, Pacheco said.

Moreover, LLMs bring the vast general knowledge of GPT (generative pre-trained transformer) models, which allows the voice assistant to answer a broad range of general domain questions, such as about the news, latest events or weather. Some can detect emotions in the user’s voice patterns and adapt their performance accordingly.

Mercedes, BMW, Tesla deploy distinct voice strategies

Automakers are taking varied approaches to implementing these conversational systems, from emotional companions to task-focused assistants.

The Mercedes-Benz User Experience (MBUX) virtual assistant uses generative AI as a “digital companion.” It can engage in natural dialogue, express four distinct emotional states, and offer proactive suggestions based on the driver’s routine.

Meanwhile, BMW is rolling out its next-generation personal assistant built on Amazon’s Alexa Custom Assistant technology. The 2025-26 update supports complex requests regarding music, news and sports without requiring rigid, scripted commands.

Finally, Tesla is using its Grok chatbot from xAI, an artificial intelligence company founded by Elon Musk in 2023. Grok differs from standard automotive assistants by its conversational depth, real-time data access and its distinctive rebellious personality.

Next: AI agents that work outside the car

The next step is the adoption of AI agents that can be controlled via the voice assistant, Pacheco said.

These agents, expected within two to three years, will allow users to control any kind of vehicle feature and even to perform tasks outside of the car, such as arranging a meeting, booking a car service or answering phone calls.

“If properly done, this can become a game changer as it will allow the car to expand the amount of value it delivers to its occupants,” Pacheco said. “This means an AI assistant will become a real personal assistant in the full sense of the word.”

 

 

Originally written by: Lois Jones

Source: Automotive News

Published on: 19 April 2026

Link to original article: In-car voice AI shifts from basic commands to personal assistants

You may also like

Leave a Comment