Mercedes Builds-In a Whole Lot of Artificial Intelligence

Although there are still plenty of car shows and well publicized car collections, it may be that we don’t love our cars like we used to.  Spewing tailpipes, exploding airbags, gridlock, and road taxes dominate a lot of the current media narrative.

Things do come round, though, and if the latest new automotive infotainment system from Mercedes Benz is indicative of where vehicle interface technology is going, we may find ourselves becoming at least more ‘in sync’ with our cars again, or even—if Mercedes Benz succeeds in its aims—feeling an “emotional connection between the vehicle, driver, and passengers.”

Mercedes is calling its new system MBUX, for Mercedes-Benz User Experience; ‘Emm-Bucks’ will likely be its folksy nick-name, but it is far from being homespun.  Automotive journalists are calling it a paradigm shift in technological development likely to keep Mercedes the standard bearer for some time to come.  After all, as Eric Adams writing for http://www.thedrive.com put it, “Designing a completely new interface that will be deeply integrated into all the car’s systems and can be adapted to every vehicle across the lineup takes years of planning, development, and testing.”

Watching what Mercedes Benz is rolling out in this regard may seem a rarified exercise but for the fact that as they evolve, many ‘luxury’ car features, by regulation or market demand or both often become ‘baseline’.

MBUX innovations are remarkable.  It uses artificial intelligence (“AI”) to learn its particular driver’s preferences and adapt accordingly.  And since it is cloud-based, it will also continue to learn from all its users how to comprehend language and intent more generally.  It responds to natural, informal language, not just a set roster of commands.

The system is activated by saying, “Hey Mercedes.” Then, for example, you can ask it to adjust the temperature by saying you’re too cold or too hot.  You can ask it to phone someone – perhaps your boss or daughter or spouse.  You can ask it for highly specific destinations—one example given was “Find me an Asian restaurant, but not Japanese, in downtown LA”—or ask about game times or sports scores.  An un-answerable question may elicit what one writer is calling ‘a silent shrug’ but this should become increasingly rare.

The dual display screen appears as a single ultra-wide screen extending left from behind the steering wheel to roughly the mid-point of the vehicle.  The left screen allows for display of the primary instrument cluster in one of three styles.  The right screen displays the navigation, entertainment, and vehicle-systems interfaces.

In addition to spoken queries, these interfaces can be controlled by a track-pad, steering-wheel controls, and touchscreen capability, horizontal swiping selecting among the scrolling icons, vertical swiping retrieving the sub-systems the AI has deduced are wanted and used the most.

The graphics are described as being “crisp and dynamic” and the system as being quick and precise with almost “no latency.”  The navigation system offers augmented reality features.  For example, it can superimpose on the wide-angle image from a forward camera data such as street names, directional arrows, and property numbers.

It must be said that a lot of these AI features are on our phones already.  Still, it should be useful to have them right at hand and integrated in your vehicle.  Google MBUX to see it in action.