AI-driven UX / HMI for Mercedes’ Autonomous

Led to an autonomous-driving HMI with AI-based monitoring, designed clear and safe interactions for drivers and passengers, later adopted across multiple classes.

Client

Mercedes-Benz

Role

Product Designer & PM : A lean, startup-size team within a focused division

Team

1 PL / 2 PM / Product Designer / 1 GUI Designer / 60 Developers + 200+

Period

24 months, 2017 - 2018

Overview

First AI-driven Autonomous HMI/UX Development for Mercedes

First AI-driven Autonomous HMI/UX Development for Mercedes

Mercedes-Benz collaborated with LG to develop an AI-based HMI and UX for Level 3 autonomous driving. Working with cross-functional teams, I gained valuable research and development experience while co-designing a multimodal AI framewo-rk that used machine learning to interpret head movement, gaze, gestures, posture, and driver health in real time. This system enabled safer and more intuitive interactio-ns, enhanced overall user experience, and this direction was later implemented in S, E, and C-Class vehicles and continues today as a recurring revenue program generating over one billion dollars annually.

Challenge

After repeated failures by senior engineers and global OEMs, the autonomous HMI project was handed to me, the youngest PM and HMI specialist on the team, with only one week remaining. With mass layoffs looming and no clear specifications in sight, I had to rebuild my understanding from the ground up, starting with used car markets and raw components.


(Fortunately, I could draw on my hands-on experience in sensor-based HMI and voice control, built through Samsung, LG, and collaborations with major Big Tech companies.)

Objective

We had to combine field research, rapid prototyping, and user-centric intuition to quickly create a realistic and scalable solution that would meet Mercedes’ expectations for the first real-world HMI prototype for autonomous driving.

Result

My engineers and I rapidly iterated through multiple rounds of prototyping, and within just 5 days, we delivered a complete HMI product plan, along with a driver and occupant monitoring system designed to handle a wide range of unexpected real-world scenarios.


Mercedes adopted and scaled this solution, ultimately generating over $1 billion in annual recurring revenue for LG. More importantly, it restored confidence in a team that had nearly been dismantled, proving the many hidden talents that had gone unrecognized.

Preview final design

Monitoring from head to toe and pointing gestures

Monitoring from head to toe and pointing gestures

Our UX approach focused on enabling intuitive, full-body interactions that support both autonomy and comfort in high-stakes driving scenarios. By integrating head posture, eye tracking, and hand pointing gestures, we created a proactive HMI that eliminates the need for memorized commands.

The design allows users: driver or passenger to interact naturally under pressure, minimizing distraction and maximizing safety. Every gesture & posture model was validated through rapid prototyping and real-world testing, ensuring clarity, speed, and emotional trust at every touchpoint.

In addition to these multimodal interactions, several advanced techniques and proprietary sensing technologies were applied throughout the system. Due to confidentiality agreements, certain technical details cannot be disclosed here—but they played a critical role in delivering a seamless and intelligent experience.

More detail

I’ll walk you through the details during our meeting

Hello

Any questions?

Let's connect and build something meaningful together