This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revision | |||
| en:safeav:hmc:hmi [2025/10/20 21:52] – raivo.sell | en:safeav:hmc:hmi [2026/03/26 11:25] (current) – raivo.sell | ||
|---|---|---|---|
| Line 2: | Line 2: | ||
| {{: | {{: | ||
| - | This chapter explores the specificities of **Human–Machine Interaction (HMI)** in the context of autonomous vehicles (AVs). It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehicle, the challenge arises: | + | This chapter explores the specificities of Human–Machine Interaction (HMI) in the context of autonomous vehicles (AVs). It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehicle, the challenge arises: how should AI-driven systems communicate effectively with passengers, pedestrians, |
| - | HMI in AVs extends far beyond the driver’s dashboard. It defines the communication bridge between | + | HMI in AVs extends far beyond the driver’s dashboard. It defines the communication bridge between machines, people, and infrastructure — shaping how autonomy is perceived and trusted. Effective HMI determines whether automation is experienced as intelligent and reliable or opaque and alien. |
| ===== Changing Paradigms of Communication ===== | ===== Changing Paradigms of Communication ===== | ||
| - | Traditional driver interfaces were designed to support manual control. In contrast, autonomous vehicles must communicate | + | Traditional driver interfaces were designed to support manual control. In contrast, autonomous vehicles must communicate intent, status, and safety both inside and outside the vehicle. The absence of human drivers requires new communication models to ensure safe interaction among all participants. |
| This section addresses the available communication channels and discusses how these channels must be redefined to accommodate the new paradigm. Additionally, | This section addresses the available communication channels and discusses how these channels must be redefined to accommodate the new paradigm. Additionally, | ||
| - | A key concept in this transformation is the **Language of Driving (LoD)** — a framework for structuring and standardizing how autonomous vehicles express awareness and intent toward humans (Kalda et al., 2022). | + | A key concept in this transformation is the Language of Driving (LoD) — a framework for structuring and standardizing how autonomous vehicles express awareness and intent toward humans (Kalda et al., 2022). |
| ===== Human Perception and Driving ===== | ===== Human Perception and Driving ===== | ||
| Line 23: | Line 23: | ||
| * Subtle speed or direction changes as non-verbal cues. | * Subtle speed or direction changes as non-verbal cues. | ||
| - | Such behaviorally inspired signaling helps AVs become socially legible, supporting | + | Such behaviorally inspired signaling helps AVs become socially legible, supporting shared understanding on the road. |
| ===== Cultural and Social Interactions ===== | ===== Cultural and Social Interactions ===== | ||
| Line 30: | Line 30: | ||
| Autonomous vehicles may need to adapt their communication style — from light colors and icons to audio tones and message phrasing — depending on cultural and regional expectations. | Autonomous vehicles may need to adapt their communication style — from light colors and icons to audio tones and message phrasing — depending on cultural and regional expectations. | ||
| - | Research explores whether AVs could adopt **human-like communication methods**, such as digital facial expressions or humanoid gestures, to support more natural interactions in complex social driving contexts. | + | Research explores whether AVs could adopt human-like communication methods, such as digital facial expressions or humanoid gestures, to support more natural interactions in complex social driving contexts. |
| ===== AI Role in Communication ===== | ===== AI Role in Communication ===== | ||
| - | Modern HMI systems increasingly rely on **artificial intelligence**, including large language models (LLMs), to process complex situational data and adapt communication in real time. | + | Modern HMI systems increasingly rely on artificial intelligence, |
| AI enables: | AI enables: | ||
| - | | + | * Context-aware dialogue systems for passengers and operators. |
| - | | + | * Adaptive message prioritization based on urgency and environment. |
| - | | + | * Natural language explanations of AV behavior (e.g., “Slowing down for crossing pedestrian”). |
| - | The evolution toward AI-mediated interfaces marks a shift from fixed UI design toward | + | The evolution toward AI-mediated interfaces marks a shift from fixed UI design toward conversational and contextual vehicle communication. |
| {{: | {{: | ||