2015
DOI: 10.4218/etrij.15.0114.0076
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

Abstract: We propose a novel HMI UI/UX for an in‐vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface–based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 13 publications
0
11
0
Order By: Relevance
“…A number of interaction studies have been conducted on infotainment systems. Various infotainment systems have been studied, such as touch buttons (Crundall et al, 2016;Feng et al, 2018;Kim et al, 2014;Suh & Ferris, 2019), overall user interface (UI) elements (Hua & Ng, 2010;Naujoks et al, 2019;Pankok & Kaber, 2018), icons/symbols (Silvennoinen et al, 2017), screen position (Kuiper et al, 2018), layout (Kim et al, 2015;Li, Chen et al, 2017), sound effects (Larsson & Niemand, 2015), gestures (Graichen et al, 2019;Parada-Loira et al, 2014), and modality (Gaffar & Kouchak, 2017).…”
Section: Literature Reviewmentioning
confidence: 99%
“…A number of interaction studies have been conducted on infotainment systems. Various infotainment systems have been studied, such as touch buttons (Crundall et al, 2016;Feng et al, 2018;Kim et al, 2014;Suh & Ferris, 2019), overall user interface (UI) elements (Hua & Ng, 2010;Naujoks et al, 2019;Pankok & Kaber, 2018), icons/symbols (Silvennoinen et al, 2017), screen position (Kuiper et al, 2018), layout (Kim et al, 2015;Li, Chen et al, 2017), sound effects (Larsson & Niemand, 2015), gestures (Graichen et al, 2019;Parada-Loira et al, 2014), and modality (Gaffar & Kouchak, 2017).…”
Section: Literature Reviewmentioning
confidence: 99%
“…System Use on Risk Perception and Psychological Changes of Drivers Yoonsook Hwang, Byoung-Jun Park, and Kyong-Ho Kim  one study proposed a novel human-machine interface (HMI) user-interface/user-experience (UI/UX) system based on the recognition of diverse gestures that is applicable to an invehicle infotainment system [7]. Based on the results of these studies, we expect to be able to understand the cause and effect of driver distractions because emotions and stress also affect driver behaviors and states.…”
Section: Effects Of Augmented-reality Head-up Displaymentioning
confidence: 99%
“…It uses a near-infrared visual device that also provides hand-skeleton information. Similarly, hand-skeleton information has been used to control the radio inside a car in the field of driver assistance [2]. Another example is the virtual reality sickness simulator presented in [3], which makes use of hand-skeleton information provided by the Leap Motion device to interact with the virtual environment.…”
Section: Introductionmentioning
confidence: 99%