2013 18th International Conference on Digital Signal Processing (DSP) 2013
DOI: 10.1109/icdsp.2013.6622782
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal desktop interaction: The face - object - gesture - voice example

Abstract: This paper presents a natural user interface system based on multimodal human computer interaction, which operates as an intermediate module between the user and the operating system. The aim of this work is to demonstrate a multimodal system which gives users the ability to interact with desktop applications using face, objects, voice and gestures. These human behaviors constitute the input qualifiers to the system. Microsoft Kinect multi-sensor was utilized as input device in order to succeed the natural use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
1
1
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Interpreting the user's explicit intention, which contains valuable information, is vital in developing efficient human computer interfaces. In conventional human computer interface (HCI) and human robot interaction (HRI) environments, the user intention such as 'copy this file' or 'create a folder' can be explicitly conveyed through a keyboard and a computer mouse [11,12], which can be easily interpreted. The process of data visualisation is suitable for externalising the facts and enabling people to understand and manipulate the results at a higher level.…”
Section: Introductionmentioning
confidence: 99%
“…Interpreting the user's explicit intention, which contains valuable information, is vital in developing efficient human computer interfaces. In conventional human computer interface (HCI) and human robot interaction (HRI) environments, the user intention such as 'copy this file' or 'create a folder' can be explicitly conveyed through a keyboard and a computer mouse [11,12], which can be easily interpreted. The process of data visualisation is suitable for externalising the facts and enabling people to understand and manipulate the results at a higher level.…”
Section: Introductionmentioning
confidence: 99%
“…Interpreting the user's explicit intention, which contains valuable information, is vital in developing efficient human computer interfaces. In conventional human computer interface (HCI) and human robot interaction (HRI) environments, the user intention such as "copy this file" or "create a folder" can be explicitly conveyed through a keyboard and a computer mouse [18,19], which can be easily interpreted. The process of data visualisation is suitable for externalising the facts and enabling people to understand and manipulate the results at a higher level.…”
Section: Introductionmentioning
confidence: 99%