2016
DOI: 10.1017/atsip.2016.2
|View full text |Cite
|
Sign up to set email alerts
|

Multi-modal sensing and analysis of poster conversations with smart posterboard

Abstract: Multi I . I N T R O D U C T I O NMulti-modal signal and information processing has been investigated primarily for intelligent human-machine interfaces, including smart phones, KIOSK terminals, and humanoid robots. Meanwhile, speech and imageprocessing technologies have been improved so much that their target now includes natural human-human behaviors, which are made without being aware of interface devices. In this scenario, sensing devices are installed in an ambient manner. Examples of this kind of directio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…Even in quite sophisticated data collection approaches, capturing eye gaze accurately may be problematic. Kawahara, et al's (2016) study of audiences at poster presentations used a multiple forms of data collection involving a camera, a microphone array and motion sensing to detect audience members' faces and track their eye gaze. While this data collection method supplemented the video, they still needed to approximate eye gaze from head movement.…”
Section: Technical Issues In Recordingmentioning
confidence: 99%
“…Even in quite sophisticated data collection approaches, capturing eye gaze accurately may be problematic. Kawahara, et al's (2016) study of audiences at poster presentations used a multiple forms of data collection involving a camera, a microphone array and motion sensing to detect audience members' faces and track their eye gaze. While this data collection method supplemented the video, they still needed to approximate eye gaze from head movement.…”
Section: Technical Issues In Recordingmentioning
confidence: 99%
“…There are a number of studies that have analyzed the relationship between gaze and user intention, such as user focus (Yonetani et al, 2010), preference (Kayama et al, 2010), and reference expression understanding , and between gaze and turn-taking (Jokinen et al, 2010;Kawahara, 2012). Nakano et al (2013) used face direction for addressee identification.…”
Section: Related Workmentioning
confidence: 99%
“…On the other hand, the users’ internal states are difficult to define and measure objectively. Many researchers have proposed recognition models for various kinds of internal states such as the level of interest [7,8], understanding [9], and emotion [10–12]. From the perspective of the relationship between dialogue participants (i.e.…”
Section: Introductionmentioning
confidence: 99%