2011
DOI: 10.1007/978-3-642-19170-1_18
|View full text |Cite
|
Sign up to set email alerts
|

Robot Emotional State through Bayesian Visuo-Auditory Perception

Abstract: In this paper we focus on auditory analysis as the sensory stimulus, and on vocalization synthesis as the output signal. Our scenario is to have one robot interacting with one human through vocalization channel. Notice that vocalization is far beyond speech; while speech analysis would give us what was said, vocalization analysis gives us how was said. A social robot shall be able to perform actions in different manners according to its emotional state. Thus we propose a novel Bayesian approach to determine th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2011
2011
2011
2011

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(8 citation statements)
references
References 8 publications
(9 reference statements)
0
8
0
Order By: Relevance
“…Progresses were made in [23] and [10] however they claim to have a high percentage of accuracy, they are not real-time. In this paper we are using our own model previously proposed on [15] witch is a real time system for auditory emotional classification, though it is well suitable for our purposes.…”
Section: Related Work 21 Proposed Approaches To Automatic Emotionmentioning
confidence: 99%
See 4 more Smart Citations
“…Progresses were made in [23] and [10] however they claim to have a high percentage of accuracy, they are not real-time. In this paper we are using our own model previously proposed on [15] witch is a real time system for auditory emotional classification, though it is well suitable for our purposes.…”
Section: Related Work 21 Proposed Approaches To Automatic Emotionmentioning
confidence: 99%
“…Most state-of-the-art in the area of emotive robots do not run in real-time, being then still inapplicable to real cases of human-robot-interaction applications. However recent researches like [20] and [15] shows result about two Bayesian classifiers inside a structure for human-robot-interaction that are applicable to HRI in real-time. These classifiers have both the purpose to classify human emotional state among the scope {anger,fear,sad,neutral and happy}.…”
Section: Emotive Robotsmentioning
confidence: 99%
See 3 more Smart Citations