2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC) 2012
DOI: 10.1109/icsmc.2012.6378084
|View full text |Cite
|
Sign up to set email alerts
|

Uses of facial expressions of android head system according to gender and age

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…We exploited a public dataset from a tweet-sentiment-extraction competition 8 . The organiser provided 27k training samples composed of raw tweet sentence, selected text, and sentiment, as shown in Table 4.…”
Section: Exploratory Data Analysis (Eda)mentioning
confidence: 99%
“…We exploited a public dataset from a tweet-sentiment-extraction competition 8 . The organiser provided 27k training samples composed of raw tweet sentence, selected text, and sentiment, as shown in Table 4.…”
Section: Exploratory Data Analysis (Eda)mentioning
confidence: 99%
“…Both the television and the tablet were used for presenting math quizzes. The robot platform is a real human-sized female-type android robot called EveR-4, which has 30 degrees of freedom in the face for expressing natural emotions [1]. The robot interacted with verbal and non-verbal modalities.…”
Section: Interaction and System Designmentioning
confidence: 99%
“…This is because text often contains limited information to describe emotions sufficiently. It also differs across gender, age, and cultural backgrounds [8]. Furthermore, sarcasm, emojis, and rapidly emerging new words restrict sentiment analysis from text data.…”
Section: Introductionmentioning
confidence: 99%