Affective Computing 2008
DOI: 10.5772/6171
|View full text |Cite
|
Sign up to set email alerts
|

Towards Affect-sensitive Assistive Intervention Technologies for Children with Autism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 56 publications
(50 reference statements)
0
7
0
Order By: Relevance
“…Basically, human feelings are translated to the computers, which can understand and express them. The identification of the six basic emotions can be used for developing assistive robots, as the ones which detect and processes the affective states of children suffering from autism spectrum disorder [65], intelligent tutoring systems that use automatic emotion recognition to improve learning efficiency and adapt learning contents and interfaces in order to engage students [66], virtual reality games or immersive virtual environments that act as real therapists in anxiety disorder treatment [9,10], recommender systems which know the users' mood and adapt the recommended items accordingly [67,68], public sentiments analysis about different events, economic, or political decisions [69], and assistive technology [70][71][72].…”
Section: Discussionmentioning
confidence: 99%
“…Basically, human feelings are translated to the computers, which can understand and express them. The identification of the six basic emotions can be used for developing assistive robots, as the ones which detect and processes the affective states of children suffering from autism spectrum disorder [65], intelligent tutoring systems that use automatic emotion recognition to improve learning efficiency and adapt learning contents and interfaces in order to engage students [66], virtual reality games or immersive virtual environments that act as real therapists in anxiety disorder treatment [9,10], recommender systems which know the users' mood and adapt the recommended items accordingly [67,68], public sentiments analysis about different events, economic, or political decisions [69], and assistive technology [70][71][72].…”
Section: Discussionmentioning
confidence: 99%
“…HRI studies have also been conducted with robots using dimensional models for affect classification using facial expressions [76][77][78], body language [5,79,80], voice [81,82], physiological signals [4,22,70,[83][84][85][86][87], and multi-modal systems [88][89][90]. The most common model used in HRI is the two-dimensional circumplex (valence-arousal) model.…”
Section: Affect Models Used In Hrimentioning
confidence: 99%
“…Automated affect detection would promote effective engagement in interactions aimed at improving a person's health and wellbeing, e.g. interventions for children with autism [22] and for the elderly [23]. Mimicry HRI consists of a robot or person imitating the verbal and/or nonverbal behaviors of the other [24].…”
Section: Introductionmentioning
confidence: 99%
“…Comparative studies were performed to evaluate a range of machine learning algorithms to emotion recognition of three emotions, i.e., liking, engagement, and anxiety was performed Liu et al 2005). Lessons learned from human robot interaction of typical population and feature extraction and machine learning methods developed were later applied to therapy of individuals with ASD (Liu et al 2007(Liu et al , 2008aConn et al 2010).…”
Section: Adaptive Hri and Psychophysiologymentioning
confidence: 99%