Proceedings of the 2015 ACM on International Conference on Multimodal Interaction 2015
DOI: 10.1145/2818346.2830599
|View full text |Cite
|
Sign up to set email alerts
|

Social Touch Gesture Recognition using Random Forest and Boosting on Distinct Feature Sets

Abstract: Touch is a primary nonverbal communication channel used to communicate emotions or other social messages. A variety of social touch exists including hugging, rubbing and punching. Despite its importance, this channel is still very little explored in the affective computing field, as much more focus has been placed on visual and aural channels. In this paper, we investigate the possibility to automatically discriminate between different social touch types. We propose five distinct feature sets for describing to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
32
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 26 publications
(36 citation statements)
references
References 41 publications
4
32
0
Order By: Relevance
“…This indicated that the additional features and the use of more complex classification methods with hyperparameter optimization have improved the accuracy. Results reported in this paper fall within the range of 26-61 % accuracy that was reported for a data challenge using the CoST data set [3,12,18,36]. Our results are comparable to those reported in the work of Gaus et al and Ta et al who reported accuracies up to 59 and 61 %, respectively using random forest [12,36].…”
Section: Classification Results and Touch Gesture Confusionsupporting
confidence: 90%
See 2 more Smart Citations
“…This indicated that the additional features and the use of more complex classification methods with hyperparameter optimization have improved the accuracy. Results reported in this paper fall within the range of 26-61 % accuracy that was reported for a data challenge using the CoST data set [3,12,18,36]. Our results are comparable to those reported in the work of Gaus et al and Ta et al who reported accuracies up to 59 and 61 %, respectively using random forest [12,36].…”
Section: Classification Results and Touch Gesture Confusionsupporting
confidence: 90%
“…Results reported in this paper fall within the range of 26-61 % accuracy that was reported for a data challenge using the CoST data set [3,12,18,36]. Our results are comparable to those reported in the work of Gaus et al and Ta et al who reported accuracies up to 59 and 61 %, respectively using random forest [12,36]. However it should be noted that the data challenge contained a subset of CoST (i.e., gentle and normal variants) and that the train and test data division was different from the leave-one-subjectout cross-validation results reported in this paper [24].…”
Section: Classification Results and Touch Gesture Confusionsupporting
confidence: 75%
See 1 more Smart Citation
“…Features were derived from the mean over time and rows or columns of the above values (41)(42). The mean absolute pressure difference for all channels was also calculated.…”
Section: Feature Extractionmentioning
confidence: 99%
“…Ta et al [108] random forest a 61.3% Ta et al [108] random forest b 60.8% Ta et al [108] SVM b 60.5% Ta et al [108] SVM a 59.9% Gaus et al [41] random forest 58.7% Gaus et al [41] multiboost 58.2% Hughes et al [53] logistic regression 47.2%…”
Section: Papermentioning
confidence: 99%