2019 IEEE International Conference on Consumer Electronics (ICCE) 2019
DOI: 10.1109/icce.2019.8662020
|View full text |Cite
|
Sign up to set email alerts
|

MoBeTrack: A Toolkit to Analyze User Experience of Mobile Apps in the Wild

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 8 publications
0
7
0
Order By: Relevance
“…Based on the best of our knowledge, no studies so far have proposed the integrated use of gaze and emotion recognition systems, based on deep learning algorithms, to support the collection of relevant information useful for UX assessment of laptop and desktop web applications nor to power recommending systems. Only [11] proposed a system to support UX assessment based deep learning, but such a solution works only for mobile applications. While only [7] proposed to process data provided by a common pc webcam to enable users' gaze and emotion detection in order to manage an intelligent e-commerce recommendation system: it proposed respectively to use SVM algorithm to enable emotion recognition and a gradient-based method to perform gaze tracking.…”
Section: Research Backgroundmentioning
confidence: 99%
See 2 more Smart Citations
“…Based on the best of our knowledge, no studies so far have proposed the integrated use of gaze and emotion recognition systems, based on deep learning algorithms, to support the collection of relevant information useful for UX assessment of laptop and desktop web applications nor to power recommending systems. Only [11] proposed a system to support UX assessment based deep learning, but such a solution works only for mobile applications. While only [7] proposed to process data provided by a common pc webcam to enable users' gaze and emotion detection in order to manage an intelligent e-commerce recommendation system: it proposed respectively to use SVM algorithm to enable emotion recognition and a gradient-based method to perform gaze tracking.…”
Section: Research Backgroundmentioning
confidence: 99%
“…For the proposed toolkit a centralized architecture has been designed. This kind of architecture has been inherited from the system described in [11] and adapted to be used in synergy with web platforms instead of mobile applications. In Figure 1 are shown the main components: the Web Plugin and the Deep Learning Platform (DLP), respectively the frontend and the backend sides.…”
Section: System Architecturementioning
confidence: 99%
See 1 more Smart Citation
“…Most of these systems process data using AI, usually with CNNs [68,69] to predict primary emotions in Ekman and Fresner's framework (as shown in Section 2.1). In fact, most of the currently available facial expressions datasets are based on this emotion framework (e.g., EmotioNet [70]).…”
Section: Emotion Recognition Technologies In the Vehiclementioning
confidence: 99%
“…For this reason, this research area is starting to focus on non-intrusive devices in the last year to automatically recognize human emotions, particularly for speech and facial coding analysis. Most of the facial expression recognition systems today make use of Deep Neural Networks (especially Convolutional Neural Networks), like the one presented by (Generosi et al, 2018;Generosi et al, 2019), taking pictures of human faces in input and providing a prediction of the relative Ekman's primary emotions (i.e., happiness, surprise, sadness, anger, disgust, and fear) (Ekman and Friesen, 1978), just like most of the state-of-the-art systems that involve this kind of technologies (Li et al, 2018). The literature proposes different emotion-aware car systems, some using voice analysis (Jones and Jonsson, 2007) others wearable devices (Nasoz et al, 2010;Katsis et al, 2008).…”
Section: Introductionmentioning
confidence: 99%