2019 IEEE 10th Annual Ubiquitous Computing, Electronics &Amp; Mobile Communication Conference (UEMCON) 2019
DOI: 10.1109/uemcon47517.2019.8993051
|View full text |Cite
|
Sign up to set email alerts
|

Facial Expression Recognition Using DCNN and Development of an iOS App for Children with ASD to Enhance Communication Abilities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
2
0
2

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 13 publications
0
2
0
2
Order By: Relevance
“…Ul Haque e Valles [32] construíram um modelo Deep Convolutional Neural Network (DCNN) baseado na arquitetura VGG-16. Em relac ¸ão a trabalhos anteriores, a fim de melhorar a precisão e o desempenho, foram modificados parâmetros do modelo, como a taxa de aprendizagem, o tamanho de lote, o número de épocas e a taxa de dropout.…”
Section: Revis ãO Da Literaturaunclassified
See 1 more Smart Citation
“…Ul Haque e Valles [32] construíram um modelo Deep Convolutional Neural Network (DCNN) baseado na arquitetura VGG-16. Em relac ¸ão a trabalhos anteriores, a fim de melhorar a precisão e o desempenho, foram modificados parâmetros do modelo, como a taxa de aprendizagem, o tamanho de lote, o número de épocas e a taxa de dropout.…”
Section: Revis ãO Da Literaturaunclassified
“…Abordagem de Ul Haque e Valles [32] foca no problema de variac ¸ão de iluminac ¸ão realizando testes com imagens em diferentes condic ¸ões de iluminac ¸ão. Wu et al [16] visam obter bons resultados, concentrando-se no problema de diferentes poses de cabec ¸a e identificando pontos de referências espaciais.…”
Section: A Discussão Sobre a Revisão De Literaturaunclassified
“…Use of BCI systems provides insight into the user's inner-emotional state. Valles et al [72] conducted research focused on mobile software design to provide assistance to children with ASD. They aimed to design a smart iOS app based on facial images according to Figure 11.…”
Section: A Mobile and Software Applicationsmentioning
confidence: 99%
“…In this way, people's faces at different angles and brightness are first photographed, and are turned into various emoji so that the autistic child can express his/her feelings and emotions. In this group's investigation [72], Kaggle's (The Facial Expression Recognition 2013) and KDEF (Kaggle's FER2013 and Karolinska Directed Emotional Faces) databases were used to train the VGG-16. In addition, the LEAP system was adapted to train the model at the University of Texas.…”
Section: A Mobile and Software Applicationsmentioning
confidence: 99%