2018 IEEE 4th International Conference on Computer and Communications (ICCC) 2018
DOI: 10.1109/compcomm.2018.8780705
|View full text |Cite
|
Sign up to set email alerts
|

Multi-dimensional Speaker Information Recognition with Multi-task Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…However, on the gender tasks, the results are close between the single task learning and multi-task learning. The work in [2] used the KSUEmtotion dataset for emotion and gender recognition; the accuracy recorded was 79.3% for the task of emotion recognition and 98.7% on gender detection. The best experiment of the proposed model outperformed these results as shown in table 4.…”
Section: Data Specific Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…However, on the gender tasks, the results are close between the single task learning and multi-task learning. The work in [2] used the KSUEmtotion dataset for emotion and gender recognition; the accuracy recorded was 79.3% for the task of emotion recognition and 98.7% on gender detection. The best experiment of the proposed model outperformed these results as shown in table 4.…”
Section: Data Specific Resultsmentioning
confidence: 99%
“…However, in this paper, the emotions are categorized following [19]'s categorization: anger, surprise, happiness, sadness, questioning, and neutral. Emotion detection from Arabic speech has been explored in previous work as in [13,2], where authors used multiple wellknown classification methods and ensembles on raw features such as MFCC and pitch. Gender detection is an easier task in comparison with emotion and dialect detection in most languages including Arabic.…”
Section: Arabic Speaker Profilingmentioning
confidence: 99%
See 2 more Smart Citations