Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.245
|View full text |Cite
|
Sign up to set email alerts
|

Neural Networks approaches focused on French Spoken Language Understanding: application to the MEDIA Evaluation Task

Abstract: In this paper, we present a study on a French Spoken Language Understanding (SLU) task: the MEDIA task. Many works and studies have been proposed for many tasks, but most of them are focused on English language and tasks. The exploration of a richer language like French within the framework of a SLU task implies to recent approaches to handle this difficulty. Since the MEDIA task seems to be one of the most difficult, according to several previous studies, we propose to explore Neural Networks approaches focus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…Based on the GESE oral test method, a formative hierarchical evaluation model of oral English teaching in higher vocational colleges is constructed, and its positive "backwash effect" law is obtained. Many of the research tasks proposed in [1] focus on English, and it is difficult to explore a richer variety of languages within the framework of the SLU task. Minor languages need to be updated in the training process of the SLU model.…”
Section: Introductionmentioning
confidence: 99%
“…Based on the GESE oral test method, a formative hierarchical evaluation model of oral English teaching in higher vocational colleges is constructed, and its positive "backwash effect" law is obtained. Many of the research tasks proposed in [1] focus on English, and it is difficult to explore a richer variety of languages within the framework of the SLU task. Minor languages need to be updated in the training process of the SLU model.…”
Section: Introductionmentioning
confidence: 99%
“…This system is based on the fine-tuning of the French CamemBERT [26] model, on the manual transcriptions of MEDIA corpus. It achieved state-of-the-art result on manual transcriptions of MEDIA corpus [19], yielding to 7.56 of CER when there is no error in the transcription.…”
Section: Cascade Approach With Pre-trained Modelsmentioning
confidence: 97%
“…For the NLU module, we propose to use the one that achieved the state-ofthe-art result on manual transcriptions of MEDIA corpus [19]. This system is based on a fine-tuning of BERT [16] on MEDIA SLU task using the French CamemBERT [26] model.…”
Section: Bert and Camembert Modelsmentioning
confidence: 99%
“…They highlighted the competitiveness of Word2Vec and ELMO on the French SLU corpus: MEDIA. More recently, [12] investigated the transferability of two French pre-trained BERT models [2] and their integration in BiLSTM and BiLSTM+CNN-based architectures. They obtained state-of-the-art results on MEDIA using CamemBERT [13] model.…”
Section: Introductionmentioning
confidence: 99%