2021
DOI: 10.3390/electronics10121412
|View full text |Cite
|
Sign up to set email alerts
|

Monolingual and Cross-Lingual Intent Detection without Training Data in Target Languages

Abstract: Due to recent DNN advancements, many NLP problems can be effectively solved using transformer-based models and supervised data. Unfortunately, such data is not available in some languages. This research is based on assumptions that (1) training data can be obtained by the machine translating it from another language; (2) there are cross-lingual solutions that work without the training data in the target language. Consequently, in this research, we use the English dataset and solve the intent detection problem … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…Such an approach allows for leveraging high-quality, abundant resources available in English to benefit less-resourced languages. However, it is important to acknowledge that crosslingual models often experience a drop in accuracy compared to their monolingual counterparts, a phenomenon well documented in the literature [20]. This accuracy dip is partly due to the intrinsic variability and complexity of language translation and contextual usage across languages.…”
Section: Related Workmentioning
confidence: 96%
“…Such an approach allows for leveraging high-quality, abundant resources available in English to benefit less-resourced languages. However, it is important to acknowledge that crosslingual models often experience a drop in accuracy compared to their monolingual counterparts, a phenomenon well documented in the literature [20]. This accuracy dip is partly due to the intrinsic variability and complexity of language translation and contextual usage across languages.…”
Section: Related Workmentioning
confidence: 96%
“…This study presents the application of word embedding methods to effectively use rich information in the abstract and title of project proposals in Turkish. Neural network-based word embedding methods (e.g., FastText, BERT) are used in numerous NLP tasks successfully (Romanov and Khusainova, 2019; Kapočiūtė-Dzikienė et al ., 2021; Kalyan et al ., 2021). However, a project proposal grouping research based on such next-generation representation approaches has not yet been reported to the best of our knowledge.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Neural network technologies for word embedding have recently shown remarkable results in NLP tasks (Romanov and Khusainova, 2019; Kapočiūtė-Dzikienė et al ., 2021; Kalyan et al ., 2021). However, a project proposal grouping study based on high-performance neural network-based textual feature extraction techniques has not yet been reported to the best of our knowledge.…”
Section: Introductionmentioning
confidence: 99%
“…Aiming at the problems of high computational cost and poor generalization ability of traditional intention recognition models, the BERT-FNN intention recognition model was proposed by Zheng et al [10]. Two BERT-based vectorization types, word embedding and sentence embedding, were used by Jurgita et al [11] to solve the problem of cross-domain intent recognition without training data. The performance of the model is improved to a great extent through the above channel fusion technique.…”
Section: Introductionmentioning
confidence: 99%