Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2021 Innovations in Intelligent Systems and Applications Conference (ASYU) 2021
DOI: 10.1109/asyu52992.2021.9599055
|View full text |Cite
|
Sign up to set email alerts
|

Leveraging the Information in In-Domain Datasets for Transformer-Based Intent Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…Our experimental setup and results are provided in this section. In the experiments, we use BERTurk in https://huggingface.co/dbmdz/bert-base-turkish-cased as the baseline pre-trained transformer model since it provided the best accuracies in our previous study (Büyük et al, 2021). The model has 12 transformer layers with 12 attention heads.…”
Section: Experimental Setup and Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Our experimental setup and results are provided in this section. In the experiments, we use BERTurk in https://huggingface.co/dbmdz/bert-base-turkish-cased as the baseline pre-trained transformer model since it provided the best accuracies in our previous study (Büyük et al, 2021). The model has 12 transformer layers with 12 attention heads.…”
Section: Experimental Setup and Resultsmentioning
confidence: 99%
“…In (Dündar et al, 2020), it is concluded that contextual word embeddings from transformers improves the intent detection accuracy compared to the classical machine learning models. In our previous study, we implement an intent detection system using the pre-trained transformer models (Büyük et al, 2021). In the study, we used three different intent detection datasets.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation