ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053899
|View full text |Cite
|
Sign up to set email alerts
|

Auxiliary Capsules for Natural Language Understanding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…The method of using SLU as fine-tuning with pre-training on another task, or vice versa, has shown improvements in the SLU performance. However the results of [Staliūnaitė and Iacobacci 2020], echoing those of [Louvan and Magnini 2019] on slot tagging, indicate a parsimonious approach to adding extra tasks simultaneously more often yields a better result.…”
Section: Target Variationsmentioning
confidence: 93%
See 3 more Smart Citations
“…The method of using SLU as fine-tuning with pre-training on another task, or vice versa, has shown improvements in the SLU performance. However the results of [Staliūnaitė and Iacobacci 2020], echoing those of [Louvan and Magnini 2019] on slot tagging, indicate a parsimonious approach to adding extra tasks simultaneously more often yields a better result.…”
Section: Target Variationsmentioning
confidence: 93%
“…This explicit and direct feedback is stronger than the implicit or indirect joint learning typically found in RNN models. [Staliūnaitė and Iacobacci 2020] extended this work to a multi-task setting with extra mid-level capsules for NER and POS labels, with mixed results. [Wen et al 2018] Using hierarchy and context Two layer (Bi)LSTM [Wang et al 2018c] Capturing local semantic information CNN, BiLSTM encoder decoder [Firdaus et al 2018a] Domain dependence Ensemble model, GRU Slow training time Progressive multi-task model using user information [Li et al 2018a] Correlation of different tasks Multi-task model incl.…”
Section: Hierarchical Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…Furthermore, many previous research articles have been published and have discussed the researcher's implementation on the topic of ID and IC with slot filling. Papers [1,8,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37] have discussed and presented their approach or implementation on developing ID and IC with slot filling using various frameworks, methodologies, and techniques. Various frameworks, methodologies, approaches, techniques and algorithms have been implemented by these researchers, whereas for instance, [19,22,24,26,27,28,31,34,35,36] have implemented techniques of Bidirectional Long Short-Term Memory ID (BiLSTM) with Conditional Random Forest (CRF) in their implementation of ID and IC with slot filling.…”
Section: Introductionmentioning
confidence: 99%