Findings of the Association for Computational Linguistics: ACL 2022 2022
DOI: 10.18653/v1/2022.findings-acl.280
|View full text |Cite
|
Sign up to set email alerts
|

Incremental Intent Detection for Medical Domain with Contrast Replay Networks

Abstract: Conventional approaches to medical intent detection require fixed pre-defined intent categories. However, due to the incessant emergence of new medical intents in the real world, such requirement is not practical. Considering that it is computationally expensive to store and re-train the whole data every time new data and intents come in, we propose to incrementally learn emerged intents while avoiding catastrophically forgetting old intents. We first formulate incremental learning for medical intent detection… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…Based on this problem, Qin et al constructed the GL-GIN model based on the graph attention network [16] and used non-autoregressive methods to alleviate the problem of inconsistent slots. Bai et al [23] proposed a memory based method to incrementally learn emerging intentions in order to address the high computational cost of storing new data and intentions each time and retraining the entire data. Jiang et al [28] proposed a method of separation parsing, which divides a sentence into multiple clauses containing a single intention, performs loop parsing on each clause, and finally integrates the parsing results.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Based on this problem, Qin et al constructed the GL-GIN model based on the graph attention network [16] and used non-autoregressive methods to alleviate the problem of inconsistent slots. Bai et al [23] proposed a memory based method to incrementally learn emerging intentions in order to address the high computational cost of storing new data and intentions each time and retraining the entire data. Jiang et al [28] proposed a method of separation parsing, which divides a sentence into multiple clauses containing a single intention, performs loop parsing on each clause, and finally integrates the parsing results.…”
Section: Related Workmentioning
confidence: 99%
“…The first one is the problem of data sparsity. The data sources for multi-intent detection are scarce [25], the amount of data is insufficient, and the cost of annotating data is very high [2], [23], making it difficult to obtain annotated data. Additionally, the occurrence frequency of some intents or slots is relatively low, which leads to poor detection performance of certain intents or slots.…”
Section: Introductionmentioning
confidence: 99%
“…In incremental learning, a model could learn from both old tasks and new tasks, so that it could be effective in both old tasks and new tasks. There are many researches which employ incremental learning in various NLP tasks, such as a few-shot classincremental learning method for NER [20], a contrast replay network for intent detection [21], an incremental few-shot text classification method for text classification [22], and an incremental meta self-training method for RE [23]. In this paper, incremental learning is employed by fine-tuning the pre-trained model with training samples in different tasks and experimental results show that our proposed model is robust to various noisy questions.…”
Section: Related Workmentioning
confidence: 99%
“…Bao et al (2020) build a chat-bot framework using user intents. Bai et al (2022) aim at incremental medical intent detection. Razzaq et al (2017); Amato et al (2017) develop an e-Health application using intent-context relationships.…”
Section: Related Workmentioning
confidence: 99%