2021
DOI: 10.1007/s11432-020-3055-1
|View full text |Cite
|
Sign up to set email alerts
|

Few-shot text classification by leveraging bi-directional attention and cross-class knowledge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…Xu and Xiang [57] proposed a multiperspective aggregation-based graph neural network that observes through eyes (support and query instance) and speaks by mouth (pair) for few-shot text classifcation. Pang et al [58] proposed an adapted bidirectional attention mechanism to exploit the interaction between query and support instances in metric learning to better describe text classifcation features. Te model we constructed was derived from the prototype network.…”
Section: Related Workmentioning
confidence: 99%
“…Xu and Xiang [57] proposed a multiperspective aggregation-based graph neural network that observes through eyes (support and query instance) and speaks by mouth (pair) for few-shot text classifcation. Pang et al [58] proposed an adapted bidirectional attention mechanism to exploit the interaction between query and support instances in metric learning to better describe text classifcation features. Te model we constructed was derived from the prototype network.…”
Section: Related Workmentioning
confidence: 99%
“…Bao et al [3] trained a meta-learning framework to map feature codes into attention scores for measuring the vocabulary representations of words, and adopted the ridge regression classifier to predict the results after seeing only a few training examples. Pang et al [20] used a bi-directional attention mechanism to encode the classification features. Despite recent progress, the generalization abilities of existing models are limited, and metalearning application scenarios are in urgent need of improvement.…”
Section: Meta-learning For Text Classificationmentioning
confidence: 99%
“…Heterogeneous model reuse takes advantage of the teacher from a related task, which relieves the burden of data storage so as to decrease the risk of privacy leaking [35], [54]. Meta-learning has also been utilized to transfer knowledge across different label spaces, e.g., fewshot learning [55], [56], [57], [58], [59], [60]. These approaches usually require special training strategies of the teacher.…”
Section: Related Workmentioning
confidence: 99%