2021
DOI: 10.1609/aaai.v35i16.17694
|View full text |Cite
|
Sign up to set email alerts
|

Making the Relation Matters: Relation of Relation Learning Network for Sentence Semantic Matching

Abstract: Sentence semantic matching is one of the fundamental tasks in natural language processing, which requires an agent to determine the semantic relation among input sentences. Recently, deep neural networks have achieved impressive performance in this area, especially BERT. Despite the effectiveness of these models, most of them treat output labels as meaningless one-hot vectors, underestimating the semantic information and guidance of relations that these labels reveal, especially for tasks with a small number … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 41 publications
(54 reference statements)
0
2
0
Order By: Relevance
“…The generation length in Chinese is Models QQP ACC. F1 CENN 80.7 \ L.D.C (Wang et al, 2016) 85.6 \ BiMPM (Wang et al, 2017) 88.2 \ DIIN (Gong et al, 2017) 89.1 \ DRCN (Kim et al, 2019) 90.2 \ R 2 -Net (Zhang et al, 2021) 91.6 \ ISG (RoBERTa-large) (Xu et al, 2022) shorter and the difficulty is lower, which makes the generation effect better. Therefore, the boosting effect for Chinese PI is better than that of English.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The generation length in Chinese is Models QQP ACC. F1 CENN 80.7 \ L.D.C (Wang et al, 2016) 85.6 \ BiMPM (Wang et al, 2017) 88.2 \ DIIN (Gong et al, 2017) 89.1 \ DRCN (Kim et al, 2019) 90.2 \ R 2 -Net (Zhang et al, 2021) 91.6 \ ISG (RoBERTa-large) (Xu et al, 2022) shorter and the difficulty is lower, which makes the generation effect better. Therefore, the boosting effect for Chinese PI is better than that of English.…”
Section: Resultsmentioning
confidence: 99%
“…Inspired by ResNet (He et al, 2016a), Kim et al (2019) propose a densely-connected recurrent and co-attentive Network (DRCN), which combines residual operation with RNN and attention mechanism. Zhang et al (2021) propose a Relation Learning Network (R 2 -Net) based on BERT, which is characterized by interactive relationship perception of multi-granularity linguistic units. The recently proposed ISG-BERT (Xu et al, 2022) integrates syntactic alignments and semantic matching signals of two sentences into an association graph to gain a fine granularity matching process.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…šæ āˆ‘ š‘¦ true log š‘¦ pred 1 š‘¦ true log 1 š‘¦ pred (4) ytrue represents the distribution of true labels, while ypred represents the predicted label distribution of the trained model. Crossentropy loss can measure the similarity between y true and y pred .…”
Section: 5loss Functionmentioning
confidence: 99%
“…Inspired by DenseNet [2], Kim et al proposed a densely connected co-attention recursive neural network that effectively preserves both original and co-attention features across layers [3]. Zhang et al discussed the importance of sentence semantic matching in natural language processing and highlighted the limitations of existing models like BERT, which treat output labels as single-point vectors [4]. To address this issue, the authors introduced the Relation of Relation Learning Network (R 2 Net).…”
Section: Introductionmentioning
confidence: 99%