2020
DOI: 10.3390/app10030958
|View full text |Cite
|
Sign up to set email alerts
|

Improving Sentence Representations via Component Focusing

Abstract: The efficiency of natural language processing (NLP) tasks, such as text classification and information retrieval, can be significantly improved with proper sentence representations. Neural networks such as convolutional neural network (CNN) and recurrent neural network (RNN) are gradually applied to learn the representations of sentences and are suitable for processing sequences. Recently, bidirectional encoder representations from transformers (BERT) has attracted much attention because it achieves state-of-t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 29 publications
0
7
0
Order By: Relevance
“…The input data length has a fixed value to match the matrix dimension of the CNN module, and a pad is added to enforce the same length for all data. CF-SBERT [24] uses the Siamese BERT and extracts the sentence vector using a GAP. A new sentence is generated using important component data obtained using Part-Of-Speech (POS) tagging from input data.…”
Section: Related Workmentioning
confidence: 99%
“…The input data length has a fixed value to match the matrix dimension of the CNN module, and a pad is added to enforce the same length for all data. CF-SBERT [24] uses the Siamese BERT and extracts the sentence vector using a GAP. A new sentence is generated using important component data obtained using Part-Of-Speech (POS) tagging from input data.…”
Section: Related Workmentioning
confidence: 99%
“…We measured the Spearman rank correlation between the cosine similarity of the sentence embedding and the gold labels. Spearman’s rank correlation has been used to measure semantic textual similarity in other studies [ 19 , 55 , 116 ]. We show the performance in Table 6 , highlighting the templates used in our article in bold.…”
Section: Resultsmentioning
confidence: 99%
“…Choi et al [25] proposed a sentence representation method for CNN-SBert, which aims to extract sentence vectors by adding a CNN module. Yin et al [26] proposed a method of CF-SBert, which uses Siamese Bert, extracts sentence vectors with the help of GAP, and uses part-of-speech tagging important component data obtained from input data to generate new sentences.…”
Section: Relate Workmentioning
confidence: 99%