2021
DOI: 10.48550/arxiv.2104.05115
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models

Abstract: Pre-trained language models have achieved huge success on a wide range of NLP tasks. However, contextual representations from pretrained models contain entangled semantic and syntactic information, and therefore cannot be directly used to derive useful semantic sentence embeddings for some tasks. Paraphrase pairs offer an effective way of learning the distinction between semantics and syntax, as they naturally share semantics and often vary in syntax. In this work, we present ParaBART, a semantic sentence embe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 18 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?