2018
DOI: 10.48550/arxiv.1804.07754
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning Semantic Textual Similarity from Conversations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(16 citation statements)
references
References 12 publications
0
15
0
Order By: Relevance
“…To measure semantic similarity of text, various encoding methods have been developed which transform the text into a vector space [20,8,38,6]. Using the cosine similarity in conjunction with such an embedding space can ensure that similar text will be closer together.…”
Section: Feature Spacesmentioning
confidence: 99%
See 1 more Smart Citation

Adversarial Gain

Henderson,
Sinha,
Ke
et al. 2018
Preprint
“…To measure semantic similarity of text, various encoding methods have been developed which transform the text into a vector space [20,8,38,6]. Using the cosine similarity in conjunction with such an embedding space can ensure that similar text will be closer together.…”
Section: Feature Spacesmentioning
confidence: 99%
“…Adversarial gain can be measured across different feature spaces (and thus different manifolds). However, another appropriate method may be to learn a specific embeddings (feature) space for the problem at hand similarly to Yang et al [38] since well-generalized embedding spaces are difficult to create [9]. By learning a feature space which ensures a well-defined distance-based correlation between inputs and outputs, the distance assumption can more accurately measure whether an adversarial attack falls in the gain range where a mistake is more likely.…”
Section: Feature Spacesmentioning
confidence: 99%

Adversarial Gain

Henderson,
Sinha,
Ke
et al. 2018
Preprint
“…Chit-chat agents, by contrast, might focus on coarse statistical regularities of dialogue data without accurately modeling the underlying "meaning"; but the data often covers a much wider space of natural language. For example, Twitter or Reddit chit-chat tasks (Li et al, 2016a;Yang et al, 2018;MazarĂ© et al, 2018) cover a huge spectrum of language and diverse topics. Chit-chat and goal-oriented dialogue are not mutually exclusive: when humans engage in chit-chat, their aim is to exchange information, or to elicit specific responses from their partners.…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning based dialogue systems have shown promising performance in many applications such as smart reply [10], conversation semantic embedding [41], human-computer interaction [8] and others [21,45,48]. Deep neural nets extract rich representations with high-level semantic information that are useful for message retrieval [39,41] and response generation [48] in conversations.…”
Section: Introductionmentioning
confidence: 99%
“…The dual encoder model [10,41] is widely used among various dialogue models especially for retrieving response messages, due to its simple structure and competitive computational speed. The dual encoder model consists of two separate encoders, which extract features for the dialogue context (e.g.…”
Section: Introductionmentioning
confidence: 99%