2018
DOI: 10.1007/978-3-319-75477-2_5
|View full text |Cite
|
Sign up to set email alerts
|

Generating Bags of Words from the Sums of Their Word Embeddings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…It is easy to see that unigram embeddings can be written as a linear system Ax where A ∈ R n×p contains v wi in the i th column and x ∈ R p is the bag-of-words vector which counts the number of occurances of words in a text. This application is considered in text processing applications to obtain the original text document given the unigram embeddings [64].…”
Section: Basis Pursuit For Recovering Bag-of-words Of Text Documentsmentioning
confidence: 99%
“…It is easy to see that unigram embeddings can be written as a linear system Ax where A ∈ R n×p contains v wi in the i th column and x ∈ R p is the bag-of-words vector which counts the number of occurances of words in a text. This application is considered in text processing applications to obtain the original text document given the unigram embeddings [64].…”
Section: Basis Pursuit For Recovering Bag-of-words Of Text Documentsmentioning
confidence: 99%
“…The Sum of Word Embeddings (SoWE) of a reviewer's review text is used as a semantic representation of each reviewer. SoWE refers to the simple linear function of aggregating the embeddings of words to represent a sentence, a reviewer, or a group, shown to be effective in different domains, such as Ren et al [19], White et al [20], and Lyndon et al [21]. Therefore, to obtain a SoWE for a sentence we applied an element-wise average to WEs of a sentence:…”
Section: A Spatial Modelingmentioning
confidence: 99%
“…Therefore, we use the SoWE at two different levels in this work: reviewer-level and group-level. Note, it is possible to use either sentence or document embedding techniques such as sec2vec, we choose SoWE due to its simplicity and proven efficiency in various domains [16], [20], [21]. The sentence-level embeddings are finetuned by a CNN with max-pooling.…”
Section: A Reviewer Representation Extractionmentioning
confidence: 99%