The World Wide Web Conference 2019
DOI: 10.1145/3308558.3313516
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Hilbert Space for Text Representation Learning

Abstract: Capturing the meaning of sentences has long been a challenging task. Current models tend to apply linear combinations of word features to conduct semantic composition for bigger-granularity units e.g. phrases, sentences, and documents. However, the semantic linearity does not always hold in human language. For instance, the meaning of the phrase "ivory tower" cannot be deduced by linearly combining the meanings of "ivory" and "tower". To address this issue, we propose a new framework that models different leve… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 30 publications
0
18
0
Order By: Relevance
“…Zhang et al [51] build neural networks to learn high-level interactions of question and answer based on their entangled state representations for question answering (QA). [22,21,20] leverage quantum superposition and mixture to model correlations between linguistic features and construct complex-valued language representations by neural networks. The representation leads to improved performance and enhanced interpretability.…”
Section: Quantum-inspired Models For Text Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Zhang et al [51] build neural networks to learn high-level interactions of question and answer based on their entangled state representations for question answering (QA). [22,21,20] leverage quantum superposition and mixture to model correlations between linguistic features and construct complex-valued language representations by neural networks. The representation leads to improved performance and enhanced interpretability.…”
Section: Quantum-inspired Models For Text Analysismentioning
confidence: 99%
“…The inspiration stems from the manifestation of non-classical phenomena in human cognition and decision, which violates classical probability theory but adopts a compact explanation via quantum theory (QT) [18]. QT has stimulated the successful construction of quantuminspired models for human cognition-related tasks, such as information retrieval (IR) [19,20] and language understanding [21,22]. As a typical human cognitive task, however, multimodal sentiment analysis has received little at-tention from a quantum-inspired viewpoint [23,24], due to the challenge in modeling complicated interactions across different modalities in a quantum manner.…”
Section: Introductionmentioning
confidence: 99%
“…Complex-valued text representation has shown promising performance in NLP tasks such as text classification [42], information retrieval [31] and question answering [23]. Complex embeddings are also integrated into NN-architectures such as Transformer [47], convolutional feed-forward networks and convolutional LSTMs [40], but they don't follow the quantum operations such as mixture and measurement.…”
Section: Complex-valued Text Representationmentioning
confidence: 99%
“…[7] proposes a mathematical framework for modeling semantic feature spaces using quantum mechanics, and verifies that implicit semantic analysis and HAL (Hyperspace Analog to Language) models are essentially standardized Hilbert spaces. [42] applies quantum theory to text representation learning tasks. [23] implements a quantum probability-based model for the QA problem.…”
Section: Complex-valued Text Representationmentioning
confidence: 99%
See 1 more Smart Citation