2018
DOI: 10.4204/eptcs.283.1
|View full text |Cite
|
Sign up to set email alerts
|

Towards Compositional Distributional Discourse Analysis

Abstract: Categorical compositional distributional semantics provide a method to derive the meaning of a sentence from the meaning of its individual words: the grammatical reduction of a sentence automatically induces a linear map for composing the word vectors obtained from distributional semantics. In this paper, we extend this passage from word-to-sentence to sentence-to-discourse composition. To achieve this we introduce a notion of basic anaphoric discourses as a mid-level representation between natural language di… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
2
2

Relationship

3
4

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 18 publications
0
8
0
Order By: Relevance
“…As a corollary, the computational problems for RelCoCat Semantics and Entailment reduce to conjunctive query evaluation and containment respectively. Building on previous work [4], we then show that the QuestionAnswering problem is NP − complete. Logical semantics for pregroups has been developped in a line of work by Preller [5,6,7], however the corresponding reasoning problems were undecidable.…”
Section: Introductionmentioning
confidence: 71%
See 1 more Smart Citation
“…As a corollary, the computational problems for RelCoCat Semantics and Entailment reduce to conjunctive query evaluation and containment respectively. Building on previous work [4], we then show that the QuestionAnswering problem is NP − complete. Logical semantics for pregroups has been developped in a line of work by Preller [5,6,7], however the corresponding reasoning problems were undecidable.…”
Section: Introductionmentioning
confidence: 71%
“…Membership follows immediately by reduction to Evaluation. Hardness follows by reduction from graph homomorphism, we only give a sketch of proof and refer to [4] where EL is called matching. Any graph can be encoded in a corpus given by a set of subject-verb-object sentences, where EL maps nouns to their corresponding node.…”
Section: Construct the Corresponding Canonical Modelmentioning
confidence: 99%
“…There is some other relevant work in this area, but not directly via pregroups or vectors, e.g. for knowledge bases [8] and using Dynamic Syntax [39].…”
Section: Discussionmentioning
confidence: 99%
“…The image of the noun type n ∈ B is a vector space where the inner product computes noun-phrase similarity [SCC13]. When applied to question answering tasks, distributional models can be used to compute the distance between a question and its answer [CdMT18].…”
Section: Pregroup Semanticsmentioning
confidence: 99%