Proceedings of the 6th Joint Conference on Lexical and Computational Semantics (*SEM 2017) 2017
DOI: 10.18653/v1/s17-1027
|View full text |Cite
|
Sign up to set email alerts
|

Classifying Semantic Clause Types: Modeling Context and Genre Characteristics with Recurrent Neural Networks and Attention

Abstract: Detecting aspectual properties of clauses in the form of situation entity types has been shown to depend on a combination of syntactic-semantic and contextual features. We explore this task in a deeplearning framework, where tuned word representations capture lexical, syntactic and semantic features. We introduce an attention mechanism that pinpoints relevant context not only for the current instance, but also for the larger context. Apart from implicitly capturing task relevant features, the advantage of our … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(13 citation statements)
references
References 24 publications
(49 reference statements)
0
12
0
Order By: Relevance
“…Detecting aspectual properties of clauses in the form of semantic clause types has been shown to depend on a combination of syntactic, semantic and contextual features. We explore the task in a deep-learning framework, where tuned word representations capture lexical, syntactic and semantic features [6,7]. Given a clause in its context (previous clauses and previously predicted labels), the model predicts its semantic type (i.e., state, event, generic, generalizing sentence).…”
Section: Classifying Semantic Clause Typesmentioning
confidence: 99%
“…Detecting aspectual properties of clauses in the form of semantic clause types has been shown to depend on a combination of syntactic, semantic and contextual features. We explore the task in a deep-learning framework, where tuned word representations capture lexical, syntactic and semantic features [6,7]. Given a clause in its context (previous clauses and previously predicted labels), the model predicts its semantic type (i.e., state, event, generic, generalizing sentence).…”
Section: Classifying Semantic Clause Typesmentioning
confidence: 99%
“…In contrast, we focus on deriving dynamic clause representations informed by paragraph-level contexts and model context influences more extensively. Becker et al (2017) proposed a GRU based neural network model that predicts the SE type for one clause each time, by encoding the content of the target clause using a GRU and incorporating several sources of context information, includ-ing contents and labels of preceding clauses as well as genre information, using additional separate GRUs (Chung et al, 2014). This model is different from our approach that processes one paragraph (with a sequence of clauses) at a time and extensively models inter-dependencies of clauses.…”
Section: Situation Entity (Se) Type Classificationmentioning
confidence: 99%
“…Note that to diminish the effects of randomness in training neural network models and report stable experimental results, we ran each of the proposed models as well as our own baseline models ten times and reported the averaged performance across the ten runs. (Friedrich et al, 2016;Becker et al, 2017), we used the same 80:20 traintest split with balanced genre distributions. Preprocessing: As described in (Friedrich et al, 2016), texts were split into clauses using SPADE (Soricut and Marcu, 2003).…”
Section: Parameter Settings and Model Trainingmentioning
confidence: 99%
See 2 more Smart Citations