2018
DOI: 10.3390/sym10090357
|View full text |Cite
|
Sign up to set email alerts
|

Neural Relation Classification Using Selective Attention and Symmetrical Directional Instances

Abstract: Relation classification (RC) is an important task in information extraction from unstructured text. Recently, several neural methods based on various network architectures have been adopted for the task of RC. Among them, convolution neural network (CNN)-based models stand out due to their simple structure, low model complexity and "good" performance. Nevertheless, there are still at least two limitations associated with existing CNN-based RC models. First, when handling samples with long distances between ent… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 16 publications
(29 reference statements)
0
4
0
Order By: Relevance
“…Recently, many researchers have begun to apply deep learning techniques to RE [22, 23]. Socher et al [6] proposed using RNNs to solve RE problems; the method first parses a sentence and then learns the vector representation for each node on the syntax tree.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, many researchers have begun to apply deep learning techniques to RE [22, 23]. Socher et al [6] proposed using RNNs to solve RE problems; the method first parses a sentence and then learns the vector representation for each node on the syntax tree.…”
Section: Related Workmentioning
confidence: 99%
“…Recently Deep Learning has strongly influenced semantic relation learning. Word embeddings can provide attributional features for a variety of learning frameworks (Attia et al, 2016;Vylomova et al, 2016), and the sentential context -in its entirety, or only the structured (through grammatical relations) or unstructured phrase expressing the relation -can be modeled through a variety of neural architectures -CNN (Tan et al, 2018;Ren et al, 2018) or RNN variations (Zhang et al, 2018). Speer et al (2008) introduce AnalogySpace, a representation of concepts and relations in CONCEPT-NET built by factorizing a matrix with concepts on one axis and their features or properties (according to CONCEPTNET) on the other.…”
Section: Semantic Relation Classificationmentioning
confidence: 99%
“…They find out the optimal window size of 2 for training by applying different window size to the input sentence. For another work with the CNN model, they use two kinds of embeddings, position embedding and word embedding [10]. After embedding, the model generates a shortest dependency path from each sentence.…”
Section: Relation Classificationmentioning
confidence: 99%