2022
DOI: 10.1016/j.eswa.2022.116951
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid neural tagging model for open relation extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…In sequence labeling open IE systems, when extracting arguments for a specific predicate, predicate-related features are used as input variables (Stanovsky et al, 2018;Zhan and Zhao, 2019;Jia and Xiang, 2019). We analyzed this extraction process from the perspective of multimodal 1 The code for our model and related resources can be found in https://github.com/youngbin-ro/ Multi2OIE learning (Mangai et al, 2010;Ngiam et al, 2011;Baltrušaitis et al, 2019), which defines an entire sequence and the corresponding predicate information as a modality.…”
Section: Multi-head Attention For Open Iementioning
confidence: 99%
See 1 more Smart Citation
“…In sequence labeling open IE systems, when extracting arguments for a specific predicate, predicate-related features are used as input variables (Stanovsky et al, 2018;Zhan and Zhao, 2019;Jia and Xiang, 2019). We analyzed this extraction process from the perspective of multimodal 1 The code for our model and related resources can be found in https://github.com/youngbin-ro/ Multi2OIE learning (Mangai et al, 2010;Ngiam et al, 2011;Baltrušaitis et al, 2019), which defines an entire sequence and the corresponding predicate information as a modality.…”
Section: Multi-head Attention For Open Iementioning
confidence: 99%
“…(Fader et al, 2011;Mausam et al, 2012;Del Corro and Gemulla, 2013), most recent open IE research has focused on deep-neural-network-based supervised learning models. Such systems are typically based on bidirectional long short-term memory (BiLSTM) and are formulated for two categories: sequence labeling (Stanovsky et al, 2018;Sarhan and Spruit, 2019;Jia and Xiang, 2019) and sequence generation (Cui et al, 2018;Sun et al, 2018;Bhutani et al, 2019). The latter enables flexible extraction; however, it is more computationally expensive than the former.…”
Section: Introductionmentioning
confidence: 99%
“…Jia et al [25] proposed a hybrid neural network model for labeling open relations, employing the LSTM model of ordered neurons to encode syntactic information and capture associations between arguments. To do so, they built a large-scale, high-quality, fully automated training corpus.…”
Section: Related Workmentioning
confidence: 99%
“…It concatenates and encodes the query together with the context to generate sequence embedding, with which this framework dynamically determines a sub-module (Single-span Extraction or Query-based Sequence Labeling) to label the potential relation(s) in the context. Jia et al (2022) propose a hybrid neural network model ( HNN4ORT) for open relation tagging. The (h, t, s), where h and t denote head entity and tail entity respectively, and s denotes the sentence corresponding to two entities.…”
Section: Open Relation Span Extractionmentioning
confidence: 99%