2021
DOI: 10.1016/j.engappai.2021.104262
|View full text |Cite
|
Sign up to set email alerts
|

Transformer based network for Open Information Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 7 publications
0
5
0
Order By: Relevance
“…Named Entity Recognition (NER) is used to identify the location of the complaint by extracting the complaint text in Indonesian on Twitter. In recent years, the NER model has often been used in deep learning, which uses convolutional neural networks and is proven to perform better than machine learning [12]. However, the structure of a neural network needs to be trained from scratch according to speci c tasks and goals, so it takes a lot of time and resources.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Named Entity Recognition (NER) is used to identify the location of the complaint by extracting the complaint text in Indonesian on Twitter. In recent years, the NER model has often been used in deep learning, which uses convolutional neural networks and is proven to perform better than machine learning [12]. However, the structure of a neural network needs to be trained from scratch according to speci c tasks and goals, so it takes a lot of time and resources.…”
Section: Related Workmentioning
confidence: 99%
“…Pre x I-(Inside) indicates the next word after the rst word of an entity [24], [25]. The NER model used is transformer-based because it is proven to have excellent performance due to a selfattention mechanism [12]. The NER models are BERT and XLNet, which will train tweet complaint data to study entities such as location, geographic entity, building, road measurement, natural place, time, date, object, measurement, and other entities.…”
Section: Data Annotationmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent innovations in deep neural networks have led to impressive advances in Natural Language Processing (NLP) (Devlin et al, 2018;Wolf et al, 2020). These advances include new stateof-the-art results in tasks as diverse as question answering, information extraction, sentiment analysis, conversational 'chatbot' agents and summarization, to only name a few (Reddy et al, 2019;Han and Wang, 2021;Naseem et al, 2020;Siblini et al, 2019;Liu, 2019). Due to the performance of these models, it has also become possible in recent years to use NLP tools for computational social science and digital humanities.…”
Section: Introductionmentioning
confidence: 99%
“…However, most transformerbased models have a quadratic complexity, which limits the token length as a trade-off between performance and memory usage, resulting in truncating training sentences (Fan et al, 2020). The authors of (Han and Wang, 2021) introduced a transformer-based OIE model, however, its performance was not evaluated against any state-of-the-art neural network model. Thus, as a future direction, we intend to further evaluate different neural OIE research trends, including transformer-based models on benchmark datasets.…”
Section: Results and Evaluationmentioning
confidence: 99%