Proceedings of the 2018 Conference of the North American Chapter Of the Association for Computational Linguistics: Hu 2018
DOI: 10.18653/v1/n18-1029
|View full text |Cite
|
Sign up to set email alerts
|

Learning beyond Datasets: Knowledge Graph Augmented Neural Networks for Natural Language Processing

Abstract: Machine Learning has been the quintessential solution for many AI problems, but learning models are heavily dependent on specific training data. Some learning models can be incorporated with prior knowledge using a Bayesian setup, but these learning models do not have the ability to access any organized world knowledge on demand. In this work, we propose to enhance learning models with world knowledge in the form of Knowledge Graph (KG) fact triples for Natural Language Processing (NLP) tasks. Our aim is to de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…Yao et al (2019) employ KG-BERT in triple classification, link prediction, and relation prediction tasks. Furthermore, knowledge graphs are used in NLP tasks such as text classification (K M et al, 2018;Ostendorff et al, 2019;, named entity recognition (Dekhili et al, 2019), and language modeling (Ahn et al, 2016;Logan et al, 2019). ERNIE (Zhang et al, 2019b) is an enhanced language representation model incorporating knowledge graphs.…”
Section: Related Workmentioning
confidence: 99%
“…Yao et al (2019) employ KG-BERT in triple classification, link prediction, and relation prediction tasks. Furthermore, knowledge graphs are used in NLP tasks such as text classification (K M et al, 2018;Ostendorff et al, 2019;, named entity recognition (Dekhili et al, 2019), and language modeling (Ahn et al, 2016;Logan et al, 2019). ERNIE (Zhang et al, 2019b) is an enhanced language representation model incorporating knowledge graphs.…”
Section: Related Workmentioning
confidence: 99%
“…Knowledge bases (KBs), such as Wikidata (Vrandečić and Krötzsch, 2014), constitute a valuable resource for collecting attributes and their values. In general, KBs have been shown to help improve multiple NLP application as they contain structured information (Annervaz et al, 2018;Nakashole and Mitchell, 2015;Rahman and Ng, 2011;Ratinov and Roth, 2009). As matter of fact, it is fairly simple to answer factoid questions such as "How old is Joe Biden?"…”
Section: Introductionmentioning
confidence: 99%
“…However, none of the previous works have explored the idea of incorporating commonsense knowledge in sarcasm detection. Common sense has been used in several natural-language based tasks like controllable story generation (Zhang et al, 2020;Brahman and Chaturvedi, 2020), sentence classification (Chen et al, 2019), question answering (Dzendzik et al, 2020), natural language inference (K M et al, 2018;Wang et al, 2019) and other related tasks but not for sarcasm detection. We hypothesize that commonsense knowledge, capturing general beliefs and world knowledge, can prove instrumental in understanding sarcasm.…”
Section: Introduction and Related Workmentioning
confidence: 99%