2023
DOI: 10.1609/aaai.v37i11.26565
|View full text |Cite
|
Sign up to set email alerts
|

KICE: A Knowledge Consolidation and Expansion Framework for Relation Extraction

Abstract: Machine Learning is often challenged by insufficient labeled data. Previous methods employing implicit commonsense knowledge of pre-trained language models (PLMs) or pattern-based symbolic knowledge have achieved great success in mitigating manual annotation efforts. In this paper, we focus on the collaboration among different knowledge sources and present KICE, a Knowledge-evolving framework by Iterative Consolidation and Expansion with the guidance of PLMs and rule-based patterns. Specifically, starting with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…Since self-training methods (Rosenberg et al, 2005) accumulate noise in pseudo labels (Hu et al, 2021b), some methods (Han et al, 2018;Lin et al, 2019) Ye and Ling, 2019). Recently, language models has shown their ability on various tasks based on prompting (Li et al, 2022a(Li et al, ,b, 2021Lu et al, 2023). Since the scale of KB is limited, PRBOOST (Zhang et al, 2022) asks PLM by prompt to predict the relation directly and take the predictions as rules.…”
Section: Related Workmentioning
confidence: 99%
“…Since self-training methods (Rosenberg et al, 2005) accumulate noise in pseudo labels (Hu et al, 2021b), some methods (Han et al, 2018;Lin et al, 2019) Ye and Ling, 2019). Recently, language models has shown their ability on various tasks based on prompting (Li et al, 2022a(Li et al, ,b, 2021Lu et al, 2023). Since the scale of KB is limited, PRBOOST (Zhang et al, 2022) asks PLM by prompt to predict the relation directly and take the predictions as rules.…”
Section: Related Workmentioning
confidence: 99%