Relation classification is to recognize semantic relation between two given entities mentioned in the given text in Knowledge Graph. Existing models have performed well on the inverse relation classification with large-scale datasets, but their performance drops significantly for few-shot learning. In this paper, we propose a novel method, function words adaptively enhanced attention framework (FAEA+), to capture class-related function words by the designed hybrid attention for few-shot inverse relation classification. Then, an instance-aware prototype network is present to adaptively capture relation information associated with query instances and eliminate intra-class redundancy due to function words introduced. We theoretically prove that the introduction of function words will increase intra-class differences, and the designed instance-aware prototype network is competent for reducing redundancy. Experimental results show that FAEA+ significantly improved over strong baselines on two datasets. Moreover, our model has a distinct advantage in solving inverse relations, which outperforms state-of-the-art results by 16.82\% under the 1-shot setting in FewRel1.0.
Relation classification is to recognize semantic relation between two given entities mentioned in the given text in Knowledge Graph. Existing models have performed well on the inverse relation classification with large-scale datasets, but their performance drops significantly for few-shot learning. In this paper, we propose a novel method, function words adaptively enhanced attention framework (FAEA+), to capture class-related function words by the designed hybrid attention for fewshot inverse relation classification. Then, an instance-aware prototype network is present to adaptively capture relation information associated with query instances and eliminate intra-class redundancy due to function words introduced. We theoretically prove that the introduction of function words will increase intra-class differences, and the designed instance-aware prototype network is competent for reducing redundancy. Experimental results show that FAEA+ significantly improved over strong baselines on two datasets. Moreover, our model Article Title has a distinct advantage in solving inverse relations, which outperforms state-of-the-art results by 16.82% under a 1-shot setting in FewRel1.0.
The relation classification is to identify semantic relations between two entities in a given text. While existing models perform well for classifying inverse relations with large datasets, their performance is significantly reduced for few-shot learning. In this paper, we propose a function words adaptively enhanced attention framework (FAEA) for few-shot inverse relation classification, in which a hybrid attention model is designed to attend class-related function words based on meta-learning. As the involvement of function words brings in significant intra-class redundancy, an adaptive message passing mechanism is introduced to capture and transfer inter-class differences.We mathematically analyze the negative impact of function words from dot-product measurement, which explains why the message passing mechanism effectively reduces the impact. Our experimental results show that FAEA outperforms strong baselines, especially the inverse relation accuracy is improved by 14.33% under 1-shot setting in FewRel1.0.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.