2022
DOI: 10.1155/2022/4656837
|View full text |Cite
|
Sign up to set email alerts
|

KGDetector: Detecting Chinese Sensitive Information via Knowledge Graph-Enhanced BERT

Abstract: The Bidirectional Encoder Representations from Transformers (BERT) technique has been widely used in detecting Chinese sensitive information. However, existing BERT-based frameworks usually fail to emphasize key entities in the texts that contribute significantly to knowledge inference. To meet this gap, we propose a BERT and knowledge graph-based novel framework to detect Chinese sensitive information (named KGDetector). Specifically, we first train a pretrained knowledge graph-based Chinese entity embedding … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
(34 reference statements)
0
0
0
Order By: Relevance
“…A knowledge graph is a kind of knowledge system that represents knowledge in a structured way and is mainly used to describe various entities and concepts that exist in the real world and the relationships among them. Many studies combine knowledge graphs with pre-trained models or deep neural networks to achieve text classification [22][23][24][25][26]. Zhao et al [27] solved the problem of small data samples by injecting sentiment domain knowledge into a linguistic representation model to exploit the additional information in sentiment knowledge graphs and obtain the embedding vectors of entities in sentiment knowledge graphs and words in a text in a consistent vector space.…”
Section: Related Workmentioning
confidence: 99%
“…A knowledge graph is a kind of knowledge system that represents knowledge in a structured way and is mainly used to describe various entities and concepts that exist in the real world and the relationships among them. Many studies combine knowledge graphs with pre-trained models or deep neural networks to achieve text classification [22][23][24][25][26]. Zhao et al [27] solved the problem of small data samples by injecting sentiment domain knowledge into a linguistic representation model to exploit the additional information in sentiment knowledge graphs and obtain the embedding vectors of entities in sentiment knowledge graphs and words in a text in a consistent vector space.…”
Section: Related Workmentioning
confidence: 99%