2023
DOI: 10.21203/rs.3.rs-2582775/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A BERT-Based model: Improving Crime News Documents Classification through Adopting Pre-trained Language Models

Abstract: Text classification has played a key role in various fields, such as news classification, spam detection, and sentiment analysis. However, the classification of crime news continues to pose challenges, including low efficiency, low precision, and the scarcity of high-quality annotated data on a large scale. Using pre-trained language models, such as Bidirectional Encoder Representation from Transformers (BERT), has reduced the need for extensive amounts of labelled data in the categorization process. BERT boas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 31 publications
0
0
0
Order By: Relevance