2023
DOI: 10.1007/978-981-99-0981-0_21
|View full text |Cite
|
Sign up to set email alerts
|

Comparative Study of Pre-trained Language Models for Text Classification in Smart Agriculture Domain

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…There are studies that conclude the superiority of BERT in different text classification tasks. One study was from Yadav et al ( 2023) [17] where they examine how people perceive smart farming technologies by conducting sentiment analysis on data extracted from YouTube using 3 transformer models: GPT-2, BERT specifically fine-tuned for the agricultural domain, and a BERT variant, DistilBERT. The study's findings and statistical evaluations suggest that transformer models are effective for categorizing both technical and agricultural texts.…”
Section: Related Studiesmentioning
confidence: 99%
“…There are studies that conclude the superiority of BERT in different text classification tasks. One study was from Yadav et al ( 2023) [17] where they examine how people perceive smart farming technologies by conducting sentiment analysis on data extracted from YouTube using 3 transformer models: GPT-2, BERT specifically fine-tuned for the agricultural domain, and a BERT variant, DistilBERT. The study's findings and statistical evaluations suggest that transformer models are effective for categorizing both technical and agricultural texts.…”
Section: Related Studiesmentioning
confidence: 99%