2015
DOI: 10.9787/kjbs.2015.47.4.409
|View full text |Cite
|
Sign up to set email alerts
|

A Roasted Chestnut Cultivar ‘Jangwon’ with High Pellicle Removability

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(13 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…For the foundation models, we utilize three popular BERT-based models: KoBERT (Lee et al, 2020), KoELECTRA (Park, 2020), and KoBigBird (Park and Kim, 2021). For comparing various model architectures, we have also trained the single-task models that solely classify just one task among the political orientation classification or the level of pro-government classification.…”
Section: Proposed Architecturementioning
confidence: 99%
See 4 more Smart Citations
“…For the foundation models, we utilize three popular BERT-based models: KoBERT (Lee et al, 2020), KoELECTRA (Park, 2020), and KoBigBird (Park and Kim, 2021). For comparing various model architectures, we have also trained the single-task models that solely classify just one task among the political orientation classification or the level of pro-government classification.…”
Section: Proposed Architecturementioning
confidence: 99%
“…We split the dataset in an 8:1:1 ratio of training, validation, and test datasets. Although the training dataset is unbalanced, we compose a test dataset that has a nearly uniform class distribution imbalance on Ko-KoBERT (Lee et al, 2020), KoBigBird (Park and Kim, 2021), and KoELECTRA (Park, 2020). We use these models from the Hugging Face which provides various Natural Language Processing (NLP) models and datasets.…”
Section: Experiments Setupmentioning
confidence: 99%
See 3 more Smart Citations