2020
DOI: 10.11591/eei.v9i5.2443
|View full text |Cite
|
Sign up to set email alerts
|

Cowbree: A novel dataset for fine-grained visual categorization

Abstract: Fine-grained visual categorization (FGVC) dealt with objects belonging to one class with intra-class differences into subclasses. FGVC is challenging due to the fact that it is very difficult to collect enough training samples. This study presents a novel image dataset named Cowbreefor FGVC. Cowbree dataset contains 4000 images belongs to eight different cow breeds. Images are properly categorized under different breed names (labels) based on different texture and color features with the help of experts. While… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 17 publications
0
1
0
Order By: Relevance
“…These vectors allow different features to be assembled into a compact vector [13]. Principal component analysis (PCA) also allows the dimension of the feature vector to be reduced [2,7,21]. The authors in [22] studied the different reductions related to PCA applied to scale-invariant feature transform (SIFT) and Speeded Up Robust Features (SURF) descriptors on Wang (http://wang.ist.psu.edu/docs/ related/, accessed on 20 March 2021) and Coil100 (https://www1.cs.columbia.edu/CAVE/ software/softlib/coil-100.php, accessed on 20 March 2021) databases and concluded that the optimal reduction was at 70%.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…These vectors allow different features to be assembled into a compact vector [13]. Principal component analysis (PCA) also allows the dimension of the feature vector to be reduced [2,7,21]. The authors in [22] studied the different reductions related to PCA applied to scale-invariant feature transform (SIFT) and Speeded Up Robust Features (SURF) descriptors on Wang (http://wang.ist.psu.edu/docs/ related/, accessed on 20 March 2021) and Coil100 (https://www1.cs.columbia.edu/CAVE/ software/softlib/coil-100.php, accessed on 20 March 2021) databases and concluded that the optimal reduction was at 70%.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…KNN is used for implementing the feature similarity to predict the value for new data point. It can also match the training dataset for analysing state assumptions [40]. Working procedure for KNN:…”
Section: 4 Knn Classifiermentioning
confidence: 99%