2013 IEEE International Conference on Computer Vision 2013
DOI: 10.1109/iccv.2013.105
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Auto-similarities of Characteristics (HASC): Exploiting Relational Information for Classification

Abstract: Capturing the essential characteristics of visual objects by considering how their features are inter-related is a recent philosophy of object classification. In this paper, we embed this principle in a novel image descriptor, dubbed Heterogeneous Auto-Similarities of Characteristics (HASC). HASC is applied to heterogeneous dense features maps, encoding linear relations by covariances and nonlinear associations through information-theoretic measures such as mutual information and entropy. In this way, highly c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 24 publications
(48 reference statements)
0
7
0
Order By: Relevance
“…The feature vector is obtained by calculating the pattern's distance to the set of prototypes, P. This feature vector is fed into the SVM to determine its class. Both the original images in the data sets and the HASC [42] descriptors (outlined in Section 2.5) serve as the input to the classification process.…”
Section: Proposed Systemmentioning
confidence: 99%
See 2 more Smart Citations
“…The feature vector is obtained by calculating the pattern's distance to the set of prototypes, P. This feature vector is fed into the SVM to determine its class. Both the original images in the data sets and the HASC [42] descriptors (outlined in Section 2.5) serve as the input to the classification process.…”
Section: Proposed Systemmentioning
confidence: 99%
“…HASC [ 42 ] is a local descriptor designed to capture the linear covariances (COV) and nonlinear entropy combined with mutual information (EMI) relational characteristics of an object. Some of the advantages of covariance matrices as descriptors include their low dimension, robustness to noise, and their ability to capture the features of the joint PDF.…”
Section: Proposed Systemmentioning
confidence: 99%
See 1 more Smart Citation
“…• RICLBP [29], a multi-scale rotation invariant co-occurrence of adjacent LBP with values (R = 1, P = 8), (R = 2, P = 8) and (R = 4, P = 8). • MLPQ [30] [31] are applied to heterogeneous dense features maps. • BSIF [23], where the standard BSIF descriptor is extracted by projecting sub-windows of the entire image onto sub-spaces.…”
Section: Step V3: Texture Descriptorsmentioning
confidence: 99%
“…Each LPQ trains a different classifier whose results are combined by sum rule. This ensemble is built up with 105 descriptors, and the scores are summed and normalised by dividing the sum by 105. Heterogeneous auto‐similarities of characteristics (HASC) [31] are applied to heterogeneous dense features maps. BSIF [23], where the standard BSIF descriptor is extracted by projecting sub‐windows of the entire image onto sub‐spaces. The images are binarised (using a threshold th ) and a histogram is built (see [32] for details).…”
Section: Audio Image Representationmentioning
confidence: 99%