2007 IEEE 23rd International Conference on Data Engineering Workshop 2007
DOI: 10.1109/icdew.2007.4400998
|View full text |Cite
|
Sign up to set email alerts
|

Learning the Relative Importance of Features in Image Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 10 publications
0
8
0
Order By: Relevance
“…This study aims at reducing the dimensionality of dataset to reduce computational load in further processing [2]. The proposed method ranks features for learning a distance function in order to capture the semantics of the dataset [3]. It also uses the orthogonality properties of wavelets to decompose the dataset into spaces of coarse and detailed signals.…”
Section: Discussionmentioning
confidence: 99%
“…This study aims at reducing the dimensionality of dataset to reduce computational load in further processing [2]. The proposed method ranks features for learning a distance function in order to capture the semantics of the dataset [3]. It also uses the orthogonality properties of wavelets to decompose the dataset into spaces of coarse and detailed signals.…”
Section: Discussionmentioning
confidence: 99%
“…Though domain experts may have subjective notions of similarity for comparison, they seldom have a distance function that ranks the image features based on their relative importance. The proposed method ranks features for learning such a distance function in order to capture the semantics of the images [3].…”
Section: Research and Methodologymentioning
confidence: 99%
“…The discrete Meyer adaptive wavelet (DMAW) is both translation-and scale-invariant and can represent a signal in a multi-scale format. While DMAW is not the best fit for entropy criterion, it is well suited for the proposed compression and cancellation purposes [8][9][10].…”
Section: Theory Of Dwt-based Filters For Noise Suppression and Order Reductionmentioning
confidence: 99%
“…shape, color, texture or text annotations. If we are to use more than one type of feature, we have the problem of finding an appropriate weighting parameter w. Research on combining two or more features tends to either assume that labeled training data is available [21] [20], or it considers specialized domains where the value of w can be determined once and fixed forever. However it is clear that for the general problem of manuscript annotation the best value for w is highly data dependent.…”
Section: Background and Related Workmentioning
confidence: 99%
“…For the other extreme, imagine we are matching heraldic shields as in Figure 4.Right, here there is very little variation in shape (and none of it meaningful), and we would wish the algorithm to consider color only 1 . There are many existing techniques for learning this mixing parameter w, if we have access to subjective similarity judgments [20][16] [21]. While we would not rule out human interaction to refine a distance measure in an important domain, the scale of the problems we wish to eventually consider means that we would like to have a completely automated system to at least bootstrap the process and produce an initial high quality measure.…”
Section: Background and Related Workmentioning
confidence: 99%