2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) 2008
DOI: 10.1109/ijcnn.2008.4633966
|View full text |Cite
|
Sign up to set email alerts
|

Multi-label imbalanced data enrichment process in neural net classifier training

Abstract: Semantic scene classification, robotic state recognition, and many other real-world applications involve multilabel classification with imbalanced data. In this paper, we address these problems by using an enrichment process in neural net training. The enrichment process can manage the imbalanced data and train the neural net with high classification accuracy. Experimental results on a robotic arm controller show that our method has better generalization performance than traditional neural net training in solv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 18 publications
0
13
0
Order By: Relevance
“…The laxity is randomly generated in the intervals of [1][2][3][4][5], [5][6][7][8][9][10], and [10-15] time units. The execution time and the task size are randomly selected from the fixed interval of [1][2][3][4][5][6][7][8][9][10] time units and from the interval of [5][6][7][8][9][10] reconfigurable tiles, respectively.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The laxity is randomly generated in the intervals of [1][2][3][4][5], [5][6][7][8][9][10], and [10-15] time units. The execution time and the task size are randomly selected from the fixed interval of [1][2][3][4][5][6][7][8][9][10] time units and from the interval of [5][6][7][8][9][10] reconfigurable tiles, respectively.…”
Section: Resultsmentioning
confidence: 99%
“…The task sizes are randomly generated in the intervals of [5][6][7][8][9][10], [10][11][12][13][14][15], [15][16][17][18][19][20], and [20-25] reconfigurable tiles for the fixed interval of [1][2][3][4][5][6][7][8][9][10] time units for laxity and of [1][2][3][4][5][6][7][8][9][10] time units for execution time. It can be seen from Figure 9 that for small task sizes (interval [5][6][7][8][9][10]), both rejection rates are low (in the range of couple percent). Increasing the task size up to [20][21][22][23][24][25] increases the rejected task number up to 50% due to the space constraints, while the rejection rate due to the deadlines decreases down to z...…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Additionally, the weights connecting this hidden layer with the output layer are adjusted applying an individual bias for each label. Also based on ANN, in [8] an iterative enrichment process is proposed. The authors initialize the ANN by clustering part of the data, after which a resampling over each cluster is performed in order to balance them in the euclidean space.…”
Section: Mlc Algorithmic Adaptation Proposalsmentioning
confidence: 99%
“…It is also present in multilabel classification (MLC), since labels are unevenly distributed in most MLDs. To deal with imbalance in MLC, methods based on algorithmic adaptations [6][7][8], the use of ensembles [9,10], and resampling techniques [11][12][13] have been proposed.…”
Section: Introductionmentioning
confidence: 99%