2012
DOI: 10.7763/ijmlc.2012.v2.237
|View full text |Cite
|
Sign up to set email alerts
|

Learning of Neural Network with Reduced Interference – An Ensemble Approach

Abstract: Abstract-This paper focuses on reducing the interference effect among input attributes. When training different attributes together, there may exist negative effect among them due to interference. To reduce the interference, input attributes are placed into different groups such that attributes with no interference with each other are placed in the same group. Two types of grouping strategies are examined in this paper, i.e. non-overlapping and overlapping. To further enhance the performance, multiple learners… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…In this section, we present the numerical performance of the proposed hyperbolic divergence test for distribution testing and compare it with other competing tests such as energy distance (ED; Székely & Rizzo, 2004), maximum mean discrepancy (MMD; Gretton et al, 2012), ball divergence (BD; Pan et al, 2018), and projective ensemble (PE; Li & Zhang, 2020). The projection‐averaging strategy (Kim, Balakrishnan & Wasserman, 2020) was not used here because the power performance of the projection‐averaging test method is equivalent to that of the PE approach but requires more computation (Li & Zhang, 2020). In our simulations, we employed the Gaussian kernel with the median heuristic for the MMD, and set κ=1 for the proposed method.…”
Section: Simulation Studiesmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we present the numerical performance of the proposed hyperbolic divergence test for distribution testing and compare it with other competing tests such as energy distance (ED; Székely & Rizzo, 2004), maximum mean discrepancy (MMD; Gretton et al, 2012), ball divergence (BD; Pan et al, 2018), and projective ensemble (PE; Li & Zhang, 2020). The projection‐averaging strategy (Kim, Balakrishnan & Wasserman, 2020) was not used here because the power performance of the projection‐averaging test method is equivalent to that of the PE approach but requires more computation (Li & Zhang, 2020). In our simulations, we employed the Gaussian kernel with the median heuristic for the MMD, and set κ=1 for the proposed method.…”
Section: Simulation Studiesmentioning
confidence: 99%
“…The metric in a unit hypersphere is the inverse cosine function of the inner product of two unit vectors. Recently, Li & Zhang (2020) and Kim, Balakrishnan & Wasserman (2020) investigated nonparametric two‐sample tests in which Euclidean data were expressed with the spherical model. A natural question is whether and how we may portray Euclidean data using the hyperbolic model.…”
Section: Introductionmentioning
confidence: 99%