2018
DOI: 10.1109/tnnls.2017.2695223
|View full text |Cite
|
Sign up to set email alerts
|

Structure Learning for Deep Neural Networks Based on Multiobjective Optimization

Abstract: This paper focuses on the connecting structure of deep neural networks and proposes a layerwise structure learning method based on multiobjective optimization. A model with better generalization can be obtained by reducing the connecting parameters in deep networks. The aim is to find the optimal structure with high representation ability and better generalization for each layer. Then, the visible data are modeled with respect to structure based on the products of experts. In order to mitigate the difficulty o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
41
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 100 publications
(41 citation statements)
references
References 46 publications
0
41
0
Order By: Relevance
“…Specifically, the parameter ρ is compared with a random value within [0, 1]. If ρ is larger than the random value, the binary vectors and real vectors of the parents are reduced by (1) and 3, respectively, and the offspring solutions are generated in the Pareto optimal subspace and then recovered by (2) and 4; otherwise, the offspring solutions are generated in the original search space without using RBM or DAE. Algorithm 2 summarizes the procedure of the offspring generation strategy.…”
Section: B Pareto Optimal Subspace Learning and Offspring Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…Specifically, the parameter ρ is compared with a random value within [0, 1]. If ρ is larger than the random value, the binary vectors and real vectors of the parents are reduced by (1) and 3, respectively, and the offspring solutions are generated in the Pareto optimal subspace and then recovered by (2) and 4; otherwise, the offspring solutions are generated in the original search space without using RBM or DAE. Algorithm 2 summarizes the procedure of the offspring generation strategy.…”
Section: B Pareto Optimal Subspace Learning and Offspring Generationmentioning
confidence: 99%
“…I N the era of big data, there exists plenty of complicated data in many research fields and real-world applications, which raises a variety of optimization problems having multiple objectives and a large number of decision variables [1]- [3]. These large-scale multiobjective optimization problems (LMOPs) present a huge search space that grows exponentially with the number of decision variables, posing stiff challenges for evolutionary algorithms to efficiently approximate the Pareto optimal solutions [4].…”
Section: Introductionmentioning
confidence: 99%
“…Artificial neural networks have been utilized to different engineering and science fields such as control, data processing [17], robotics [18], function approximation [19], and pattern and speech recognition [20]. An ANN consists of interconnecting neurons which have been categorized into three layers which are, namely, input layer, hidden layer, and output layer [21]. ere could be more than one hidden layer in an ANN making it more flexible and accurate to learn at the cost of learning time and effort.…”
Section: Neural Networkmentioning
confidence: 99%
“…Evolutionary optimization has been leveraged in order to both design the network topology and optimize the weights in the network [9] [10] [11]. Building upon these prior works [12] [13] we extend our evolutionary optimization algorithm to include multiple objectives. In [12], their primary objectives are 1) sparsity of the network and 2) classification accuracy.…”
Section: B Multi-objective Optimization In Deep Learningmentioning
confidence: 99%