2016
DOI: 10.1109/tnnls.2015.2479223
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning of Part-Based Representation of Data Using Sparse Autoencoders With Nonnegativity Constraints

Abstract: Abstract-We demonstrate a new deep learning autoencoder network, trained by a nonnegativity constraint algorithm (NCAE), that learns features which show part-based representation of data. The learning algorithm is based on constraining negative weights. The performance of the algorithm is assessed based on decomposing data into parts and its prediction performance is tested on three standard image data sets and one text dataset. The results indicate that the nonnegativity constraint forces the autoencoder to l… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
104
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 188 publications
(106 citation statements)
references
References 27 publications
0
104
0
1
Order By: Relevance
“…Renewable energy generation Stochastic optimization Handling date uncertainties of renewable energy [10][11][12] Robust optimization [14][15][16][17] Wind power forecasting Linear methods Increasing the accuracy of prediction model [19,20] Nonlinear methods [24][25][26][27] Microgrid management Ordinary decision theory Optimizing energy-scheduling strategies [28][29][30] Noncooperative games [33][34][35][36] Cooperative games [37][38][39][40] and robust optimization [9]. On the one hand, stochastic optimization provides an effective framework to optimize statistical objective functions while the uncertain numerical data are assumed to follow a proverbial probability distribution.…”
Section: Application Scenarios Solution Methods Optimization Goals LImentioning
confidence: 99%
See 1 more Smart Citation
“…Renewable energy generation Stochastic optimization Handling date uncertainties of renewable energy [10][11][12] Robust optimization [14][15][16][17] Wind power forecasting Linear methods Increasing the accuracy of prediction model [19,20] Nonlinear methods [24][25][26][27] Microgrid management Ordinary decision theory Optimizing energy-scheduling strategies [28][29][30] Noncooperative games [33][34][35][36] Cooperative games [37][38][39][40] and robust optimization [9]. On the one hand, stochastic optimization provides an effective framework to optimize statistical objective functions while the uncertain numerical data are assumed to follow a proverbial probability distribution.…”
Section: Application Scenarios Solution Methods Optimization Goals LImentioning
confidence: 99%
“…To efficiently handle the complex, unlabeled and high-dimensional time series data, deep learning has been proposed in Ref. [26]. As an essential deep learning architecture, SAE plays a fundamental role in unsupervised learning and the objective function can be solved efficiently via fast back propagation [27].…”
Section: Application Scenarios Solution Methods Optimization Goals LImentioning
confidence: 99%
“…The autoencoder will be invalid for dimension reduction and key feature extraction if the number of hidden nodes is the same or greater than the number of input nodes, that is, L ≥ n. To solve this problem, sparsity constraints are imposed on the hidden layer to obtain the representative features and to learn useful structures from the input data [36][37][38][39]. This allows for sparse representations of inputs and is useful for pre-training in many tasks.…”
Section: Sparse Autoencodermentioning
confidence: 99%
“…We minimize the following loss function, which imposes a sparsity constraint on the reconstruction error to obtain the optimal parameters of the sparsity autoencoder [36][37][38][39]:…”
Section: Sparse Autoencodermentioning
confidence: 99%
See 1 more Smart Citation