2019
DOI: 10.1016/j.neunet.2019.07.005
|View full text |Cite
|
Sign up to set email alerts
|

Efficient training of interval Neural Networks for imprecise training data

Abstract: In this paper we attempt to build upon past work on Interval Neural Networks, and provide a robust way to train and quantify the uncertainty of Deep Neural Networks. Specifically, we propose a back propagation algorithm for Neural Networks with constant width predictions. In order to maintain numerical stability we propose minimising the maximum of the batch of errors at each step. Our approach can accommodate incertitude in the training data, and therefore adversarial examples from a commonly used attack mode… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(15 citation statements)
references
References 29 publications
0
11
0
1
Order By: Relevance
“…The number of terms in (43) increases with each iteration because the updating rules for momentum include the functions S and D from previous iterations unlike the standard gradient descent. However, expression (43) depends on the previously computed ∂W/∂Y and ∂b/∂Y and has same neighboring derivatives as in formula (22) which should be computed once. In addition, we use the relationships between the derivatives…”
Section: Gradient Descent With Momentummentioning
confidence: 99%
See 1 more Smart Citation
“…The number of terms in (43) increases with each iteration because the updating rules for momentum include the functions S and D from previous iterations unlike the standard gradient descent. However, expression (43) depends on the previously computed ∂W/∂Y and ∂b/∂Y and has same neighboring derivatives as in formula (22) which should be computed once. In addition, we use the relationships between the derivatives…”
Section: Gradient Descent With Momentummentioning
confidence: 99%
“…The main feature of these proposed methods is that they exploit separate expressions for the lower and upper limits of the interval outputs in the learning algorithm. In [43] Sadeghi et al presented an alternative way to efficiently train interval neural networks for imprecise data using the framework of interval predictor models [44,45]. Unlike standard regression techniques, these models try to bound an envelope of the data using a conceptualization of fit different from the traditional least squares criterion, which produce interval estimates for the regression even from point-valued input data.…”
Section: Introductionmentioning
confidence: 99%
“…It is particularly popular when undertaking risk or reliability analyses when data is not perfectly known [24,6,5]. PBA objects and methods can also be used within machine learning techniques [15,25,26].…”
Section: Introductionmentioning
confidence: 99%
“…Quantile regression itself is a well-studied field with the first results from [22], see also [23,46]. Quantile regression in deep learning has been also recently considered as a general statistical modeling tool [38,53,40,50,45]. Bayesian quantile regression has also been studied [26,51].…”
Section: Introductionmentioning
confidence: 99%