1994
DOI: 10.1007/bf01582063
|View full text |Cite
|
Sign up to set email alerts
|

Representations of quasi-Newton matrices and their use in limited memory methods

Abstract: Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and R… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
240
0
1

Year Published

1996
1996
2017
2017

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 540 publications
(256 citation statements)
references
References 13 publications
0
240
0
1
Order By: Relevance
“…We have implemented a limited memory BFGS method using the compact representations described in [8]. Here B has the form…”
Section: Quasi-newton Approximationsmentioning
confidence: 99%
“…We have implemented a limited memory BFGS method using the compact representations described in [8]. Here B has the form…”
Section: Quasi-newton Approximationsmentioning
confidence: 99%
“…the Quasi Newton Algorithm (QNA; Byrd et al 1994) and the Levenberg-Marquardt rule (Nocedal and Wright 2006), respectively.…”
Section: The Machine Learning Modelsmentioning
confidence: 99%
“…a MLP implementation with learning rule based on the Quasi Newton Algorithm, belongs to the Newton's methods specialized to find the stationary point of a function through a statistical approximation of the Hessian of the training error, obtained by a cyclic gradient calculation. MLPQNA makes use of the known L-BFGS algorithm (Limited memory -Broyden Fletcher Goldfarb Shanno; Byrd et al 1994), originally designed for problems with a wide parameter space. The analytical details of the MLPQNA method, as well as its performances, have been extensively discussed elsewhere (Cavuoti et al 2012;Brescia et al 2013;Cavuoti et al 2014Cavuoti et al , 2015b.…”
Section: The Machine Learning Modelsmentioning
confidence: 99%
“…This is achieved with the use of the following recursive formula which is directly derived from (22) [8].…”
Section: Projected Gradient Non-negative Least-squares Algorithmmentioning
confidence: 99%