2012
DOI: 10.1007/s10957-012-0150-2
|View full text |Cite
|
Sign up to set email alerts
|

An Inexact Accelerated Proximal Gradient Method and a Dual Newton-CG Method for the Maximal Entropy Problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
8
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 25 publications
0
8
0
Order By: Relevance
“…For instance, in the field of large scale optimization, there is a growing interest in inexact and approximate Newton-type methods for [7,11,1,40,39,13], which can benefit from fast subroutines for calculating approximate solutions of linear systems. In machine learning, applications arise for the problem of finding optimal configurations in Gaussian Markov Random Fields [32], in graph-based semi-supervised learning and other graph-Laplacian problems [2], least-squares SVMs, Gaussian processes and more.…”
mentioning
confidence: 99%
“…For instance, in the field of large scale optimization, there is a growing interest in inexact and approximate Newton-type methods for [7,11,1,40,39,13], which can benefit from fast subroutines for calculating approximate solutions of linear systems. In machine learning, applications arise for the problem of finding optimal configurations in Gaussian Markov Random Fields [32], in graph-based semi-supervised learning and other graph-Laplacian problems [2], least-squares SVMs, Gaussian processes and more.…”
mentioning
confidence: 99%
“…It has been reported that the inexact scheme can achieve faster computational speeds and better local minima than can the exact scheme [22,24]. Moreover, the inexact scheme has been employed with APG for many problems in broad areas, such as quadratic semidefinite programming [25], maximal entropy [26], and tensor recovery [27]. Therefore, the inexact scheme and APG can naturally be extended to the NCP problem.…”
Section: Introductionmentioning
confidence: 99%
“…From a Machine Learning perspective, problem (1) arises from a wide range of applications such as Gaussian processes (Rasmussen and Williams, 2008), Least-Square Support Vector Machines (Ye and Xiong, 2007), graph-based Semi-Supervised Learning and Graph-Laplacian problems (Bengio et al, 2006), Gaussian Markov Random Fields (Rue and Held, 2005), etc. Approximate solutions of Linear Systems can be of practical benefit in inexact Newton schemes (Jiang et al, 2012;Wang and Xu, 2013;Gondzio, 2013) that are gaining lots of traction in the field of large-scale optimization. Throughout the paper, we assume that problem (1) is consistent (there exists x * such that Ax * = b), dimensions are large and m n. In a large-scale setting, solving problem (1) with direct methods are infeasible.…”
Section: Introductionmentioning
confidence: 99%