Hyperspectral imaging is a remote sensing technique that measures the spectrum of each pixel in the image of a scene. It can be used to detect objects or classify materials based on their optical reflectance spectra. Various methods have been developed to reduce the spectral dimension of hyperspectral images in order to facilitate their analysis. Independent Component Analysis (ICA) is a class of algorithms which extract statistically independent features. FastICA, is one of the most used ICA algorithms because it is simple and fast. However, FastICA often finds irrelevant stationary points (e.g., minima instead of maxima) and is not scalable as it uses at each iteration the whole set of pixels. In this paper, we present a new stochastic algorithm, called SHOICA, which smoothly approximates the non-convex loss functions of ICA using higher-order Taylor minorizers. Because SHOICA guarantees ascent of its objective function, it identifies (local) maxima. Moreover, because SHOICA is stochastic, it facilitates minibatching and thus is scalable and appropriate for large datasets. The quality of features extracted, as well as the time and epochs required by both FastICA and SHOICA are compared on dimensionality reduction and classification tasks of real hyperspectral images.
Majorization-minimization algorithms consist of successively minimizing a sequence of upper bounds of the objective function so that along the iterations the objective function decreases. Such a simple principle allows to solve a large class of optimization problems, even nonconvex, nonsmooth and stochastic. We present a stochastic higher-order algorithmic framework for minimizing the average of a very large number of sufficiently smooth functions. Our stochastic framework is based on the notion of stochastic higherorder upper bound approximations of the finite-sum objective function and minibatching. We present convergence guarantees for nonconvex and convex optimization when the higher-order upper bounds approximate the objective function up to an error that is p times differentiable and has a Lipschitz continuous p derivative. More precisely, we derive asymptotic stationary point guarantees for nonconvex problems, and for convex ones we establish local linear convergence results, provided that the objective function is uniformly convex. Unlike other higher-order methods, ours work with any batch size. Moreover, in contrast to most existing stochastic Newton and third-order methods, our approach guarantees local convergence faster than with first-order oracle and adapts to the problem's curvature. Numerical simulations also confirm the efficiency of our algorithmic framework.
Majorization-minimization algorithms consist of successively minimizing a sequence of upper bounds of the objective function so that along the iterations the objective function decreases. Such a simple principle allows to solve a large class of optimization problems, convex or nonconvex, smooth or nonsmooth. We propose a general higher-order majorization-minimization algorithm for minimizing an objective function that admits an approximation (surrogate) such that the corresponding error function has a higher-order Lipschitz continuous derivative. We present convergence guarantees for our new method for general optimization problems with (non)convex or (non)smooth objective function. For convex (possibly nonsmooth) problems we provide global sublinear convergence rates, while for problems with uniformly convex objective function we obtain locally faster superlinear convergence rates. We also prove global asymptotic stationary point guarantees for general nonconvex (possibly nonsmooth) problems and under Kurdyka-Lojasiewicz property of the objective function we derive local convergence rates ranging from sublinear to superlinear for our majorization-minimization algorithm. Moreover, for composite (unconstrained) nonconvex problems we derive convergence rates in terms of first-(second)-order optimality conditions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.