We develop an efficient computational solution to train deep neural networks (DNN) with free-form activation functions. To make the problem well-posed, we augment the cost functional of the DNN by adding an appropriate shape regularization: the sum of the second-order total-variations of the trainable nonlinearities. The representer theorem for DNNs tells us that the optimal activation functions are adaptive piecewise-linear splines, which allows us to recast the problem as a parametric optimization. The challenging point is that the corresponding basis functions (ReLUs) are poorly conditioned and that the determination of their number and positioning is also part of the problem. We circumvent the difficulty by using an equivalent B-spline basis to encode the activation functions and by expressing the regularization as an 1-penalty. This results in the specification of parametric activation function modules that can be implemented and optimized efficiently on standard development platforms. We present experimental results that demonstrate the benefit of our approach.
Most existing bounds for signal reconstruction from compressive measurements make the assumption of additive signal-independent noise. However in many compressive imaging systems, the noise statistics are more accurately represented by Poisson or Poisson-Gaussian noise models. In this paper, we derive upper bounds for signal reconstruction error from compressive measurements which are corrupted by Poisson or Poisson-Gaussian noise. The features of our bounds are as follows: (1) The bounds are derived for a computationally tractable convex estimator with statistically motivated parameter selection. The estimator penalizes signal sparsity subject to a constraint that imposes a novel statistically motivated upper bound on a term based on variance stabilization transforms to approximate the Poisson or Poisson-Gaussian distributions by distributions with (nearly) constant variance. (2) The bounds are applicable to signals that are sparse as well as compressible in any orthonormal basis, and are derived for compressive systems obeying realistic constraints such as nonnegativity and flux-preservation. Our bounds are motivated by several properties of the variance stabilization transforms that we develop and analyze. We present extensive numerical results for signal reconstruction under varying number of measurements and varying signal intensity levels. Ours is the first piece of work to derive bounds on compressive inversion for the Poisson-Gaussian noise model. We also use the properties of the variance stabilizer to develop a principle for selection of the regularization parameter in penalized estimators for Poisson and Poisson-Gaussian inverse problems.
We focus on the generalized-interpolation problem. There, one reconstructs continuous-domain signals that honor discrete data constraints. This problem is infinite-dimensional and ill-posed. We make it well-posed by imposing that the solution balances data fidelity and some Lp-norm regularization. More specifically, we consider p ≥ 1 and the multi-order derivative regularization operator L = D N 0. We reformulate the regularized problem exactly as a finite-dimensional one by restricting the search space to a suitable space of polynomial splines with knots on a uniform grid. Our splines are represented in a B-spline basis, which results in a well-conditioned discretization. For a sufficiently fine grid, our search space contains functions that are arbitrarily close to the solution of the underlying problem where our constraint that the solution must live in a spline space would have been lifted. This remarkable property is due to the approximation power of splines. We use the alternatingdirection method of multipliers along with a multiresolution strategy to compute our solution. We present numerical results for spatial and Fourier interpolation. Through our experiments, we investigate features induced by the Lp-norm regularization, namely, sparsity, regularity, and oscillatory behavior.
We study a variant of the interpolation problem where the continuously defined solution is regularized by minimizing the L p -norm of its second-order derivative. For this continuous-domain problem, we propose an exact discretization scheme that restricts the search space to quadratic splines with knots on an uniform grid. This leads to a discrete finitedimensional problem that is computationally tractable. Another benefit of our spline search space is that, when the grid is sufficiently fine, it contains functions that are arbitrarily close to the solutions of the underlying unrestricted problem. We implement an iteratively reweighted algorithm with a grid-refinement strategy that computes the solution within a prescribed accuracy. Finally, we present experimental results that illustrate characteristics, such as sparsity, of the L p -regularized interpolants.
Most modern imaging systems incorporate a computational pipeline to infer the image of interest from acquired measurements. The Bayesian approach to solve such ill-posed inverse problems involves the characterization of the posterior distribution of the image. It depends on the model of the imaging system and on prior knowledge on the image of interest. In this work, we present a Bayesian reconstruction framework for nonlinear imaging models where we specify the prior knowledge on the image through a deep generative model. We develop a tractable posterior-sampling scheme based on the Metropolisadjusted Langevin algorithm for the class of nonlinear inverse problems where the forward model has a neural-network-like structure. This class includes most practical imaging modalities. We introduce the notion of augmented deep generative priors in order to suitably handle the recovery of quantitative images. We illustrate the advantages of our framework by applying it to two nonlinear imaging modalities-phase retrieval and optical diffraction tomography.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.