Because balanced constraints can overcome the problems of trivial solutions of data classification via minimum cut method, many techniques with different balanced strategies have been proposed to improve data classification accuracy. However, their performances have not been compared comprehensively so far. In this paper, we investigate seven balanced classification methods under the discrete non-local total variational framework and compare their accuracy performances on graph. The two-class classification problem with equality constraints, inequality constraints and Ratio Cut, Normalized Cut, Cheeger Cut models are investigated. For cases of equality constraint, we firstly compare the Penalty Function Method (PFM) and the Augmented Lagrangian Method (ALM), which can transform the constrained problems into unconstrained ones, to show the advantages of ALM. The other cases are all solved using the ALM also. In order to make the comparison fairly, we solve all models using ALM method and using the same proportion of fidelity points and the same neighborhood size on graph. Experimental results demonstrate ALM with the equality balanced constraint has the best classification accuracy compared with other six constraints. 200 words.
The algorithm unfolding networks with explainability of algorithms and higher efficiency of Deep Neural Networks (DNN) have received considerable attention in solving ill-posed inverse problems. Under the algorithm unfolding network framework, we propose a novel end-to-end iterative deep neural network and its fast network for image restoration. The first one is designed making use of proximal gradient descent algorithm of variational models, which consists of denoiser and reconstruction sub-networks. The second one is its accelerated version with momentum factors. For sub-network of denoiser, we embed the Convolutional Block Attention Module (CBAM) in previous U-Net for adaptive feature refinement. Experiments on image denoising and deblurring demonstrate that competitive performances in quality and efficiency are gained by compared with several state-of-the-art networks for image restoration. Proposed unfolding DNN can be easily extended to solve other similar image restoration tasks, such as image super-resolution, image demosaicking, etc.
Preserving contour topology during image segmentation is useful in many practical scenarios. By keeping the contours isomorphic, it is possible to prevent over-segmentation and under-segmentation, as well as to adhere to given topologies. The Self-repelling Snakes model (SR) is a variational model that preserves contour topology by combining a non-local repulsion term with the geodesic active contour model. The SR is traditionally solved using the additive operator splitting (AOS) scheme. In our paper, we propose an alternative solution to the SR using the Split Bregman method. Our algorithm breaks the problem down into simpler sub-problems to use lower-order evolution equations and a simple projection scheme rather than re-initialization. The sub-problems can be solved via fast Fourier transform or an approximate soft thresholding formula which maintains stability, shortening the convergence time, and reduces the memory requirement. The Split Bregman and AOS algorithms are compared theoretically and experimentally.
Preserving the contour topology during image segmentation is useful in many practical scenarios. By keeping the contours isomorphic, it is possible to prevent over-segmentation and under-segmentation, as well as to adhere to given topologies. The self-repelling snake model (SR) is a variational model that preserves contour topology by combining a non-local repulsion term with the geodesic active contour model (GAC). The SR is traditionally solved using the additive operator splitting (AOS) scheme. Although this solution is stable, the memory requirement grows quickly as the image size increases. In our paper, we propose an alternative solution to the SR using the Split Bregman method.Our algorithm breaks the problem down into simpler subproblems to use lowerorder evolution equations and approximation schemes. The memory usage is significantly reduced as a result. Experiments show comparable performance to the original algorithm with shorter iteration times.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.