“…Similar to the proof in [11], we can prove that, if f (x) is a differentiable convex function, then for λ ≥ ∇f (0) ∞ , x = 0 is the optimal solution of problem (15). Thus, λ = ∇f (0) ∞ is big enough such that (x, w) = (0, 1) is the optimal solution of problem (4) with s f = 0.…”
Section: B the Choice Of λmentioning
confidence: 52%
“…Motivated by the weighted thresholding method for individual variable sparsity constrained optimization [11] and the ISTA with sparse group hard thresholding method [1], we adopt the weighted thresholding framework and present the following Algorithm 3 for problem (4). For the convenience of description, we denote the solution in Algorithm 1 by WT λ,L (x k ), then we propose the weighted thresholding method for problem (4) in Algorithm 3.…”
Section: Weighted Thresholding Methodsmentioning
confidence: 99%
“…First, we introduce an equivalent formulation of the sparsity constraint x ∈ C f , where [11]: x ∈ C f is equivalent to that there exists w ∈ {0, 1} n such that…”
Section: B Equivalent Formulationmentioning
confidence: 99%
“…And then the function is minimized over the constraints of problem (1), which can be solved optimally by a dynamic programming algorithm. However according to [11], it is restrictive to minimize a proximal function over individual variable sparsity constraint directly. Hence it is interesting to improve the iterative sparse group hard thresholding algorithm [1] for problem (1) using the reformulation technique in [11].…”
Section: Introductionmentioning
confidence: 99%
“…To achieve this purpose, we reformulate the s f -sparse individual constraint in (1) by a weighted l 1 -norm strategy and get a new problem, which is equivalent to problem (1). This reformulation is similar to that in [11], which does not consider the group sparsity constraint. To solve the reformulated problem, we penalize the side constraint and propose a weighted thresholding method, which is based on a dynamic programming algorithm.…”
In this paper, we investigate the sparse group feature selection problem, in which covariates posses a grouping structure sparsity at the level of both features and groups simultaneously. We reformulate the feature sparsity constraint as an equivalent weighted l 1-norm constraint in the sparse group optimization problem. To solve the reformulated problem, we first propose a weighted thresholding method based on a dynamic programming algorithm. Then we improve the method to a weighted thresholding homotopy algorithm using homotopy technique. We prove that the algorithm converges to an L-stationary point of the original problem. Computational experiments on synthetic data show that the proposed algorithm is competitive with some state-of-the-art algorithms. INDEX TERMS Homotopy technique, weighted thresholding method, sparse group feature selection.
“…Similar to the proof in [11], we can prove that, if f (x) is a differentiable convex function, then for λ ≥ ∇f (0) ∞ , x = 0 is the optimal solution of problem (15). Thus, λ = ∇f (0) ∞ is big enough such that (x, w) = (0, 1) is the optimal solution of problem (4) with s f = 0.…”
Section: B the Choice Of λmentioning
confidence: 52%
“…Motivated by the weighted thresholding method for individual variable sparsity constrained optimization [11] and the ISTA with sparse group hard thresholding method [1], we adopt the weighted thresholding framework and present the following Algorithm 3 for problem (4). For the convenience of description, we denote the solution in Algorithm 1 by WT λ,L (x k ), then we propose the weighted thresholding method for problem (4) in Algorithm 3.…”
Section: Weighted Thresholding Methodsmentioning
confidence: 99%
“…First, we introduce an equivalent formulation of the sparsity constraint x ∈ C f , where [11]: x ∈ C f is equivalent to that there exists w ∈ {0, 1} n such that…”
Section: B Equivalent Formulationmentioning
confidence: 99%
“…And then the function is minimized over the constraints of problem (1), which can be solved optimally by a dynamic programming algorithm. However according to [11], it is restrictive to minimize a proximal function over individual variable sparsity constraint directly. Hence it is interesting to improve the iterative sparse group hard thresholding algorithm [1] for problem (1) using the reformulation technique in [11].…”
Section: Introductionmentioning
confidence: 99%
“…To achieve this purpose, we reformulate the s f -sparse individual constraint in (1) by a weighted l 1 -norm strategy and get a new problem, which is equivalent to problem (1). This reformulation is similar to that in [11], which does not consider the group sparsity constraint. To solve the reformulated problem, we penalize the side constraint and propose a weighted thresholding method, which is based on a dynamic programming algorithm.…”
In this paper, we investigate the sparse group feature selection problem, in which covariates posses a grouping structure sparsity at the level of both features and groups simultaneously. We reformulate the feature sparsity constraint as an equivalent weighted l 1-norm constraint in the sparse group optimization problem. To solve the reformulated problem, we first propose a weighted thresholding method based on a dynamic programming algorithm. Then we improve the method to a weighted thresholding homotopy algorithm using homotopy technique. We prove that the algorithm converges to an L-stationary point of the original problem. Computational experiments on synthetic data show that the proposed algorithm is competitive with some state-of-the-art algorithms. INDEX TERMS Homotopy technique, weighted thresholding method, sparse group feature selection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.