Group sparsity combines the underlying sparsity and group structure of the data in problems. We develop a proximally linearized algorithm InISSAPL for the non-Lipschitz group sparse p,qr optimization problem. The algorithm gives a unified framework for all the parameters p ≥ 1, 0 < q < 1, 1 ≤ r ≤ ∞, which is applicable to different kinds of measurement noise. In particular, it includes the addition of the non-smooth 1,q regularization term and the non-smooth 1 / ∞ fidelity term as special cases. It allows an inexact inner loop accessible to the implementation of scaled ADMM, and still has global convergence. The algorithm is efficient and fast with computation only on the shrinking group support set. Many numerical experiments are presented for the algorithm with diversity of parameters p, q, r. The comparisons show that our algorithm is superior to others in the existing works.Laplace noise or heavy-tailed noise such as impulsive noise, the 1 fidelity term (r = 1) is a good choice. For the noise by uniform distribution or quantization error, the ∞ fidelity term (r = ∞) suits.There are many references to study the sparse optimization problem without group structure in it, i.e. the non-group model in which the number of groups g equals N . Then the p,q term in (1.1) is degenerated to q (0 < q < 1) regularization one. One class of methods is smoothing approximate methods [5,[12][13][14]24]. By a smoothing function ϕ(x, θ), the non-Lipschitz property of the objective function can be removed. The second class of methods is general iterative shrinkage-thresholding algorithms (GISA) for q -2 problem [9,34,40]. GISA was inspired by the great success of soft thresholding and iterative shrinkage-thresholding algorithms (ISTA) [3,16] for convex 1 -2 problem. The third class of methods is the iterative reweighted minimization methods for q -2 minimization problem; see, e.g. [10,15,23,27]. Actually reweighted methods reformulate the original non-Lipschitz q -2 to Lipschitz ones by a de-singularizing parameter. Very recently, [25,39] developed methods by successively shrinking the support of the variables to overcome non-Lipschitz property, in which [25] considered the non-group case with r = ∞ and [39] focused on the image restoration with r = 2. To the best of our knowledge, we note that most of the references considered only r = 2 in these methods.For the group sparse optimization problem (1.1), most algorithms were proposed only in the case of r = 2 as well. Hu et al. [21] investigated this problem via p,q regularization, others developed algorithms for 2,1 regularized least squares, e.g. group Lasso [8,17,38]. As noted before, it is important and necessary to develop algorithm for general 1 ≤ r ≤ ∞. This will bring the difficulty to universally handle the noise parameter r with regularized parameters p, q in the group structure. In addition, the regularization term p,q with parameters p ≥ 1, 0 < q < 1 in the objective function E in (1.1) leads to a non-convex, non-Lipschitz optimization problem. This non-smoothness becom...