“…IRLS is a practical framework for the optimization of non-smooth, possibly non-convex, high-dimensional objectives that minimizes quadratic models which majorize these objectives. Due to its ease of implementation and favorable data-efficiency, it has been widely used in compressed sensing [GR97, CY08, DDFG10, LXY13, FPRW16, KMVS21], robust statistics [HW77, AH15, MGJK19], computer vision [CG17, LC22, SWL22], low-rank matrix recovery and completion [FRW11, MF12, KS18, KMV21], and in inverse problems involving group sparsity [CHHL14, ZB15,CHLH18]. Recently, it has been shown [LK23] that dictionary learning techniques can be incorporated into IRLS schemes for sparse and low-rank recovery to allow the learning of a sparsifying dictionary while recovering the solution.…”