The paper provides global optimization algorithms for two particularly difficult nonconvex problems raised by hybrid system identification: switching linear regression and bounded-error estimation. While most works focus on local optimization heuristics without global optimality guarantees or with guarantees valid only under restrictive conditions, the proposed approach always yields a solution with a certificate of global optimality. This approach relies on a branchand-bound strategy for which we devise lower bounds that can be efficiently computed. In order to obtain scalable algorithms with respect to the number of data, we directly optimize the model parameters in a continuous optimization setting without involving integer variables. Numerical experiments show that the proposed algorithms offer a higher accuracy than convex relaxations with a reasonable computational burden for hybrid system identification. In addition, we discuss how bounded-error estimation is related to robust estimation in the presence of outliers and exact recovery under sparse noise, for which we also obtain promising numerical results.arXiv:1707.05533v3 [cs.LG] 23 Nov 2017and the algorithm stops at iteration T .
ConvergenceAs for the switching regression case studied in Sect. 3.3, convergence is obtained, for both the 2, and the 0, loss functions, from the tightness of the bounds, under the following assumptions.Assumption 3. For p = 2 (respectively, p = 0), the global optimum of Problem (39) (resp. Problem (40)) is strictly positive:Assumption 4. For any p ∈ {0, 2}, upper bounds in a box B = [u, v] are computed as J(B) = J p BE (u) or such that J(B) ≤ J p BE (u). Theorem 2. Under Assumptions 3-4, the branch-and-bound algorithm described above to minimize J p BE with p ∈ {0, 2} and lower bounds J(B) computed as in Lemma 6 for p = 2 or Lemma 7 for p = 0 converges in a finite number of iterations for any T OL > 0.