“…Subsequently (in the 2000s and more recently), the theory was being developed in the form of a suboptimal guaranteed approach to systems with internal perturbations [27], towards convexification of the anisotropy-based control synthesis [30,37,38] and suboptimal observer design [40,39], and also taking into account nonzero-mean input disturbances [28] and multiplicative noise [29] in the system, to mention some of the developments. The results of the anisotropy-based theory have been adapted to descriptor systems in [2] 2 Alternative proximity measures (such as the relative Renyi entropy and Hellinger distance) instead of the Kullback-Leibler informational divergence in application to the anisotropy functional are discussed in [10]. 3 1 in the sense of statistical correlations between the entries of a multivariable noise at the same or different moments of time 2 where there are mathematical and bibliographic errors and inaccuracies: for example, [2, Definition 3.1 on p. 61] introduces erroneous dimensions of vectors and corresponding spaces in the definition of anisotropy and a wrong sign of the differential entropy; [2, p. 62] specifies a wrong analyticity domain for transfer functions; the order of noncommuting matrix factors in the factorization of spectral densities in [2, Eq.…”