It is shown that every asymptotically controllable system can be globally stabilized by means of some (discontinuous) feedback law. The stabilizing strategy is based on pointwise optimization of a smoothed version of a control-Lyapunov function, iteratively sending trajectories into smaller and smaller neighborgoods of a desired equilibrium. A major technical problem, and one of contributions of the present paper, concerns the precise meaning of "solution" when using a discontinuous controller. * A more cumbersome but descriptive notation would be "clssstabilizing", for stabilization under "closed-loop system sampling"
ABSTRACT. A theory of generalized gradients for a general class of functions is developed, as well as a corresponding theory of normals to arbitrary closed sets. It is shown how these concepts subsume the usual gradients and normals of smooth functions and manifolds, and the subdifferentials and normals of convex analysis. A theorem is proved concerning the differentiability properties of a function of the form max{g(x, u):u e if}. This result unifies and extends some theorems of Danskin and others. The results are then applied to obtain a characterization of flow-invariant sets which yields theorems of Bony and Brezis as corollaries.Introduction. Some of the most important recent advances in optimization have come about as a result of the systematic replacement of smoothness assumptions by convexity. This is exemplified by the work of Rockafellar [12], [13], which has extended the boundaries of treatable problems and has in addition led to new techniques for dealing with problems of a familiar nature.It is natural to ask whether analogous results can be proven without either smoothness or convexity. A general theory of necessary conditions for such problems has been obtained [3] and the results described in [4]. The conditions are expressed, in part, by means of generalized "gradients". They subsume the results of the smooth and convex cases and they yield, among other things, significant extensions of the Pontryagin maximum principle of optimal control theory.We describe in this article the generalized theory of gradients and some of its consequences. As mentioned, the principal application of this theory has been to variational problems, but the two main applications given here concern the differential properties of max functions ( §2) and flow-invariant sets ( §4).
We establish that differential inclusions corresponding to upper semicontinuous multifunctions are strongly asymptotically stable if and only if there exists a smooth Lyapunov function. Since well-known concepts of generalized solutions of differential equations with discontinuous right-hand side can be described in terms of solutions of certain related differential inclusions involving upper semicontinuous multifunctions, this result gives a Lyapunov characterization of asymptotic stability of either Filippov or Krasovskii solutions for differential equations with discontinuous right-hand side. In the study of weak (as opposed to strong) asymptotic stability, the existence of a smooth Lyapunov function is rather exceptional. However, the methods employed in treating the strong case of asymptotic stability are applied to yield a necessary condition for the existence of a smooth Lyapunov function for weakly asymptotically stable differential inclusions; this is an extension to the context of Lyapunov functons of Brockett's celebrated``covering condition'' from continuous feedback stabilization theory.
Academic Press
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.