Differentiation is an important task in control, observation and fault detection. Levant's differentiator is unique, since it is able to estimate exactly and robustly the derivatives of a signal with a bounded high-order derivative. However, the convergence time, although finite, grows unboundedly with the norm of the initial differentiation error, making it uncertain when the estimated derivative is exact. In this paper we propose an extension of Levant's differentiator so that the worst case convergence time can be arbitrarily assigned independently of the initial condition, i.e. the estimation converges in Fixed-Time. We propose also a family of continuous differentiators and provide a unified Lyapunov framework for analysis and design.
I. INTRODUCTIONGiven a (Lebesgue-measurable) signal f (t) defined on [0, ∞) the objective of a differentiator is to estimate as close as possible some of its time derivatives. Usually, signal f (t) is composed of the base signal f 0 (t), which we want to differentiate and is assumed to be n-times differentiable, and a noise signal ν(t), that we will assume to be uniformly bounded, i.e. f (t) = f 0 (t) + ν(t).