Abstract. In a Hilbert space setting H, given Φ : H → R a convex continuously differentiable function, and α a positive parameter, we consider the inertial system with Asymptotic Vanishing Damping (AVD) αẍ (t) + α tẋ (t) + ∇Φ(x(t)) = 0.Depending on the value of α with respect to 3, we give a complete picture of the convergence properties as t → +∞ of the trajectories generated by (AVD) α , as well as iterations of the corresponding algorithms. Indeed, as shown by Su-Boyd-Candès, the case α = 3 corresponds to a continuous version of the accelerated gradient method of Nesterov, with the rate of convergence Φ(x(t)) − min Φ = O(t −2 ) for α ≥ 3. Our main result concerns the subcritical case α ≤ 3, where we show that Φ(). This overall picture shows a continuous variation of the rate of convergence of the values Φ(x(t)) − min H Φ = O(t −p(α) ) with respect to α > 0: the coefficient p(α) increases linearly up to 2 when α goes from 0 to 3, then displays a plateau. Then we examine the convergence of trajectories to optimal solutions. When α > 3, we obtain the weak convergence of the trajectories, and so find the recent results by May and Attouch-Chbani-Peypouquet-Redont. As a new result, in the one-dimensional framework, for the critical value α = 3, we prove the convergence of the trajectories without any restrictive hypothesis on the convex function Φ. In the second part of this paper, we study the convergence properties of the associated forward-backward inertial algorithms. They aim to solve structured convex minimization problems of the form min {Θ := Φ + Ψ}, with Φ smooth and Ψ nonsmooth. The continuous dynamics serves as a guideline for this study, and is very useful for suggesting Lyapunov functions. We obtain a similar rate of convergence for the sequence of iterates (x k ): for α ≤ 3 we have Θ(, and for α > 3 Θ(x k ) − min Θ = o(k −2 ) . We conclude this study by showing that the results are robust with respect to external perturbations.