A new approach to the study of the Lyapunov exponents of random matrices is presented. It is proved that, under general assumptions, any family of nonnegative matrices possesses a continuous concave positively homogeneous invariant functional ("antinorm") on R d + . Moreover, the coefficient corresponding to an invariant antinorm equals the largest Lyapunov exponent. All conditions imposed on the matrices are shown to be essential. As a corollary, a sharp estimate for the asymptotics of the mathematical expectation for logarithms of norms of matrix products and of their spectral radii is derived. New upper and lower bounds for Lyapunov exponents are obtained. This leads to an algorithm for computing Lyapunov exponents. The proofs of the main results are outlined.Lyapunov exponents (growth exponents of norms of random matrix products) have been extensively studied in the literature ([1]-[10]). In this work we introduce the notion of an invariant functional ("antinorm") for a family of random matrices and prove its existence under certain assumptions. This implies asymptotically sharp estimates for the meam growth rate of norms and spectral radii of matrix products, as well as new bounds for the Lyapunov exponent. For the sake of simplicity, we restrict our analysis to independent multipliers distributed on a finite set of matrices. Let η k = X k · · · X 1 , where all X i are independent and identically distributed on a set A = {A 1 , . . . , A m } of d × d matrices. Suppose that P{η 1 = A j } = p j > 0 and m j=1 p j = 1. It is well known that the quantity 1 k log η k converges almost surely as k → ∞ to the (maximal) Lyapunov exponent λ = lim k→∞ 1 k E log η k , where E is mathematical expectation ([1], [3]). We set A k = {A d k · · · A d 1 | A d j ∈ A , j = 1, . . . , k}.