In this paper, the fault detection problem is investigated for a class of networked multi-rate systems (NMSs) with network-induced fading channels and randomly occurring faults. The stochastic characteristics of the fading measurements are governed by mutually independent random channel coefficients over the known interval [0,1]. By applying the lifting technique, the system model for the observer-based fault detection is established. With the aid of the stochastic analysis approach, sufficient conditions are established under which the stochastic stability of the error dynamics for the state estimation is guaranteed and the prescribed H ∞ performance constraint on the error dynamics for the fault estimation is achieved. Based on the established conditions, the addressed fault detection problem of NMSs is recast as a convex optimization one that can be solved via the semi-definite program method, and the explicit expression of the desired fault detection filter is derived by means of the feasibility of certain matrix inequalities. The main results are specialized to the networked single-rate systems that are a special case of the NMSs. Finally, two simulation examples are utilized to illustrate the effectiveness of the proposed fault detection method.