Abstract. The total least square (TLS) estimation problem of random systems is widely found in many fields of engineering and science, such as signal processing, automatic control, system theory and so on. In the case of linear Gaussian case, a very mature TLS parameter estimation algorithm has been developed. In the non-Gaussian case, the existing research is not much and not deep, the current error entropy criterion (EEC) and the minimum error entropy( MEE) based on EEC method has been paid attention to, the traditional MEE only consider the output data contains the error situation, so it cannot get the optimal solution. In this paper, we consider the inclusion of noise in the input and output data, and deduce the total error entropy criterion (TEEC) and the corresponding TLS method named the minimum total error entropy (MTEE) method The In addition, the derivation method is simulated, and the simulation results show the correctness of the algorithm.