For some special data in reality, such as the genetic data, adjacent genes may have the similar function. Thus ensuring the smoothness between adjacent genes is highly necessary. But, in this case, the standard lasso penalty just doesn't seem appropriate anymore. On the other hand, in high-dimensional statistics, some datasets are easily contaminated by outliers or contain variables with heavy-tailed distributions, which makes many conventional methods inadequate. To address both issues, in this paper, we propose an adaptive Huber regression for robust estimation and inference, in which, the fused lasso penalty is used to encourage the sparsity of the coefficients as well as the sparsity of their differences, i.e., local constancy of the coefficient profile. Theoretically, we establish its nonasymptotic estimation error bounds under 2-norm in high-dimensional setting. The proposed estimation method is formulated as a convex, nonsmooth and separable optimization problem, hence, the alternating direction method of multipliers can be employed. In the end, we perform on simulation studies and real cancer data studies, which illustrate that the proposed estimation method is more robust and predictive.