Copper mobility and availability in soil environments is largely controlled by Cu sorption reactions as well as its chemical forms. In this study, equilibrium, kinetic batch experiments, and a chemical fractionation scheme were carried out to evaluate effects of drinking water treatment residual (DWTR) application on sorption and bioavailability of Cu in three arid zone soils having different properties. Distinct differences in the amounts of Cu sorbed among the different soils were observed where highest sorption was associated with clay, OM, and CEC contents. The quantity of Cu sorbed on the three studied soils drastically increased as a result of increasing rates of DWTR application from 2% to 12% (w/w). Freundlich distribution coefficient (K f ) values indicate that Cu sorption affinities for the studied soils followed the trend Typic torrifluvent (TF) > Typic calciorthids (CO) > Typic torripsamment (TP) soils. The sorption of Cu was initially fast with 95, 92, and 73% of Cu sorbed on TF and CO and TP unamended soils, respectively, in the first 60 min. Following the initial fast reaction, the sorption reaction continued for 63 h, after which only a small amount of additional sorption occurred (2-6%). The parabolic diffusion law and the power function models described Cu sorption kinetics in all the sorbents studied equally well as the R 2 values were quite high and SE values were low. Addition of DWTR drastically reduced non-residual (NORS) Cu and simultaneously increased residual (RS) Cu fractions. At 12% application rate, DWTR decreased NORS-Cu in nonamended soils from 10.9 to 4.2, from 50.2 to 21.5, and from 78.6 to 33.3% in TF, CO, and TP soils, respectively. Our results suggest that as the application rate of DWTR to Cu-contaminated soils increased, more Cu was associated with the residual fractions, which decreased potential Cu mobility and bioavailability in these soils.