Abstract-Electrocardiogram (ECG) is an effective and non-invasive indicator for the detection and prevention of arrhythmia. ECG signals are susceptible to noise contamination, which can lead to errors in ECG interpretation. Therefore, ECG pretreatment is important for accurate analysis. In this paper, a method of noise reduction based on deep learning is proposed. The method is divided into two stages, and two corresponding models are formed. In the first stage, a one-dimensional U-net model is designed for ECG signal denoising to eliminate noise as much as possible. The one-dimensional DR-net model in the second stage is used to reconstruct the ECG signal and to correct the waveform distortion caused by noise removal in the first stage. In this paper, the U-net and the DR-net are constructed by the convolution method to achieve end-to-end mapping from noisy ECG signals to clean ECG signals. The ECG data used in this paper are from CPSC2018, and the noise signal is from MIT-BIH Noise Stress Test Database (NSTDB). In the experiment, the improvement in the signal-to-noise ratio imp SNR , the root mean square error decrease de RMSE , and the correlation coefficient P , are used to evaluate the performance of the network. This two-stage method is compared with FCN and U-net alone. The experimental results show that the two-stage noise reduction method can eliminate complex noise in the ECG signal while retaining the characteristic shape of the ECG signal. According to the results, we believe that the proposed method has a good application prospect in clinical practice.