“…In the last three decades, there have been a large number of studies of the approximation and representation properties for fully connected neural networks with single hidden layer [17,5,4,26,2,20,37,43,38] and deep neural networks (DNNs) with more than one hidden layer [29,42,35,44,28,1,36,13,9,32,12]. To our knowledge, however, there are very few studies of the approximation property of CNNs [3,45,31,46,34,23]. In [3], the authors consider a one-dimensional space (1D) ReLU-CNN that is constituted by a fully connected layer and a sequence of convolution layers and, by showing the identity operator can be realized by an underlying sequence of convolutional layers, they obtain approximation property of CNN directly from that of the underlying fully connected layer.…”