“…In the last several years, there has been a number of interesting papers that addressed the role of depth and architecture of deep neural networks in approximating functions that possess special regularity properties such as analytic functions [20,38], differentiable functions [45,52], oscillatory functions [29], functions in Sobolev or Besov spaces [1,27,30,53]. High-dimensional approximations by deep neural networks have been studied in [39,48,16,17], and their applications to high-dimensional PDEs in [47,21,43,31,25,26,28]. Most of these papers used deep ReLU (Rectified Linear Unit) neural networks since the rectified linear unit is a simple and preferable activation function in many applications.…”