While model order reduction is a promising approach in dealing with multiscale time-dependent systems that are too large or too expensive to simulate for long times, the resulting reduced order models can suffer from instabilities. We have recently developed a time-dependent renormalization approach to stabilize such reduced models. In the current work, we extend this framework by introducing a parameter that controls the time decay of the memory of such models and optimally select this parameter based on limited fully resolved simulations. First, we demonstrate our framework on the inviscid Burgers equation whose solution develops a finite-time singularity. Our renormalized reduced order models are stable and accurate for long times while using for their calibration only data from a full order simulation before the occurrence of the singularity. Furthermore, we apply this framework to the three-dimensional (3D) Euler equations of incompressible fluid flow, where the problem of finite-time singularity formation is still open and where brute force simulation is only feasible for short times. Our approach allows us to obtain a perturbatively renormalizable model which is stable for long times and includes all the complex effects present in the 3D Euler dynamics. We find that, in each application, the renormalization coefficients display algebraic decay with increasing resolution and that the parameter which controls the time decay of the memory is problem-dependent.
Deep neural networks used for image classification often use convolutional filters to extract distinguishing features before passing them to a linear classifier. Most interpretability literature focuses on providing semantic meaning to convolutional filters to explain a model's reasoning process and confirm its use of relevant information from the input domain. Fully connected layers can be studied by decomposing their weight matrices using a singular value decomposition, in effect studying the correlations between the rows in each matrix to discover the dynamics of the map. In this work we define a singular value decomposition for the weight tensor of a convolutional layer, which provides an analogous understanding of the correlations between filters, exposing the dynamics of the convolutional map. We validate our definition using recent results in random matrix theory. By applying the decomposition across the linear layers of an image classification network we suggest a framework against which interpretability methods might be applied using hypergraphs to model class separation. Rather than looking to the activations to explain the network, we use the singular vectors with the greatest corresponding singular values for each linear layer to identify those features most important to the network. We illustrate our approach with examples and introduce the DeepDataProfiler library, the analysis tool used for this study.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.