Despite its great success, machine learning can have its limits when dealing with insufficient training data. A potential solution is the additional integration of prior knowledge into the training process which leads to the notion of informed machine learning.In this paper, we present a structured overview of various approaches in this field. We provide a definition and propose a concept for informed machine learning which illustrates its building blocks and distinguishes it from conventional machine learning. We introduce a taxonomy that serves as a classification framework for informed machine learning approaches. It considers the source of knowledge, its representation, and its integration into the machine learning pipeline. Based on this taxonomy, we survey related research and describe how different knowledge representations such as algebraic equations, logic rules, or simulation results can be used in learning systems. This evaluation of numerous papers on the basis of our taxonomy uncovers key methods in the field of informed machine learning.
Abstract. We use tools from n-dimensional Brownian motion in conjunction with the Feynman-Kac formulation of heat diffusion to study nodal geometry on a compact Riemannian manifold M . On one hand we extend a theorem of Lieb (see [L]) and prove that any nodal domain Ω λ almost fully contains a ball of radius ∼, which is made precise by Theorem 1.6 below. This also gives a slight refinement of a result by Mangoubi, concerning the inradius of nodal domains ([Man2]). On the other hand, we also prove that no nodal domain can be contained in a reasonably narrow tubular neighbourhood of unions of finitely many submanifolds inside M (this is Theorem 1.5).
We study solutions of uniformly elliptic PDE with Lipschitz leading coefficients and bounded lower order coefficients. We extend previous results of A. Logunov ([L]) concerning nodal sets of harmonic functions and, in particular, prove polynomial upper bounds on interior nodal sets of Steklov eigenfunctions in terms of the corresponding eigenvalue λ.
Let Ω ⊂ R n be a bounded domain satisfying a Hayman-type asymmetry condition, and let D be an arbitrary bounded domain referred to as "obstacle". We are interested in the behaviour of the first Dirichlet eigenvalue λ 1 (Ω \ (x + D)).First, we prove an upper bound on λ 1 (Ω \ (x + D)) in terms of the distance of the set x + D to the set of maximum points x 0 of the first Dirichlet ground state φ λ 1 > 0 of Ω. In short, a direct corollary is that ifis large enough in terms of λ 1 (Ω), then all maximizer sets x+D of µ Ω are close to each maximum point x 0 of φ λ 1 . Second, we discuss the distribution of φ λ 1 (Ω) and the possibility to inscribe wavelength balls at a given point in Ω.Finally, we specify our observations to convex obstacles D and show that if µ Ω is sufficiently large with respect to λ 1 (Ω), then all maximizers x + D of µ Ω contain all maximum points x 0 of φ λ 1 (Ω) .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.