We consider the problem of exact and inexact matching of weighted undirected graphs, in which a bijective correspondence is sought to minimize a quadratic weight disagreement. This computationally challenging problem is often relaxed as a convex quadratic program, in which the space of permutations is replaced by the space of doubly stochastic matrices. However, the applicability of such a relaxation is poorly understood. We define a broad class of friendly graphs characterized by an easily verifiable spectral property. We prove that for friendly graphs, the convex relaxation is guaranteed to find the exact isomorphism or certify its inexistence. This result is further extended to approximately isomorphic graphs, for which we develop an explicit bound on the amount of weight disagreement under which the relaxation is guaranteed to find the globally optimal approximate isomorphism. We also show that in many cases, the graph matching problem can be further harmlessly relaxed to a convex quadratic program with only n separable linear equality constraints, which is substantially more efficient than the standard relaxation involving 2n equality and n 2 inequality constraints. Finally, we show that our results are still valid for unfriendly graphs if additional information in the form of seeds or attributes is allowed, with the latter satisfying an easy to verify spectral characteristic.graph isomorphism | graph matching | permutation | convex relaxation G raphs are a natural abstraction in a variety of problems and are particularly useful for modeling structures, frequently arising in different domains of science and engineering. In many applications, graphs have to be compared or brought into correspondence. The term "graph isomorphism" or the less precise term "graph matching" (used mainly in the applied community) refers to a class of computational problems consisting of finding an optimal correspondence between the vertices of two graphs that minimizes adjacency disagreement. The uses of graph models in general and graph matching in particular are too numerous to allow a comprehensive review within the scope of this paper. In what follows, we will just list a few prominent ones, referring the reader to a (partial) review of applications with a particular emphasis on the domain of pattern recognition (1). In computer vision and pattern recognition, graph matching is used for stereo vision and 3D reconstruction (2), object detection and recognition (3, 4)-in particular, optical character recognition (5)-and image and video indexing and retrieval (6). In biometric applications, graph-based techniques have been widely used for identification tasks implemented by means of elastic graph matching. These include, among others, face recognition and pose estimation (7) and fingerprint recognition (8). In biomedical applications, graphs have been used to model vascular structures and, more recently, to represent connections between neurons (9). In data mining, graphs are used to model networks, including the Web and social ...
An important tool in information analysis is dimensionality reduction. There are various approaches for large data simplification by scaling its dimensions down that play a significant role in recognition and classification tasks. The efficiency of dimension reduction tools is measured in terms of memory and computational complexity, which are usually a function of the number of the given data points. Sparse local operators that involve substantially less than quadratic complexity at one end, and faithful multiscale models with quadratic cost at the other end, make the design of dimension reduction procedure a delicate balance between modeling accuracy and efficiency. Here, we combine the benefits of both and propose a low-dimensional multiscale modeling of the data, at a modest computational cost. The idea is to project the classical multidimensional scaling problem into the data spectral domain extracted from its Laplace-Beltrami operator. There, embedding into a small dimensional Euclidean space is accomplished while optimizing for a small number of coefficients. We provide a theoretical support and demonstrate that working in the natural eigenspace of the data, one could reduce the process complexity while maintaining the model fidelity. As examples, we efficiently canonize nonrigid shapes by embedding their intrinsic metric into R 3 , a method often used for matching and classifying almost isometric articulated objects. Finally, we demonstrate the method by exposing the style in which handwritten digits appear in a large collection of images. We also visualize clustering of digits by treating images as feature points that we map to a plane.flat embedding | distance maps | big data | diffusion geometry M anifold learning refers to the process of mapping given data into a simple low-dimensional domain that reveals properties of the data. When the target space is Euclidean, the procedure is also known as flattening. The question of how to efficiently and reliably flatten given data is a challenge that occupies the minds of numerous researchers. Distance-preserving data-flattening procedures can be found in the fields of geometry processing, the mapmaker problem through exploration of biological surfaces (1-3), texture mapping in computer graphics (4, 5), nonrigid shape analysis (6), image (7-9) and video understanding (7, 10), and computational biometry (11), to name just a few. The flat embedding is usually a simplification process that aims to preserve, as much as possible, distances between data points in the original space, while being efficient to compute. One family of flattening techniques is multidimensional scaling (MDS), which attempts to map all pairwise distances between data points into small dimensional Euclidean domains. Review of MDS applications in psychophysics can be found in ref. 12, which includes the computational realization that human color perception is 2D.A proper way to explore the geometry of given data points involves computing all pairwise distances. Then, a flattening procedure s...
A proof of the optimality of the eigenfunctions of the LaplaceBeltrami operator (LBO) in representing smooth functions on surfaces is provided and adapted to the field of applied shape and data analysis. It is based on the Courant-Fischer min-max principle adapted to our case. The theorem we present supports the new trend in geometry processing of treating geometric structures by using their projection onto the leading eigenfunctions of the decomposition of the LBO. Utilization of this result can be used for constructing numerically efficient algorithms to process shapes in their spectrum. We review a couple of applications as possible practical usage cases of the proposed optimality criteria. We refer to a scale invariant metric, which is also invariant to bending of the manifold. This novel pseudo-metric allows constructing an LBO by which a scale invariant eigenspace on the surface is defined. We demonstrate the efficiency of an intermediate metric, defined as an interpolation between the scale invariant and the regular one, in representing geometric structures while capturing both coarse and fine details. Next, we review a numerical acceleration technique for classical scaling, a member of a family of flattening methods known as multidimensional scaling (MDS). There, the optimality is exploited to efficiently approximate all geodesic distances between pairs of points on a given surface, and thereby match and compare between almost isometric surfaces. Finally, we revisit the classical principal component analysis (PCA) definition by coupling
Multidimensional scaling (MDS) is a family of methods that embed a given set of points into a simple, usually flat, domain. The points are assumed to be sampled from some metric space, and the mapping attempts to preserve the distances between each pair of points in the set. Distances in the target space can be computed analytically in this setting. Generalized MDS is an extension that allows mapping one metric space into another, that is, multidimensional scaling into target spaces in which distances are evaluated numerically rather than analytically. Here, we propose an efficient approach for computing such mappings between surfaces based on their natural spectral decomposition, where the surfaces are treated as sampled metric-spaces. The resulting spectral-GMDS procedure enables efficient embedding by implicitly incorporating smoothness of the mapping into the problem, thereby substantially reducing the complexity involved in its solution while practically overcoming its non-convex nature. The method is compared to existing techniques that compute dense correspondence between shapes. Numerical experiments of the proposed method demonstrate its efficiency and accuracy compared to state-of-the-art approaches.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.