In manifold learning, alignment is performed with the objective of deriving the global low-dimensional coordinates of input data from their local coordinates. In virtually all alignment processes, the relation between the local and global coordinates is designed intuitively, without mathematical deduction and detailed analysis. In this study, the authors propose a local nonlinear alignment manifold learning algorithm (LNA) for non-linear dimensionality reduction, based on the concept of local pullback and the mathematical characteristics of a manifold. According to mathematical manifold theory, a function defined on a manifold cannot be differentiated directly on the manifold directly. Instead, it has to be pulled back to Euclidean space with the help of local homeomorphism between the manifold and Euclidean space, where it is then differentiated. In the authors' proposed algorithm, the component functions of global homeomorphism are regarded as the functions defined on the manifold and pulled back to the Euclidean space. Then, Taylor expansion is utilised up to the second order to establish the relation between the global and local coordinates. The objective function in LNA is based on the alignment error and can be solved with an eigenvalue problem. The experimental results conducted on various datasets verify the validity of the authors' method.
Radiative transfer equation (RTE) in Cartesian coordinates can be considered as a special kind of convective-diffusive equation with strong convection characteristics. For this convection dominated problem, standard finite element solutions often suffer from spurious oscillations. To avoid this problem, the upwind finite element methods based on streamline upwind (SU) and streamline upwind Petrov-Galerkin (SUPG) schemes are developed to solve multidimensional radiative heat transfer in semitransparent uniform and graded index media. Comparison between these two upwind schemes on the solution of RTE is carried out. The SUPG scheme is demonstrated to be better than the SU scheme as far as solution accuracy is concerned and have good accuracy in solution of radiative heat transfer in semitransparent graded index media.
Covariance matrices, known as symmetric positive definite (SPD) matrices, are usually regarded as points lying on Riemannian manifolds. We describe a new covariance descriptor, which could improve the discriminative learning ability of region covariance descriptor by taking into account the mean of feature vectors. Due to the specific geometry of Riemannian manifolds, classical learning methods cannot be directly used on it. In this paper, we propose a subspace projection framework for the classification task on Riemannian manifolds and give the mathematical derivation for it. It is different from the common technique used for Riemannian manifolds, which is to explicitly project the points from a Riemannian manifold onto Euclidean space based upon a linear hypothesis. Under the proposed framework, we define a Gaussian Radial Basis Function-(RBF-) based kernel with a Log-Euclidean Riemannian Metric (LERM) to embed a Riemannian manifold into a high-dimensional Reproducing Kernel Hilbert Space (RKHS) and then project it onto a subspace of the RKHS. Finally, a variant of Linear Discriminative Analyze (LDA) is recast onto the subspace. Experiments demonstrate the considerable effectiveness of the mixed region covariance descriptor and the proposed method.
Tensor dimensionality reduction (TDR) is a hot research topic in machine learning, which learns data representations by preserving the original data structure while avoiding convert samples into vectors and solving the problem of the curse of dimensionality of tensor data. In the work, a novel TDR approach based on mode product and Hilbert-Schmidt Independence criterion (HSIC) is proposed. The contributions of authors' work is described as following: (1) HSIC measures the statistical correlation of two random variables. However, instead of measuring the statistical correlation of two random variables directly, HSIC first transforms the two random variables into two reproducing kernel Hilbert spaces (RKHSs), and then measures the statistical correlation of transformed random variables by using Hilbert-Schmidt operators between the two RKHSs. The exploitation of RKHS increases the flexibility and applicability of HSIC. Although HSIC is widely used in machine learning, the authors have not seen its application to dimensionality reduction (DR)(except for authors' previous work). (2) A novel HSIC-based TDR approach is proposed, which first applies HSIC to capture statistical information of tensor data set for DR. The authors give the mathematical derivation of HSIC for tensor data and establish a framework of TDR based on HSIC, named HSIC-TDR for short, which aims to improve the DR results of tensor by exploring and preserving the statistical information of original data set. (3) Furthermore, to solve the out-of-sample problem, the authors learn an explicit expression between the dimensionality-reduced tensors and the higher-dimensional tensors by introducing mode product to HSIC-TDR. The experimental results between the proposed method and other state-of-the-art algorithm on various datasets demonstrate the well performance of the proposed method.This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.