2020
DOI: 10.1109/tmi.2020.2967451
|View full text |Cite
|
Sign up to set email alerts
|

Fast Polynomial Approximation of Heat Kernel Convolution on Manifolds and Its Application to Brain Sulcal and Gyral Graph Pattern Analysis

Abstract: Heat diffusion has been widely used in brain imaging for surface fairing, mesh regularization and cortical data smoothing. Motivated by diffusion wavelets and convolutional neural networks on graphs, we present a new fast and accurate numerical scheme to solve heat diffusion on surface meshes. This is achieved by approximating the heat kernel convolution using high degree orthogonal polynomials in the spectral domain. We also derive the closed-form expression of the spectral decomposition of the Laplace-Beltra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(22 citation statements)
references
References 78 publications
0
22
0
Order By: Relevance
“…In the AlexNet network structure, the size of the first layer of convolution kernel is 11 × 11, the size of the second layer of convolution kernel is 5 × 5, and the size of the third, fourth, and fifth layers of convolution kernel is 3 × 3. Related scholars pointed out that a large convolution kernel can be stacked by using multiple small convolution kernels [26,27]. erefore, in order to simplify the design process, the convolution kernel size in the convolution layer used in this article is all 3 × 3. e size of the feature map is adjusted by the nonoverlapping maximum pooling layer with a size of 2 × 2 and a step size of 2.…”
Section: Improved Network Architecturementioning
confidence: 99%
“…In the AlexNet network structure, the size of the first layer of convolution kernel is 11 × 11, the size of the second layer of convolution kernel is 5 × 5, and the size of the third, fourth, and fifth layers of convolution kernel is 3 × 3. Related scholars pointed out that a large convolution kernel can be stacked by using multiple small convolution kernels [26,27]. erefore, in order to simplify the design process, the convolution kernel size in the convolution layer used in this article is all 3 × 3. e size of the feature map is adjusted by the nonoverlapping maximum pooling layer with a size of 2 × 2 and a step size of 2.…”
Section: Improved Network Architecturementioning
confidence: 99%
“…Indentations in the outer surface of the cerebrum, known as sulci, are key phenotypical biomarkers for linking brain structure and function ( Armstrong et al, 1995 ; Cachia et al, 2008 ; De Winter et al, 2015 ; Huang et al, 2020 ; Le Goualher et al, 1999 ; Lyu et al, 2018a; 2018b; Mangin et al, 2004 ; Miller et al, 2020b ; Weiner et al, 2014 ; Welker, 1990 ; Zilles et al, 1988 ). It is well known that deep sulci, which emerge early in gestation, are key landmarks linking structure, function, and behavior in primary sensory cortices ( Armstrong et al, 1995 ; Ono et al, 1990 ; Sanides, 1964 ; Schwarzkopf and Rees, 2013 ; Welker, 1990 ).…”
Section: Introductionmentioning
confidence: 99%
“…Compared to topological data analysis methods, geometric methods are more adept at detecting localized signals in trees. Figure 2 displays the sucal and gyral trees obtained from brain surface meshes ( Huang et al, 2020 ). Trees are treated as a heat source with value +1 on gyral trees and a heat sink with value −1 on sulcal trees.…”
Section: Lack Of Localizationmentioning
confidence: 99%
“…The major advantage of this approach is that such maps can be easily compared across different subjects. In Huang et al (2020) , a two-sample t -statistic is calculated at each mesh vertex and is used in localizing the sex difference, 268 females, 176 males, near the temporal lobes of the brain. Such localized signal detection is not possible with many existing topological data analysis methods.…”
Section: Lack Of Localizationmentioning
confidence: 99%