2023
DOI: 10.1111/sjos.12660
|View full text |Cite
|
Sign up to set email alerts
|

Deep neural network classifier for multidimensional functional data

Abstract: We propose a new approach, called as functional deep neural network (FDNN), for classifying multidimensional functional data. Specifically, a deep neural network is trained based on the principal components of the training data which shall be used to predict the class label of a future data function. Unlike the popular functional discriminant analysis approaches which only work for one‐dimensional functional data, the proposed FDNN approach applies to general non‐Gaussian multidimensional functional data. More… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 42 publications
(100 reference statements)
0
9
0
Order By: Relevance
“…Hall et al (2001), Bongiorno and Goia (2016), X. Dai et al (2017), and Zhang and Sakhanenko (2019) employed the kernel density estimation methods to obtain {}ftruêk()bold-italicξJk=1K by considering independent FPCS, where truef̂kξJ=j=1Jtruef̂italickjξj, and truef̂italickj are density function estimators for the j‐th FPCS in the k‐th group. To overcome the restrictive independence assumption, S. Wang, Cao, and Shang (2023) additionally took into account a broader scenario encompassing situations where intricate {}pkk=1K are present. Particularly, pk incorporate convoluted interactions among FPCS.…”
Section: Methodologies For Functional Data Classificationmentioning
confidence: 99%
See 4 more Smart Citations
“…Hall et al (2001), Bongiorno and Goia (2016), X. Dai et al (2017), and Zhang and Sakhanenko (2019) employed the kernel density estimation methods to obtain {}ftruêk()bold-italicξJk=1K by considering independent FPCS, where truef̂kξJ=j=1Jtruef̂italickjξj, and truef̂italickj are density function estimators for the j‐th FPCS in the k‐th group. To overcome the restrictive independence assumption, S. Wang, Cao, and Shang (2023) additionally took into account a broader scenario encompassing situations where intricate {}pkk=1K are present. Particularly, pk incorporate convoluted interactions among FPCS.…”
Section: Methodologies For Functional Data Classificationmentioning
confidence: 99%
“…Figure 3 portrays a representative sample of the MNIST dataset, emphasizing the varying intensity levels. S. Wang, Cao, and Shang (2023) developed a deep neural network based classifier to conduct multi‐class classification for this dataset.…”
Section: Motivation Examplesmentioning
confidence: 99%
See 3 more Smart Citations