2024
DOI: 10.1109/tnnls.2022.3226772
|View full text |Cite
|
Sign up to set email alerts
|

PHNNs: Lightweight Neural Networks via Parameterized Hypercomplex Convolutions

Abstract: Hypercomplex neural networks have proven to reduce the overall number of parameters while ensuring valuable performance by leveraging the properties of Clifford algebras. Recently, hypercomplex linear layers have been further improved by involving efficient parameterized Kronecker products. In this article, we define the parameterization of hypercomplex convolutional layers and introduce the family of parameterized hypercomplex neural networks (PHNNs) that are lightweight and efficient large-scale models. Our … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 16 publications
(19 citation statements)
references
References 57 publications
(70 reference statements)
0
12
0
Order By: Relevance
“…Future directions: As was mentioned before, N-dimensional convolution has been applied on Real-Valued CNNs for processing multichannel inputs. In a similar way, the equations presented can be extended to 8-channel inputs using an octonion or hypercomplex algebra, see for example [65], [66], [67], [68]. For larger number of inputs, a geometric algebra [69] representation can be applied.…”
Section: A Quaternion Convolution Layersmentioning
confidence: 99%
See 3 more Smart Citations
“…Future directions: As was mentioned before, N-dimensional convolution has been applied on Real-Valued CNNs for processing multichannel inputs. In a similar way, the equations presented can be extended to 8-channel inputs using an octonion or hypercomplex algebra, see for example [65], [66], [67], [68]. For larger number of inputs, a geometric algebra [69] representation can be applied.…”
Section: A Quaternion Convolution Layersmentioning
confidence: 99%
“…For larger number of inputs, a geometric algebra [69] representation can be applied. To the best of our knowledge, this type of architecture has not been published to date, but a first approach in this direction can be found in [70], [71]. In these type of deep learning architectures: quaternion, hyper-complex, or geometric, a major concern is the selection of the signature of the algebra, which will embed data into different geometric spaces, and the processing will take distinct meanings accordingly.…”
Section: A Quaternion Convolution Layersmentioning
confidence: 99%
See 2 more Smart Citations
“…Jiang et al [28] developed a method for network compression in super-resolution, which employed weight pruning and a multi-slicing network of information to extract and integrate multi-scale features. Grassucci et al [29] introduced a family of parameterized hypercomplex neural networks that directly captured convolution rules and filter organization from data, making them flexible in any user-defined or tuned domain. Cheng et al [30] proposed a lightweight unified fusion network for multi-focus image fusion, which used guided filtering to separate the source image into basic and detailed layers and applied a gradient perception strategy to handle the fusion problem.…”
Section: Introductionmentioning
confidence: 99%