2021
DOI: 10.48550/arxiv.2104.09630
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quaternion Generative Adversarial Networks

Abstract: Latest Generative Adversarial Networks (GANs) are gathering outstanding results through a large-scale training, thus employing models composed of millions of parameters requiring extensive computational capabilities. Building such huge models undermines their replicability and increases the training instability. Moreover, multi-channel data, such as images or audio, are usually processed by real-valued convolutional networks that flatten and concatenate the input, losing any intra-channel spatial relation. To … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…Moreover, when processing multidimensional data with correlated channels, such as color images, rather than mulichannel audio or multisensor signals, PHC layers bring benefits due to the weight sharing among different channels. This allows to capture latent intra-channels relations that standard convolutional networks ignore because of the rigid structure of the weights (Grassucci et al, 2021a;Parcollet et al, 2019b).…”
Section: Figurementioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, when processing multidimensional data with correlated channels, such as color images, rather than mulichannel audio or multisensor signals, PHC layers bring benefits due to the weight sharing among different channels. This allows to capture latent intra-channels relations that standard convolutional networks ignore because of the rigid structure of the weights (Grassucci et al, 2021a;Parcollet et al, 2019b).…”
Section: Figurementioning
confidence: 99%
“…Recent state-of-the-art convolutional models achieved astonishing results in various fields of application by large-scaling the overall parameters amount (Karras et al, 2020;d'Ascoli et al, 2021;Dosovitskiy et al, 2021). Simultaneously, quaternion neural networks (QNNs) demonstrated to significantly reduce the number of parameters while still gaining comparable performances (Parcollet et al, 2019c;Grassucci et al, 2021a;Tay et al, 2019). Quaternion models exploit hypercomplex algebra properties, including the Hamilton product, to painstakingly design interactions among the imaginary units, thus involving 1/4 of free parameters with respect to real-valued models.…”
Section: Introductionmentioning
confidence: 99%