2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.00665
|View full text |Cite
|
Sign up to set email alerts
|

Orthogonal Jacobian Regularization for Unsupervised Disentanglement in Image Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
44
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(45 citation statements)
references
References 17 publications
0
44
0
Order By: Relevance
“…Jacobi orthogonal regularization search algorithm [ 28 ] exploits the gradient relationship between the input and output feature maps of the generator network rather than the parameter information of a particular layer, which ensures the integrity of the method on the model. Jacobian Decomposition GAN [ 29 ] discovered and demonstrated that the eigenvectors obtained by the Jacobian matrix decomposition of the feature maps generated by the GAN generator network correspond to the semantic attribute direction of the image.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Jacobi orthogonal regularization search algorithm [ 28 ] exploits the gradient relationship between the input and output feature maps of the generator network rather than the parameter information of a particular layer, which ensures the integrity of the method on the model. Jacobian Decomposition GAN [ 29 ] discovered and demonstrated that the eigenvectors obtained by the Jacobian matrix decomposition of the feature maps generated by the GAN generator network correspond to the semantic attribute direction of the image.…”
Section: Methodsmentioning
confidence: 99%
“…Hessian Penalty GAN [ 27 ] adopts the idea of the Jacobian matrix to restrict each dimension of the input latent space by orthogonal regularization, the method achieves different semantic attributes of the generated image controlled by different dimensions of the latent space, but the restriction of different dimensions to control a single semantics is too strict, and the editing does not work well on models with large hidden space dimensions such as StyleGAN. OroJaRGAN [ 28 ] relaxes the orthogonal regularization constraints on the basis of Hessian Penalty GAN, the method achieves good results on large GAN models.…”
Section: Introductionmentioning
confidence: 99%
“…Wei et al [64] propose an orthogonal Jacobian regularization (OroJaR) to enforce disentanglement for generative models. They employ the Jacobian matrix of the output with respect to the input (i.e., latent variables for representation) to measure the output changes caused by the variations in the input.…”
Section: Generative Adversarial Network (Gan) Based Approachesmentioning
confidence: 99%
“…Vectorwise two or more MAP-IVR [83], DRNET [84], DR-GAN [65], DRANet [21], Lee et al [8], Liu et al [22], Singh et al [73] each latent variable aligns to one coarse-grained semantic meaning real scenes Dimensionwise one VAE-based methods, InfoGAN [9], IB-GAN [61], Zhu et al [19], InfoGAN-CR [62], PS-SC GAN [63], Wei et al [64], DNA-GAN [66] each dimension aligns to one fine-grained semantic meaning synthetic and simple datasets empirical case and the desired case. Explicitness focuses on the coverage of latent representation with respect to generative factors.…”
Section: Dimension Of Each Latent Factor Representative Work Semantic...mentioning
confidence: 99%
“…Representation disentanglement is an important line of research in TST, which disentangles content and attribute representations (John et al, 2018b). Many disentanglement approaches are proposed to minimize the dependence between these two representations, such as mutual information (Yuan et al, 2020) and orthogonality (Wei et al, 2021). CTG controls the text generation of language models by smart prompt design (Li & Liang, 2021;Shin et al, 2020) or training conditioned on the controllable variables (Li et al, 2020;Hu & Li, 2021).…”
Section: Related Workmentioning
confidence: 99%