2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition 2018
DOI: 10.1109/cvpr.2018.00390
|View full text |Cite
|
Sign up to set email alerts
|

Wasserstein Introspective Neural Networks

Abstract: We present Wasserstein introspective neural networks (WINN) that are both a generator and a discriminator within a single model. WINN provides a significant improvement over the recent introspective neural networks (INN) method by enhancing INN's generative modeling capability. WINN has three interesting properties: (1) A mathematical connection between the formulation of the INN algorithm and that of Wasserstein generative adversarial networks (WGAN) is made. (2) The explicit adoption of the Wasserstein dista… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
58
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 48 publications
(62 citation statements)
references
References 25 publications
(62 reference statements)
0
58
0
Order By: Relevance
“…Building on the pioneering work of [34], recently [35], [36], [37] have developed an introspective learning method to learn the energy-based model, where the energy function is discriminatively learned, and the learned energy function is used to generate synthesized examples via Langevin dynamics. It can be interesting to combine introspective learning with the proposed cooperative learning method that recruits a generator to jumpstart the Langevin sampling.…”
Section: Related Workmentioning
confidence: 99%
“…Building on the pioneering work of [34], recently [35], [36], [37] have developed an introspective learning method to learn the energy-based model, where the energy function is discriminatively learned, and the learned energy function is used to generate synthesized examples via Langevin dynamics. It can be interesting to combine introspective learning with the proposed cooperative learning method that recruits a generator to jumpstart the Langevin sampling.…”
Section: Related Workmentioning
confidence: 99%
“…Our model in (2) directly corresponds to a classifier, specifically a spatial-temporal discriminative ConvNet, in the following sense [22], [51], [52], [53], [54], [55]. Suppose we have C categories of video sequences.…”
Section: Energy-based Spatial-temporal Generative Convnetmentioning
confidence: 99%
“…Our paper is a generalization of [15,56] by applying ABP to train a conditional version of the generator model for feature-to-feature translation. The generative ConvNet [58] and the Wasserstein INN [23] are two one-piece models that learn energy-based generative models for data generation. Both [58] and [23] generate data via iterative MCMC sampling, while our model generates data via direct ancestral sampling, which is much more efficient.…”
Section: Related Workmentioning
confidence: 99%
“…The generative ConvNet [58] and the Wasserstein INN [23] are two one-piece models that learn energy-based generative models for data generation. Both [58] and [23] generate data via iterative MCMC sampling, while our model generates data via direct ancestral sampling, which is much more efficient.…”
Section: Related Workmentioning
confidence: 99%