2020
DOI: 10.1007/978-3-030-58526-6_46
|View full text |Cite
|
Sign up to set email alerts
|

On Diverse Asynchronous Activity Anticipation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(22 citation statements)
references
References 37 publications
0
20
0
Order By: Relevance
“…However, in the case of applying a GAN on discrete data, the discrete and stochastic fashion of the generator's output hampers direct differentiability, and has therefore been a suitable candidate for Gumbel-based gradient estimators. Applications of discrete GANs that have leveraged these Gumbel-based gradient estimators include (among others) text generation [50], [51], [52], [53], [54], [55], [56], fake user data creation for recommender systems [57], and action prediction [58]. Discrete GANs have also been combined with knowledge distillation frameworks.…”
Section: Discrete Gansmentioning
confidence: 99%
See 1 more Smart Citation
“…However, in the case of applying a GAN on discrete data, the discrete and stochastic fashion of the generator's output hampers direct differentiability, and has therefore been a suitable candidate for Gumbel-based gradient estimators. Applications of discrete GANs that have leveraged these Gumbel-based gradient estimators include (among others) text generation [50], [51], [52], [53], [54], [55], [56], fake user data creation for recommender systems [57], and action prediction [58]. Discrete GANs have also been combined with knowledge distillation frameworks.…”
Section: Discrete Gansmentioning
confidence: 99%
“…Conversely, when GS is used in discrete GANs, entropy of the generator's (output) distribution is often maximized rather than minimized, in order to stabilize optimization and prevent mode collapse (which is the case when the generator outputs non-diverse samples) [40], [54], [96]. To deal with this diversity-quality dilemma in GANs, also distance regularization in latent space has been leveraged [52], [58], promoting closeness of embeddings, and therefore (indirectly) influencing the logits predicted by the generator.…”
Section: Initialization and Regularizationmentioning
confidence: 99%
“…Discrete time format matches the original scheme naturally while continuous ones incur unnecessary decimals. Recent work succeeds the discrete time scheme and trains a conditional GAN model based on Gumbel discrete sampling to jointly enhance the accuracy and diversity in both future action semantics and times [183]; see Figure 3.9. Previous efforts tended to be challenged in producing realistic (accurate) and diverse predictions.…”
Section: Add_milk Crack_egg Add_buttermentioning
confidence: 99%
“…Some work aims at learning the mapping from high-level information, extracted from raw observations, to the future. These efforts usually depend on extracted/estimated activity labels and segmented temporal duration as input [1,2,61,105,183].…”
Section: Add_milk Crack_egg Add_buttermentioning
confidence: 99%
See 1 more Smart Citation