Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2020
DOI: 10.1145/3394486.3403165
|View full text |Cite
|
Sign up to set email alerts
|

Retrospective Loss: Looking Back to Improve Training of Deep Neural Networks

Abstract: Deep neural networks (DNNs) are powerful learning machines that have enabled breakthroughs in several domains. In this work, we introduce a new retrospective loss to improve the training of deep neural network models by utilizing the prior experience available in past model states during training. Minimizing the retrospective loss, along with the task-specific loss, pushes the parameter state at the current training step towards the optimal parameter state while pulling it away from the parameter state at a pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 19 publications
(37 reference statements)
0
1
0
Order By: Relevance
“…These methods can work with existing architectures to improve performance. From another insight, retrospective loss (RL) [18] proposes to utilize the past model state during the training process Take the red boxes as an example…”
Section: Related Workmentioning
confidence: 99%
“…These methods can work with existing architectures to improve performance. From another insight, retrospective loss (RL) [18] proposes to utilize the past model state during the training process Take the red boxes as an example…”
Section: Related Workmentioning
confidence: 99%