2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00641
|View full text |Cite
|
Sign up to set email alerts
|

Non-Local Recurrent Neural Memory for Supervised Sequence Modeling

Abstract: Typical methods for supervised sequence modeling are built upon the recurrent neural networks to capture temporal dependencies. One potential limitation of these methods is that they only model explicitly information interactions between adjacent time steps in a sequence, hence the high-order interactions between nonadjacent time steps are not fully exploited. It greatly limits the capability of modeling the long-range temporal dependencies since oneorder interactions cannot be maintained for a long term due t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 30 publications
0
6
0
Order By: Relevance
“…Inspired by non-local methods (Wang et al 2018;Fu et al 2019), we design a non-local recurrent regularization network to model long-range dependencies along the depth dimension, which helps to regularize cost volumes by aggregating more context information. As illustrated in Fig.…”
Section: Non-local Recurrent Regularizationmentioning
confidence: 99%
“…Inspired by non-local methods (Wang et al 2018;Fu et al 2019), we design a non-local recurrent regularization network to model long-range dependencies along the depth dimension, which helps to regularize cost volumes by aggregating more context information. As illustrated in Fig.…”
Section: Non-local Recurrent Regularizationmentioning
confidence: 99%
“…Non-local Neural Networks [22] present non-local operations as a generic family of building blocks for capturing longrange temporal dependencies. Meanwhile, to capture highorder interactions between non-adjacent time steps, Non-local Recurrent Neural Memory [23] performs non-local operations to learn full-order interactions within a sliding temporal block and models global interactions between blocks in a gated recurrent manner. In addition, Zhou et al [24] propose Temporal Relation Network (TRN) to give CNNs a remarkable capability to discover temporal relations in video.…”
Section: Temporal Relation Modeling In Action Recognitionmentioning
confidence: 99%
“…An earlier conference version of this paper appeared in (Fu et al, 2019a). Compared to the prior version, this longer article is improved in three aspects.…”
Section: Introductionmentioning
confidence: 99%