2022
DOI: 10.1007/s11263-022-01648-y
|View full text |Cite
|
Sign up to set email alerts
|

Learning Sequence Representations by Non-local Recurrent Neural Memory

Abstract: The key challenge of sequence representation learning is to capture the long-range temporal dependencies. Typical methods for supervised sequence representation learning are built upon recurrent neural networks to capture temporal dependencies. One potential limitation of these methods is that they only model one-order information interactions explicitly between adjacent time steps in a sequence, hence the high-order interactions between nonadjacent time steps are not fully exploited. It greatly limits the cap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 56 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?