“…Dealing with very long term dependencies is a current area of research and recent papers have introduced new variations which aim at fixing this issue and improve on the historical models: IndRNN (Shai et al, 2018), RNN with auxiliary losses (Trinh et al, 2018). Earlier works also include the uRNN (Arjovsky et al, 2016), Quasi-Recurrent Neural Networks (Q-RNN) (Bradbury et al, 2016), Dilated RNN (Chang et al, 2017), Recurrent additive networks (Lee et al, 2017), ChronoNet (Roy et al, 2018), EUNN (Jing et al, 2016), Kronecker Recurrent Units -KRU (Jose et al, 2017) and Recurrent Weight Average (J. Ostmeyer et al, 2017).…”