“…Deep belief network (DBN) [Mesnil et al 2013] To explore RNN RNN [Yao et al 2013] To explore RNN-LM RNN-LM [Yao et al 2014b] Label dependencies, label bias problem RNN, CRF [Yao et al 2014a] Gradient diminishing and exploding problem, label dependencies, label bias problem LSTM, regression model, deep learning [Liu and Lane 2015] Label dependencies RNN, sampling approach [Mesnil et al 2015] To explore RNN RNN Vanishing and exploding gradient RNN, external memory [Kurata et al 2016] Label dependencies LSTM, encoder-labeler To explore past and future information Bi-directional RNN, ranking loss function [Vu 2016] To explore CNN CNN [Zhu and Yu 2017] To explore attention mechanism Bi-directional LSTM, LSTM, encoder-decoder, focus mechanism [Dai et al 2018] Unseen slots CRF [Gong et al 2019] To explore MTL MTL, segment tagging, NER [Louvan and Magnini 2018] To explore MTL MTL, NER, bi-LSTM, CRF [Shin et al 2018] To better labelling common words Encoder-decoder attention, delexicalised sentence generation [Wang et al 2018a] Imbalanced data DNN, reinforcement learning [Zhao and Feng 2018] OOV GRU, attention, pointer network [Gong et al 2019] To explore MTL MTL, segment tagging, NER To extend original SLU to H2H conversations Bi-LSTM, different knowledge sources [Shen et al 2019b…”