“…Word2Vec is one of the popular word embedding that is used by [ 4 , 25 , 27 , 91 ]. But as word2vec can’t handle out-of-vocabulary words, researchers have exploited Glove, BERT, XLNet and other embeddings instead [ 26 , 28 , 92 , 95 , 96 , 98 , 110 , 129 ]. FND-SCTI [ 4 ] considers the hierarchical document structure and uses Bi-LSTM at both word-level (from word to sentence) and sentence-level (from sentence to document) to capture the long-term dependencies in the text.…”