“…DE was used for finely adapting Naïve Bayesian Classifier (NBC) and used for text classification in Diab and El Hindi ( 2017 ). A multiple partially observed view for multilingual text categorization (Amini et al, 2009 ), an iterative deep neighborhood model for text classification (Liu et al, 2020 ), integrating bidirectional Long Term Short Memory (LSTM) with 2D max pooling for text classification (Zhou et al, 2016 ), Recurrent Neural Network (RNN) for text classification with multi-task learning (Liu et al, 2016 ), Recurrent CNN for text classification (Lai et al, 2015 ), and a character level convolutional network for text classification (Zhang et al, 2015 ) are some of the most famous deep learning works proposed in the literature. A ranking based deep learning representation for efficient text classification (Zheng et al, 2018 ), a hierarchical neural network document representation approach for text classification using three different models (Kowsari et al, 2017 ), a C-LSTM neural network for text classification (Zhou, 2015 ), and a neural attention model for leveraging contextual sentences for text classification (Yan, 2019 ) are again some of the works which help the research community to a great extent for further analysis.…”