“…Shahi & Pant, 2018;Singh, 2018;Subba, Paudel, & Shahi, 2019;Thakur & Singh, 2014;Wagle & Thapa, 2021) and sentiment analysis (Piryani, Piryani, Singh, & Pinto, 2020;Regmi, Bal, & Kultsova, 2017;T. Shahi, Sitaula, & Paudel, 2022;Sitaula, Basnet, Mainali, & Shahi, 2021;Tamrakar, Bal, & Thapa, 2020;Thapa & Bal, 2016). Several studies (Aggarwal, Chauhan, Kumar, Mittal, & Verma, 2020;Al-Yahya, Al-Khalifa, Al-Baity, AlSaeed, & Essam, 2021;Terechshenko et al, 2020) show that transformer models give significantly better performances than the former approaches due to their ability to attend to longer sequences of text using attention mechanisms.…”