“…Among knowledge-based approaches, we took into account the extension of Lesk comprising word embeddings (Basile, Caputo, and Semeraro 2014, Lesk ext +emb),the extended version of UKB with gloss relations (Agirre, de Lacalle, and Soroa 2014, UKB gloss ) and Babelfy (Moro, Raganato, and Navigli 2014). As for supervised systems we considered an SVM-based classifier integrated with word embeddings (Iacobacci, Pilehvar, and Navigli 2016, IMS+emb), the Bi-LSTM with attention and multi-task objective presented in Raganato, Delli Bovi, and Navigli, Bi-LSTM (2017), and the more recent supervised systems leveraging sense definitions, i.e., HCAN (Luo et al 2018) and EWISE (Kumar et al 2019). We also performed a comparison with the two LSTM-based architectures of Yuan et al (2016, LSTM-LP) and context2vec (Melamud, Goldberger, and Dagan 2016) for learning representations of the annotated sentences in the training corpus.…”