“…However, static word vectors remain important in applications where word meaning has to be modelled in the absence of (sentence) context. For instance, static word vectors are needed for zero-shot image classification (Socher et al, 2013) and zero-shot entity typing (Ma et al, 2016), for ontology alignment (Kolyvakis et al, 2018) and completion (Li et al, 2019), taxonomy learning (Bordea et al, 2015(Bordea et al, , 2016, or for representing query terms in information retrieval systems (Nikolaev and Kotov, 2020). Moreover, Liu et al (2020) recently found that static word vectors can complement CLMs, by serving as anchors for contextualized vectors, while Alghanmi et al (2020) found that incorporating static word vectors could improve the performance of BERT for social media classification.…”