“…Most common approaches include adapting entity embeddings learned by models such as BERT by providing additional knowledge from different ontologies that define relations between entities. This can be done either by using templates to convert the relations to text before finetuning embeddings (Weissenborn et al, 2017;Lauscher et al, 2020;, by combining relational information from knowledge graphs with text embeddings (Mihaylov and Frank, 2018;Chen et al, 2018;Zhang et al, 2019;Yang et al, 2019a;, or by jointly learning knowledge graph and textual embeddings (Peters et al, 2019;Feng et al, 2020). These ontologies are either generic like WordNet (Miller, 1995), Concept-Net Singh, 2004), andWikidata (Vrandečić andKrötzsch, 2014), or more specific to a particular domain like the UMLS (Bodenreider, 2004).…”