Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.842
|View full text |Cite
|
Sign up to set email alerts
|

VeeAlign: Multifaceted Context Representation Using Dual Attention for Ontology Alignment

Abstract: Ontology Alignment is an important research problem applied to various fields such as data integration, data transfer, data preparation, etc. State-of-the-art (SOTA) Ontology Alignment systems typically use naive domain-dependent approaches with handcrafted rules or domainspecific architectures, making them unscalable and inefficient. In this work, we propose VeeAlign, a Deep Learning based model that uses a novel dual-attention mechanism to compute the contextualized representation of a concept which, in turn… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…Machine learning-based methods for ontology alignment [16] and ontology matching [7] can be applied to automatically map concepts and terms from one ontology or vocabulary to another. These algorithms use techniques such as semantic similarity measures [25], graph-based methods [24], and deep learning models [4,10,11] to identify correspondences between concepts in different ontologies or vocabularies. The goal is to produce a mapping that enables data exchange between systems using different ontologies or vocabularies while preserving the meaning of the data.…”
Section: Ontology and Vocabulary Alignmentmentioning
confidence: 99%
“…Machine learning-based methods for ontology alignment [16] and ontology matching [7] can be applied to automatically map concepts and terms from one ontology or vocabulary to another. These algorithms use techniques such as semantic similarity measures [25], graph-based methods [24], and deep learning models [4,10,11] to identify correspondences between concepts in different ontologies or vocabularies. The goal is to produce a mapping that enables data exchange between systems using different ontologies or vocabularies while preserving the meaning of the data.…”
Section: Ontology and Vocabulary Alignmentmentioning
confidence: 99%
“…IDFSample (text similarity-based). This strategy is to introduce hard negative candidates that are ambiguous to the ground truth class at text level (i.e., with similar labels 14 ). We first build a sub-word inverted index [12] for the labels of all the classes of O ′ using a sub-word tokenizer pre-trained on biomedical texts [1].…”
Section: Local Rankingmentioning
confidence: 99%
“…Recently, machine learning (ML)-based OM systems have become increasingly popular as they can go beyond surface-form string comparison by encoding ontology entities into vectors. For example, DeepAlignment [17] adopts counter-fitting to refine word embeddings for better representation of class labels; VeeAlign [14] proposes a dual encoder to encode both textual and path information of classes; and BERTMap [12] derives mappings through dynamic contextual text embeddings from the pre-trained language model BERT.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, machine learning (ML)-based OM systems have become increasingly popular as they can go beyond surface-form string comparison by encoding ontology entities into vectors. For example, DeepAlignment [17] adopts counter-fitting to refine word embeddings for better representation of class labels; VeeAlign [14] proposes a dual encoder to encode both textual and path information of classes; and BERTMap [12] derives mappings through dynamic contextual text embeddings from the pre-trained language model BERT.…”
Section: Introductionmentioning
confidence: 99%
“…2. LogMap 13 & AML14 . LogMap and AML are two classical OM systems based on lexical matching, mapping extension and repair.…”
mentioning
confidence: 99%