The platform will undergo maintenance on Sep 14 at about 9:30 AM EST and will be unavailable for approximately 1 hour.
Proceedings of the Web Conference 2021 2021
DOI: 10.1145/3442381.3450128
|View full text |Cite
|
Sign up to set email alerts
|

Biomedical Vocabulary Alignment at Scale in the UMLS Metathesaurus

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
40
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(41 citation statements)
references
References 32 publications
0
40
1
Order By: Relevance
“…Motivation. Clustering biomedical terms into concepts in the UMLS Metathesaurus was formalized into a vocabulary alignment problem identified as UMLS Vocabulary Alignment (UVA) or synonymy prediction task by (Nguyen et al, 2021). The UVA is different from other biomedical ontology alignment efforts by the Ontology Alignment Evaluation Initiative (OAEI) due to the extremely large problem size of the UVA with the need to compare 8.7M biomedical terms pairwise (as opposed to tens of thousands of pairs in OAEI datasets).…”
Section: Introductionmentioning
confidence: 99%
See 4 more Smart Citations
“…Motivation. Clustering biomedical terms into concepts in the UMLS Metathesaurus was formalized into a vocabulary alignment problem identified as UMLS Vocabulary Alignment (UVA) or synonymy prediction task by (Nguyen et al, 2021). The UVA is different from other biomedical ontology alignment efforts by the Ontology Alignment Evaluation Initiative (OAEI) due to the extremely large problem size of the UVA with the need to compare 8.7M biomedical terms pairwise (as opposed to tens of thousands of pairs in OAEI datasets).…”
Section: Introductionmentioning
confidence: 99%
“…The UVA is different from other biomedical ontology alignment efforts by the Ontology Alignment Evaluation Initiative (OAEI) due to the extremely large problem size of the UVA with the need to compare 8.7M biomedical terms pairwise (as opposed to tens of thousands of pairs in OAEI datasets). The authors of (Nguyen et al, 2021) also introduced a scalable supervised learning approach based on the Siamese neural architecture which leverages the lexical information present in the terms. Bidirectional Encoder Representations from Transformers (BERT) (Devlin et al, 2019) is a language model (LM), based on the multi-layer, bidirectional architecture of Transformers (Vaswani et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations