2021 Second International Conference on Intelligent Data Science Technologies and Applications (IDSTA) 2021
DOI: 10.1109/idsta53674.2021.9660801
|View full text |Cite
|
Sign up to set email alerts
|

Clustering and Network Analysis for the Embedding Spaces of Sentences and Sub-Sentences

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…Gupta et al [48] finds naive clustering of high-dimensional contextual BERT embeddings produces deficient results. An et al [47] reinforces this theory by surveying embedding model's clustering ability using Spatial Histograms, which found high-dimensional dynamic SBERT to be least able to cluster when compared to low-dimensional static GloVe models. We argue that by reducing embedding dimensionality and therefore clustering complexity, an increase in clustering performance can be observed.…”
Section: Dimensionality Reduction Algorithms Selectionmentioning
confidence: 85%
See 2 more Smart Citations
“…Gupta et al [48] finds naive clustering of high-dimensional contextual BERT embeddings produces deficient results. An et al [47] reinforces this theory by surveying embedding model's clustering ability using Spatial Histograms, which found high-dimensional dynamic SBERT to be least able to cluster when compared to low-dimensional static GloVe models. We argue that by reducing embedding dimensionality and therefore clustering complexity, an increase in clustering performance can be observed.…”
Section: Dimensionality Reduction Algorithms Selectionmentioning
confidence: 85%
“…Bodrunova et al [46] uses Hierarchal Agglomerative Clustering to group Universal Sentence Encoder embeddings, with the addition of Markov Stopping moment to choose optimal number of clusters. Similarly, An et al [47] uses a range of both static and dynamic Sentence Embeddings, which are clustered with k-means at into a specified number of groups by Spatial Histogram analysis. Gupta et al [48] makes the insight that lowering embedding dimensionality previous to clustering using an Encoder-Decoder Model, improves clustering performance.…”
Section: Semantic Clustering Of Variables Into Domainsmentioning
confidence: 99%
See 1 more Smart Citation
“…In total, ontology-based applications were observed in 14 of the selected papers. [28][29][30][32][33][34][35]37,38,40,41,[50][51][52] These applications include the reuse, combination, and extension of existing ontologies and semantic resources (Figure 3b,i,ii). An example is given by Li et al [33] demonstrating the extension of two nanotechnology ontologies with new concepts and axioms in a two-step procedure.…”
Section: Ontology Applicationmentioning
confidence: 99%
“…An additional important application is the mapping or alignment of one ontology to another (Figure 3b,iii). An et al [50] illustrated this by introducing the two-component system OTMapOnto. In this process, terms in different ontologies are identified and mapped with each other using ontology embedding and an optimal transport approach.…”
Section: Ontology Applicationmentioning
confidence: 99%