Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval 2021
DOI: 10.1145/3404835.3463035
|View full text |Cite
|
Sign up to set email alerts
|

Entity Retrieval Using Fine-Grained Entity Aspects

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 28 publications
0
4
0
2
Order By: Relevance
“…A word will not only have one static vector but also a dynamic vector, depending on the specific context it is used for. This allows the resolution of terminology problems present in other word embedding models which associate a single representation for each word (Pfeiffer et al , 2018; Nogueira et al , 2020; Zheng et al , 2020; Zhuang et al , 2021; Wu et al , 2021; Chatterjee and Dietz, 2022; Wang et al , 2022).…”
Section: Related Workmentioning
confidence: 99%
“…A word will not only have one static vector but also a dynamic vector, depending on the specific context it is used for. This allows the resolution of terminology problems present in other word embedding models which associate a single representation for each word (Pfeiffer et al , 2018; Nogueira et al , 2020; Zheng et al , 2020; Zhuang et al , 2021; Wu et al , 2021; Chatterjee and Dietz, 2022; Wang et al , 2022).…”
Section: Related Workmentioning
confidence: 99%
“…Past studies have shown that entity ranking improves by leveraging mentions in text passages to create a topic-specific text-entity graph [12]. Transformer-based embeddings have been shown to be a reasonable entity ranking baseline [6], with a strong performance on the related tasks such as entity linking [38]. Entity ranking closely relates to entity aspect linking, where the task is to identify the fine-grained semantics of the entity that relates to a mention in a contextual passage [27,31].…”
Section: Entity Rankingmentioning
confidence: 99%
“…Entity ranking closely relates to entity aspect linking, where the task is to identify the fine-grained semantics of the entity that relates to a mention in a contextual passage [27,31]. Incorporation of entity aspects has also been shown to improve entity ranking [6].…”
Section: Entity Rankingmentioning
confidence: 99%
“…BERT, metin verilerindeki dil yapısını ve bağlamı anlamak ve metinlerdeki kelimelerin gömülmesini (embedding) oluşturmak için kullanılan bir derin öğrenme modelidir (Büyük, 2023). Ayrıca, bu modelin sorgu özgü varlık gömülerini oluşturmak için kullanıldığı ve bu kullanımın daha iyi sonuçlar verdiği birçok çalışmada belirtilmiştir (Chatterjee & Dietz, 2022). Sonuç olarak, önceden eğitilmiş BERT modeli, yalnızca bir ek çıkış katmanıyla ince ayar yapılarak, soru cevaplama ve dil çıkarım gibi geniş bir yelpazede görevler için son teknoloji modelleri oluşturabilir, önemli görev özgü mimari değişiklikleri gerektirmez (Devlin ve ark.…”
Section: Introductionunclassified