2019
DOI: 10.1007/978-3-030-15712-8_19
|View full text |Cite
|
Sign up to set email alerts
|

Local and Global Query Expansion for Hierarchical Complex Topics

Abstract: In this work we study local and global methods for query expansion for multifaceted complex topics. We study word-based and entity-based expansion methods and extend these approaches to complex topics using fine-grained expansion on different elements of the hierarchical query structure. For a source of hierarchical complex topics we use the TREC Complex Answer Retrieval (CAR) benchmark data collection. We find that leveraging the hierarchical topic structure is needed for both local and global expansion metho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2
1
1

Relationship

3
3

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 22 publications
0
6
0
Order By: Relevance
“…Embedding-based Expansion Another approach for query expansion incorporates static embeddings (Pennington et al 2014;Mikolov et al 2013) to find the relevant terms to the query, because embeddings promise to capture the semantic similarity between terms and are used in different ways to expand queries (Diaz et al, 2016;Kuzi et al, 2016;Zamani & Croft, 2016Dalton et al, 2019;Roy et al, 2016;Naseri et al, 2018). These word embeddings, such as Word2Vec, GloVe, and others, learn a static word embedding for each term regardless of the context.…”
Section: Background and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Embedding-based Expansion Another approach for query expansion incorporates static embeddings (Pennington et al 2014;Mikolov et al 2013) to find the relevant terms to the query, because embeddings promise to capture the semantic similarity between terms and are used in different ways to expand queries (Diaz et al, 2016;Kuzi et al, 2016;Zamani & Croft, 2016Dalton et al, 2019;Roy et al, 2016;Naseri et al, 2018). These word embeddings, such as Word2Vec, GloVe, and others, learn a static word embedding for each term regardless of the context.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Complex Answer Retrieval (Table 2) We follow previous expansion work on CAR (Dalton et al 2019), and use BenchmarkY1Tree with the root topic titles removed. This is the recommended setup from the CAR organizers, and is an updated version of the widely used hierarchical judgments (and therefore slightly different from reported hierarchical values ).…”
Section: Contextualized Query Expansionmentioning
confidence: 99%
“…Another approach for query expansion incorporates static embeddings [26,19] to find the relevant terms to the query, because embeddings promise to capture the semantic similarity between terms and are used in different ways to expand queries [7,12,36,37,5,31,20]. These word embeddings, such as Word2Vec, GloVe, and others, learn a static word embedding for each term regardless of the context.…”
Section: Embedding-based Expansionmentioning
confidence: 99%
“…It is clear that integrating PRF signals into deep language models implies a trade-off between effectiveness and efficiency. While current approaches ignored efficiency, the majority still achieved marginal improvements in effective- maintaining efficiency: (i) by concatenating the feedback passages with the original query to form the new queries that contain the relevant signals, (ii) by pre-generating passage collection embeddings and performing PRF in the vector space, because embeddings promise to capture the semantic similarity between terms [10,13,22,36,37,45,56,57],…”
Section: Related Workmentioning
confidence: 99%