“…In the context of transformers, the general setup of ranking with dense representations involves learning transformer-based encoders that convert queries and texts into dense, fixed-size vectors. In the simplest approach, ranking becomes the problem of approximate nearest neighbor (ANN) search based on some simple metric such as cosine similarity Xiong et al, 2020;Lu et al, 2020;Reimers and Gurevych, 2019;Gao et al, 2020b;Karpukhin et al, 2020;Qu et al, 2020;Hofstätter et al, 2020a;Lin et al, 2020b). However, recognizing that accurate ranking cannot be captured via simple metrics, researchers have explored using more complex machinery to compare dense representations (Humeau et al, 2020;Khattab and Zaharia, 2020).…”