Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2019
DOI: 10.1145/3292500.3330658
|View full text |Cite
|
Sign up to set email alerts
|

Applying Deep Learning to Airbnb Search

Abstract: e application of deep learning to search ranking was one of the most impactful product improvements at Airbnb. But what comes next a er you launch a deep learning model? In this paper we describe the journey beyond, discussing what we refer to as the ABCs of improving search: A for architecture, B for bias and C for cold start. For architecture, we describe a new ranking neural network, focusing on the process that evolved our existing DNN beyond a fully connected two layer network. On handling positional bias… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
46
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 74 publications
(51 citation statements)
references
References 23 publications
0
46
0
Order By: Relevance
“…For some models such as the autoencoder, we found that log based transformations are helpful in improving performance and speeding up training. This type of behavior for neural network models has been reported previously [17]. For ease in explanation, we denote the set A to be the indices of the six features that are the most important features in our pricing algorithm, e.g., price, cost, average historical price.…”
Section: Featuresmentioning
confidence: 81%
“…For some models such as the autoencoder, we found that log based transformations are helpful in improving performance and speeding up training. This type of behavior for neural network models has been reported previously [17]. For ease in explanation, we denote the set A to be the indices of the six features that are the most important features in our pricing algorithm, e.g., price, cost, average historical price.…”
Section: Featuresmentioning
confidence: 81%
“…However, the Mart-based models consume more memory. Nevertheless, uRank, uBoost, and urBoost models would have an advantage over Mart-based models by seamlessly linking to diverse deep learning embedding techniques when plain text features are present [19,29]. An interesting future direction would be leveraging the strengths of neural networks and trees.…”
Section: Resultsmentioning
confidence: 99%
“…Organizational resources are becoming increasingly common across organizations as teams rely more heavily on ML [29,30,75]. We demonstrate that leveraging these auxiliary resources provides opportunity to train better models for related tasks of new modalities.…”
Section: Feature Generationmentioning
confidence: 99%