Proceedings of the Recommender Systems Challenge 2017 2017
DOI: 10.1145/3124791.3124794
|View full text |Cite
|
Sign up to set email alerts
|

Practical Lessons for Job Recommendations in the Cold-Start Scenario

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(19 citation statements)
references
References 14 publications
0
19
0
Order By: Relevance
“…Due to the large number of possible category variables, the converted one-hot features are usually of high dimensionality but sparse [11], and simply using raw features rarely provides optimal results. On this occasion, the interactions among different features act as the winning formula for a wide range of data mining tasks [7], [12], [13]. The interactions among multiple raw features are usually termed as cross features [7] (a.k.a.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Due to the large number of possible category variables, the converted one-hot features are usually of high dimensionality but sparse [11], and simply using raw features rarely provides optimal results. On this occasion, the interactions among different features act as the winning formula for a wide range of data mining tasks [7], [12], [13]. The interactions among multiple raw features are usually termed as cross features [7] (a.k.a.…”
Section: Introductionmentioning
confidence: 99%
“…In short, there are two major trends of improvements over the plain FM. One is to make the model "deep" with multi-layer network structures in order to exhaustively extract useful information from feature interactions, e.g., the residual network in DeepCross [7], the pairwise product layer in PNN [21], and the compressed interaction network in xDeepFM [12]. The other is to make the model "wide" by considering multiple feature interactions in varied domains (usually coupled with "deep" structures), e.g., separately modelling user logs and texts with CoFM [15], or fusing shallow low-order output with dense high-order output via Wide&Deep [18], DeepFM [20] and xDeepFM [19].…”
Section: Introductionmentioning
confidence: 99%
“…Due to the large number of possible category variables, the converted one-hot features are usually of high dimensionality but sparse [69], and simply using raw features rarely provides optimal results. On this occasion, the interactions among different features act as the winning formula for a wide range of data mining tasks [95][96][97] information for user profiling with cross features, such as ( junior, engineer) and (senior, lecturer). To avoid the high cost of task-specific manual feature engineering, factorization machines (FMs) [20] are proposed to embed raw features into a latent space, and model the interactions among features via the inner product of their embedding vectors.…”
Section: Predictive Analytics Under Sparsitymentioning
confidence: 99%
“…However, these In short, there are two major trends of improvements over the plain FM. One is to make the model "deep" with multi-layer network structures in order to exhaustively extract useful information from feature interactions, e.g., the residual network in DeepCross [95], the pairwise product layer in PNN [99], and the compressed interaction network in xDeepFM [96]. The other is to make the model "wide" by considering multiple feature interactions in varied domains (usually coupled with "deep" structures), e.g., separately modelling user logs and texts with CoFM [84], or fusing shallow low-order output with dense high-order output via Wide&Deep [101], DeepFM [100] and xDeepFM [70].…”
Section: Evolution Of Factorization Machinesmentioning
confidence: 99%
See 1 more Smart Citation