2022
DOI: 10.48550/arxiv.2206.00267
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

LPFS: Learnable Polarizing Feature Selection for Click-Through Rate Prediction

Abstract: In industry, feature selection is a standard but necessary step to search for an optimal set of informative feature fields for efficient and effective training of deep Click-Through Rate (CTR) models. Most previous works measure the importance of feature fields by using their corresponding continuous weights from the model, then remove the feature fields with small weight values. However, removing many features that correspond to small but not exact zero weights will inevitably hurt model performance and not b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(8 citation statements)
references
References 25 publications
(57 reference statements)
0
8
0
Order By: Relevance
“…When g 𝑘 𝑖 = 1, feature x 𝑘 𝑖 is in the optimal feature set X g and vice versa. Notice that previous work [8,30,32] ]. The final prediction can be formulated as follows:…”
Section: Feature Selectionmentioning
confidence: 99%
See 4 more Smart Citations
“…When g 𝑘 𝑖 = 1, feature x 𝑘 𝑖 is in the optimal feature set X g and vice versa. Notice that previous work [8,30,32] ]. The final prediction can be formulated as follows:…”
Section: Feature Selectionmentioning
confidence: 99%
“…3.1.3 Baseline Methods and Backbone Models. We compare the proposed method OptFS with the following feature selection methods: (i) AutoField [32]: This baseline utilizes neural architecture search techniques [15] to select the informative features on a field 1 https://www.kaggle.com/c/criteo-display-ad-challenge 2 http://www.kaggle.com/c/avazu-ctr-prediction 3 http://www.kddcup2012.org/c/kddcup2012-track2/data level; (ii) LPFS [8]: This baseline designs a customized, smoothedl 0 -liked function to select informative fields on a field level; (iii) AdaFS [13]: This baseline that selects the most relevant features for each sample via a novel controller network. We apply the above baselines over the following mainstream backbone models: FM [26], DeepFM [7], DCN [31] and IPNN [24].…”
Section: Metricsmentioning
confidence: 99%
See 3 more Smart Citations