2019
DOI: 10.1609/aaai.v33i01.33014139
|View full text |Cite
|
Sign up to set email alerts
|

Accurate and Interpretable Factorization Machines

Abstract: Factorization Machines (FMs), a general predictor that can efficiently model high-order feature interactions, have been widely used for regression, classification and ranking problems. However, despite many successful applications of FMs, there are two main limitations of FMs: (1) FMs consider feature interactions among input features by using only polynomial expansion which fail to capture complex nonlinear patterns in data. (2) Existing FMs do not provide interpretable prediction to users. In this paper, we … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(20 citation statements)
references
References 11 publications
0
11
0
Order By: Relevance
“…-NFM: Neural Factorization Machines (He and Chua, 2017) which combines the FM with the neural networks. -SEFM: It improves expressive limitation of FM by one-hot encoding to each input features (Lan and Geng, 2019). -Binarized FM: our proposed method.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…-NFM: Neural Factorization Machines (He and Chua, 2017) which combines the FM with the neural networks. -SEFM: It improves expressive limitation of FM by one-hot encoding to each input features (Lan and Geng, 2019). -Binarized FM: our proposed method.…”
Section: Methodsmentioning
confidence: 99%
“…As shown in (1), the relationship of individual features and feature interactions to the class label in FM is just linear and therefore it fails to capture highly nonlinear pattern in the data. Recently, SEFM Lan and Geng (2019) was proposed to address the expressiveness limitation of FM. The basic idea of SEFM is to apply one-hot encoding to each individual feature.…”
Section: Preliminariesmentioning
confidence: 99%
See 2 more Smart Citations
“…We now come to the optimization for the min-max objective function obj (•, •) in Eq. (15). Consider that if the parameter p is given, the problem is then casted as minimization for a non-convex function obj (•, p).…”
Section: Optimizationmentioning
confidence: 99%