2010 IEEE International Conference on Data Mining 2010
DOI: 10.1109/icdm.2010.164
|View full text |Cite
|
Sign up to set email alerts
|

One-Class Matrix Completion with Low-Density Factorizations

Abstract: Abstract-Consider a typical recommendation problem. A company has historical records of products sold to a large customer base. These records may be compactly represented as a sparse customer-times-product "who-bought-what" binary matrix. Given this matrix, the goal is to build a model that provides recommendations for which products should be sold next to the existing customer base. Such problems may naturally be formulated as collaborative filtering tasks. However, this is a one-class setting, that is, the o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 73 publications
(32 citation statements)
references
References 11 publications
0
31
0
Order By: Relevance
“…We will include one rankingloss approach in our empirical comparison. In [27], a variable is simultaneously optimized to decide if a missing entry is negative or not. To be focused, in the current work we do not discuss this approach.…”
mentioning
confidence: 99%
“…We will include one rankingloss approach in our empirical comparison. In [27], a variable is simultaneously optimized to decide if a missing entry is negative or not. To be focused, in the current work we do not discuss this approach.…”
mentioning
confidence: 99%
“…In this paper, we focus on the uniform strategy, and will refer to this approach as iZAN in the following. Note that the similar idea was also implicitly explored in [31].…”
Section: Preliminariesmentioning
confidence: 95%
“…Paquet and Koenigstein [25] propose a sampling-based method where the degree distributions of users/items are preserved. Sindhwani et al [31] propose to treat the unobserved data as optimization variables, which is essentially the imputation-based method.…”
Section: Related Workmentioning
confidence: 99%
“…Hanhuai and Arindam used a Gaussian prior with an arbitrary mean over u and v, and combined the correlated topic model and latent Dirichlet allocation work with their matrix model [6]. Vikas, Serhat, Jianying and Aleksandra introduced a new binary matrix, in which 0 means 'not purchased' and 1 means 'purchased', into the factor-learning process in non-negative Matrix Factorization (NMF) approach [7]. Compared to the above algorithms, the proposed new MF approach in this paper concentrates on transforming the original rating matrix and modifying already-predicted ratings using item attributes.…”
Section: Matrix Factorizationmentioning
confidence: 99%
“…Most MF researches improve the accuracy by taking the data (e.g. movie ratings) as the point of penetration without considering the impact of item attributes [5], [6], [7]. Although these new MF approaches can provide more accurate ratings, an improvement from different perspective will be necessary when they reach a bottleneck.…”
Section: Introductionmentioning
confidence: 99%