2017 IEEE International Conference on Data Mining Workshops (ICDMW) 2017
DOI: 10.1109/icdmw.2017.89
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting of Commercial Sales with Large Scale Gaussian Processes

Abstract: This paper argues that there has not been enough discussion in the field of applications of Gaussian Process for the fast moving consumer goods industry. Yet, this technique can be important as it e.g., can provide automatic feature relevance determination and the posterior mean can unlock insights on the data. Significant challenges are the large size and high dimensionality of commercial data at a point of sale. The study reviews approaches in the Gaussian Processes modeling for large data sets, evaluates th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
10
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 41 publications
0
10
0
Order By: Relevance
“…∈ [0, 1] is an inverse link function 26 that squashes f into the class probability space. Differently, the multi-class GPC (MGPC) [228] with y ∈ {1, · · · , C} is f c ∼ GP(0, k c ), p(y|f ) = Categorical(π(f )), (37) where {f c } C c=1 are independent latent functions 27 for C classes, and f = [f 1 , · · · , f C ] T : R d → R C . 28 Due to the non-Gaussian likelihood, exact inference for GPC however is intractable, thus requiring approximate inference, the key of which approximates the non-Gaussian posterior p(f |y) ∝ p(y|f )p(f ) with a Gaussian q(f |y) [227].…”
Section: F Scalable Gp Classificationmentioning
confidence: 99%
See 3 more Smart Citations
“…∈ [0, 1] is an inverse link function 26 that squashes f into the class probability space. Differently, the multi-class GPC (MGPC) [228] with y ∈ {1, · · · , C} is f c ∼ GP(0, k c ), p(y|f ) = Categorical(π(f )), (37) where {f c } C c=1 are independent latent functions 27 for C classes, and f = [f 1 , · · · , f C ] T : R d → R C . 28 Due to the non-Gaussian likelihood, exact inference for GPC however is intractable, thus requiring approximate inference, the key of which approximates the non-Gaussian posterior p(f |y) ∝ p(y|f )p(f ) with a Gaussian q(f |y) [227].…”
Section: F Scalable Gp Classificationmentioning
confidence: 99%
“…This sidesteps the non-Gaussian likelihood. A more principled way however is adopting GPCs in (36) and (37), and combining approximate inference, e.g., laplace approximation, EP and VI, with the sparse strategies in section III-C to derive scalable GPCs [90], [233]- [237]. 29 The main challenges of scalable GPC, especially MGPC, are: (i) the intractable inference and posterior, and (ii) the high training complexity for a large C. For the first issue, the stochastic GPC derives the model evidence expressed as the integration over an one-dimensional Gaussian distribution, which can be adequately calculated using Gaussian-Hermite quadrature [234], [238].…”
Section: F Scalable Gp Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…The aims and backgrounds of this research is to present a technique for data pre-processing accessible to non-technical business experts. Traditional forecasting methods are still being used primarily by over 40% of demand planners in the industry, [10], and the use of novel machine learning methods is a promising area with little academic research and insufficient efforts to expose practitioners to them, [24] [25].…”
Section: Introductionmentioning
confidence: 99%