2019
DOI: 10.2139/ssrn.3335536
|View full text |Cite
|
Sign up to set email alerts
|

Autoencoder Asset Pricing Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
70
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 58 publications
(72 citation statements)
references
References 32 publications
0
70
0
Order By: Relevance
“…36 Using stocks with missing characteristic information requires data imputation based on model assumptions. Gu et al (2019) replace a missing characteristic with the cross-sectional median of that characteristic during that month. However, this approach introduces an additional source of error and as it ignores the dependency structure in the characteristic space creates artificial time-series fluctuation in the characteristics, which we want to avoid.…”
Section: One Characteristic and One Macroeconomic State Processmentioning
confidence: 99%
See 1 more Smart Citation
“…36 Using stocks with missing characteristic information requires data imputation based on model assumptions. Gu et al (2019) replace a missing characteristic with the cross-sectional median of that characteristic during that month. However, this approach introduces an additional source of error and as it ignores the dependency structure in the characteristic space creates artificial time-series fluctuation in the characteristics, which we want to avoid.…”
Section: One Characteristic and One Macroeconomic State Processmentioning
confidence: 99%
“…Our approach also yields the conditional mean-variance efficient portfolio, but based on all stocks. Gu et al (2019) extend the linear conditional factor model of Kelly et al (2018) to a non-linear factor model using an autoencoder neural network. 5…”
Section: Introductionmentioning
confidence: 99%
“…Pelger and Xiong (2019) estimate model (21) by minimizing a local version of the Least-Squares criterion underlying PCA, where localization is implemented by kernel smoothing. In practice, the number of conditioning variables, which we can accommodate, is small.Among the nonparametric approaches, some recent work takes advantage of machine learning methods to achieve greater flexibility in the modeling of time-varying betas and accommodate the large dimensionality of the set of potential characteristics and state variables Gu et al (2019). consider the setting where the loadings are a nonparametric function of a large-dimensional vector of characteristics: b i,t = b(Z i,t−1 ), and use an autoencoder to estimate this relationship.…”
mentioning
confidence: 99%
“…consider the setting where the loadings are a nonparametric function of a large-dimensional vector of characteristics: b i,t = b(Z i,t−1 ), and use an autoencoder to estimate this relationship. Autoencoder is a class of universal approximators in the realm of Artificial Neural Networks (seeGu et al (2019) and references therein). Using L b hidden layers and an activator function g, each component of the loadings vector is approximated as:…”
mentioning
confidence: 99%
See 1 more Smart Citation