2019
DOI: 10.2139/ssrn.3465357
|View full text |Cite
|
Sign up to set email alerts
|

Large Dimensional Latent Factor Modeling with Missing Observations and Applications to Causal Inference

Abstract: This paper develops the inferential theory for latent factor models estimated from large dimensional panel data with missing observations. We estimate a latent factor model by applying principal component analysis to an adjusted covariance matrix estimated from partially observed panel data. We derive the asymptotic distribution for the estimated factors, loadings and the imputed values under a general approximate factor model. The key application is to estimate counterfactual outcomes in causal inference from… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 65 publications
0
8
0
Order By: Relevance
“…Our analysis is similar in objective to that of Su et al (2019) and Xiong and Pelger (2019) but differs in two ways. First, our results hold for arbitrary type of missing data and does not require assumptions about the missing data mechanism.…”
Section: Missing Datamentioning
confidence: 59%
See 2 more Smart Citations
“…Our analysis is similar in objective to that of Su et al (2019) and Xiong and Pelger (2019) but differs in two ways. First, our results hold for arbitrary type of missing data and does not require assumptions about the missing data mechanism.…”
Section: Missing Datamentioning
confidence: 59%
“…A finding that emerges from Su et al (2019) is that imputation noise from the initial estimation will affect all subsequent factor estimates, a consequence of the fact that principal components are weighted averages of all data, including the imputed ones. Xiong and Pelger (2019) also initializes the missing values to zero but re-weights the data to remove bias while allowing the probability of missing data to depend on observables. However, the inferential theory assumes that the probabilities are known.…”
Section: Missing Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Single imputation refers to the use of a specific number (i.e., a best guess) in place of each missing value, which includes for example, k-nearest neighbor, mean/median imputation, smoothing, interpolation [21] and spline. Matrix completion and matrix factorization [5,6,7,24,20] are often used to impute panel data, as well as latent factor modelling [10,2,33]. Recently, there is a surge in the application of the recurrent neural network for imputation [9,11,22,8], as well as the generative adversarial network [34].…”
Section: Related Literaturementioning
confidence: 99%
“…Versions of this model have been considered by Hsiao, Steve Ching and Ki Wan (2012), Gobillon and Magnac (2016), Athey, Bayati, Doudchenko, Imbens and Khosravi (2017), Li and Bell (2017), Xu (2017), Li (2018), Bai and Ng (2019a), Xiong and Pelger (2019), and Chan and Kwok (2020). Example 1 is a special case with…”
Section: Model and Effects Of Interestmentioning
confidence: 99%