2018
DOI: 10.48550/arxiv.1810.03180
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Simple Inference on Functionals of Set-Identified Parameters Defined by Linear Moments

Abstract: This paper considers uniformly valid (over a class of data generating processes) inference for linear functionals of partially identified parameters in cases where the identified set is defined by linear (in the parameter) moment inequalities. We propose a bootstrap procedure for constructing uniformly valid confidence sets for a linear functional of a partially identified parameter. The proposed method amounts to bootstrapping the value functions of a linear optimization problem, and subsumes subvector infere… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
6
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(8 citation statements)
references
References 44 publications
(65 reference statements)
2
6
0
Order By: Relevance
“…Also, if J = d, then the PPHI assumptions imply LICQ. This clarifies relation to recent work by Cho and Russell (2019) and Gafarov (2019): Both effectively impose LICQ and benefit from this by being able to propose relatively simple inference. However, while stronger than assumptions in CHT, BCS, and certainly Kaido, Molinari, and Stoye (2019), the assumptions exceed those in PPHI only in the sense of excluding "overidentified" support points, i.e.…”
supporting
confidence: 76%
See 3 more Smart Citations
“…Also, if J = d, then the PPHI assumptions imply LICQ. This clarifies relation to recent work by Cho and Russell (2019) and Gafarov (2019): Both effectively impose LICQ and benefit from this by being able to propose relatively simple inference. However, while stronger than assumptions in CHT, BCS, and certainly Kaido, Molinari, and Stoye (2019), the assumptions exceed those in PPHI only in the sense of excluding "overidentified" support points, i.e.…”
supporting
confidence: 76%
“…The literature on partial identification uses constraint qualifications in many ways: To ensure Hausdorff consistency or rates of convergence for simple estimators of identified sets (Chernozhukov, Hong, and Tamer, 2007;Yildiz, 2012), to justify inference for the full parameter vector θ (Chernozhukov, Hong, and Tamer, 2007) or subvectors (Cho and Russell, 2019;Gafarov, 2019;Pakes, Porter, Ho, and Ishii, 2011), or to justify efficiency bounds (Kaido and Santos, 2014). However, only some of these uses are explicit, making it difficult even for expert readers to compare assumptions.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…While these assumptions can be dispensed with when the researcher's goal is to obtain a confidence set for the partially identified parameter vector that ispointwise or uniformly-consistent in level (e.g., Andrews and Soares, 2010), related assumptions reappear when the aim is to obtain a confidence interval for a smooth function of the partially identified parameter vector that is-pointwise or uniformly-consistent in level (e.g., Pakes et al, 2011, PPHI henceforth Bugni, Canay, andShi, 2017, BCS henceforth). 1 Some more recent contributions (Cho and Russell, 2019;Gafarov, 2019) observe a connection to stochastic programming and show that inference becomes much more tractable under the so-called Linear Independence constraint qualification. Some obvious questions arise: How do all these assumptions relate?…”
Section: Introductionmentioning
confidence: 99%