2019
DOI: 10.48550/arxiv.1911.02146
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multi-Item Mechanisms without Item-Independence: Learnability via Robustness

Abstract: We study the sample complexity of learning revenue-optimal multi-item auctions. We obtain the first set of positive results that go beyond the standard but unrealistic setting of item-independence. In particular, we consider settings where bidders' valuations are drawn from correlated distributions that can be captured by Markov Random Fields or Bayesian Networks -two of the most prominent graphical models. We establish parametrized sample complexity bounds for learning an up-to-ε optimal mechanism in both mod… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 38 publications
0
9
0
Order By: Relevance
“…Last year, Gonczarowski and Weinberg [25] show that one can learn an almost revenue-optimal ε-BIC mechanism using poly(n, m, 1/ε) samples under the itemindependence assumption, where n is the number of bidders and m is the number of items. Brustle et al [10] generalize the result to settings where the item values are drawn from correlated but structured distributions that can be modeled by either Markov random fields or Bayesian Networks. The mechanism they produce is still ε-BIC.…”
Section: Further Related Workmentioning
confidence: 95%
See 1 more Smart Citation
“…Last year, Gonczarowski and Weinberg [25] show that one can learn an almost revenue-optimal ε-BIC mechanism using poly(n, m, 1/ε) samples under the itemindependence assumption, where n is the number of bidders and m is the number of items. Brustle et al [10] generalize the result to settings where the item values are drawn from correlated but structured distributions that can be modeled by either Markov random fields or Bayesian Networks. The mechanism they produce is still ε-BIC.…”
Section: Further Related Workmentioning
confidence: 95%
“…The mechanism they produce is still ε-BIC. Our transformation can certainly convert these mechanisms from [25,10] into exactly BIC mechanisms, and the transformation requires poly ∑ i∈[n] |T i |, 1/ε many samples. Unfortunately, each |T i | is already exponential in m in their settings.…”
Section: Further Related Workmentioning
confidence: 99%
“…If the buyers' values for different items can be arbitrarily correlated, however, Dughmi et al [11] proved that exponentially many samples were required. Recently, Brustle et al [3] explored the regime between independent and arbitrarily correlated distributions and proved polynomial sample complexity bounds for certain families of structured correlated distributions. We leave as a future direction to explore the power of targeted samples in the multi-parameter setting.…”
Section: Related Workmentioning
confidence: 99%
“…In the general case when ϕ i may not be monotone, we need another ingredient by Myerson known as ironing. It can be interpreted as identifying an appropriate subset of values V i for each buyer i, 3 and constructing a distribution Di by rounding values from D i down to the closest value in V i . The resulting D is regular.…”
Section: Myerson's Optimal Auctionmentioning
confidence: 99%
“…A buy-one mechanism M defined by lotteries Λ satisfies the buy-many constraint if for every adaptive buying strategy A there exists a cheaper single lottery λ ∈ Λ dominating it. 4 Revenue. Define Rev D (M) to be the revenue achieved by mechanism M when buyer's type is drawn from D. We also use Rev v (M) to denote the payment of the buyer of type v in mechanism M. Let Opt(D) be the maximum revenue obtained by any truthful mechanism when the buyer's type is drawn from D. Let BuyManyRev(D) be the optimal revenue obtained by any buy-many mechanism for a buyer with type drawn from D.…”
Section: Preliminariesmentioning
confidence: 99%