Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2019
DOI: 10.48550/arxiv.1907.05638
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning Functions over Sets via Permutation Adversarial Networks

Abstract: In this paper, we consider the problem of learning functions over sets, i.e., functions that are invariant to permutations of input set items. Recent approaches of pooling individual element embeddings [34] can necessitate extremely large embedding sizes for challenging functions. We address this challenge by allowing standard neural networks like LSTMs to succinctly capture the function over the set. However, to ensure invariance with respect to permutations of set elements, we propose a novel architecture ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…In theory (Zaheer et al 2017, Theorem 2), if COMB 1,2 are given 'sufficient' hidden units, this set representation is universal. In practice, however, commutative-associative aggregation suffers from limited expressiveness (Pabbaraju and Jain 2019;Wagstaff et al 2019;Garg, Jegelka, and Jaakkola 2020;Cohen-Karlik, David, and Globerson 2020), which degrades the quality of x u and s(•, •), as described below. Specifically, their expressiveness is constrained from two perspectives.…”
Section: Gnns and Their Limitationsmentioning
confidence: 99%
“…In theory (Zaheer et al 2017, Theorem 2), if COMB 1,2 are given 'sufficient' hidden units, this set representation is universal. In practice, however, commutative-associative aggregation suffers from limited expressiveness (Pabbaraju and Jain 2019;Wagstaff et al 2019;Garg, Jegelka, and Jaakkola 2020;Cohen-Karlik, David, and Globerson 2020), which degrades the quality of x u and s(•, •), as described below. Specifically, their expressiveness is constrained from two perspectives.…”
Section: Gnns and Their Limitationsmentioning
confidence: 99%
“…This has motivated the development and application of neural network architectures which are insensitive to the ordering of the jet constituents, where jets are viewed as point clouds: a set of data points in space. The representation of functions on sets from which neural network architectures may be modeled has been much studied recently in the ML community [40,[42][43][44][45]. Collider physics studies in this vein include the use of architectures based on Graph Neural Networks such as Ref.…”
Section: Energy Flow Networkmentioning
confidence: 99%
“…In theory [52,Theorem 2], if COMB 1,2 are given 'sufficient' hidden units, this set representation is universal. In practice, however, commutative-associative aggregation suffers from limited expressiveness [9,30,44], which degrades the quality of x u and s(•, •), as described below. Specifically, their expressiveness is constrained from two perspectives.…”
Section: Gnns and Their Limitationsmentioning
confidence: 99%
“…Cuturi [8] exploited this to solve transportation problems approximately. It was soon realized [26,30] that row and column scaling transform an arbitrary matrix to a near-permutation matrix, while allowing backpropagation. After the seminal deep sets work of Zaheer et al [52], several efforts [5,20,38,40,41] were made to capture dependencies between set elements while retaining order invariance by design [27].…”
Section: Contentsmentioning
confidence: 99%