2021
DOI: 10.1186/s40649-021-00086-z
|View full text |Cite
|
Sign up to set email alerts
|

Gumbel-softmax-based optimization: a simple general framework for optimization problems on graphs

Abstract: In computer science, there exist a large number of optimization problems defined on graphs, that is to find a best node state configuration or a network structure, such that the designed objective function is optimized under some constraints. However, these problems are notorious for their hardness to solve, because most of them are NP-hard or NP-complete. Although traditional general methods such as simulated annealing (SA), genetic algorithms (GA), and so forth have been devised to these hard problems, their… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…The Gumbel Softmax is widely used for gradient-based optimization in differentiable neural networks. Additionally, it has applications in solving combinatorial problems and addressing inverse problems with discrete multivariate random variables [41], [42].…”
Section: Gumbel Softmaxmentioning
confidence: 99%
“…The Gumbel Softmax is widely used for gradient-based optimization in differentiable neural networks. Additionally, it has applications in solving combinatorial problems and addressing inverse problems with discrete multivariate random variables [41], [42].…”
Section: Gumbel Softmaxmentioning
confidence: 99%
“…In order to capture multiple modes of the posterior distribution, we optimize multiple Vs in parallel. To do this, we set up steps 1-3 such that x Âs are solved for in parallel 51 , where x is equal to the sample size and is calculated according to the size of the inputs (Ã K C ). See Supplementary Information for further explanation.…”
mentioning
confidence: 99%