2022
DOI: 10.1613/jair.1.13547
|View full text |Cite
|
Sign up to set email alerts
|

The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality

Abstract: Understanding the computational complexity of training simple neural networks with rectified linear units (ReLUs) has recently been a subject of intensive research. Closing gaps and complementing results from the literature, we present several results on the parameterized complexity of training two-layer ReLU networks with respect to various loss functions. After a brief discussion of other parameters, we focus on analyzing the influence of the dimension d of the training data on the computational complexity. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 9 publications
references
References 25 publications
0
0
0
Order By: Relevance