2010 International Conference on Machine Learning and Cybernetics 2010
DOI: 10.1109/icmlc.2010.5580483
|View full text |Cite
|
Sign up to set email alerts
|

The iteration scheme with errors for zero point of m-accretive

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…Consequently, this may lead to an overly pessimistic problem in practice. 41 , 42 , 43 Therefore, structural constraints must be introduced on the uncertainty set to overcome this pessimism. Moreover, we specified as the combination of the client index and some of the sensitive attribute(s) , rather than using all the attributes.…”
Section: Resultsmentioning
confidence: 99%
“…Consequently, this may lead to an overly pessimistic problem in practice. 41 , 42 , 43 Therefore, structural constraints must be introduced on the uncertainty set to overcome this pessimism. Moreover, we specified as the combination of the client index and some of the sensitive attribute(s) , rather than using all the attributes.…”
Section: Resultsmentioning
confidence: 99%
“…In distributionally robust optimisation (DRO) ( Ben-Tal et al, 2013 , Rahimian and Mehrotra, 2019 ), one aims to minimise the worst-case expected loss over an ‘uncertainty set’ of distributions. In the group DRO setting ( Hu et al, 2018 , Oren et al, 2019 , Sagawa et al, 2020 ), this minimisation is simply over the (instantaneous) worst-performing group of examples. In the context of neural network optimisation, given training data already divided into groups, Sagawa et al (2020) minimise this empirical worst-group risk while demonstrating the importance of simultaneously enhancing generalisability through greater regularisation.…”
Section: Introductionmentioning
confidence: 99%