2019
DOI: 10.1111/sjos.12378
|View full text |Cite
|
Sign up to set email alerts
|

Scalable statistical inference for averaged implicit stochastic gradient descent

Abstract: Stochastic gradient descent (SGD) provides a scalable way to compute parameter estimates in applications involving large‐scale data or streaming data. As an alternative version, averaged implicit SGD (AI‐SGD) has been shown to be more stable and more efficient. Although the asymptotic properties of AI‐SGD have been well established, statistical inferences based on it such as interval estimation remain unexplored. The bootstrap method is not computationally feasible because it requires to repeatedly resample fr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(19 citation statements)
references
References 18 publications
(32 reference statements)
0
19
0
Order By: Relevance
“…Toulis et al (2014) proposed an AISGD algorithm that was shown to be more stable than the explicit SGD algorithm. Later, Fang (2019)…”
Section: Existing Methodsmentioning
confidence: 98%
See 4 more Smart Citations
“…Toulis et al (2014) proposed an AISGD algorithm that was shown to be more stable than the explicit SGD algorithm. Later, Fang (2019)…”
Section: Existing Methodsmentioning
confidence: 98%
“…() proposed an AISGD algorithm that was shown to be more stable than the explicit SGD algorithm. Later, Fang () extended AISGD by adding a random weight Wifalse(sfalse) to the gradient, resulting in the following implicit SGD procedure:bold-italicβi(s)im=bold-italicβi1normalim+γiWi(s)Ufalse(yi;xi,βifalse(normalsfalse)normalimfalse),bold-italicβi(s)aim=1ifalse∑k=1ibold-italicβk(s)im,i=1,,Nb.…”
Section: Existing Methodsmentioning
confidence: 99%
See 3 more Smart Citations