2021
DOI: 10.1080/24754269.2021.1974158
|View full text |Cite
|
Sign up to set email alerts
|

A review of distributed statistical inference

Abstract: The rapid emergence of massive datasets in various fields poses a serious challenge to traditional statistical methods. Meanwhile, it provides opportunities for researchers to develop novel algorithms. Inspired by the idea of divide-and-conquer, various distributed frameworks for statistical estimation and inference have been proposed. They were developed to deal with large-scale statistical optimization problems. This paper aims to provide a comprehensive review for related literature. It includes parametric … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(13 citation statements)
references
References 43 publications
(50 reference statements)
0
13
0
Order By: Relevance
“…Both the MPLE and MLE of the ERGM are consistent for a growing number of networks observed from the same set of fixed nodes (Arnold & Strauss, 1991). The MPLE can be rapidly computed from large networks (Schmid & Desmarais, 2017), if necessary, using divide-and-conquer (Gao et al 2022;Rosenblatt & Nadler, 2016;Minsker, 2019) or streaming methods (Luo & Song, 2019).…”
Section: Overview Of Network Data Modelingmentioning
confidence: 89%
“…Both the MPLE and MLE of the ERGM are consistent for a growing number of networks observed from the same set of fixed nodes (Arnold & Strauss, 1991). The MPLE can be rapidly computed from large networks (Schmid & Desmarais, 2017), if necessary, using divide-and-conquer (Gao et al 2022;Rosenblatt & Nadler, 2016;Minsker, 2019) or streaming methods (Luo & Song, 2019).…”
Section: Overview Of Network Data Modelingmentioning
confidence: 89%
“…Offline or Online Distributed (Parameter) Estimation. One of the extensively studied distributed estimation settings is the offline setting [22], where each agent obtains multiple i.i.d. samples at the beginning of the learning task.…”
Section: Related Workmentioning
confidence: 99%
“…The literature on distributed learning is flourishing and encompasses a variety of topics: M -estimation (Zhang et al, 2013, Shamir et al, 2014, Lee et al, 2017, Wang et al, 2017, Battey et al, 2018, Jordan et al, 2019, Fan et al, 2021, Principal Component Analysis (Garber et al, 2017, Fan et al, 2019, Chen et al, 2021, feature screening (Li et al, 2020), to name a few. We refer to Gao et al (2022) for a recent review of the distributed learning literature.…”
Section: Introductionmentioning
confidence: 99%