2014
DOI: 10.48550/arxiv.1410.4307
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Social Learning and Distributed Hypothesis Testing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

1
17
0

Year Published

2015
2015
2016
2016

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(18 citation statements)
references
References 0 publications
1
17
0
Order By: Relevance
“…The rules agents follow as well as the structure of the environment determines their ability to make decisions in a distributed manner. In this paper, we study the distributed non-Bayesian learning model where a group of agents tries to "learn" a hypothesis (from a parametrized family) that best explains some observed data [1], [2], [3], [4], [5], [6], [7], [8]. Observations are realizations of a vector random variable with unknown distribution.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The rules agents follow as well as the structure of the environment determines their ability to make decisions in a distributed manner. In this paper, we study the distributed non-Bayesian learning model where a group of agents tries to "learn" a hypothesis (from a parametrized family) that best explains some observed data [1], [2], [3], [4], [5], [6], [7], [8]. Observations are realizations of a vector random variable with unknown distribution.…”
Section: Introductionmentioning
confidence: 99%
“…N00014-12-1-0998. learning structure of the problem [2], [4]. Non-asymptotic rates have been recently derived for fixed graphs [3] and time-varying directed graphs [5].…”
Section: Introductionmentioning
confidence: 99%
“…In a simultaneous independent effort, the authors in [17], [18] proposed a similar non-Bayesian learning algorithm where a local Bayes update is followed by a consensus step. In [17], convergence result for fixed graphs is provided and large deviation convergence rates are given, proving the existence of a random time after which the beliefs will concentrate exponentially fast. In [18], similar probabilistic bounds for the rate of convergence are derived for fixed graphs and comparisons with the centralized version of the learning rule are provided.…”
mentioning
confidence: 99%
“…In [11], local exponential rates of convergence for undirected gossip-like graphs are studied. The authors in [12], [13] proposed a non-Bayesian learning algorithm where a local Bayes' update is followed by a consensus step. In [12], convergence result for fixed graphs is provided and large deviation convergence rates are given, proving the existence of a random time after which the beliefs will concentrate exponentially fast.…”
Section: Introductionmentioning
confidence: 99%