2008 42nd Asilomar Conference on Signals, Systems and Computers 2008
DOI: 10.1109/acssc.2008.5074720
|View full text |Cite
|
Sign up to set email alerts
|

The decentralized estimation of the sample covariance

Abstract: In this paper we consider the problem of estimating the eigenvectors of the sample covariance matrix of decentralized measurements in a distributed fashion. The need for a distributed scheme is motivated by the many moment based methods that resort to the covariance of the data to extract information from the measurements. For large sensor network, gathering the data at a central processor generates a communication bottleneck. Our algorithm is based on a combination of the so called power method, that is used … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
72
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 58 publications
(72 citation statements)
references
References 15 publications
(16 reference statements)
0
72
0
Order By: Relevance
“…However, relatively little attention has been paid to the problem of distributed learning of the geometric structure of data in a data-adaptive manner. Most notable exceptions to this include [7]- [11]. While our work as well as [7]- [11] rely on consensus averaging for computing the underlying geometric structure, we are explicit in our formulation that perfect consensus under arbitrary topologies cannot be achieved.…”
Section: Relationship To Previous Workmentioning
confidence: 98%
See 1 more Smart Citation
“…However, relatively little attention has been paid to the problem of distributed learning of the geometric structure of data in a data-adaptive manner. Most notable exceptions to this include [7]- [11]. While our work as well as [7]- [11] rely on consensus averaging for computing the underlying geometric structure, we are explicit in our formulation that perfect consensus under arbitrary topologies cannot be achieved.…”
Section: Relationship To Previous Workmentioning
confidence: 98%
“…The setting studied in some of these works is that data is partitioned horizontally, with each distributed entity responsible for some dimensions of the data [6], [9]. Some of the other works in this direction focus on learning under the assumption of data lying near (linear) subs paces [7], [8], require extensive communications among the distributed entities [10], and ignore some of the technical details associated with processing among distributed entities having interconnections described by graphs of arbi trary topologies [7], [8], [10], [11].…”
Section: A Backgroundmentioning
confidence: 99%
“…This method converges to the maximum eigenvector of C as long as the maximum eigenvalue ofĈ is strictly greater than the other eigenvalues and the vector q(0), an initial random vector, has a non-zero component in the direction of the eigenvector associated to the largest eigenvalue [15]. For simplicity, let us assume thatx = 0.…”
Section: A Consensus On the Rotation And Translationmentioning
confidence: 99%
“…Unlike [15], we perform the average consensus protocols for x i (x i · q 1 (n)), i = 1, ..., N , instead of only the inner products and, in our algorithm, each robot does not need to know the number of robots in the team. Finally, we can obtain the estimate of q 1 .…”
Section: A Consensus On the Rotation And Translationmentioning
confidence: 99%
See 1 more Smart Citation