2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2015
DOI: 10.1109/icassp.2015.7178400
|View full text |Cite
|
Sign up to set email alerts
|

Sparse sensing for distributed gaussian detection

Abstract: An offline sampling design problem for Gaussian detection is considered in this paper. The sensing operation is modeled through a vector, whose sparsity order is determined by the prescribed global error probability. Since the numerical optimization of the error probability is difficult, equivalent simpler costs, viz., the KullbackLiebler distance and Bhattacharyya distance are optimized. The sensing problem is formulated and solved sub-optimally using convex optimization techniques. Furthermore, it is shown t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 12 publications
0
9
0
Order By: Relevance
“…It is clear from (25) that when an inactive sensor is made active, the increase in Fisher information leads to an information gain in terms of the rank-one matrix given by (25). Such a phenomenon was also discovered in the calculation of sensor utility for adaptive signal estimation [29] and leader selection in stochastically forced consensus networks [12].…”
Section: B Greedy Algorithmmentioning
confidence: 89%
“…It is clear from (25) that when an inactive sensor is made active, the increase in Fisher information leads to an information gain in terms of the rank-one matrix given by (25). Such a phenomenon was also discovered in the calculation of sensor utility for adaptive signal estimation [29] and leader selection in stochastically forced consensus networks [12].…”
Section: B Greedy Algorithmmentioning
confidence: 89%
“…In particular, the Chernoff information [14,15] is asymptotically (in N) relied on the exponential rate of P (N) e . It turns out that the Chernoff information is very useful in many practically important problems as for instance, distributed sparse detection [16], sparse support recovery [17], energy detection [18], multi-input and multi-output (MIMO) radar processing [19,20], network secrecy [21], angular resolution limit in array processing [22], detection performance for informed communication systems [23], just to name a few. In addition, the Chernoff information bound can be tight for a minimal s-divergence over parameter s ∈ (0, 1).…”
Section: State-of-the-art and Problem Statementmentioning
confidence: 99%
“…We can notice that for Q = 1, the result 5 is similar to result 1. However, when Q ≥ 2, the integrals in Equation (16) are not tractable in a closed-form expression. For instance, let Q = 2, we consider the integral…”
Section: Remarkmentioning
confidence: 99%
“…The key idea of expressing (7) as an explicit function of w is to replace Φ w with w based on their relationship given by (4). Consider a decomposition of the noise covariance matrix [25]…”
Section: B Fisher Information J W As An Explicit Function Of Wmentioning
confidence: 99%
“…given w, enumerate all the inactive sensors in I to determine j ∈ I such that tr(J −1 w ) − tr(J −1 w ) in ( 26) is maximized 3: update w by setting w j = 1, and update J w by adding c j α j α T j in (25) 4:…”
Section: Algorithm 3 Greedy Algorithm For Sensor Selectionmentioning
confidence: 99%