2012
DOI: 10.1109/tsp.2012.2193397
|View full text |Cite
|
Sign up to set email alerts
|

Robust 1-bit Compressive Sensing Using Adaptive Outlier Pursuit

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
119
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 159 publications
(120 citation statements)
references
References 16 publications
1
119
0
Order By: Relevance
“…Specifically in this experiment, when n = 1000, m = 1000, the average computational times are: 6.7ms (Passive), 8 Similar observations can be found in Fig. 2, where different noise levels are considered.…”
Section: Larizations Section 3 Gives Several Such Kinds Of Algorithmsupporting
confidence: 81%
See 2 more Smart Citations
“…Specifically in this experiment, when n = 1000, m = 1000, the average computational times are: 6.7ms (Passive), 8 Similar observations can be found in Fig. 2, where different noise levels are considered.…”
Section: Larizations Section 3 Gives Several Such Kinds Of Algorithmsupporting
confidence: 81%
“…If the underlying signal is sparse, then sparsity pursuit techniques can help signal recovery, which is similar to the regular compressive sensing. Therefore, since its proposal by [6], 1bit-CS has attracted much attention in both the signal processing society ( [7,8,9,10]) and the machine learning society ( [11,12,13,14]). Because the one-bit information has no capability to specify the magnitude of the original signal, we assume x 2 = 1 without loss of generality (there is also some work on norm estimation, see, e.g., [15]), and 1bit-CS can be explained as finding the sparest vector on the unit sphere that coincides with the observed signs, i.e, minimize x∈R n x 0 , subject to y i = sgn(u ⊤ i x), ∀i = 1, 2, .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Extensions include recovering the norm of the target [32, 3] and non-Gaussian measurement settings [1]. Many first order methods [6,34,49,14] and greedy methods [35,5,29] are developed to minimize the sparsity promoting nonconvex objected function arising from either the unit sphere constraint or the nonconvex regularizers. To address…”
mentioning
confidence: 99%
“…Now we compare our proposed model(1.2) and PDASC algorithm with several state-of-the-art methods such as BIHT[28] (http://perso. uclouvain.be/laurent.jacques/index.php/Main/BIHTDemo), AOP[49] and PBAOP[24] (both AOP and PBAOP available at http://www.esat.kuleuven.be/stadius/ADB/huang/downloads/ 1bitCSLab.zip) and linear projection (LP)[47,42]. BIHT, AOP, LP and PBAOP are all required to specify the true sparsity level s. Both AOP and PBAOP also need to required to specify thesign flips probability q.…”
mentioning
confidence: 99%