Abstract-Model-based compressive sensing (CS) for signalspecific applications is of particular interest in the sparse signal approximation. In this paper, we deal with a special class of sparse signals with binary entries. Unlike conventional CS approaches based on l1 minimization, we model the CS process with a bi-partite graph. We design a novel sampling matrix with unique sum property, which can be universally applied to any binary signal. Moreover, a novel binary CS decoding algorithm (BCS) based on graph and unique sum table, which does not need complex optimization process, is proposed. Proposed method is verified and compared with existing solutions through mathematical analysis and numerical simulations.
In this paper, we propose an integration of compressive sensing (CS) and clustering in WSNs utilizing block diagonal matrices (BDMs) as the measurement matrices. Such an integration results in a significant reduction in the power consumption related to the data collection. The main idea is to partition a WSN into clusters, where each cluster head (CH) collects the sensor readings within its cluster only once and then generates CS measurements to be forwarded to the base station (BS). We considered two methods to forward CS measurements from CHs to the BS: (i) direct and (ii) multi-hop routing through intermediate CHs. For the latter case, a distributed tree-based algorithm is utilized to relay CS measurements to the BS. The BS then implements a CS recovery process in the collected M CS measurements to reconstruct all N sensory data, where M N . Under this novel framework, we formulated the total power consumption and discussed the effect of different sparsifying bases on the CS performance as well as the optimal number of clusters for reaching the minimum power consumption.
Augmenting deep neural networks with skip connections, as introduced in the so called ResNet architecture, surprised the community by enabling the training of networks of more than 1000 layers with significant performance gains. It has been shown that identity skip connections eliminate singularities and improve the optimization landscape of the network. This paper deciphers ResNet by analyzing the of effect of skip connections in the backward path and sets forth new theoretical results on the advantages of identity skip connections in deep neural networks. We prove that the skip connections in the residual blocks facilitate preserving the norm of the gradient and lead to well-behaved and stable back-propagation, which is a desirable feature from optimization perspective. We also show that, perhaps surprisingly, as more residual blocks are stacked, the network becomes more norm-preserving. Traditionally, norm-preservation is enforced on the network only at beginning of the training, by using initialization techniques. However, we show that identity skip connection retain norm-preservation during the training procedure. Our theoretical arguments are supported by extensive empirical evidence. Can we push for more norm-preservation? We answer this question by proposing zero-phase whitening of the fully-connected layer and adding norm-preserving transition layers. Our numerical investigations demonstrate that the learning dynamics and the performance of ResNets can be improved by making it even more norm preserving through changing only a few blocks in very deep residual networks. Our results and the introduced modification for ResNet, referred to as Procrustes ResNets, can be used as a guide for studying more complex architectures such as DenseNet, training deeper networks, and inspiring new architectures.
Abstract-In this letter, we propose a new scheme to construct low-density parity-check (LDPC) codes that are suitable for unequal error protection (UEP). We derive UEP density evolution (UDE) formulas for the proposed ensemble over the binary erasure channel (BEC). Using the UDE formulas, high performance UEP codes can be found. Simulation results depict an improvement in the bit error rate of more important bits in comparison with previous results on UEP-LDPC codes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.