Abstract-We study a problem of broadcasting confidential messages to multiple receivers under an information-theoretic secrecy constraint. Two scenarios are considered: 1) all receivers are to obtain a common message; and 2) each receiver is to obtain an independent message. Moreover, two models are considered: parallel channels and fast-fading channels.For the case of reversely degraded parallel channels, one eavesdropper, and an arbitrary number of legitimate receivers, we determine the secrecy capacity for transmitting a common message, and the secrecy sum-capacity for transmitting independent messages. For the case of fast-fading channels, we assume that the channel state information of the legitimate receivers is known to all the terminals, while that of the eavesdropper is known only to itself. We show that, using a suitable binning strategy, a common message can be reliably and securely transmitted at a rate independent of the number of receivers. We also show that a simple opportunistic transmission strategy is optimal for the reliable and secure transmission of independent messages in the limit of large number of receivers.
Abstract-Burnashev in 1976 gave an exact expression for the reliability function of a discrete memoryless channel (DMC) with noiseless feedback. A coding scheme that achieves this exponent needs, in general, to know the statistics of the channel. Suppose now that the coding scheme is designed knowing only that the channel belongs to a family of DMCs. Is there a coding scheme with noiseless feedback that achieves Burnashev's exponent uniformly over at a nontrivial rate? We answer the question in the affirmative for two families of channels (binary symmetric, and Z). For these families we show that, for any given fraction, there is a feedback coding strategy such that for any member of the family: i) guarantees this fraction of its capacity as rate, and ii) guarantees the corresponding Burnashev's exponent. Therefore, for these families, in terms of delay and error probability, the knowledge of the channel becomes asymptotically irrelevant in feedback code design: there are blind schemes that perform as well as the best coding scheme designed with the foreknowledge of the channel under use. However, a converse result shows that, in general, even for families that consist of only two channels, such blind schemes do not exist.Index Terms-Burnashev's exponent, error exponent, feedback, universal channel coding, variable-length coding.
We consider asynchronous communication over point-to-point discrete memoryless channels without feedback. The transmitter starts sending one block codeword at an instant that is uniformly distributed within a certain time period, which represents the level of asynchronism. The receiver, by means of a sequential decoder, must isolate the message without knowing when the codeword transmission starts but being cognizant of the asynchronism level. We are interested in how quickly can the receiver isolate the sent message, particularly in the regime where the asynchronism level is exponentially larger than the codeword length, which we refer to as 'strong asynchronism. ' This model of sparse communication might represent the situation of a sensor that remains idle most of the time and, only occasionally, transmits information to a remote base station which needs to quickly take action. Because of the limited amount of energy the sensor possesses, assuming the same cost per transmitted symbol, it is of interest to consider minimum size codewords given the asynchronism level.The first result is an asymptotic characterization of the largest asynchronism level, in terms of the codeword length, for which reliable communication can be achieved: vanishing error probability can be guaranteed as the codeword length N tends to infinity while the asynchronism level grows as e N α if and only if α does not exceed the synchronization threshold, a constant that admits a simple closed form expression, and is at least as large as the capacity of the synchronized channel.The second result is the characterization of a set of achievable strictly positive rates in the regime where the asynchronism level is exponential in the codeword length, and where the rate is defined with respect to the expected (random) delay between the time information starts being emitted until the time the receiver makes a decision. Interestingly, this achievability result is obtained by a coding strategy whose decoder not only operates in an asynchronously, but has an almost universal decision rule, in the sense that it is almost independent of the channel statistics.As an application of the first result we consider antipodal signaling over a Gaussian additive channel and derive a simple necessary condition between blocklength, asynchronism level, and SNR for achieving reliable communication.
We consider the 'one-shot frame synchronization problem' where a decoder wants to locate a sync pattern at the output of a channel on the basis of sequential observations. We assume that the sync pattern of length N starts being emitted at a random time within some interval of size A, that characterizes the asynchronism level between the transmitter and the receiver. We show that a sequential decoder can optimally locate the sync pattern, i.e., exactly, without delay, and with probability approaching one as N → ∞, if and only if the asynchronism level grows as O(e N α ), with α below the synchronization threshold, a constant that admits a simple expression depending on the channel. This constant is the same as the one that characterizes the limit for reliable asynchronous communication, as was recently reported by the authors. If α exceeds the synchronization threshold, any decoder, sequential or non-sequential, locates the sync pattern with an error that tends to one as N → ∞. Hence, a sequential decoder can locate a sync pattern as well as the (non-sequential) maximum likelihood decoder that operates on the basis of output sequences of maximum length A + N − 1, but with much fewer observations.
The capacity per unit cost, or, equivalently, the minimum cost to transmit one bit, is a well-studied quantity under the assumption of full synchrony between the transmitter and the receiver. In many applications, such as sensor networks, transmissions are very bursty, with amounts of bits arriving infrequently at random times. In such scenarios, the cost of acquiring synchronization is significant and one is interested in the fundamental limits on communication without assuming a priori synchronization. In this paper, the minimum cost to transmit bits of information asynchronously is shown to be equal to , where is the synchronous minimum cost per bit and is a measure of timing uncertainty equal to the entropy for most reasonable arrival time distributions. This result holds when the transmitter can stay idle at no cost and is a particular case of a general result which holds for arbitrary cost functions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.