Information Theory answers the fundamental problem of communication, that of reproducing at one point either exactly or approximately a message selected at another point. The design and development of synchronous communication systems, like satellite, wireless or optical fibers, rely nowadays in the establishment of performance measure indicators such as the error decoding probability and the channel capacity. Further progress in the field of communications requires among others the development of new, tight performance bounds, that are capable of designating desirable characteristics for coding schemes. The asymptotic behavior of these bounds is not always desirable since they do not reveal practical codes of finite length for simple and more complicated (nonlinear) channel models. Furthemore, the treatment of the inherent nonlinear behaviour of communication systems as well as the calculation of the capacity of the corresponding channels is crucial for enhancing system performance.In this thesis, Gallager's upper bound as well as its variations through the Duman-Salehi bound are improved. The technique that yields this improvement relies on the new inverse exponential sum inequality. The specific criterion is applied to linear codes that are transmitted in discrete, memoryless, linear symmetric channels and upper bounds the word and bit maximum likelihood error decoding probabilities. The analysis designates a new desirable characteristic for linear codes, that is directly connected with the concept of list decoding.The thesis also presents lower bounds on the capacity of nonlinear channels, combining the random coding technique with the theory of martingales. Upper bounds for the maximum likelihood error decoding probability are obtained through the representation of a nonlinear channel with Volterra series. The proposed research follows the main ideas that dominate Shannon's basic work and properly utilizes exponential martingale inequalities in order to bound the probabilities of erroneous decoding regions. The specific analysis is also applied to cases where the noise statistical characteristics (mean value, deviation) remain unknown.A desirable characteristic of the proposed techniques that provide tight bounds for error decoding probability is their ability to designate codes with optimum characteristics and with rates near channel capacity. The present work improves and extends the bound of Shulman-Feder for the family of binary, linear codes that are permutation invariant under list decoding. A new upper bound on list error decoding probability is presented that decreases double exponentially with respect to the code's block length.