A novel Decentralized Noisy Model Update Tracking Federated Learning algorithm (FedNMUT) is proposed that is tailored to function efficiently in the presence of noisy communication channels that reflect imperfect information exchange. This algorithm uses gradient tracking to minimize the impact of data heterogeneity while minimizing communication overhead. The proposed algorithm incorporates noise into its parameters to mimic the conditions of noisy communication channels, thereby enabling consensus among clients through a communication graph topology in such challenging environments. FedNMUT prioritizes parameter sharing and noise incorporation to increase the resilience of decentralized learning systems against noisy communications. Theoretical results for the smooth non-convex objective function are provided by us, and it is shown that the ϵ−stationary solution is achieved by our algorithm at the rate of O 1 √ T , where T is the total number of communication rounds. Additionally, via empirical validation, we demonstrated that the performance of FedNMUT is superior to the existing state-of-the-art methods and conventional parameter-mixing approaches in dealing with imperfect information sharing. This proves the capability of the proposed algorithm to counteract the negative effects of communication noise in a decentralized learning framework.Imperfect information exchange, such as noisy or quantized communication, has been examined in the context of average consensus algorithms within distributed frameworks. Yet, the ramifications of varying noise levels remain underexplored. Moreover, existing research, primarily focused on consensus issues, does not fully address the complex challenges encountered in contemporary decentralized optimization and learning paradigms [23], [24]. In contrast to Federated Learning (FL), where server assistance is common, Decentralized Federated Learning (DFL) operates without a central server, with each client acting autonomously, processing local Stochastic Gradient Descent (SGD) or its variants on its data and interacting directly with neighboring clients.In our previous paper [25], we performed a comparative study of three proposed algorithms for DFL under imperfect communication conditions, typified by noisy channels. These algorithms-FedNDL1, FedNDL2, and FedNDL3-differ in their handling of noise and parameter sharing, demonstrating varying degrees of resilience to communication noise. In this paper, we propose a novel algorithm that employs the Gradient Tracking method in DFL and compare its performance against the previously mentioned algorithms.
C. Paper's ContributionsThis paper introduces a novel algorithm that employs the Gradient Tracking method in DFL, considering the impact of communication noise. Previous studies have evaluated the effectiveness of two-time scale methods in DFL with noisy channels. However, these investigations were limited by inflexible assumptions such as strong convexity in papers such as [26]- [29]. These assumptions are rarely satisfied in practi...