A fog-aided wireless network architecture is studied in which edge-nodes (ENs), such as base stations, are connected to a cloud processor via dedicated fronthaul links, while also being endowed with caches. Cloud processing enables the centralized implementation of cooperative transmission strategies at the ENs, albeit at the cost of an increased latency due to fronthaul transfer. In contrast, the proactive caching of popular content at the ENs allows for the low-latency delivery of the cached files, but with generally limited opportunities for cooperative transmission among the ENs. The interplay between cloud processing and edge caching is addressed from an information-theoretic viewpoint by investigating the fundamental limits of a high Signal-to-Noise-Ratio (SNR) metric, termed normalized delivery time (NDT), which captures the worst-case coding latency for delivering any requested content to the users. The NDT is defined under the assumptions of either serial or pipelined fronthauledge transmission, and is studied as a function of fronthaul and cache capacity constraints. Placement and delivery strategies across both fronthaul and wireless, or edge, segments are proposed with the aim of minimizing the NDT. Information-theoretic lower bounds on the NDT are also derived. Achievability arguments and lower bounds are leveraged to characterize the minimal NDT in a number of important special cases, including systems with no caching capabilities, as well as to prove that the proposed schemes achieve optimality within a constant multiplicative factor of 2 for all values of the problem parameters.
Index TermsCaching, Cloud Radio Access Network (C-RAN), Fog Radio Access Network, edge processing, 5G, degreesof-freedom, latency, wireless networks, interference channel. the fronthaul capacity is small. This is because, with pipelined transmission, the ENs need not wait for the fronthaul transmission to be completed before communicating to the users on the edge links. For the same reason, pipelined fronthaul-edge transmission generally improves the NDT compared to serial transmission. In particular, even with partial caching, that is, with µ < 1, the ideal NDT δ * = 1 is achievable with pipelined fronthaul-edge transmission, while this is not the case with serial transmission. More details can be found in Sections V-C and VI-D.Related Work: The line of work pertaining to the information-theoretic analysis of cache-aided communication systems can be broadly classified into studies that consider caching at the end-users' devices or at the ENs. This research direction was initiated by [9], [10] for a set-up that consists of a multicast link with cache-aided receivers. This work demonstrates that coded multicasting enables global caching gains to be reaped, as opposed to the conventional local caching gains of uncoded transmission. Follow-up papers on related models with receiver-end caching include [11]-[24]. The present paper is instead inscribed in the parallel line of work that concerns caching at the ENs of a wireless n...