This article investigates complexity and approximability properties of combinatorial optimization problems yielded by the notion of Shared Risk Resource Group (SRRG). SRRG has been introduced in order to capture network survivability issues where a failure may break a whole set of resources, and has been formalized as colored graphs, where a set of resources is represented by a set of edges with same color. We consider here the analogous of classical problems such as determining paths or cuts with the minimum numbers of colors or color disjoint paths. These optimization problems are much more difficult than their counterparts in classical graph theory. In particular standard relationship such as the Max Flow -Min Cut equality do not hold any longer. In this article we identify cases where these problems are polynomial, for example when the edges of a given color form a connected subgraph, and otherwise give hardness and non approximability results for these problems.
This article investigates the problem of the allocation of modulation and coding, subcarriers and power to users in LTE. The proposed model achieves inter-cell interference mitigation through the dynamic and distributed self-organization of cells. Therefore, there is no need for any a prior frequency planning. Moreover, a two-level decomposition method able to find near optimal solutions is proposed to solve the optimization problem. Finally, simulation results show that compared to classic reuse schemes the proposed approach is able to pack more users into the same bandwidth, decreasing the probability of user outage.
Air pollution has become a major issue of modern megalopolis because of industrial emissions and increasing urbanization along with traffic jams and heating/cooling of buildings. Monitoring urban air quality is therefore required by municipalities and by the civil society. Current monitoring systems rely on reference sensing stations that are precise but massive, costly and therefore seldom. In this paper, we focus on an alternative or complementary approach, with a network of low cost and autonomic wireless sensors, aiming at a finer spatiotemporal granularity of sensing. Generic deployment models of the literature are not adapted to the stochastic nature of pollution sensing. Our main contribution is to design integer linear programming models that compute sensor deployments capturing both the coverage of pollution under time-varying weather conditions and the connectivity of the infrastructure. We evaluate our deployment models on a real data set of Greater London. We analyze the performance of the proposed models and show that our joint coverage and connectivity formulation is tight and compact, with a reasonable enough execution time. We also conduct extensive simulations to derive engineering insights for effective deployments of air pollution sensors in an urban environment.
International audienceWe consider the problem of finding a lightpath assignment for a given set of communication requests on a multifiber WDM optical network with wavelength translators. Given such a network and w, the number of wavelengths available on each fiber, k, the number of fibers per link, and c, the number of partial wavelength translations available on each node, our problem stands for deciding whether it is possible to find a w-lightpath for each request in the set such that there is no link carrying more that k lightpaths using the same wavelength nor node where more than c wavelength translations take place. Our main theoretical result is the writing of this problem as a particular instance of integral multicommodity flow, hence integrating routing and wavelength assignment in the same model. We then provide three heuristics mainly based upon randomized rounding of fractional multicommodity flow and enhancements that are three different answers to the trade-off between efficiency and tightness of approximation, and discuss their practical performances on both theoretical and real-world instances
Network measurement is essential for assessing performance issues, identifying and locating problems. Two common strategies are the passive approach that attaches specific devices to links in order to monitor the traffic that passes through the network and the active approach that generates explicit control packets in the network for measurements. One of the key issues in this domain is to minimize the overhead in terms of hardware, software, maintenance cost and additional traffic.In this paper, we study the problem of assigning tap devices for passive monitoring and beacons for active monitoring. Minimizing the number of devices and finding optimal strategic locations is a key issue, mandatory for deploying scalable monitoring platforms. In this article, we present a combinatorial view of the problem from which we derive complexity and approximability results, as well as efficient and versatile Mixed Integer Programming (MIP) formulations.
Emerging mobile network architectures (e.g., aerial networks, disaster relief networks) are disrupting the classical careful planning and deployment of mobile networks by requiring specic self-deployment strategies. Such networks, referred to as self-deployable, are formed by interconnected rapidly deployable base stations that have no dedicated backhaul connection towards a traditional core network. Instead, an entity providing essential core network functionalities is co-located with one of the base stations. In this work, we tackle the problem of placing this core network entity within a self-deployable mobile network, i.e., we determine with which of the base stations it must be co-located. We propose a novel centrality metric, the ow centrality, which measures a node capacity of receiving the total amount of ows in the network. We show that in order to maximize the amount of exchanged trac between the base stations and the core network entity, under certain capacity and load distribution constraints, the latter should be co-located with the base station having the maximum ow centrality. We rst compare our proposed metric to other state of the art centralities. Then, we highlight the signicant trac loss occurring when the core network entity is not placed on the node with the maximum ow centrality, which could reach 55% in some cases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.