The great amounts of energy consumed by large-scale computing and network systems, such as data centers and supercomputers, have been a major source of concern in a society increasingly reliant on information technology. Trying to tackle this issue, the research community and industry have proposed a myriad of techniques to curb the energy consumed by IT systems. This article surveys techniques and solutions that aim to improve the energy efficiency of computing and network resources. It discusses methods to evaluate and model the energy consumed by these resources, and describes techniques that operate at a distributed system level, trying to improve aspects such as resource allocation, scheduling and network traffic management.This work aims to review the state of the art on energy efficiency and to foster research on schemes to make network and computing resources more efficient.