High-fidelity image synthesis is the process of computing images that are perceptually indistinguishable from the real world they are attempting to portray. Such a level of fidelity requires that the physical processes of materials and the behavior of light are accurately simulated. Most computer graphics algorithms assume that light passes freely between surfaces within an environment. However, in many applications, we also need to take into account how the light interacts with media, such as dust, smoke, fog, etc., between the surfaces. The computational requirements for calculating the interaction of light with such participating media are substantial. This process can take many hours and rendering effort is often spent on computing parts of the scene that may not be perceived by the viewer. In this paper, we present a novel perceptual strategy for physically based rendering of participating media. By using a combination of a saliency map with our new extinction map (X map), we can significantly reduce rendering times for inhomogeneous media. The visual quality of the resulting images is validated using two objective difference metrics and a subjective psychophysical experiment. Although the average pixel errors of these metric are all less than 1%, the subjective validation indicates that the degradation in quality still is noticeable for certain scenes. We thus introduce and validate a novel light map (L map) that accounts for salient features caused by multiple light scattering around light sources. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or direct commercial advantage and that copies show this notice on the first page or initial screen of a display along with the full citation. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, to redistribute to lists, or to use any component of this work in other works requires prior specific permission and/or a fee. Permissions may be requested from Publications Dept., ACM, Inc., 2 Penn Plaza, Suite 701, New York, NY 10121-0701 USA, fax +1 (212) 869-0481, or permissions@acm.org.