Mechanical circulatory support devices, such as total artificial hearts and left ventricular assist devices, rely on external energy sources for their continuous operation. Clinically approved power supplies rely on percutaneous cables connecting an external energy source to the implanted device with the associated risk of infections. One alternative, investigated in the 70s and 80s, employs a fully implanted nuclear power source. The heat generated by the nuclear decay can be converted into electricity to power circulatory support devices. Due to the low conversion efficiencies, substantial levels of waste heat are generated and must be dissipated to avoid tissue damage, heat stroke, and death. The present work computationally evaluates the ability of the blood flow in the descending aorta to remove the locally generated waste heat for subsequent full-body distribution and dissipation, with the specific aim of investigating methods for containment of local peak temperatures within physiologically acceptable limits. To this aim, coupled fluid–solid heat transfer computational models of the blood flow in the human aorta and different heat exchanger architectures are developed. Particle tracking is used to evaluate temperature histories of cells passing through the heat exchanger region. The use of the blood flow in the descending aorta as a heat sink proves to be a viable approach for the removal of waste heat loads. With the basic heat exchanger design, blood thermal boundary layer temperatures exceed 50°C, possibly damaging blood cells and proteins. Improved designs of the heat exchanger, with the addition of fins and heat guides, allow for drastically lower blood temperatures, possibly leading to a more biocompatible implant. The ability to maintain blood temperatures at biologically compatible levels will ultimately allow for the body-wise distribution, and subsequent dissipation, of heat loads with minimum effects on the human physiology.