Abstract-Electromigration (EM) is a growing reliability concern in sub-22nm technology. Design teams must apply guard bands to meet EM lifetime requirements, at the cost of performance and power. However, EM lifetime analysis cannot ignore front-end reliability mechanisms such as bias temperature instability (BTl). Although the gate delay degradation due to BTl can be compensated by adaptive voltage scaling (AVS), any elevated supply voltage will accelerate EM degradation and reduce lifetime. Since the degradation of BTl is front-loaded, AVS can increase electrical stress on metal wires relatively early during Ie lifetime, which can significantly decrease electromigration reliability. In this paper, we study the "chicken-egg" interlock among BTl, AVS and EM and quantify timing and power costs of meeting EM lifetime requirements with full consideration of BTl and AVS mechanisms. By applying existing statistical and physical EM models, we demonstrate that without such considerations, the inaccuracy (reduction) of EM lifetime due to improper guard band against BTl at signoff can be as high as 30% in a 28nm FDSOI foundry technology. Furthermore, we provide signoff guidelines which suggest that the lifetime penalty can be compensated by paying a penalty of up to 1.6% in area and 6% in power. We also demonstrate that suboptimal choice of voltage step size and scheduling strategy can result in up to 1.5 years of decreased EM lifetime.