Modern semiconductor industry has a major focus on die size reduction and favours the usage of multiple metal layers to increase gross margins. High congestion and voltage or IR drop challenges result from designs with higher and lower core utilisations. Owing to this issue, one of the major difficulties in large silicon-on-chip (SoC) design is the implementation of a power grid design. It’s essential to correctly analyze IR drops to guarantee the reliability of the power grid. This paper presents a comprehensive review approach for the analysis of IR drop for robust power grid design in semiconductor chips. The necessity of the vector-based dynamic analysis and the drawbacks of the vector-less analysis have been addressed. The various techniques used to mitigate IR drop effects, including power grid modeling techniques, use of decoupling capacitors, and voltage drop analysis are explored. The review concludes by identifying the most promising techniques for robust power grid design in semiconductor chips and providing recommendations for future research. The simulations are carried out using ANSYS RedHawk, and the analysis findings are achieved using FinFET technology.