This paper presents the implementation of a digital calibration method for a variable-gain amplifier (VGA), where the amplifier input offset voltage is the calibration target. The whole system is implemented in 130 nm CMOS technology using the supply voltage of 600 mV . Firstly, fundamentals of calibration circuitry operation are described, followed by detailed explanation of the proposed implementation. Then, investigation of possible undesired effects of calibration circuitry on the amplifier characteristics is performed. The whole calibration system was verified through simulation for various temperatures using Cadence environment and BSIM3v3 transistor model version. The VGA was calibrated within the temperature range from −20 • C to 60 • C, in which standard deviation of the input offset voltage reaches less than 802 µV and mean value is at most 450 µV . Using the calibration, the standard deviation is reduced to 35% compared to the VGA without the calibration, while the change of mean value is rather negligible.