Reverse body biasing has been widely used in commercial memory chips since the mid-1970s, in order to lower the risk of latchup and memory data destruction, due to lack of substrate contacts for high density cell layout. In logic chips, on the other hand, the substrate and wells are typically biased stably to the ground and power potential with sufficient substrate contacts to ensure that no devices become forward biased, raising the risk of latchup due to unexpected operation of random logic circuits. Since the mid-1990s, however, reverse body biasing has been applied in logic chips for a different reason: power reduction.CMOS power dissipation is increasing rapidly by device scaling [1]. Lowering power supply voltage, VDD, is effective in reducing the power dissipation, but at a cost of increase in propagation delay time. In order to recover the circuit speed, transistor threshold voltage, VTH, should be lowered [2,3,4]. This approach, however, raises two problems.The first problem is rapid increase in sub-threshold leakage in low VJH devices. For every 0.1-volt reduction of VV//, sub-threshold leakage current increases by about one decade. Battery life in portable equipment shortens unless this leakage current is reduced in a standby mode. In standby leakage current (EDDQ) testing, it is difficult to sort out defective chips by monitoring the quiescent power supply current, because leakage current caused by a defect cannot be detected under cover of the increased subthreshold leakage current. Since some kinds of defects are difficult to detect by means other than the EDDQ testing [5], defect mixed rate may be increased. Without the IDDQ testing it is more difficult to develop test vectors for high test coverage as integration level improves. It is also