Soft errors have emerged as an important reliability challenge for nanoscale very large scale integration designs. In this paper, we present a fast and efficient soft error rate (SER) analysis methodology for combinational circuits. We first present a novel parametric waveform model based on the Weibull function to represent particle strikes at individual nodes in the circuit. We then describe the construction of the descriptor object that efficiently captures the correlation between the transient waveforms and their associated rate distribution functions. The proposed algorithm consists of operations to inject, propagate, and merge these descriptors while traversing forward along the gates in a circuit. The parameterized waveforms enable an efficient static approach to calculate the SER of a circuit. We exercise the proposed approach on a wide variety of combinational circuits and observe that our algorithm has linear runtime with the size of the circuit. The runtimes for soft error estimation were observed to be in the order of about 1 s, compared to several minutes or even hours for previously proposed methods.
In this paper, we propose a new sensitivity based, statistical gate sizing method. Since circuit optimization effects the entire shape of the circuit delay distribution, it is difficult to capture the quality of a distribution with a single metric. Hence, we first introduce a new objective function that provides an effective measure for the quality of a delay distribution for both ASIC and high performance designs. We then propose an efficient and exact sensitivity based pruning algorithm based on a newly proposed theory of perturbation bounds. A heuristic approach for sensitivity computation which relies on efficient computation of statistical slack is then introduced. Finally, we show how the pruning and statistical slack based approaches can be combined to obtain nearly identical results compared with the brute-force approach but with an average run-time improvement of up to 89x. We also compare the optimization results against that of a deterministic optimizer and show an improvement up to 16% in the 99-percentile circuit delay and up to 31% in the standard deviation for the same circuit area.
Statistical static timing analysis (SSTA) has become a key method for analyzing the effect of process variation in aggressively scaled CMOS technologies. Much research has focused on the modeling of spatial correlation in SSTA. However, the vast majority of these works used artificially generated process data to test the proposed models. Hence, it is difficult to determine the actual effectiveness of these methods, the conditions under which they are necessary, and whether they lead to a significant increase in accuracy that warrants their increased runtime and complexity. In this paper, we study 5 different correlation models and their associated SSTA methods using 35420 critical dimension (CD) measurements that were extracted from 23 reticles on 5 wafers in a 130nm CMOS process. Based on the measured CD data, we analyze the correlation as a function of distance and generate 5 distinct correlation models, ranging from simple models which incorporate one or two variation components to more complex models that utilize principle component analysis and Quad-trees. We then study the accuracy of the different models and compare their SSTA results with the result of running STA directly on the extracted data. We also examine the trade-off between model accuracy and run time, as well as the impact of die size on model accuracy. We show that, especially for small dies (< 6.6mm x 5.7mm), the simple models provide comparable accuracy to that of the more complex ones, while incurring significantly less runtime and implementation difficulty. The results of this study demonstrate that correlation models for SSTA must be carefully tested on actual process data and must be used judiciously.
Statistical static timing analysis (SSTA) has become a key method for analyzing the effect of process variation in aggressively scaled CMOS technologies. Much research has focused on the modeling of spatial correlation in SSTA. However, the vast majority of these works used artificially generated process data to test the proposed models. Hence, it is difficult to determine the actual effectiveness of these methods, the conditions under which they are necessary, and whether they lead to a significant increase in accuracy that warrants their increased runtime and complexity. In this paper, we study 5 different correlation models and their associated SSTA methods using 35420 critical dimension (CD) measurements that were extracted from 23 reticles on 5 wafers in a 130nm CMOS process. Based on the measured CD data, we analyze the correlation as a function of distance and generate 5 distinct correlation models, ranging from simple models which incorporate one or two variation components to more complex models that utilize principle component analysis and Quad-trees. We then study the accuracy of the different models and compare their SSTA results with the result of running STA directly on the extracted data. We also examine the trade-off between model accuracy and run time, as well as the impact of die size on model accuracy. We show that, especially for small dies (< 6.6mm x 5.7mm), the simple models provide comparable accuracy to that of the more complex ones, while incurring significantly less runtime and implementation difficulty. The results of this study demonstrate that correlation models for SSTA must be carefully tested on actual process data and must be used judiciously.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.