We consider a type of nonnormal approximation of infinitely divisible distributions that incorporates compound Poisson, Gamma, and normal distributions. The approximation relies on achieving higher orders of cumulant matching, to obtain higher rates of approximation error decay. The parameters of the approximation are easy to fix. The computational complexity of random sampling of the approximating distribution in many cases is of the same order as normal approximation. Error bounds in terms of total variance distance are derived. Both the univariate and the multivariate cases of the approximation are considered. is 0.4785 ([28], p. 71). Also, in some symmetric cases, since the third cumulants of X r and the Gaussian random variable are 0, |κ| 3,Xr in the bound can be more or less replaced with |κ| 4,Xr , while the power of κ 2,Xr is raised to 2 [1]. From the pattern of the bound, one could guess that, if X r and some Y r have the same cumulants of order 1, . . . , q − 1 with q ≥ 5, then X r could be approximated by Y r with the error being bounded by C(r)(|κ| q,Xr + |κ| q,Yr )/κ q/2 2,Xr , where C(r) is a near-constant, at least when r is small, or most ideally, a universal constant which may depend on the dimension of X but is not very large. Elementary calculations indicates that in many cases, the above bound vanishes at a genuinely higher rate than the bound for normal approximation as r → 0+. Since the qth cumulant of a Gaussian random variable is 0, the above bound, if true, is consistent with the bound for normal approximation.Even by some rough analysis on characteristic functions, there is good reason to expect that the bound is true for both univariate and multivariate cases. However, before attempting to work out the detail, perhaps one should first ask if such an approximation can possibly be implemented easily. The meaning of the question is twofold. First, the distribution of the approximating random variable should be easy to identify; preferably, it is i.d. Second, the approximating random variable should be easy to sample; preferably, the computational complexity of the sampling is of the same order as the normal approximation. If the answer to the question is positive, then the next question is how large q can be. It can be anticipated that the larger q is, the faster the error of approximation vanishes as r → 0+. After the answers to the two questions are in place, a wide range of available techniques can potentially be modified to establish the error bound (e.g. [3,5,8,28]).Clearly, cumulant matching is equivalent to moment matching. Actually, our proof of the above type of bound eventually will be based on moment matching. However, thanks to the Lévy-Khintchine representation, it is more natural and convenient to consider cumulants than moments. We shall show that it is fairly easy to construct approximating i.d. random variables with matching cumulants up to at least the fourth order, in other words, we can get at least q = 5. In many important cases, we can get q = 6, and in the symmetric ...