This paper presents an efficient approach to render refracting transmissive objects with multi-scale surface roughness, under distant illumination. To correctly capture both fine-scale surface details and large-scale appearances, and enable real-time processing at various viewing resolutions, we first divide the surface roughness into three levels, namely, micro-scale, meso-scale and macro-scale. Each scale of roughness is modeled and evaluated using different strategies, and the overall roughness is approximated by their spherical convolution. Then, this representation is incorporated into a microfacet-based BTDF model, and multi-scale rough refractions are simulated on both front and back sides of an object as light enters and exits the object. In particular, non-linear filtering methods are applied to both macro-scale geometries and meso-scale bumps to reduce aliasing when viewed across a range of distances. Finally, experimental results illustrate that our approach produces resolution-dependent refraction effects that match supersampled ground truth, while achieving a speed up of several orders of magnitude with hardware acceleration.