The paper presents a simulation study on the impact ofLRD and SRD on a simple queueing system and shows that, while for short buffers an analytically tractable SRD model may suffice to capture the long-term loss probability, for systems with large buffers the effect of LRD can be significant. The main contribution of this paper consists in highlighting this evidence through simulations driven by synthetic traces (generated by traditional AR models and LRD Chaotic Map Generator), measured data and shuffled versions of them.