With the aim of explaining the formal development behind the chaos-based modeling of network traffic and other similar phenomena, here we generalize the tools presented in the companion paper (Setti et al., 2002)
I. SCENARIOS FOR ADVANCED APPLICATIONSModern information engineering deals with systems made of a population of active and often intelligent units deeply interconnected and interacting. In fact, this paradigm applies to many of the most fundamental technologies supporting nowadays information society as well as many of the most advanced proposals for its evolution: from Ethernet-based local area networks (LANs) to frame-relay or asynchronous Manuscript received July 3, 2001; revised November 25, 2001. R. Rovatti is with CEG-ARCES, University of Bologna, 40136 Bologna, Italy (e-mail: r.rovatti@chaos.cc).G. Mazzini and G. Setti are with DEC-DI, University of Ferrara, 44100 Ferrara, Italy (e-mail: g.mazzini@ieee.org; gsetti@ing.unife.it) and also with CEG-ARCES, University of Bologna, 40136 Bologna, Italy.A. Giovanardi is with CED-DI, University of Ferrara, 44100 Ferrara, Italy (e-mail: agiovanardi@ing.unife.it).Publisher Item Identifier S 0018-9219(02)05243-X.transfer mode geographic links, from Internet and its protocols to wireless sensor networks, and from distributed memory or processing hierarchies to distributed cooperative computation.The complexity of the activities carried out in those systems and the number of independent and often unpredictable entities interacting with them forces the extensive use of statistical tools for their modeling and design. Hence, future successful development of related technologies will depend on our ability of characterizing stochastic processes of increased complexity.One of the most significant example comes from digital communication networks where the discovery of self-similar of fractal processes opened a definite chasm between classical traffic and queuing analysis and the most recent trends and methods.As discussed in [1], modeling techniques in this field were forced to evolve from classical Poisson processes, to renewal processes with finite variance of interarrival times, to Markov chain administrating the transition between some of those renewal processes.The present state of this evolution is the widely accepted second-order self-similar model [1], [2] from which most of the recent network traffic analysis originate.A key step in this analysis was the discovery that complex self-similar behaviors can be interpreted as the superimposition of elementary binary (ON-OFF) processes featuring heavy-tailed (i.e., very slowly decaying) distribution of ON or OFF times [3], [4].Researchers interested in an intuitive explanation of the nontrivial features of aggregated traffic identify each ON-OFF process with a different session-based network activity (e.g., session). Here, we are more interested in the mathematical structure that they provide to the overall process.In fact, we will describe how such an observation can be at least partially systematized in a gen...