Entropy is used in physics to measure the disorder level of matter (see [Feynman et al., 1989]): The second law of thermodynamics states that the entropy of an isolated system always increases or remains constant. For instance, if you lock a gas in a room, it is first in a disorganized state (particles can be accumulated in some places), as time passes, the gas tends to organize itself more uniformly. During this type of relaxation, the entropy is a physical measurement that changes: It increases.Entropy has been used in mathematics and information theory to measure the order of any mapping or message (see [Arnold, 2011] or [Shannon, 1948]). It is now used in statistics (see [Billingsley, 1978]) to discover relationships between variables, using the fact when you associate one variable X to a variable Y, the system (Y, X) is less disorganized than Y alone meaning that X can be used to explain Y (for instance in a regression, see Section A.10).The analogy between gas in a room and liquidity in a market is not very difficult to explain: If the microstructure would follow the law of matter, the liquidity, like the gas, would naturally spread over all available venues and uniformly fill a room. Comparing the entropy of the liquidity now to the maximum, it could give clues about how far the microstructure is from a 100% relaxed state.The formula for entropy used to define the FEI (Fragmentation Efficiency Index) models the microstructure by N trading venues assuming that in a relaxed state, the liquidity could find its way to
248
Market Microstructure in Practiceany of them at the same cost. The "ideal" repartition of liquidity over such N pools should be 1/N of it in each of them.Once the repartition of liquidity is measured in each trading venue by quantities q n summing to one, for instance, using the market share M(n) defined in Section 1.1.1 in Chapter 1, the formula of entropy of the configuration (q 1 , . . . , q N ) can be applied:q n log q n (A.1) (by convention we will take 0 · log 0 = 0). It can be easily seen that• when only one liquidity pool is available, the entropy of the system is zero; • when all the liquidity is concentrated into one pool only (i.e. for some n, q n = 1 and for the others q n = 0), then the entropy of the system is zero; • when the same amount of liquidity is in each pool (i.e. for any n, q n = 1/N), thenThe repartition maximizing the entropy can also be found, for instance, by solving the following maximization program:Maximize − 1≤n≤N q n log q n ,
Variable(q 1 , . . . , q N ), Constraint n q n = 1, using a Lagrangian multiplier λ to express the fact that at the extremum, the slope of the criterion to maximize is tangent to the constraint, it gives ∀n, log q n − 1 = λ. All the q n being equal implies that their value is q n = 1/N for all of them. Added to this, Equation (A.2) says that the maximum possible entropy is log (N). This allows us to define the FEI: The FEI value is in [0, 1], with a minimum value of 0 when the liquidity is highly concentrated on few pools, and 1 when i...