“…This has been highlighted perhaps most prominently in recent work on neural network models, in which the model complexity and data size increase together. For this reason, the double asymptotic regime where n, N → ∞, with N/n → c, a constant, is a particularly interesting (and likely more realistic) limit, despite being technically more challenging [48,51,21,15,37,32,5]. In particular, working in this regime allows for a finer quantitative assessment of machine learning systems, as a function of their relative complexity N/n, as well as for a precise description of the under-to over-parameterized "phase transition" (that does not appear in the N → ∞ alone analysis).…”