While the equilibrium properties, states, and phase transitions of interacting systems are well described by statistical mechanics, the lack of suitable state parameters has hindered the understanding of nonequilibrium phenomena in diverse settings, from glasses to driven systems to biology. The length of a losslessly compressed data file is a direct measure of its information content: The more ordered the data file is, the lower its information content and the shorter the length of its encoding can be made. Here, we describe how data compression enables the quantification of order in nonequilibrium and equilibrium many-body systems, both discrete and continuous, even when the underlying form of order is unknown. We consider absorbing state models on and off lattice, as well as a system of active Brownian particles undergoing motility-induced phase separation. The technique reliably identifies nonequilibrium phase transitions, determines their character, quantitatively predicts certain critical exponents without prior knowledge of the order parameters, and reveals previously unknown ordering phenomena. This technique should provide a quantitative measure of organization in condensed matter and other systems exhibiting collective phase transitions in and out of equilibrium.
Machine learning techniques are being increasingly used as flexible non-linear fitting and prediction tools in the physical sciences. Fitting functions that exhibit multiple solutions as local minima can be analysed in terms of the corresponding machine learning landscape. Methods to explore and visualise molecular potential energy landscapes can be applied to these machine learning landscapes to gain new insight into the solution space involved in training and the nature of the corresponding predictions. In particular, we can define quantities analogous to molecular structure, thermodynamics, and kinetics, and relate these emergent properties to the structure of the underlying landscape. This Perspective aims to describe these analogies with examples from recent applications, and suggest avenues for new interdisciplinary research.
Electron transfer from TiO to iodine/iodide electrolytes proceeds via reduction of either I or uncomplexed I (free iodine), but which route predominates has not previously been determined. By measurement of the electron lifetime while independently varying free iodine or I concentrations, we find the lifetime is correlated with free-iodine concentration and independent of I concentration. This trend supports the hypothesis that electron recombination to the electrolyte occurs predominantly by iodine reduction rather than reduction of triiodide.Peer reviewe
The theoretical analysis of many problems in physics, astronomy and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: the probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling (SENS) combines the strengths of global optimization with the unbiased/athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling
We present a numerical calculation of the total number of disordered jammed configurations Ω of N repulsive, three-dimensional spheres in a fixed volume V. To make these calculations tractable, we increase the computational efficiency of the approach of Xu et al. [Phys. Rev. Lett. 106, 245502 (2011)10.1103/PhysRevLett.106.245502] and Asenjo et al. [Phys. Rev. Lett. 112, 098002 (2014)10.1103/PhysRevLett.112.098002] and we extend the method to allow computation of the configurational entropy as a function of pressure. The approach that we use computes the configurational entropy by sampling the absolute volume of basins of attraction of the stable packings in the potential energy landscape. We find a surprisingly strong correlation between the pressure of a configuration and the volume of its basin of attraction in the potential energy landscape. This relation is well described by a power law. Our methodology to compute the number of minima in the potential energy landscape should be applicable to a wide range of other enumeration problems in statistical physics, string theory, cosmology, and machine learning that aim to find the distribution of the extrema of a scalar cost function that depends on many degrees of freedom.
In the late 1980s, Sam Edwards proposed a possible statisticalmechanical framework to describe the properties of disordered granular materials 1 . A key assumption underlying the theory was that all jammed packings are equally likely. In the intervening years it has never been possible to test this bold hypothesis directly. Here we present simulations that provide direct evidence that at the unjamming point, all packings of soft repulsive particles are equally likely, even though generically, jammed packings are not. Typically, jammed granular systems are observed precisely at the unjamming point since grains are not very compressible. Our results therefore support Edwards' original conjecture. We also present evidence that at unjamming the configurational entropy of the system is maximal.In science, most breakthroughs cannot be derived from known physical laws: they are based on inspired conjectures 2 . Comparison with experiment of the predictions based on such a hypothesis allows us to eliminate conjectures that are clearly wrong. However, there is a distinction between testing the consequences of a conjecture and testing the conjecture itself. A case in point is Edwards' theory of granular media. In the late 1980s, Edwards and Oakeshott 1 proposed that many of the physical properties of granular materials ('powders') could be predicted using a theoretical framework that was based on the assumption that all distinct packings of such a material are equally likely to be observed. The logarithm of the number of such packings was postulated to play the same role as entropy does in Gibbs' statistical-mechanical description of the thermodynamic properties of equilibrium systems. However, statistical-mechanical entropy and granular entropy are very different objects. Until now, the validity of Edwards' hypothesis could not be tested directlymainly because the number of packings involved is so large that direct enumeration is utterly infeasible-and, as a consequence, the debate about the Edwards hypothesis has focused on its consequences, rather than on its assumptions. Here we present results that show that now, at last, it is possible to test Edwards' hypothesis directly by numerical simulation. To our own surprise, we find that the hypothesis appears to be correct precisely at the point where a powder is just at the (un)jamming threshold. However, at higher densities, the hypothesis fails. At the unjamming transition, the configurational entropy of jammed states appears to be at a maximum.The concept of 'ensembles' plays a key role in equilibrium statistical mechanics, as developed by J. Willard Gibbs, well over a century ago 3 . The crucial assumption that Gibbs made to arrive at a tractable theoretical framework to describe the equilibrium properties of gases, liquid and solids was that, at a fixed total energy, every state of the system is equally likely to be observed. The distinction between, say, a liquid at thermal equilibrium and a granular material is that in a liquid, atoms undergo thermal motion whereas...
We propose an efficient Monte Carlo method for the computation of the volumes of highdimensional bodies with arbitrary shape. We start with a region of known volume within the interior of the manifold and then use the multi-state Bennett acceptance-ratio method to compute the dimensionless free-energy difference between a series of equilibrium simulations performed within this object. The method produces results that are in excellent agreement with thermodynamic integration, as well as a direct estimate of the associated statistical uncertainties. The histogram method also allows us to directly obtain an estimate of the interior radial probability density profile, thus yielding useful insight into the structural properties of such a high dimensional body. We illustrate the method by analysing the effect of structural disorder on the basins of attraction of mechanically stable packings of soft repulsive spheres.
Proteins require high developability—quantified by expression, solubility, and stability—for robust utility as therapeutics, diagnostics, and in other biotechnological applications. Measuring traditional developability metrics is low throughput in nature, often slowing the developmental pipeline. We evaluated the ability of 10 variations of three high-throughput developability assays to predict the bacterial recombinant expression of paratope variants of the protein scaffold Gp2. Enabled by a phenotype/genotype linkage, assay performance for 105 variants was calculated via deep sequencing of populations sorted by proxied developability. We identified the most informative assay combination via cross-validation accuracy and correlation feature selection and demonstrated the ability of machine learning models to exploit nonlinear mutual information to increase the assays’ predictive utility. We trained a random forest model that predicts expression from assay performance that is 35% closer to the experimental variance and trains 80% more efficiently than a model predicting from sequence information alone. Utilizing the predicted expression, we performed a site-wise analysis and predicted mutations consistent with enhanced developability. The validated assays offer the ability to identify developable proteins at unprecedented scales, reducing the bottleneck of protein commercialization.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.