Abstract. We consider the stationary state of a Markov process on a bipartite system from the perspective of stochastic thermodynamics. One subsystem is used to extract work from a heat bath while being affected by the second subsystem. We show that the latter allows for a transparent and thermodynamically consistent interpretation of a Maxwell's demon. Moreover, we obtain an integral fluctuation theorem involving the transfer entropy from one subsystem to the other. Comparing three different inequalities, we show that the entropy decrease of the first subsystem provides a tighter bound on the rate of extracted work than both the rate of transfer entropy from this subsystem to the demon and the heat dissipated through the dynamics of the demon. The latter two rates cannot be ordered by an inequality as shown with the illustrative example of a four state system.
We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarsegrained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial.
For sensory networks, we determine the rate with which they acquire information about the changing external conditions. Comparing this rate with the thermodynamic entropy production that quantifies the cost of maintaining the network, we find that there is no universal bound restricting the rate of obtaining information to be less than this thermodynamic cost. These results are obtained within a general bipartite model consisting of a stochastically changing environment that affects the instantaneous transition rates within the system. Moreover, they are illustrated with a simple four-states model motivated by cellular sensing. On the technical level, we obtain an upper bound on the rate of mutual information analytically and calculate this rate with a numerical method that estimates the entropy of a time-series generated with a simulation.
For a general sensory system following an external stochastic signal, we introduce the sensory capacity. This quantity characterizes the performance of a sensor: sensory capacity is maximal if the instantaneous state of the sensor has as much information about a signal as the whole time series of the sensor. We show that adding a memory to the sensor increases the sensory capacity. This increase quantifies the improvement of the sensor with the addition of the memory. Our results are obtained with the framework of stochastic thermodynamics of bipartite systems, which allows for the definition of an efficiency that relates the rate with which the sensor learns about the signal with the energy dissipated by the sensor, which is given by the thermodynamic entropy production. We demonstrate a general trade-off between sensory capacity and efficiency: if the sensory capacity is equal to its maximum 1, then the efficiency must be less than 1/2. As a physical realization of a sensor we consider a two-component cellular network estimating a fluctuating external ligand concentration as signal. This model leads to coupled linear Langevin equations that allow us to obtain explicit analytical results.
Relaxation and first passage processes are the pillars of kinetics in condensed matter, polymeric and single-molecule systems. Yet, an explicit connection between relaxation and first passage time-scales so far remained elusive. Here we prove a duality between them in the form of an interlacing of spectra. In the basic form the duality holds for reversible Markov processes to effectively one-dimensional targets. The exploration of a triple-well potential is analyzed to demonstrate how the duality allows for an intuitive understanding of first passage trajectories in terms of relaxational eigenmodes. More generally, we provide a comprehensive explanation of the full statistics of reactive trajectories in rugged potentials, incl. the so-called 'few-encounter limit'. Our results are required for explaining quantitatively the occurrence of diseases triggered by protein misfolding. 1 l -, the longest relaxation time, and the mean first passage time to surmount the highest barrier in the landscape [7][8][9][10][11]56]. However, in spite of the
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.