Next Article in Journal
Coupling Conditions for Water Waves at Forks
Previous Article in Journal
Algebraic Entropy of a Class of Five-Point Differential-Difference Equations
Previous Article in Special Issue
A Note on the Sequence Related to Catalan Numbers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time

Department of Mathematics, Kwangwoon University, Seoul 01897, Korea
Symmetry 2019, 11(3), 433; https://doi.org/10.3390/sym11030433
Submission received: 27 February 2019 / Revised: 21 March 2019 / Accepted: 22 March 2019 / Published: 22 March 2019
(This article belongs to the Special Issue Current Trends in Symmetric Polynomials with their Applications)

Abstract

:
Sagawa and Ueda established a fluctuation theorem of information exchange by revealing the role of correlations in stochastic thermodynamics and unified the non-equilibrium thermodynamics of measurement and feedback control. They considered a process where a non-equilibrium system exchanges information with other degrees of freedom such as an observer or a feedback controller. They proved the fluctuation theorem of information exchange under the assumption that the state of the other degrees of freedom that exchange information with the system does not change over time while the states of the system evolve in time. Here we relax this constraint and prove that the same form of the fluctuation theorem holds even if both subsystems co-evolve during information exchange processes. This result may extend the applicability of the fluctuation theorem of information exchange to a broader class of non-equilibrium processes, such as a dynamic coupling in biological systems, where subsystems that exchange information interact with each other.

1. Introduction

Biological systems possess information processing mechanisms for their survival and heredity [1,2,3]. They, for example, sense external ligand concentrations [4,5], transmit information through signaling networks [6,7,8], and coordinate gene expressions [9] by secreting and sensing signaling molecules [10]. Cells even implement time integration by copying states of environment into molecular states inside the cells to reduce their sensing errors [11,12]. Therefore it is crucial to reveal the role of information in thermodynamics to properly understand complex biological information processes.
Historically, information has entered into the realm of thermodynamics by the name of Maxwell’s demon. The demon observes the speed of molecules in a box that is divided into two portions by a partition in which there is a small hole, and lets the fast particles pass from the lower-half of the box to the upper-half, and only the slow particles pass from the upper-half to the lower-half by opening/closing the hole without expenditure of work (see Figure 1a). This results in raising the temperature of the upper-half of the box and lower that of the lower-half, indicating that the second law of thermodynamics, which implies heat flows spontaneously from hotter to colder places, might hypothetically be violated [13]. This paradox shows that information can affect thermodynamics of a physical system, or information is a physical element [14].
Szilard has devised a much simpler model that carries the essential role of information in Maxwell’s thought experiment. The Szilard engine consists of a single particle in a box which is surrounded by a heat reservoir of constant temperature. A cycle of the engine begins with inserting a partition in the middle of the box. Depending on whether the particle is in the left-half or in the right-half of the box, one controls a lever such that a weight can be lifted during the wall moves quasi-statically in the direction that the particle pushes (see Figure 1b). If the partition reaches an end of the box, the partition is removed and a new cycle begins again with inserting a partition at the center. Since the energy required for lifting the weight comes from the heat reservoir, this engine corresponds to a perpetual-motion machine of the second kind, where the single heat reservoir is spontaneously cooled and the corresponding thermal energy is converted into mechanical work cyclically, which is prohibited by the second-law of thermodynamics [15].
Szilard interprets the coupling between the location of the particle and the direction of the lever as a sort of memory faculty and points out that the coupling is the main cause that enables an amount of work to be extracted from the heat reservoir. He infers, therefore, that establishing the coupling must be accompanied by a production of entropy (dissipation of heat into the environment) which compensates for the lost heat in the reservoir. In [16], Sagawa and Ueda have proved this idea in the form of a fluctuation theorem of information exchange, generalizing the second-law of thermodynamics by taking information into account:
e σ + Δ I = 1 ,
where σ is the entropy production of a system X, and Δ I is the change of mutual information between the system X and another system Y, such as a demon, during a process λ t for 0 t τ . Here the bracket indicates the ensemble average over all microscopic trajectories of X and over all states of Y. By Jensen’s inequality [17], Equation (1) implies
σ Δ I .
This tells indeed that establishing a correlation between the two subsystems, Δ I > 0 , accompanies an entropy production, σ > 0 , and expenditure of this correlation, Δ I < 0 , serves as a source of entropy decrease, σ < 0 . In proving this theorem, they have assumed that the state of system Y does not evolve in time. This assumption causes no problem for simple models of measurement and feedback control. However, in biological systems, it is not unusual that both subsystems that exchange information with each other co-evolve in time. For example, transmembrane receptor proteins transmit signals through thermodynamic coupling between extracellular ligands and conformation of intracellular parts of the receptors during a dynamic allosteric transition [18,19]. In this paper, we relax the constraint that Sagawa and Ueda have assumed, and generalize the fluctuation theorem of information exchange to be applicable to more involved situations, where the two subsystems can influence each other so that the states of both systems co-evolve in time.

2. Results

2.1. Theoretical Framework

We consider a finite classical stochastic system composed of subsystems X and Y that are in contact with a heat reservoir of inverse temperature β 1 / ( k B T ) where k B is the Boltzmann constant and T is the temperature of the reservoir. We allow both systems X and Y to be driven far from equilibrium by changing external parameter λ t during time 0 t τ [20,21,22]. We assume that time evolutions of subsystems X and Y are described by a classical stochastic dynamics from t = 0 to t = τ along trajectories { x t } and { y t } , respectively, where x t ( y t ) denotes a specific microstate of X (Y) at time t for 0 t τ on each trajectory. Since both trajectories fluctuate, we repeat the process λ t with appropriate initial joint probability distribution p 0 ( x , y ) over all microstates ( x , y ) of systems X and Y. Then the joint probability distribution p t ( x , y ) would evolve for 0 t τ . Let p t ( x ) : = p t ( x , y ) d y and p t ( y ) : = p t ( x , y ) d x be the corresponding marginal probability distributions. We assume
p 0 ( x , y ) 0 f o r a l l ( x , y )
so that we have p t ( x , y ) 0 , p t ( x ) 0 , and p t ( y ) 0 for all x and y during 0 t τ .
Now, the entropy production σ during process λ t for 0 t τ is given by
σ : = Δ s + β Q b ,
where Δ s is the sum of changes in stochastic entropy along { x t } and { y t } , and Q b is heat dissipated into the reservoir (entropy production in the reservoir) [23,24]. In detail, we have
Δ s : = Δ s x + Δ s y , Δ s x : = ln p τ ( x τ ) + ln p 0 ( x 0 ) , Δ s y : = ln p τ ( y τ ) + ln p 0 ( y 0 ) .
We note that the stochastic entropy s [ p t ( ) ] : = ln p t ( ) of microstate ∘ at time t can be interpreted as uncertainty of occurrence of ∘ at time t: The greater the probability that state ∘ occurs, the smaller the uncertainty of occurrence of state ∘.
Now we consider situations where system X exchanges information with system Y during process λ t . By this, we mean that trajectory { x t } of system X evolves depending on the trajectory { y t } of system Y. Then, information I t at time t between x t and y t is characterized by the reduction of uncertainty of x t due to given y t [16]:
I t ( x t , y t ) : = s [ p t ( x t ) ] s [ p t ( x t | y t ) ] = ln p t ( x t , y t ) p t ( x t ) p t ( y t ) ,
where p t ( x t | y t ) is the conditional probability distribution of x t given y t . We note that this is called the (time-dependent form of) thermodynamic coupling function [19]. The larger the value of I t ( x t , y t ) is, the more information is being shared between x t and y t for their occurrence. We note that I t ( x t , y t ) vanishes if x t and y t are independent at time t, and the average of I t ( x t , y t ) with respect to p t ( x t , y t ) over all microstates is the mutual information between the two subsystems, which is greater than or equal to zero [17].

2.2. Proof of Fluctuation Theorem of Information Exchange

Now we are ready to prove the fluctuation theorem of information exchange in this general setup. We define reverse process λ t : = λ τ t for 0 t τ , where the external parameter is time-reversed [25,26]. Here we set the initial probability distribution p 0 ( x , y ) for the reverse process as the final (time t = τ ) probability distribution for the forward process p τ ( x , y ) so that we have
p 0 ( x ) = p 0 ( x , y ) d y = p τ ( x , y ) d y = p τ ( x ) , p 0 ( y ) = p 0 ( x , y ) d x = p τ ( x , y ) d x = p τ ( y ) .
Then, by Equation (3), we have p t ( x , y ) 0 , p t ( x ) 0 , and p t ( y ) 0 for all x and y during 0 t τ . We also consider the time-reversed conjugate for each { x t } and { y t } for 0 t τ as follows:
{ x t } : = { x τ t * } , { y t } : = { y τ t * } ,
where * denotes momentum reversal. The microscopic reversibility condition connects the time-reversal symmetry of the microscopic dynamics to non-equilibrium thermodynamics, and reads in this framework as follows [23,27,28,29]:
p ( { x t } , { y t } | x 0 , y 0 ) p ( { x t } , { y t } | x 0 , y 0 ) = e β Q b ,
where p ( { x t } , { y t } | x 0 , y 0 ) is the conditional joint probability distribution of paths { x t } and { y t } conditioned at initial microstates x 0 and y 0 , and p ( { x t } , { y t } | x 0 , y 0 ) is that for the reverse process. Now we have the following:
p ( { x t } , { y t } ) p ( { x t } , { y t } ) = p ( { x t } , { y t } | x 0 , y 0 ) p ( { x t } , { y t } | x 0 , y 0 ) · p 0 ( x 0 , y 0 ) p 0 ( x 0 , y 0 )
= p ( { x t } , { y t } | x 0 , y 0 ) p ( { x t } , { y t } | x 0 , y 0 ) · p 0 ( x 0 , y 0 ) p 0 ( x 0 ) p 0 ( y 0 ) · p 0 ( x 0 ) p 0 ( y 0 ) p 0 ( x 0 , y 0 ) · p 0 ( x 0 ) p 0 ( x 0 ) · p 0 ( y 0 ) p 0 ( y 0 )
= exp { β Q b + I τ ( x τ , y τ ) I 0 ( x 0 , y 0 ) Δ s x Δ s y }
= exp { σ + Δ I } .
To obtain Equation (11) from Equation (10), we multiply Equation (10) by p 0 ( x 0 ) p 0 ( y 0 ) p 0 ( x 0 ) p 0 ( y 0 ) and p 0 ( x 0 ) p 0 ( y 0 ) p 0 ( x 0 ) p 0 ( y 0 ) , which are 1. We obtain Equation (12) by applying Equations (5)–(7) and (9) consecutively to Equation (11). Finally, we set Δ I : = I τ ( x τ , y τ ) I 0 ( x 0 , y 0 ) , and use Equation (4) to obtain Equation (13) from Equation (12).
We note that Equation (13) generalizes the detailed fluctuation theorem in the presence of information exchange that is proved in [16]. Now we obtain the generalized version of Equation (1) by using Equation (13) as follows:
e σ + Δ I = e σ + Δ I p ( { x t } , { y t } ) d { x t } d { y t } = p ( { x t } , { y t } ) d { x t } d { y t } = 1 .
Here we use the fact that there is a one-to-one correspondence between the forward and the reverse paths due to the time-reversal symmetry of the underlying microscopic dynamics such that d { x t } = d { x t } and d { y t } = d { y t } [30].

2.3. Corollary

Before discussing a corollary, we remark one thing: we have used similar notation to that used by Sagawa and Ueda in [16], but there is an important difference. Most importantly, their entropy production σ su reads as follows:
σ su : = Δ s su + β Q b ,
where Δ s su : = Δ s x . In [16], system X is in contact with the heat reservoir, but system Y is not. Nor does system Y evolve over time. Thus they have considered entropy production in system X and the bath. In this paper, both systems X and Y are in contact with the reservoir, and system Y also evolves in time. Thus both subsystems X and Y as well as the heat bath contribute to the entropy production as expressed in Equations (4) and (5). Keeping in mind this difference, we apply Jensen’s inequality to Equation (14) to obtain
σ Δ I .
It tells us that firstly, establishing correlation between X and Y accompanies entropy production, and secondly, established correlation serves as a source of entropy decrease.
Now as a corollary, we refine the generalized fluctuation theorem in Equation (14) by including energetic terms. To this end, we define local free energy F x of system X at x t and F y of system Y at y t as follows:
F x ( x t , t ) : = E x ( x t , t ) T s [ p t ( x t ) ] F y ( y t , t ) : = E y ( y t , t ) T s [ p t ( y t ) ] ,
where E x and E y are internal energy of systems X and Y, respectively, and s [ p t ( ) ] : = ln p t ( ) is stochastic entropy [23,24]. Here T is the temperature of the heat bath and argument t indicates dependency of each terms on external parameter λ t . During the process λ t , work done on the systems is expressed by the first law of thermodynamics as follows:
W : = Δ E + Q b ,
where Δ E is the change in internal energy of the systems. If we assume that systems X and Y are weakly coupled, in that interaction energy between X and Y is negligible compared to internal energy of X and Y, we may have
Δ E : = Δ E x + Δ E y ,
where Δ E x : = E x ( x τ , τ ) E x ( x 0 , 0 ) and Δ E y : = E y ( y τ , τ ) E y ( y 0 , 0 ) [31]. We rewrite Equation (12) by adding and subtracting the change of internal energy Δ E x of X and Δ E y of Y as follows:
p ( { x t } , { y t } ) p ( { x t } , { y t } ) = exp { β ( Q b + Δ E x + Δ E y ) + Δ I + β Δ E x Δ s x + β Δ E y Δ s y }
= exp { β ( W Δ F x Δ F y ) + Δ I } ,
where we have applied Equations (16)–(18) consecutively to Equation (19) to obtain Equation (20). Here Δ F x : = F x ( x τ , τ ) F x ( x 0 , 0 ) and Δ F y : = F y ( y τ , τ ) F y ( y 0 , 0 ) . Now we obtain fluctuation theorem of information exchange with energetic terms as follows:
e β ( W Δ F x Δ F y ) + Δ I = e β ( W Δ F x Δ F y ) + Δ I p ( { x t } , { y t } ) d { x t } d { y t } = p ( { x t } , { y t } ) d { x t } d { y t } = 1 ,
which generalizes known relations in the literature [31,32,33,34,35,36]. We note that Equation (21) holds under the weak-coupling assumption between systems X and Y during the process λ t . By Jensen’s inequality, Equation (21) implies
W Δ F x + Δ F y + Δ I β .
We remark that Δ F x + Δ F y in Equation (22) is the difference in non-equilibrium free energy, which is different from the change in equilibrium free energy that appears in similar relations in the literature [32,33,34,35,36].

3. Examples

3.1. Measurement

Let X be a device (or a demon) which measures the state of other system and Y be a measured system, both of which are in contact with a heat bath of inverse temperature β (see Figure 2a). We consider a dynamic measurement process, which is described as follows: X and Y are prepared separately in equilibrium such that X and Y are not correlated initially, i.e., I 0 ( x 0 , y 0 ) = 0 for all x 0 and y 0 . At time t = 0 , device X is put in contact with system Y so that the coupling of X and Y occurs due to their (weak) interactions until time t = τ , at which a single measurement process finishes. We note that system Y is allowed to evolve in time during the process. Since each process fluctuates, we repeat the measurement many times to obtain probability distribution p t ( x , y ) for 0 t τ .
A distinguished feature of the framework in this paper is that mutual information I t ( x t , y t ) in Equation (6) enables us to obtain the time-varying amount of established information during the dynamic coupling process, unlike other approaches where they either provide the amount of information at a fixed time [31,36,37] or one of the system is fixed during the coupling process [16]. For example, let us assume that the probability distribution p t ( x t , y t ) at an intermediate time t is as shown in Table 1.
Then we have the following:
I t ( x t = 0 , y t = 0 ) = ln 1 / 3 ( 1 / 2 ) · ( 1 / 2 ) = ln ( 4 / 3 ) , I t ( x t = 0 , y t = 1 ) = ln 1 / 6 ( 1 / 2 ) · ( 1 / 2 ) = ln ( 2 / 3 ) , I t ( x t = 1 , y t = 0 ) = ln 1 / 6 ( 1 / 2 ) · ( 1 / 2 ) = ln ( 2 / 3 ) , I t ( x t = 1 , y t = 1 ) = ln 1 / 3 ( 1 / 2 ) · ( 1 / 2 ) = ln ( 4 / 3 ) ,
so that Δ I = ( 1 / 3 ) ln ( 4 / 3 ) + ( 1 / 6 ) ln ( 2 / 3 ) + ( 1 / 6 ) ln ( 2 / 3 ) + ( 1 / 3 ) ln ( 4 / 3 ) ln ( 1.06 ) . Thus by Equation (15) we obtain the lower bound of the average entropy production for the coupling that has been established until time t from the uncorrelated initial state, as follows: σ Δ I ln 1.06 . If there is no measurement error at final time τ such that p τ ( x τ = 0 , y τ = 1 ) = p τ ( x τ = 1 , y τ = 0 ) = 0 and p τ ( x τ = 0 , y τ = 0 ) = p τ ( x τ = 1 , y τ = 1 ) = 1 / 2 , then we may have σ Δ I = ln 2 , which is greater than ln 1.06 .

3.2. Feedback Control

Unlike the case in [16], we need not to exchange subsystems X and Y to consider feedback control after the measurement. Thus we proceed continuously to feedback control immediately after each measurement process at time τ (see Figure 2b). We assume that correlation I τ ( x τ , y τ ) at time τ is given by the values in Equation (23) and final correlation at later time τ is zero, i.e., I τ ( x τ , y τ ) = 0 . By feedback control, we mean that external parameter λ t for τ t τ is manipulated in a pre-determined manner [16], while systems X and Y co-evolve in time, such that the established correlation is used as a source of work while I t ( x t , y t ) for τ t τ is decreased, not necessarily monotonically. Equation (21) provides an exact relation on the energetics of this process. We rewrite its corollary, Equation (22), with respect to extractable work W ext : = W as follows:
W ext Δ F x + Δ F y + Δ I β .
Then the extractable work on top of the conventional bound, Δ F x + Δ F y , is additionally given by Δ I / β = ln ( 1.06 ) , which comes from the consumption of the established correlation.

4. Conclusions

We have proved the fluctuation theorem of information exchange, Equation (14), which holds even during the co-evolution of two systems that exchange information with each other. Equation (14) tells us that establishing correlation between two systems necessarily accompanies entropy production which is contributed by both systems and the heat reservoir, as expressed in Equations (4) and (5). We have also proved, as a corollary of Equation (14), the fluctuation theorem of information exchange with energetic terms, Equation (21), under the assumption of weak coupling between the two subsystems. Equation (21) reveals the exact relationship between non-equilibrium free energy of both sub-systems and mutual information that is established/consumed through their interactions. This more generalized framework than that in [16], enables us to apply thermodynamics of information to biological systems, where molecules generate/consume correlations through their information processing mechanisms [4,5,6]. Since the new framework is applicable to fully non-equilibrium situations, thermodynamic coupling during a dynamic allosteric transition, for example, may be analyzed based on this theoretical framework beyond current equilibrium thermodynamic approach [18,19].

Funding

L.J. was supported by the National Research Foundation of Korea grant funded by the Korean Government (NRF-2010-0006733, NRF-2012R1A1A2042932, NRF-2016R1D1A1B02011106), and in part by Kwangwoon University Research Grant in 2016.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Hartwell, L.H.; Hopfield, J.J.; Leibler, S.; Murray, A.W. From molecular to modular cell biology. Nature 1999, 402, C47. [Google Scholar] [CrossRef] [PubMed]
  2. Crofts, A.R. Life, information, entropy, and time: Vehicles for semantic inheritance. Complexity 2007, 13, 14–50. [Google Scholar] [CrossRef] [PubMed]
  3. Cheong, R.; Rhee, A.; Wang, C.J.; Nemenman, I.; Levchenko, A. Information transduction capacity of noisy biochemical signaling networks. Science 2011, 334, 354–358. [Google Scholar] [CrossRef]
  4. McGrath, T.; Jones, N.S.; ten Wolde, P.R.; Ouldridge, T.E. Biochemical Machines for the Interconversion of Mutual Information and Work. Phys. Rev. Lett. 2017, 118, 028101. [Google Scholar] [CrossRef] [PubMed]
  5. Ouldridge, T.E.; Govern, C.C.; ten Wolde, P.R. Thermodynamics of Computational Copying in Biochemical Systems. Phys. Rev. X 2017, 7, 021004. [Google Scholar] [CrossRef]
  6. Becker, N.B.; Mugler, A.; ten Wolde, P.R. Optimal Prediction by Cellular Signaling Networks. Phys. Rev. Lett. 2015, 115, 258103. [Google Scholar] [CrossRef] [PubMed]
  7. Cheng, F.; Liu, C.; Shen, B.; Zhao, Z. Investigating cellular network heterogeneity and modularity in cancer: A network entropy and unbalanced motif approach. BMC Syst. Biol. 2016, 10, 65. [Google Scholar] [CrossRef]
  8. Whitsett, J.A.; Guo, M.; Xu, Y.; Bao, E.L.; Wagner, M. SLICE: Determining cell differentiation and lineage based on single cell entropy. Nucleic Acids Res. 2016, 45, e54. [Google Scholar]
  9. Statistical Dynamics of Spatial-Order Formation by Communicating Cells. iScience 2018, 2, 27–40. [CrossRef]
  10. Maire, T.; Youk, H. Molecular-Level Tuning of Cellular Autonomy Controls the Collective Behaviors of Cell Populations. Cell Syst. 2015, 1, 349–360. [Google Scholar] [CrossRef]
  11. Mehta, P.; Schwab, D.J. Energetic costs of cellular computation. Proc. Natl. Acad. Sci. USA 2012, 109, 17978–17982. [Google Scholar] [CrossRef] [PubMed]
  12. Govern, C.C.; ten Wolde, P.R. Energy dissipation and noise correlations in biochemical sensing. Phys. Rev. Lett. 2014, 113, 258102. [Google Scholar] [CrossRef] [PubMed]
  13. Leff, H.S.; Rex, A.F. Maxwell’s Demon: Entropy, Information, Computing; Princeton University Press: Princeton, NJ, USA, 2014. [Google Scholar]
  14. Landauer, R. Information is physical. Phys. Today 1991, 44, 23–29. [Google Scholar] [CrossRef]
  15. Szilard, L. On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. Behav. Sci. 1964, 9, 301–310. [Google Scholar] [CrossRef]
  16. Sagawa, T.; Ueda, M. Fluctuation theorem with information exchange: Role of correlations in stochastic thermodynamics. Phys. Rev. Lett. 2012, 109, 180602. [Google Scholar] [CrossRef] [PubMed]
  17. Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
  18. Tsai, C.J.; Nussinov, R. A unified view of how allostery works? PLoS Comput. Biol. 2014, 10, e1003394. [Google Scholar] [CrossRef]
  19. Cuendet, M.A.; Weinstein, H.; LeVine, M.V. The allostery landscape: Quantifying thermodynamic couplings in biomolecular systems. J. Chem. Theory Comput. 2016, 12, 5758–5767. [Google Scholar] [CrossRef]
  20. Jarzynski, C. Equalities and inequalities: Irreversibility and the second law of thermodynamics at the nanoscale. Annu. Rev. Condens. Matter Phys. 2011, 2, 329–351. [Google Scholar] [CrossRef]
  21. Seifert, U. Stochastic thermodynamics, fluctuation theorems and molecular machines. Rep. Prog. Phys. 2012, 75, 126001. [Google Scholar] [CrossRef]
  22. Spinney, R.; Ford, I. Fluctuation Relations: A Pedagogical Overview. In Nonequilibrium Statistical Physics of Small Systems; Wiley-VCH Verlag GmbH & Co. KGaA: Weinheim, Germany, 2013; pp. 3–56. [Google Scholar]
  23. Crooks, G.E. Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. Phys. Rev. E 1999, 60, 2721–2726. [Google Scholar] [CrossRef]
  24. Seifert, U. Entropy production along a stochastic trajectory and an integral fluctuation theorem. Phys. Rev. Lett. 2005, 95, 040602. [Google Scholar] [CrossRef]
  25. Ponmurugan, M. Generalized detailed fluctuation theorem under nonequilibrium feedback control. Phys. Rev. E 2010, 82, 031129. [Google Scholar] [CrossRef]
  26. Horowitz, J.M.; Vaikuntanathan, S. Nonequilibrium detailed fluctuation theorem for repeated discrete feedback. Phys. Rev. E 2010, 82, 061120. [Google Scholar]
  27. Kurchan, J. Fluctuation theorem for stochastic dynamics. J. Phys. A Math. Gen. 1998, 31, 3719. [Google Scholar] [CrossRef]
  28. Maes, C. The fluctuation theorem as a Gibbs property. J. Stat. Phys. 1999, 95, 367–392. [Google Scholar] [CrossRef]
  29. Jarzynski, C. Hamiltonian derivation of a detailed fluctuation theorem. J. Stat. Phys. 2000, 98, 77–102. [Google Scholar] [CrossRef]
  30. Goldstein, H.; Poole, C., Jr.; Safko, J.L. Classical Mechanics, 3rd ed.; Pearson: London, UK, 2001. [Google Scholar]
  31. Parrondo, J.M.; Horowitz, J.M.; Sagawa, T. Thermodynamics of information. Nat. Phys. 2015, 11, 131–139. [Google Scholar] [CrossRef]
  32. Kawai, R.; Parrondo, J.M.R.; den Broeck, C.V. Dissipation: The phase-space perspective. Phys. Rev. Lett. 2007, 98, 080602. [Google Scholar] [CrossRef]
  33. Generalization of the second law for a transition between nonequilibrium states. Phys. Lett. A 2010, 375, 88–92. [CrossRef]
  34. Generalization of the second law for a nonequilibrium initial state. Phys. Lett. A 2010, 374, 1001–1004. [CrossRef]
  35. Esposito, M.; Van den Broeck, C. Second law and Landauer principle far from equilibrium. Europhys. Lett. 2011, 95, 40004. [Google Scholar] [CrossRef]
  36. Sagawa, T.; Ueda, M. Generalized Jarzynski equality under nonequilibrium feedback control. Phys. Rev. Lett. 2010, 104, 090602. [Google Scholar] [CrossRef] [PubMed]
  37. Horowitz, J.M.; Parrondo, J.M. Thermodynamic reversibility in feedback processes. Europhys. Lett. 2011, 95, 10005. [Google Scholar] [CrossRef]
Figure 1. Paradox in thermodynamics of information (a) Maxwell’s demon (orange cat) uses information on the speed of the particles in the box: He opens/closes the small hole (orange line) without expenditure of energy such that fast particles (red filled circles) are gathered in the upper-half of the box and slow particles (blue filled circles) are gathered in the lower-half of the box. Since temperature is the average velocity of the particles, the demon’s action results in spontaneous flow of heat from colder places to hotter places, which violates the second-law of thermodynamics. (b) A cycle of Szilard’s engine is represented. A lever (green curved arrow) is controlled such that a weight can be lifted during the wall moves quasi-statically in the direction that the particle pushes. This engine harnesses heat from the heat reservoir (yellow region around each boxes) and convert it into mechanical work, cyclically, and thus corresponds to a perpetual-motion engine of the second kind, which is prohibited by the second-law of thermodynamics.
Figure 1. Paradox in thermodynamics of information (a) Maxwell’s demon (orange cat) uses information on the speed of the particles in the box: He opens/closes the small hole (orange line) without expenditure of energy such that fast particles (red filled circles) are gathered in the upper-half of the box and slow particles (blue filled circles) are gathered in the lower-half of the box. Since temperature is the average velocity of the particles, the demon’s action results in spontaneous flow of heat from colder places to hotter places, which violates the second-law of thermodynamics. (b) A cycle of Szilard’s engine is represented. A lever (green curved arrow) is controlled such that a weight can be lifted during the wall moves quasi-statically in the direction that the particle pushes. This engine harnesses heat from the heat reservoir (yellow region around each boxes) and convert it into mechanical work, cyclically, and thus corresponds to a perpetual-motion engine of the second kind, which is prohibited by the second-law of thermodynamics.
Symmetry 11 00433 g001
Figure 2. Measurement and feedback control: system X is, for example, a measuring device and system Y is a measured system. X and Y co-evolve, as they interact weakly, along trajectories { x t } and { y t } , respectively. (a) Coupling is being established during the measurement process so that I t ( x t , y t ) for 0 t τ may be increased (not necessarily monotonically). (b) Established correlation is being used as a source of work through external parameter λ t so that I t ( x t , y t ) for τ t τ may be decreased (not necessarily monotonically).
Figure 2. Measurement and feedback control: system X is, for example, a measuring device and system Y is a measured system. X and Y co-evolve, as they interact weakly, along trajectories { x t } and { y t } , respectively. (a) Coupling is being established during the measurement process so that I t ( x t , y t ) for 0 t τ may be increased (not necessarily monotonically). (b) Established correlation is being used as a source of work through external parameter λ t so that I t ( x t , y t ) for τ t τ may be decreased (not necessarily monotonically).
Symmetry 11 00433 g002
Table 1. The joint probability distribution of x and y at an intermediate time t: Here we assume for simplicity that both systems X and Y have two states, 0 (left) and 1 (right).
Table 1. The joint probability distribution of x and y at an intermediate time t: Here we assume for simplicity that both systems X and Y have two states, 0 (left) and 1 (right).
X Y 0 (Left)1 (Right)
0 (Left)1/31/6
1 (Right)1/61/3

Share and Cite

MDPI and ACS Style

Jinwoo, L. Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time. Symmetry 2019, 11, 433. https://doi.org/10.3390/sym11030433

AMA Style

Jinwoo L. Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time. Symmetry. 2019; 11(3):433. https://doi.org/10.3390/sym11030433

Chicago/Turabian Style

Jinwoo, Lee. 2019. "Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time" Symmetry 11, no. 3: 433. https://doi.org/10.3390/sym11030433

APA Style

Jinwoo, L. (2019). Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time. Symmetry, 11(3), 433. https://doi.org/10.3390/sym11030433

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop