Next Article in Journal
Negentropy Generation and Fractality in the Dry Friction of Polished Surfaces
Next Article in Special Issue
The Maximum Entropy Production Principle: Its Theoretical Foundations and Applications to the Earth System
Previous Article in Journal
Combined Effects of Pipe Diameter, Reynolds Number and Wall Heat Flux and on Flow, Heat Transfer and Second-Law Characteristics of Laminar-Transitional Micro-Pipe Flows
Previous Article in Special Issue
From Maximum Entropy to Maximum Entropy Production: A New Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Relaxation Processes and the Maximum Entropy Production Principle

by
Paško Županović
1,*,
Srećko Botrić
2,
Davor Juretić
1 and
Domagoj Kuić
1
1
Faculty of Science, University of Split, Teslina 12, 21000 Split, Croatia
2
Faculty of Electrical Engineering, Mechanical Engineering and Naval Architecture, University of Split, R. Boškovića b.b, 21000 Split, Croatia
*
Author to whom correspondence should be addressed.
Entropy 2010, 12(3), 473-479; https://doi.org/10.3390/e12030473
Submission received: 19 February 2010 / Accepted: 9 March 2010 / Published: 11 March 2010
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)

Abstract

:
Spontaneous transitions of an isolated system from one macroscopic state to another (relaxation processes) are accompanied by a change of entropy. Following Jaynes’ MaxEnt formalism, it is shown that practically all the possible microscopic developments of a system, within a fixed time interval, are accompanied by the maximum possible entropy change. In other words relaxation processes are accompanied by maximum entropy production.
Classification:
PACS 05.70.Ln, 65.40.Gr

1. Introduction

E.T. Jaynes, claiming that our knowledge of the initial microscopic state of a system is not a matter of physics but of information theory, introduced information entropy to describe the behavior of macroscopic systems [1,2]. Keeping in mind that the macroscopic development of a system depends on its constraints rather than on the initial microscopic state, Jaynes used such constraints as the pillars of his theory. He proposed that the probabilities of states, under constraints, make the information entropy the maximum possible. This procedure is known as the MaxEnt formalism [3].
MaxEnt can be equally applied to equilibrium and nonequilibrium processes. It was applied by a number of authors [4,5,6] to nonequilibrium processes in order to determine the physical principle relevant for nonequilibrium thermodynamics. We single out Dewar’s [4] and Niven’s [5] approaches.
Dewar considered the ensemble of the trajectories (paths) in phase space. He maximized the information entropy over paths with macroscopic quantities as constraints (MaxEnt) and found that path probability is proportional to the exponential function of entropy production, when entropy production is expressed as the sum of products of fluxes and conjugated forces. Thus, the most probable evolution of a system is accompanied by the maximum possible entropy production. Some criticism of Dewar’s work can be found in references [7,8,9].
Niven [5] applied MaxEnt to stationary processes in a different way than Dewar. Values of fluxes are the main elements of his approach. He considered the mean values of fluxes as constraints. Assuming local equilibrium, he found, under certain additional assumptions, that entropy production must achieve maximal value.
Both Dewar and Niven assumed the existence of a stationary state, local equilibrium and entropy production defined as the sum of the products of thermodynamic forces and conjugated fluxes.
We noted in paper II that in a stationary nonequilibrium state the subject must interfere with a system in order to maintain such a state. In this paper, entitled paper III, we apply the MaxEnt formalism to relaxation processes. Our approach is free of the above-mentioned assumptions. Assuming that changes of entropy, within a fixed interval of time, are distributed so as to achieve maximum information entropy, we find that the evolution of an isolated system is accompanied by the maximum possible entropy production.
The paper is organized in the following manner. Section II is devoted to the Jaynes’ principle of maximum information entropy. In Section III, we apply this principle to the change of entropy of an isolated system. We find that the development of an isolated system (relaxation) is accompanied by the maximum production of entropy. In the Conclusions section we summarize our results.

2. Information Entropy and MaxEnt Formalism

E.T. Jaynes [1,2], having in mind that initial microscopic states cannot be determined, proposed that the statistics of any microscopic physical quantity, describing the time evolution of a system, can be determined by making use of information entropy [10]:
S I = - i w i ln w i
Here, w i is the probability of the i - t h value of the considered microscopic quantity. Taking into account a definite number of constraints (conservation laws and/or the mean values of measured quantities ), the probability distribution is obtained by the maximization of entropy S I . The constrained maximization of entropy S I is well known as the MaxEnt formalism. This maximum-entropy inference gives probability distribution as a measure of the amount of uncertainty. Jaynes [1,2] has pointed out that the very existence of definite macroscopic properties (states) is essentially related to sharp distributions: the overwhelming majority of microscopic evolutions correspond to the same macroscopic (deterministic) evolution of microscopic states.
We note that information entropy is a property of any distribution of probabilities and has, in principle, nothing to do with thermodynamic entropy that is a property of state of macroscopic system (see page 351 in reference [10]).

3. Relaxation Processes, Information Entropy and the MEP Principle

The relaxation process is the spontaneous approach of an isolated system toward its equilibrium state.
The initial problem in applying MaxEnt is the choice of events of interest. In the case of an isolated system, the initial microscopic state of the system defines the “phase space trajectories” (paths). Assuming that the initial microscopic states do not share a common path there is a one-to-one correspondence between initial states and paths.
Time evolution of the macroscopic state of a system is defined with the set { Q i } of mutually independent variables. To be more specific, these variables are fluxes, the main characteristic of non-equilibrium processes, or quantities related to fluxes.
We can define the entropy of an isolated system by means of microcanonical ensemble using either Boltzmann’s , S = k B ln W , or Gibb’s, S = - k B i p i ln p i definition of the entropy. Here, k B , W and p i are Boltzmann’s constant, statistical weight of the macroscopic state and probability of i t h microscopic state, respectively. The unsatisfactory feature of both definitions, in the case of isolated systems, comes from the Louville’s theorem, which states that the number of microscopic states do not change in course of the time. In particular, one initial microstate will give just one final microstate. i.e., there is no increase of the number of microscopic states. Yet, according to the second law the number of states should increase in the course of time.
In order to resolve this paradox, Zurek [11] introduced the notion of the algorithmic entropy (randomness) as an objective property of the microscopic state. Algorithmic entropy is equal to the number of digits s of the shortest program that generates output that contains sufficient information, within required accuracy, about the microscopic state of the system. Momenta and positions of all particles are examples of such output in the case of systems described within classical mechanics. When s is interpreted as a binary representation of the integer, algorithmic entropy is K = log 2 s . Zurek has shown that the old problem of statistical mechanics, fundamental incompatibility of invariance of the physical laws on the change of the sign of time (reversibility) and second law of thermodynamics could be overcome by means of algorithmic entropy. However, since algorithmic entropy is tightly connected with determination of the microscopic state of the system of classical particles, it does not include the principle of the particle indistinguishability. Zurek was forced to modify the starting algorithm (see Appendix A in [11]) in order to get Sackur-Tetrode relation, the correct expression for free gas entropy. It is interesting to note that in order to find correct expression for entropy of ideal gases, Sackur and Tetrode were forced to use the quantum-mechanical concepts - the principle of particle indistiguishabilty and definition of the elementary volume 3 in phase space - in 1912, long before the formulation of quantum mechanics in 1925. In standard textbooks [12,13], one exploits uncertainty relation to define 3 as the elementary volume of the phase space. This process, known as coarse graining, enables one to cross from the continuous to the discrete microscopic states. This introduction of the discrete microscopic states enables one to define ensembles and corresponding statistics.
Each measuring process is defined by the measuring device and system, whose state is determined with the value of measured quantity. A measurement of any physical quantity is accompanied with uncertainty. The resulting uncertainty is a superposition of the uncertainties introduced by the measuring device and system. The uncertainty introduced due to the measuring instrument is within the resolution of the measuring instrument. We too exploit the coarse grain approach, but in a different way from the above described. We take the resolution of the measuring devices { δ Q i } as the quantization step of the variable { Q i } . The macroscopic state is defined with values of independent physical quantities in intervals { Q i , Q i + δ Q i } . We follow the Boltzmann definition of the entropy. The number of microstates W that have values of independent physical quantities within the intervals { Q i , Q i + δ Q i } are the statistical weight of state defined with { Q i } . The entropy of this state is S = k B ln W . We assign this entropy to each microscopic state characterized with values of physical quantities in interval { Q i , Q i + δ Q i } . In the following we will write { Q i } for { Q i , Q i + δ Q i } .
The initial state is defined by a subensemble with fixed values { Q i ( t ) } . During small fixed macroscopic interval of time Δ t , the state { Q i } evolves into the state { Q i ( t + Δ t ) } , where { Q i ( t + Δ t ) } are just discrete values of variables { Q i } . There is an essential difference between initial and final states. The microscopic states of the initial macroscopic state are elements of the subensemble with the variables { Q i } in the interval { Q i ( t ) < Q i < Q i ( t ) + δ Q i } . On the other hand, there is an additional uncertainty of the variables { Q i } in the final state due to the different evolutions (paths) of the elements of the initial subensemble. The uncertainty of physical variables in the final state can be larger than the resolution of the measuring device. We assign this excess of uncertainty to the fluctuations of the measuring quantities { Q i ( t ) } . To be more specific, the elements of the starting subensemble defined by { Q i ( t ) , Q i ( t ) + δ Q i } can end in different subensembles { Q i ( t + Δ t ) + j δ Q i , Q i + ( j + 1 ) δ Q i } . Here, j assumes integer values. This additional uncertainty j 0 is due to the different time evolution of the elements of the initial subensemble and can be in principle detected by the measuring instrument. Note that there is no fluctuation of entropy due to the uncertainty introduced by the measuring device.
The mean value of entropy is the result of averaging its value over subensembles defined with { Q i ( t ) } . If the relaxation is close to equilibrium, the mean value of entropy change is equal to the change of locally defined thermodynamic entropy. The latter is defined in same way as in equilibrium thermodynamics [14,15].
We assume that changes of entropy satisfy the MaxEnt distribution. Then the measured (mean) value of the change of entropy is the only constraint in addition to the normalization of probabilities:
Δ S ¯ = S ( t + Δ t ) ¯ - S ( t )
Here S ( t ) is the entropy of the microscopic or macroscopic initial state, while S ( t + Δ t ) ¯ and Δ S ¯ are mean entropies of the final state of a system and the mean change of entropy, respectively. Due to the evolution of a system, entropy in final microscopic states can acquire different values, although entropy of initial microscopic states has same value.
Information entropy in Eq. (1) becomes
S I = - ( Δ S ) m i n ( Δ S ) m a x w ( Δ S ) ln w ( Δ S )
Here w ( Δ S ) is the probability of the entropy change Δ S , and ( Δ S ) m i n and ( Δ S ) m a x are the minimum and the maximum possible entropy changes, respectively. These extreme values are imposed by the dynamics of the system.
The constrained maximization of information entropy (3), with respect to w ( Δ S ) , is performed according to standard procedure [16] by introducing Lagrange’s multipliers and looking for the maximum of the Lagrangian function
F = - w ( Δ S ) ln w ( Δ S ) + λ w ( Δ S ) - 1 + μ w ( Δ S ) Δ S - S ( t + Δ t ) + S ( t )
The outcome of the maximization of F is
w ( Δ S ) = C - 1 exp ( μ Δ S )
This result can be put into relationship with Einstein’s formula for a probability of fluctuations [17,18]. He proposed that the probability of fluctuation from equilibrium, in the isolated system, is w ( Δ S ) = C - 1 exp ( Δ S / k B ) . Here, Δ S < 0 is the decrease of the entropy from its maximum value. Comparing Einstein’s formula with Eq. (5) we get μ = 1 / k B .
We return back to the relaxation of the system. It comes from the foregoing expression that the maximum possible change of entropy is the most probable one. For a very short interval of time Δ t , many different developments of the system are possible with likely probabilities. But as the time interval increases, the ratio exp ( Δ S / k B ) becomes extremely sharp function at the value of maximum possible entropy change. This result is in accordance with the fluctuation theorem [19,20]. In the case of large time intervals we can say that, apart from relatively small fluctuations, the relaxation of the system is accompanied by maximum possible change of entropy. In other words, the most probable time evolution of the isolated system is accompanied by the maximum entropy production (MEP). Martyushev and Seleznev [7] provide a plausible derivation of the MEP principle in the case of an isolated system. They argue that an isolated system reaches a state of maximum entropy within relaxation time (or within the order of relaxation time) according to the second law of thermodynamics. Therefore, the change of entropy within this time interval is the maximum possible. The MaxEnt formalism applied to the change of entropy of an isolated system supports the plausible derivation of the MEP principle by Martyushev and Seleznev.
In short, relaxation processes in isolated systems are in accordance with the MEP principle. An isolated system will develop in such a way so as to produce the maximum possible entropy. This is an addition of the second law of thermodynamics. While the second law states only that an isolated system will approach the state of largest entropy, the MEP principle states that the system will approach that state with the maximum possible speed [4,5,7,19,21,22].

4. Conclusions

The basic assumption of this paper is that an entropy change in relaxation processes obeys the MaxEnt formalism. Therefore, the largest possible increase in entropy is the most probable increase of entropy. This means that a system evolves with maximum entropy production. In this way, the MEP principle becomes an addition to the second law of thermodynamics. While the second law requires that an isolated system will find itself in the state of maximum entropy, the MEP principle asserts that the system reaches this state as quickly as possible.

Acknowledgements

The present work was supported by the bilateral research project of Slovenia-Croatia Cooperation in Science and Technology, 2009-2010 and Croatian Ministry of Science grant No. 177-1770495-0476 to DJ.

References

  1. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  2. Jaynes, E.T. Information theory and statistical mechanics. II. Phys. Rev. 1957, 108, 171–190. [Google Scholar] [CrossRef]
  3. Jaynes, E.T. On the rationale of maximum-entropy methods. Proc. IEEE. 1982, 70, 939–952. [Google Scholar] [CrossRef]
  4. Dewar, R. Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states. J. Phys. A: Math. Gen. 2003, 36, 631–641. [Google Scholar] [CrossRef]
  5. Niven, R.K. Steady state of a dissipative flow-controlled system and the maximum entropy production principle. Phys. Rev. E 2009, 80, 021113:1–021113:15. [Google Scholar] [CrossRef]
  6. Jones, W. Variational principles for entropy production and predictive statistical mechanics. J. Phys. A: Math. Gen. 1983, 16, 3629–3634. [Google Scholar] [CrossRef]
  7. Martyushev, L.M.; Seleznev, V.D. Maximum entropy production principle in physics, chemistry and biology. Phys. Rep. 2006, 426, 1–45. [Google Scholar] [CrossRef]
  8. Bruers, S. A discussion on maximum entropy production and information theory. J. Phys. A: Math. Theor. 2007, 40, 7441–7450. [Google Scholar] [CrossRef]
  9. Grinstein, G.; Linsker, R. Comments on a derivation and application of the “maximum entropy production” principle. J. Phys. A: Math. Theor. 2007, 40, 9717–9720. [Google Scholar] [CrossRef]
  10. Jaynes, E.T. Probability Theory: The Logic of Science; Cambridge Press: Cambridge, UK, 2003. [Google Scholar]
  11. Zurek, W.H. Algorithmic randomness and physical entropy. Phys. Rev. A 1989, 40, 4731–4751. [Google Scholar] [CrossRef] [PubMed]
  12. Gibbs, W. Elementary Principles in Statistical Mechanics; Yale University Press: London, UK, 1938. [Google Scholar]
  13. Landau, L.D.; Lifschitz, E.M. Statistical Physics, Part 1; Pergamon Press: Oxford, UK, 1980. [Google Scholar]
  14. De Groot, S.R.; Mazur, P. Non-Equilibrium Thermodynamics; North-Holland Pub. Co.: Amsterdam, The Netherlands, 1962. [Google Scholar]
  15. Evans, D.J.; Morris, G.P. Statistical Mechanics of Nonequilibrium Liquids; Academic Press: London, UK, 1990. [Google Scholar]
  16. Krasnov, M.I.; Makarenko, G.I.; Kiselev, A.I. Problems and Exercises in the Calculus of Variations; Mir Publishers: Moscow, Russia, 1975. [Google Scholar]
  17. Einstein, A. Theorie der Opaleszenz von homogenem Flüssigkeiten und Flüssigkeitesgemischen in der Nähe der kriotischen Zustandes. Ann. Phys. 1910, 33, 1275–1298. [Google Scholar] [CrossRef]
  18. Županović, P.; Botrić, S.; Juretić, D. Relaxation processes, MaxEnt formalism and Einstein’s formula for the probability of fluctuations. Croat. Chem. Acta 2006, 79, 335–338. [Google Scholar]
  19. Dewar, R.C. Maximum entropy production and the fluctuation theorem. J. Phys. A: Math. Gen. 2005, 38, L371–L381. [Google Scholar] [CrossRef]
  20. Evans, D.J.; Debra, J.S. The fluctuation theorem. Advan. Phys. 2002, 51, 1529–1585. [Google Scholar] [CrossRef]
  21. Lorenz, R. Full Steam ahead—probably. Science 2003, 299, 837–838. [Google Scholar] [CrossRef] [PubMed]
  22. Dewar, R.C. Maximum entropy production and non-equilibrium statistical mechanics. In Non-Equilibrium Thermodynamics and the Production of Entropy: Life, Earth, and Beyond; Kleidon, A., Lorenz, R.D., Eds.; Springer-Verlag: Berlin, Germany, 2005; pp. 41–55. [Google Scholar]

Share and Cite

MDPI and ACS Style

Županović, P.; Botrić, S.; Juretić, D.; Kuić, D. Relaxation Processes and the Maximum Entropy Production Principle. Entropy 2010, 12, 473-479. https://doi.org/10.3390/e12030473

AMA Style

Županović P, Botrić S, Juretić D, Kuić D. Relaxation Processes and the Maximum Entropy Production Principle. Entropy. 2010; 12(3):473-479. https://doi.org/10.3390/e12030473

Chicago/Turabian Style

Županović, Paško, Srećko Botrić, Davor Juretić, and Domagoj Kuić. 2010. "Relaxation Processes and the Maximum Entropy Production Principle" Entropy 12, no. 3: 473-479. https://doi.org/10.3390/e12030473

APA Style

Županović, P., Botrić, S., Juretić, D., & Kuić, D. (2010). Relaxation Processes and the Maximum Entropy Production Principle. Entropy, 12(3), 473-479. https://doi.org/10.3390/e12030473

Article Metrics

Back to TopTop