Next Article in Journal
MVMD-MOMEDA-TEO Model and Its Application in Feature Extraction for Rolling Bearings
Next Article in Special Issue
Entropy Generation in Cu-Al2O3-H2O Hybrid Nanofluid Flow over a Curved Surface with Thermal Dissipation
Previous Article in Journal
Detecting Toe-Off Events Utilizing a Vision-Based Method
Previous Article in Special Issue
Entropy Generation Rate Minimization for Methanol Synthesis via a CO2 Hydrogenation Reactor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Generalized Entropy Generation Expressions in Gases

by
Michael H. Peters
Department of Chemical and Life Science Engineering, Virginia Commonwealth University, 601 West Main St., Richmond, VA 23284, USA
Entropy 2019, 21(4), 330; https://doi.org/10.3390/e21040330
Submission received: 23 January 2019 / Revised: 20 March 2019 / Accepted: 22 March 2019 / Published: 27 March 2019
(This article belongs to the Special Issue Entropy Generation Minimization II)

Abstract

:
In this study, we generalize our previous methods for obtaining entropy generation in gases without the need to carry through a specific expansion method, such as the Chapman–Enskog method. The generalization, which is based on a scaling analysis, allows for the study of entropy generation in gases for any arbitrary state of the gas and consistently across the conservation equations of mass, momentum, energy, and entropy. Thus, it is shown that it is theoretically possible to alter specific expressions and associated physical outcomes for entropy generation by changing the operating process gas state to regions significantly different than the perturbed, local equilibrium or Chapman–Enskog type state. Such flows could include, for example, hypersonic flows or flows that may be generally called hyper-equilibrium state flows. Our formal scaling analysis also provides partial insight into the nature of entropy generation from an informatics perspective, where we specifically demonstrate the association of entropy generation in gases with uncertainty generated by the approximation error associated with density function expansions.

1. Introduction

In general, entropy conservation, including its generation and the concomitant Second Law of Thermodynamics, represents a critical foundation in the analysis of energy efficiency and more broadly in the dictation of both the direction of natural processes and the limitations on its behavior. The specific irreversible nature or “time’s arrow” of nonequilibrium systems are entailed in the so-called phenomenological laws, i.e., Fick’s Law of Diffusion, Newton’s Law of Viscosity, Fourier’s Law of Heat Conduction, and the Law of Entropy Increase. From classical molecular theory, the phenomenological laws of mass, momentum, and energy have been shown to follow from the condition of small perturbations from local equilibrium states [1]. Molecular-based derivations of the conservation equations for mass, momentum, and energy and the associated phenomenological laws have been well-developed based on Irving and Kirkwood’s [2] paper on the “classical” statistical mechanics of transport processes. However, a classical molecular theory derivation of the nonequilibrium entropy conservation equation and the Law of Entropy Increase has remained somewhat elusive or, at best, lacking of comparable clarity and consistency with its sister transport processes. As a disclaimer, no attempt is made here to review the multitude of approaches to the description of nonequilibrium entropy outside the realm of classical statistical mechanics or molecular theoretic approaches, such as coarse graining, quantum approaches, semi-empirical efforts, probability and informatics modeling, phenomenology, etc., which may provide alternative and complimentary paths of analysis. Here we stick to the classical molecular theory approaches that have formed the theoretical foundations of transport phenomena.
Previously [3], we derived the entropy conservation equation in gases following the Irving and Kirkwood approach and using a generalized definition of entropy, which recovers Boltzmann’s entropy in the lowest-ordered gas limit and Gibb’s entropy in the high density limit. Then using a Chapman and Enskog expansion method [4], we obtained an expression for the entropy generation and flux in gases equivalent to that obtained from phenomenology and to Boltzmann’s celebrated expression for entropy generation [3].
In the current study, we refine our approach by obtaining the correct form of entropy generation and flux in gases without resort to the specifics of the Chapman–Enskog expansion method. The formal analysis allows for a general approach or paradigm to the prediction of entropy generation in gases with any arbitrary gas state. This, in turn, allows for the study of entropy generation minimization schemes based on alterations of the gas state beyond the equilibrium perturbation states associated with phenomenology or the Chapman–Enskog expansion type states.
Formal statistical mechanical derivations of the conservation laws begin with the Liouville equation, or the general equation of classical statistical mechanics; so, we must begin there. For the sake of simplicity, we consider only pure component systems here, but the results given are readily extended to multicomponent systems as well.
As noted previously [3], formally the N-particle Liouville equation cannot support finite entropy generation [5,6]. It would be appropriate to refer to this result as the “Liouville entropy paradox”. In the subject of information theory and statistics, entropy is formally defined as a measure of “uncertainty” [7,8,9]. As presented by Jaynes [10], deliberate truncations or approximations to the solution of the Liouville equation are necessary to obtain finite positive entropy generation. As a minimum then, clearly any approximations must be consistent across the transport equations in order to obtain meaningful results. To be clear, the exact origins of the Second Law are not considered here. Rather we seek to develop general expressions to entropy flux and generation in gases based on generalities of perturbation expansions whose approximations lead to finite entropy generation. Our results are shown to be consistent across the transport equations and consistent with phenomenology, and they lead to a general approach for entropy analysis of systems far outside of local equilibrium states. Our formality in scaling and expansions, however, does allow us to partially address the Liouville entropy paradox, as will be shown below.

2. Materials and Methods

2.1. Liouville Equation and Reduced Forms

In Cartesian vector notation, the Liouville equation for particle number preserving systems reads [11]
f N t = - i = 1 N p i m i · f N r i + F i ( r N ) · f N p i .
where f N is the N-particle density function, r i and p i are the position and momentum coordinates for the ith particle of mass m i , and we have assumed that the force acting on particle i is only a function of the position coordinates r N ( r 1 , r 2 , , r N ).
Now consider a reduced form of the Liouville equation for a set of molecules { s } = { 1 , 2 , 3 , , s } that can be obtained by integrating the Liouville equation over the phase-space of the other { N - s } set of molecules. Following standard procedures [11], we can integrate Equation (1) over d r N - s d p N - s space to obtain the reduced Liouville equation
f s t + i = 1 s p i m i · f s r i + 1 ( N - s ) ! i = 1 s p i · F i f N d r N - s d p N - s = 0
The force acting on the i-th molecule can be written as the gradient of a potential as
F i = - r i Φ N ( r N )
and for the purposes of specific applications below, the potential will be approximated by a sum over the pair interaction potential as (“pairwise additivity”)
F i = - r i Φ N ( r N ) - j = 1 j i N r i ϕ ( r i j )
where the pair potential ϕ ( r i j ) is the interaction potential between any two molecules in the system, i.e., the effects of three or more body interactions on the pair potential expression are neglected. Please note that this assumption does not limit the generality of our results given below.
Substituting Equation (4) into Equation (2) and integration on the right-hand side over all the ( r N - s , p N - s ) space except the ( s + 1 ) molecule gives
f s t + i = 1 s p i m i · f s r i - j = 1 j i s ϕ ( r i j ) r i · f s p i
= i = 1 s ϕ ( r i , s + 1 ) r i · f s + 1 p i d r s + 1 d p s + 1
Equation (6) is the reduced Liouville equation for pairwise additive interaction forces, and it is an integro-differential equation where the evolution of the f s density depends on the next higher-order, f s + 1 , density. This nonhomogeneous dimensionality feature is the so-called BBGKY hierarchy named after its originators: Bogoliubov, Born, Green, Kirkwood, and Yvon [11].

2.2. Entropy Conservation

Irving and Kirkwood derived the mass, momentum, and energy conservation equations from the Liouville equation, and their paradigm will be followed here unless otherwise noted. For more details the reader is referred to the original manuscript [2]. To obtain an entropy conservation equation following the IK (Irving and Kirkwood) approach, we define a general quantity α as [3]
α = - k s N ! s ! N - s ! 1 s j = 1 s δ r j - r z s r s , p s , t
where, following Green [12], z s is the natural logarithm of an s-th ordered, normalized density function that depends on the multiparticle expansion method [12,13]. Please note that r is an arbitrary locator vector in the system. The phase-space average of α , < α > , is computed by integrating α over all s phase-space coordinates as
< α > = n S ¯ B G = - k s 1 s ! 1 s j = 1 s δ r j - r f s z s d r s d p s
We call this average the Boltzmann–Gibbs entropy, where S ¯ B G is the entropy per molecule and n is the local molecular number density. For s = 1 , z 1 = ln ( 3 f 1 ) and we recover Boltzmann’s definition [3,4]
n S ¯ B r , t = - k f 1 r , p , t ln 3 f 1 d p
where is Planck’s constant, and in the limit s = N , it can be shown that we recover Gibbs’ entropy [12]
n S ¯ G r , t = - k N ! 1 N j = 1 N δ r j - r f N ln 3 N f N d r N d p N
We note that for systems at global equilibrium f N is independent of the locator vector r and, using the fact that n N / V = N / d r at global equilibrium, Equation (9) becomes equal to Gibbs’ equilibrium entropy function.
Extensions to other orders can also be shown. Specifically, the nonequilibrium, two-space Green’s entropy would be defined according to Equation (7) as [14]
n S ¯ 2 ( r , t ) = - k 4 j = 1 2 δ r j - r f 2 z 2 d r 2 d p 2
where z 2 = ln [ f 2 ( r 2 , p 2 , t ) / f 1 ( r 1 , p 1 , t ) f 1 ( r 2 , p 2 , t ) ] , which again under global equilibrium conditions can readily be shown to reduce to Green’s two-particle equilibrium expression [14]. Please note that various multiparticle expansion or closure methods for the s-order density functions have been recently studied and reviewed by Singer [13].
Now, the dynamical variable for entropy given above depends explicitly on time through the s-order density functions, whereas the dynamical variables for mass, momentum, and energy introduced by IK have no explicit time dependence. So, to obtain an entropy conservation equation following the IK approach, we can work with the Liouville equation and dynamic variable modified to include the explicit time dependence in α , or it is somewhat easier and equivalent to work directly with the reduced Liouville equation, Equation (6); we choose the latter approach.
Multiplying Equation (6) by
- k s 1 s ! 1 s j = 1 s z s δ r j - r
and integrating over all ( r s , p s ) space gives the following entropy conservation equation [3]
n S ¯ B G t + n v 0 · S ¯ B G r = - r · s + n s ¯ g
where n is the local number density and the bulk or mass average velocity v 0 is defined by
v 0 ( r , t ) 1 n ( p m ) f 1 ( r , p , t ) d p
and the entropy flux vector s also follows as
s ( r , t ) - k s 1 s ! s i = 1 s p i m i f s z s d r s - 1 d p s
which represents the flux of entropy relative to the bulk average velocity, where p i / m i = p i / m i - v 0 .
Now, the last term on the right-hand side represents the entropy generation, specifically
n s ¯ g - k s 1 s ! s i = 1 s j = 1 s z s δ r j - r Φ r i , s + 1 r i · f s + 1 p i d r s d r s + 1 d p s d p s + 1
Please note that for s = 1 , we obtain the well-known Boltzmann’s entropy generation term [11],
n s ¯ g ( r , t ) = k ln 3 f 1 r , p 1 , t Φ r 2 , r r 2 · f 2 r , r 2 , p 1 , p 2 , t p 1 d r 2 d p 1 p 2
where we have used Newton’s Third Law in writing this last expression. Please note that further simplification of Equation (14) is possible, but our focus here will be on the s = 1 generation term and its particular forms.

2.3. Asymptotic Expansions

We now look at the specific expressions for the entropy flux and entropy generation terms based on asymptotic expansions. The truncation of these expansions will be clearly shown to result in finite entropy generation; consequently, the truncations must be consistent across the transport equations for any given system. For the sake of generality, we introduce the following dimensionless variables denoted by an asterisk into the reduced Liouville equation: t * = t / l 0 / v 0 ; r i * = r i / l 0 ; p i * = p i / p 0 ; f s * = f s p 0 3 s n 0 - s ; F i * = F i / F 0 = F i λ 0 / ϕ 0 ; where l 0 , p 0 ( m v 0 ) and F 0 ( ϕ 0 / λ 0 ) are characteristic values of length, momentum, and intermolecular interaction force, respectively, where the characteristic force is expressed as a characteristic interaction energy, ϕ 0 , divided by a characteristic interaction length scale, λ 0 .
Substituting these dimensionless quantities into Equation (5) leads to the non-dimensionless, reduced Liouville equation [14]
f s * t * + i = 1 s p i * · f s * r i * + 1 ϵ j = 1 j i s F i * r i j * · f s * p i *
= - N β ϵ i = 1 s F i * r i , s + 1 · f s + 1 * p i * d r s + 1 * d p s + 1 *
where
ϵ = m v 0 2 ϕ 0 λ 0 l 0
and
N β = n 0 λ 0 3
are dimensionless groups.
For most problems, the characteristic velocity would be the molecular velocity and, thus, m v 0 2 k T 0 , where T 0 is a characteristic temperature. Also, the characteristic intermolecular potential, ϕ 0 , is usually on the order of magnitude of k T 0 , so we can select ϕ 0 = k T 0 and m v 0 2 = k T 0 giving
ϵ = λ 0 l 0
Specifically in Equation (16) above, l 0 is the length scale over which an appreciable change in f s with respect to r i occurs and λ 0 is the length scale over which an appreciable change in F i with respect to r i j occurs. For example, in most gas and liquid systems λ 0 < < l 0 , and this disparity is used to obtain approximate asymptotic solutions to the reduced Liouville equation. Since appreciable changes in f s are concomitant with changes in its moments (or the transport properties), l 0 is also referred to as characteristic macroscopic length scale. We will also assume N β to be on the order of one for a moderate density gas. The characteristic time is normally associated with the macroscopic scale and N t may be taken to be unity for our purposes here. Please note that the scaling analysis given above is a generalization of that given by Frieman [15].
In both gases and liquids ϵ is a smallness parameter. Under these conditions, the solution to Equation (16) is written in the following regular or Poincare’ perturbation expansion
f s * = f s * ( 0 ) + ε f s * ( 1 ) + ε 2 f s * ( 2 ) + . . .
and it follows that
ln f s * = ln f s * ( 0 ) + ε f s * ( 1 ) f s * ( 0 ) + O ( ε 2 )
Using the Chapman–Enskog expansion method, it was previously shown that the entropy flux and generation expressions can be obtained from their definitions given above [3]. The expressions include only the so-called kinetic contributions, which is consistent with the s = 1 level of analysis. Here we demonstrate a more general method for obtaining the entropy flux and generation expressions. For completeness, we also note that in the case of rarefied and dilute gases the expansion is not “regular”, but can still be readily treated as originally shown by Frieman [14,15]. Also, note that a complete analysis for gases requires both and s = 1 and s = 2 analysis, the latter of which includes two body interactions and leads to the “potential part” of the flux relationships [11,14]. However, the paradigm given below is readily extended to a complete s = 2 analysis and will not be considered here in order to simplify the presentation and results.
For s = 1 , substitution of Equation (20) into Equation (16) leads to a hierarchy of order equations [14]:
0 ( 1 ) :
0 = - F 1 * r 12 * · f 2 * ( 0 ) p 1 * d r 12 * d p 2 *
0 ( ϵ ) :
f 1 * ( 0 ) t * + p 1 * · f 1 * ( 0 ) r 1 *
- F 1 * r 12 * · f 2 * ( 1 ) p 1 * d r 12 * d p 2 *
Now, through straightforward analysis of the s = 2 expressions, it can be readily shown that the leading order solution to Equation (22) follows the local equilibrium density [14], given in dimensionless terms as
f 2 * ( 0 ) = n 2 * ( 0 ) ( r * , r 12 * , t * ) 2 π T * r * , t * 3 e x p - j = 1 2 p j * 2 2 T * r * , t *
where p j * is the dimensionless momentum of molecule j relative to the bulk motion and n 2 * ( 0 ) ( r * , r 12 * , t * ) is the dimensionless two particle density function [14]. The relations given above are all that are needed to derive general forms of the entropy flux and entropy generation in gases, as we will now show.

3. Results

Entropy Flux and Entropy Generation

First, we return to dimensional variables and expand the logarithm term from Equation (21) for s = 1 yielding
f 1 ln 3 f 1 = f 1 ( 0 ) ln 3 f 1 ( 0 ) + f 1 ( 1 ) 1 + ln ( 3 f 1 ( 0 ) ) + O ( ε 2 )
Next, we consider the s = 1 entropy flux term s 1 from Equation (13). Using the expansion Equation (25) and the local equilibrium solution in dimensional terms
f 1 ( 0 ) = n ( r , t ) 2 π m k T 3 / 2 e x p - p 1 2 2 m k T
Now, using the auxiliary conditions [3]
f 1 ( 1 ) d p 1 = 0
p 1 f 1 ( 1 ) d p 1 = 0
p 1 2 f 1 ( 1 ) d p 1 = 0
the entropy flux term can be easily shown to be given by
s 1 ( r , t ) = p 1 f 1 ( 1 ) ( p 1 2 2 m 2 T ) d p 1
where the leading-order entropy flux term vanishes as required for local equilibrium flows [3]. Noting that the kinetic part of the energy flux relative to the mass average velocity is given to first order in ε by [4,11]
q k 1 2 m 2 f 1 ( 1 ) ( r , p 1 , t ) p 1 2 p 1 d p 1
we see that to first-order in ε we obtain
s 1 = q k T
i.e., the first-order entropy flux is equal to the kinetic contribution to the energy flux divided by temperature, as required for this order of approximation.
Now, returning to the entropy generation term, Equation (15), we can substitute the expansions Equations (20) and (21) for s = 1 , to obtain
n s ¯ g 1 ( r , t ) = k ln f 1 ( 0 ) + f 1 ( 1 ) f 1 ( 0 ) + ·
× ϕ r 2 , r r 2 · p 1 [ f 2 ( 0 ) + f 2 ( 1 ) + · ] d r 2 d p 1 d p 2
Now, we can use the O ( ε ) equation, Equation (23), in dimensional terms
f 1 ( 0 ) t + p 1 m · f 1 ( 0 ) r
= ϕ r 2 , r r 2 · f 2 ( 1 ) p 1 d r 2 d p 2
for part of the integral term in Equation (33). It can also be readily shown that the leading order terms of Equation (33) all vanish as required for local equilibrium flows [14]. For the derivative terms in Equation (34), we have
f 1 ( 0 ) t = f 1 ( 0 ) ln n 1 t + p 1 k T f 1 ( 0 ) · v 0 t + f 1 ( 0 ) 1 T p 1 2 2 m k T - 3 2 T t
and
p 1 m · f 1 ( 0 ) r = f 1 ( 0 ) p 1 m · ln n 1 r + f 1 ( 0 ) p 1 p 1 m k T : v 0 r + f 1 ( 0 ) 1 T p 1 2 2 m k T - 3 2 p 1 m · T r
Now, for the f 1 ( 0 ) gradients, we can use the energy transport equation for equilibrium flows [14] to express the temperature derivatives as
3 2 k n 1 [ T t + v 0 · T r ] = - p k r · v 0
where p k is the ideal gas pressure. Finally then, substituting Equation (34) into Equation (33), using Equations (35)–(37) and the auxiliary relations, we obtain Boltzmann’s level entropy generation term as
n s ¯ g 1 ( r , t ) = - 1 T v 0 r : k - 1 T 2 T r · q k
where
P k = p 1 p 1 f 1 d p 1 = P k ( 0 ) + k ( 1 )
is the kinetic contribution to the total pressure tensor, and
P k ( 0 ) = p 1 p 1 f 1 ( 0 ) d p 1 = p k I
is the ideal gas pressure, and the kinetic contribution to the shear stress tensor is defined as [4,11]
k ( 1 ) = p 1 p 1 f 1 ( 1 ) d p 1
Thus, Boltzmann’s entropy generation term recovers phenomenological results [16], where the energy flux q and momentum flux flux tensor P are restricted to their kinetic contributions as required to this order, s = 1 . For completeness, we note that the nonequilibrium work expression follows from the energy conservation equation originally given by IK.

4. Discussion

We have shown that we can readily obtain the specific s = 1 order entropy flux and generation terms through a simple ordering analysis without regard to the specifics of the Chapman Enskog method. The results are also shown to be in agreement with phenomenology. The ordering paradigm can be applied to any type of gas flows and therefore opens the possibility of treating systems far removed from local equilibrium states.
It is also enlightening to examine the order of terms appearing in the entropy generation expression obtained. The only surviving term from Equation (33) involves the product of the f 1 ( 1 ) / f 1 ( 0 ) term with the f 2 ( 1 ) term, which are both O ( ε ) terms. Thus, the only surviving generation term is O ( ε 2 ) , and it represents the uncertainty generated through the errors in the approximation. Also, it is noted that the resulting physical gradients, as reflected phenomenologically, are of second-order reflecting the higher order character of the generation term. If there were no errors associated with the O ( ε ) expansion, the generation term would be zero. This latter case would constitute local, reversible gas flows. In the N-particle Liouville analysis, it is assumed that f N is known exactly and hence there is no entropy generation involved. It would be interesting to carry the above analysis for s = 1 through to O ( ε 2 ) with the errors in the approximation being necessarily of higher order. Indeed, Burnett carried out a partial analysis of the O ( ε 2 ) expansion following the Chapman–Enskog method, as reviewed by Chapman and Cowling [4]. Burnett’s results are believed to be applicable to hypersonic flows and several different molecular theory approaches have also been developed [4], but an entropy analysis is lacking.

5. Conclusions

A more general paradigm for predicting entropy flux and generation in fluids has been proposed and examined for gases, which opens the door for the treatment of fluids far removed from an equilibrium state. What practical results that can be obtained by attempting to operate fluids in hyper-(far from)equilibrium states remains an unanswered question. Current studies are directed at the s = 2 level of analysis as well as a full accounting of O ( ε 2 ) expansions, where we seek to understand how approximations play out in higher order.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Prigogine, I. From Being to Becoming. Time and Complexities in the Physical Sciences; W.H. Freeman Co.: New York, NY, USA, 1980. [Google Scholar]
  2. Irving, J.H.; Kirkwood, J.G. The Statistical Mechanical Theory of Transport Processes. IV. The Equations of Hydrodynamics. J. Chem. Phys. 1950, 18, 817–828. [Google Scholar] [CrossRef]
  3. Peters, M.H. Completing Irving and Kirkwood’s Molecular Theory of Transport Processes: Nonequilibrium Entropy Generation. Ind. Eng. Chem. Res. 2009, 48, 166–171. [Google Scholar] [CrossRef]
  4. Chapman, S.; Cowling, T.G. The Mathematical Theory of Non-Uniform Gases; Cambridge University Press: Cambridge, UK, 1970. [Google Scholar]
  5. Gaspard, P. Entropy Production in Open Volume-Preserving Systems. J. Stat. Phys. 1997, 88, 1215–1240. [Google Scholar] [CrossRef]
  6. Schack, R.; Caves, C.M. Chaos for Liouville Probability Densities. Phys. Rev. E 1996, 53, 3387–3401. [Google Scholar] [CrossRef]
  7. Jaynes, E.T. Information Theory and Statistical Mechanics. Part I. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  8. Jaynes, E.T. Information Theory and Statistical Mechanics. Part II. Phys. Rev. 1957, 108, 171–190. [Google Scholar] [CrossRef]
  9. Katz, A. Principles of Statistical Mechanics: The Information Theory Approach; W.H. Freeman: San Francisco, CA, USA, 1967. [Google Scholar]
  10. Jaynes, E.T. Gibbs vs. Boltzmann Entropies. Am. J. Phys. 1965, 33, 391–398. [Google Scholar] [CrossRef]
  11. Hirschfelder, J.O.; Curtiss, C.F.; Bird, R.B. Molecular Theory of Gases and Liquids; Wiley: New York, NY, USA, 1964. [Google Scholar]
  12. Green, H.S. Molecular Theory of Fluids; North-Hollnad: Amsterdam, The Netherlands, 1952. [Google Scholar]
  13. Singer, A. Maximum Entropy Formulation of the Kirkwood Superposition Approximation. J. Chem. Phys. 2004, 121, 3657–3666. [Google Scholar] [CrossRef] [PubMed]
  14. Peters, M.H. Molecular Thermodynamics and Transport Phenomena. Complexities of Scales in Space and Time; McGraw-Hill: New York, NY, USA, 2005. [Google Scholar]
  15. Frieman, E.A. On a New Method in the Theory of Irreversible Processes. J. Math. Phys. 1963, 4, 410–418. [Google Scholar] [CrossRef]
  16. Bird, R.B.; Stewart, W.E.; Lightfoot, E.N. Transport Phenomena; Wiley: New York, NY, USA, 1960. [Google Scholar]

Share and Cite

MDPI and ACS Style

Peters, M.H. Generalized Entropy Generation Expressions in Gases. Entropy 2019, 21, 330. https://doi.org/10.3390/e21040330

AMA Style

Peters MH. Generalized Entropy Generation Expressions in Gases. Entropy. 2019; 21(4):330. https://doi.org/10.3390/e21040330

Chicago/Turabian Style

Peters, Michael H. 2019. "Generalized Entropy Generation Expressions in Gases" Entropy 21, no. 4: 330. https://doi.org/10.3390/e21040330

APA Style

Peters, M. H. (2019). Generalized Entropy Generation Expressions in Gases. Entropy, 21(4), 330. https://doi.org/10.3390/e21040330

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop