Next Article in Journal
Optimization of Curvilinear Tracing Applied to Solar Physics and Biophysics
Next Article in Special Issue
Information Geometry of Complex Hamiltonians and Exceptional Points
Previous Article in Journal
On the Entropy of a Class of Irreversible Processes
Previous Article in Special Issue
Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Time Evolution of Relative Entropies for Anomalous Diffusion

1
Institut für Physik, Technische Universität Chemnitz, D-09107 Chemnitz, Germany
2
Department of Applied Mathematics, The University of Western Ontario, N6A 5B7 London, Canada
*
Author to whom correspondence should be addressed.
Entropy 2013, 15(8), 2989-3006; https://doi.org/10.3390/e15082989
Submission received: 26 June 2013 / Revised: 17 July 2013 / Accepted: 18 July 2013 / Published: 26 July 2013
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)

Abstract

:
The entropy production paradox for anomalous diffusion processes describes a phenomenon where one-parameter families of dynamical equations, falling between the diffusion and wave equations, have entropy production rates (Shannon, Tsallis or Renyi) that increase toward the wave equation limit unexpectedly. Moreover, also surprisingly, the entropy does not order the bridging regime between diffusion and waves at all. However, it has been found that relative entropies, with an appropriately chosen reference distribution, do. Relative entropies, thus, provide a physically sensible way of setting which process is “nearer” to pure diffusion than another, placing pure wave propagation, desirably, “furthest” from pure diffusion. We examine here the time behavior of the relative entropies under the evolution dynamics of the underlying one-parameter family of dynamical equations based on space-fractional derivatives.
Classification:
PACS; 05.70.-a; 05.70.Ln; 05.40.Fb; 05.40.Jc
Classification:
MSC; 34A08; 60E07

1. Introduction

Relative entropies, like Kullback-Leibler [1,2] or the Tsallis relative entropy [3,4,5,6], provide a means to discuss (directed) distances between different probability distributions. These measures are usually not symmetric, so they do not provide a metric on the distributions. Nonetheless, relative entropies allow the comparison of distributions occurring in a variety of different contexts [7,8,9,10,11,12,13,14]. Here, we will use these measures to analyze certain features of distributions describing the dynamics of physical systems.
In particular, our motivation is to study anomalous diffusion processes and the corresponding distribution functions of the dispersing particles. Anomalous diffusion processes differ from classical diffusion in that the dispersion of particles proceeds faster (superdiffusion) or slower (subdiffusion) than for the regular case. These anomalous diffusion processes do, for instance, occur in biological tissues [15,16] or in chemical systems [17]. They can also be observed in porous media [18,19] or turbulent diffusion [20,21]. Such processes are also important in other areas, like target search [22,23,24] or the design of optical materials in which light waves perform a Lévy flight [25].
The theoretical treatment of such processes has lead to the study of evolution equations using non-linear dependencies on the probability density functions (PDF) [26,27] or employing fractional derivatives [28,29,30,31,32,33,34,35,36].
Here, we focus on the space-fractional diffusion equation
t P ( x , t ) = D α x α P ( x , t )
not as a modeling tool for an interesting class of superdiffusion processes with remarkable features [37,38,39], but as a bridge to link the usually unrelated classical diffusion equation (iconic irreversibility) to the wave equation (iconic reversibility). For that the parameter α must vary between one (the (half) wave case) and the (the diffusion case). This bridging regime has been analyzed under different perspectives [40,41,42] and has shown unexpected features. The space-fractional diffusion equation represents a family of processes in the bridge regime that can be ordered by the parameter α, which will be called the bridge ordering.
In [43], the relative entropies (e.g., Kullback-Leibler), in contrast to the regular entropies, order the PDF’s from Equation (1), because of a monotonic relationship in α, placing the wave and diffusion limits “farthest” from each other, even if not in a metrical sense. This establishes relative entropies as a natural measure for the bridging regime. However, [43] considered circumstances at one particular time. This paper extends the previous work by asking whether this ordering is preserved over all time. This question is addressed by examining direct computation and deducing asymptotic expressions valid for long times, based on a saddle point approach.
After briefly setting up the formalism of the known probability densities that solve Equation (1) and the asymptotic methods that are employed, the time dependence of the Kullback-Leibler entropy is discussed, and the cross entropy is introduced. Direct computational analysis of the relative entropy is shown that confirms that the desired monotonic behavior of the relative entropy is persistent at a number of widely ranging times. Then, the asymptotic form is computed for large time, using the saddle point method applied to the cross entropy. This yields an asymptotic form valid for the entire relative entropy. By examining the asymptotic form of the derivative with respect to α, the sign is shown to be preserved, confirming monotonicity for large times. Comparisons are made between direct computational analysis and the resulting asymptotic forms, showing good agreement.
Completing the picture, a similar treatment is performed for Tsallis relative entropies ( q 1 ), resulting in similar outcomes. Monotonicity is preserved at finite values and asymptotically. An interesting feature is noted, namely that for the Tsallis relative entropies, the values are bounded, which is not so for the Kullback-Leibler case.

2. Stable Distributions

The solution of the space-fractional diffusion equation can be expressed in terms of a stable distribution, S x α , β , γ , δ n ; n [37,43,44], given as
P α ( x , t ) = S x α , 1 , ( D α t ) 1 / α , 0 ; 1 = 1 ( D α t ) 1 / α S x ( D α t ) 1 / α α , 1 , 1 , 0 ; 1
where D α = - D cos α π 2 and the parameters are chosen appropriately with
β = 1
γ = - D t cos α π 2 1 / α = ( D α t ) 1 / α
δ 1 = 0
n = 1
An important feature of stable distributions is their so-called fat tails, i.e., the probability density in the tails falls off with a power law rather than exponentially, thus leading to the non-existence of higher moments. Fat tails are responsible for the non-existence of the Kullback-Leibler entropy for certain combinations of stable distributions.
The following presents properties of stable distributions needed for the calculations below. The stable distribution rescales as
S x α , β , γ , δ n ; n = 1 γ S x - δ n γ α , β , 1 , 0 ; n
For the parameters discussed here, the support of the stable distributions is the full real line. Apart from special cases, there are no general closed representations of the stable distribution in terms of elementary functions. However, the density at a certain point x can be expressed as an integral
S x α , β , 1 , 0 ; 1 = 1 π 0 e - u α cos ( x u + ζ u α ) d u
where ζ = - β tan ( α π 2 ) . For x = 0 , the integral can be evaluated in closed form, leading to
S 0 α , β , 1 , 0 ; 1 = 1 α π Γ 1 α cos θ 0 cos ( α θ 0 ) 1 α
where θ 0 = - arctan ( ζ ) / α and Γ ( x ) is the Gamma function. Similarly, there are expressions for the derivatives of the density based on Equation (8). The 2 k -th and the ( 2 k + 1 ) -th derivatives with respect to x are
S ( 2 k ) ( x α , β , 1 , 0 ; 1 ) = ( - 1 ) k π 0 u 2 k e - u α cos ( x u + ζ u α ) d u
S ( 2 k + 1 ) ( x α , β , 1 , 0 ; 1 ) = ( - 1 ) ( k + 1 ) π 0 u 2 k + 1 e - u α sin ( x u + ζ u α ) d u
We will need the first two derivatives at x = 0 , which can again be expressed in closed form as
S ( 1 ) ( 0 α , β , 1 , 0 ; 1 ) = - 1 π Γ 3 α cos ( α θ 0 ) 2 α sin ( 2 θ 0 )
S ( 2 ) ( 0 α , β , 1 , 0 ; 1 ) = - 1 π Γ 4 α cos ( α θ 0 ) 3 α cos ( 3 θ 0 )
For β = 1 and angles in ( - π / 2 , π / 2 )
θ 0 = - 1 α arctan tan ( 2 - α ) π 2 = π 2 - π α
The abbreviations
S α n ( x ) = S ( n ) ( x α , 1 , 1 , 0 ; 1 ) (15) Γ n = Γ ( n α ) Γ ( 1 α ) Ψ n = Ψ n α = Γ n α Γ n α
will be used in the following. Ψ is the digamma function. Thus, Equations (12) and (13) become
S α 1 ( 0 ) = - α Γ 3 sin 2 π α - cos α π 2 1 / α S α 0
S α 2 ( 0 ) = - α Γ 4 sin 3 π α - cos α π 2 1 / α S α 0
For α = 2 , Equation (1) reduces to the classical diffusion equation. That is reflected in its solution P D ( x , t ) , which is the well-known Gaussian. That solution can also be expressed in terms of a stable distribution for α = 2 as
P D ( x , t ) = S x 2 , 1 , D t , 0 ; 1 = N ( 0 , 2 D t )
= 1 2 π D t exp - x 2 4 D t
where N ( μ , σ 2 ) is the normal distribution with mean μ = 0 and variance σ 2 = 2 D t [44].
In the limit α 1 , the scale parameter, γ = - D t cos α π 2 1 / α , of the stable distribution goes to zero and the mode (i.e., maximum of the distribution) x ^ α - D t . This constitutes a time-moving δ-distribution centered at - D t , representing the one-sided solution of the wave equation with an initial δ-distribution.
Writing Equation (1) in the α = 1 limit
t - D x P ( x , t ) = 0
When the operator t + D x is applied to Equation (20), the standard wave equation is recovered, ensuring that solutions of Equation (20) are also solutions of the full wave equation. Equation (20) thus reflects one of two operator factors for the wave equation, and it is known, accordingly, as the half wave equation. The classical advection equation also reduces to the half wave equation in the one-dimensional solenoidal case.

3. Relative Entropies

Kullback-Leibler entropy (“Kullback” for short) between two probability density functions, P a ( x , t ) and P b ( x , t ) , is defined in this context as
K ( P a ( x , t ) , P b ( x , t ) ) = - P a ( x , t ) ln P a ( x , t ) P b ( x , t ) d x
which is the q = 1 limit of the Tsallis relative entropy
T q ( P a ( x , t ) , P b ( x , t ) ) = 1 q - 1 - P a q ( x , t ) P b 1 - q ( x , t ) d x - 1
These can both be regarded as mean values of appropriately chosen functions. K ( P a , P b ) can be regarded as the mean of ln P a / P b , while T q ( P a , P b ) can be seen as the shifted mean of ( P a / P b ) q - 1 . In the following, we shall see that in the large time limit, the density function P a with respect to which these averages are taken, will become sharply peaked, making the saddle point method a natural way [45] to get to asymptotic expressions for long times.

4. Saddle Point Asymptotic Expansion

In general, the saddle point approach [46] can be used to provide asymptotic expansions of integrals of the form:
I ( A ) = y 1 y 2 f ( y ) e A g ( y ) d y
where f ( y ) and g ( y ) are real functions, and we consider large positive values of A. If g ( y ) is unimodal and has its maximum at y 0 , then with increasing A, the function e A g ( y ) will become more and more peaked, so that only values of f ( y ) close to y 0 will contribute significantly to the integral. This induces an asymptotic series expansion in powers of 1 / A . Setting z = A ( y - y 0 ) ,
I ( A ) = f ( y 0 ) e A g ( y 0 ) A - e z 2 g ( y 0 ) / 2 1 + n = 1 A - n / 2 P n ( z ) d z
= f ( y 0 ) e A g ( y 0 ) A 2 π - A g ( y 0 ) 1 + n = 1 C 2 n A n
Here, P n ( z ) are polynomials in z, and C 2 n contain derivatives of the functions f ( y ) and g ( y ) , evaluated at y 0 . In particular, one finds
C 2 = - f 2 4 f 0 g 2 + f 1 g 3 8 f 0 g 2 2 + g 4 32 g 2 2 - 5 g 3 2 192 g 2 3 ,
where f i = i y i f ( y ) | y = y 0 and g i = i y i g ( y ) | y = y 0 . While higher orders can be determined, we here will use the expansion only up to order 1 / A .

5. Kullback-Leibler Entropy Time Dependence by Direct Computational Analysis

In [43], on which this paper builds, it was shown that the Kullback and Tsallis relative entropy might serve to order solutions of the space-fractional diffusion Equation (1) at a particular time. Here, we show that this property extends over time. One complication dealt with previously is that K ( P α , P D ) does not exist, due to the fat tails of P α . Thus, the focus was on K ( P D , P α ) , which does exist.
The Kullback is given by
K ( P D , P α ) = - P D ( x , t ) ln P D ( x , t ) P α ( x , t ) d x = - P D ( x , t ) ln P D ( x , t ) d x - - P D ( x , t ) ln P α ( x , t ) d x (27) = - 1 / 2 ( 1 + ln 4 π D t ) + K ( P D , P α )
where K ( P D , P α ) is known as the cross entropy.
Figure 1. The Kullback-Leibler entropy, K ( P D , P α ) , is plotted over α for different times t. One can see that for all times, K ( P D , P α ) exhibits a monotonic decreasing behavior, thus confirming the bridge ordering property of K ( P D , P α ) .
Figure 1. The Kullback-Leibler entropy, K ( P D , P α ) , is plotted over α for different times t. One can see that for all times, K ( P D , P α ) exhibits a monotonic decreasing behavior, thus confirming the bridge ordering property of K ( P D , P α ) .
Entropy 15 02989 g001
The previous analysis of K ( P D , P α ) at t = 1 showed that the Kullback was compatible with the bridge ordering. However, here in Figure 1, we show K ( P D , P α ) as a function of α for a wide range of different times ( t = 10 0 , 10 2 , 10 4 , 10 6 ). The label DCA stands for “Direct Computational Analysis.” For the analysis, K ( P D , P α ) is obtained by a numerical scheme that makes use of the fact that the tail behavior of the stable distributions is known analytically. Here and in all other figures, we set D = 1 . As for α 1 , P α approaches a δ-distribution. Thus, our direct computations were restricted in practice to α [ 1 . 01 , 2 ] .
In Figure 1, for all cases, K ( P D , P α ) falls off quickly for α close to one, but more slowly near α = 2 . The monotonic relation between K ( P D , P α ) and α confirms that the bridge ordering at t = 1 is maintained across a wide interval of times. Note here that the graph of K ( P D , P α ) for t = 1 crosses those at later times. Thus, while the graph shows curves monotonic in α for fixed t, they are not generally monotonic in t for fixed α.

6. Kullback-Leibler Entropy for Long Times

Though ordering is preserved for a wide range of times, that does not mean that the ordering is preserved for long times. Thus the long-time behavior of K ( P D , P α ) is deduced in the following.
The cross entropy from Equation (27) becomes
K ( P D , P α ) = - - 1 2 π σ e - x 2 2 σ 2 ln S x α , 1 , γ , 0 ; 1 d x (28) = ln γ - - 1 2 π σ e - x 2 2 σ 2 ln S α x γ d x
where we have made use of Equation (7) and the normalization property of the Gaussian. Using the substitution y = x / γ we then get
K ( P D , P α ) = ln γ - γ 2 π σ - e - y 2 γ 2 2 σ 2 ln S α y d y
Note that due to γ = ( D α t ) ( 1 / α ) , σ = 2 D t
σ γ = 2 D t ( D α t ) ( 1 / α ) = 2 D D α ( 1 / α ) t 1 / 2 - 1 / α
is an α-dependent quantity, where the exponent of t is always negative for 1 < α < 2 . It follows that for t , the inverse ratio γ σ will diverge. For large t, it will be large too, and thus, we can now make use of the saddle point method.
Using the results of Section 4 together with the following definitions
A = γ σ 2
f ( y ) = ln S α y
g ( y ) = - y 2 2
and noting that the maximum of g ( y ) is at y 0 = 0 , we obtain an asymptotic expansion for the cross entropy
K ( P D , P α ) = ln γ - ln S α 0 2 π σ γ 2 π γ σ 2 1 + n = 1 C 2 n A n
= ln γ - ln S α 0 1 + C 2 A + O 1 A 2
Combining Equations (27) and (35), we conclude
K ( P D , P α ) = - 1 2 1 + ln 4 π D t + ln ( D α t ) α - ln S α 0 1 + C 2 A + O 1 A 2
K 1 + K 2 ( t ) + 1 α - 1 2 ln t
where K 1 is a constant in time, and for 1 < α < 2 , the factor multiplying ln t is positive. K 2 ( t ) decays with time, and thus, for large times, K ( P D , P α ) will diverge.
The constant K 1 and the time-dependent K 2 ( t ) can be evaluated further making use of the properties of the stable distribution. We note that C 2 simplifies considerably as d ν g ( y ) / d y ν = 0 for ν > 2 . Using
C 2 = 1 4 S α 0 S α 2 ( 0 ) S α 0 - S α 1 ( 0 ) S α 0 2
we find
K 1 = 1 α ln D - 1 2 ln 4 π D - ln Γ 1 α π sin π α - 1 2
and
K 2 ( t ) = ( D t ) 1 - 2 α 2 sin π α ( α Γ 4 + α 2 Γ 3 2 ) sin 3 π α + α 2 Γ 3 2 sin π α
Figure 2. A comparison between the direct numerical calculation of the Kullback-Leibler entropy, K ( P D , P α ) (DCA), the saddle point method of the zeroth order (SP0) and the saddle point method of first order (SP1) is shown over logarithmic time t for two different values of α. One observes that the approximations approach the DCA data points for large times and fit the data quite well already for t > 1 .
Figure 2. A comparison between the direct numerical calculation of the Kullback-Leibler entropy, K ( P D , P α ) (DCA), the saddle point method of the zeroth order (SP0) and the saddle point method of first order (SP1) is shown over logarithmic time t for two different values of α. One observes that the approximations approach the DCA data points for large times and fit the data quite well already for t > 1 .
Entropy 15 02989 g002
In Figure 2, a comparison between the numerical evaluation of K ( P D , P α ) (DCA) and its approximations SP0 and SP1 based on the saddle point method is displayed. Here, SP0 is the approximation excluding terms of O 1 A , while SP1 excludes terms of O 1 A 2 . One sees easily how the approximations approach the data points obtained by direct numerical calculation for large times. Especially for α = 1 . 7 , that the quality increases by using the higher order SP1 is obvious. In that case, SP1 covers nearly all times t > 1 . This is not so for α = 1 . 3 , where one sees a sizable deviation for t < 10 . However, all of the asymptotic forms produce good agreement with DCA values for t > 10 6 .
While the long time behavior is thus understood, the short time behavior shows a surprising feature. For short times, K ( P D , P α ) shows an initial decay until a minimum is reached. Only then does the Kullback start to grow and approach its long time behavior.
In order to understand this effect, the sequence of graphs in Figure 3 shows how the distribution P D and ln P α change in time. The plots are for t = 10 0 , 10 2 , 10 4 , 10 6 , when α = 1 . 3 . We see that the Gaussian P D acts like a window that suppresses ln P α , where P D is exponentially small. For small times, ln P α varies considerably in that window, and thus, it cannot be approximated reasonably by its value and its first two derivatives at the peak position of the Gaussian. This changes as the time increases. ln P α becomes flatter and flatter, and for t = 10 6 , the approximation works very well.
Figure 3. Four plots of P D and ln P α over x are given for t = 1 (a), t = 10 2 (b), t = 10 4 (c) and t 1 = 10 6 (d) for α = 1 . 3 . It can be seen that within the width of P D , the distribution ln P α becomes flatter with increasing time. Thus, ln P α can be approximated well by its function value and its first two derivatives.
Figure 3. Four plots of P D and ln P α over x are given for t = 1 (a), t = 10 2 (b), t = 10 4 (c) and t 1 = 10 6 (d) for α = 1 . 3 . It can be seen that within the width of P D , the distribution ln P α becomes flatter with increasing time. Thus, ln P α can be approximated well by its function value and its first two derivatives.
Entropy 15 02989 g003
Based on the asymptotic time behavior of K ( P D , P α ) , we can now see how K ( P D , P α ) changes with α at a given time t. We do so by taking the derivative of the asymptotic form of K ( P D , P α ) with respect to α
α K ( P D , P α ) ( t ) α K 1 + α K 2 ( t ) - 1 α 2 ln t
For the different terms in Equation (41), we get
α K 1 = 1 α + 1 α 2 Ψ 1 + π cot π α - ln D
α K 2 ( t ) = ( D t ) ( 1 - 2 α ) 2 F 1 α + F 2 α 2 + 2 K 2 ( t ) α 2 ln t
where
F 1 Γ 4 sin 3 π α + 4 Γ 3 2 cos π α sin 2 π α
F 2 Γ 4 sin 3 π α Ψ 1 - 4 Ψ 4 + π cot π α + 3 π cot 3 π α + ln D (45) + 2 Γ 3 2 sin 2 π α cos π α 2 Ψ 1 - 6 Ψ 3 + π cot π α + ln D + 2 Γ 3 2 cos π α π - 3 π cos 2 π α
Inserting Equations (42)–(45) into Equation (41) and collecting all time-dependent terms leads to
α K ( P D , P α ) ( t ) = 2 K 2 ( t ) - 1 α 2 ln t (46) + F 1 α + F 2 α 2 ( D t ) ( 1 - 2 α ) 2 + 1 α + Ψ 1 + π cot π α - ln D α 2
For large times, this simplifies to:
α K ( P D , P α ) ( t ) 2 K 2 ( t ) - 1 α 2 ln t - ln t α 2 < 0 ( t )
From Equation (40), we see that | K 2 ( t ) | 0 in the long-time limit. Thus, the Kullback is a monotonic decreasing function of α in the large time limit, ensuring that it retains its ordering property in α for large times. This result anchors the ordering property at infinity, already established in the preceding sections for finite times.

7. The Tsallis Relative Entropy Time Dependence by Direct Computational Analysis

The Tsallis relative entropy provides another means to establish an ordering between P D and P α . Contrary to the Kullback case, both T q ( P α , P D ) and T q ( P D , P α ) exist. In [43], we showed that these two Tsallis relative entropies are defined for 0 q < 1 . Based on this insight, one finds an interesting relation between the two
T q ( P α , P D ) = 1 q - 1 - P α q ( x , t ) P D 1 - q ( x , t ) d x - 1
and
T 1 - q ( P D , P α ) = 1 ( 1 - q ) - 1 - P D 1 - q ( x , t ) P α q ( x , t ) d x - 1
from which we obtain
( - q ) T 1 - q ( P D , P α ) = ( q - 1 ) T q ( P α , P D )
It thus suffices to analyze the time behavior of one of the two. Here, we choose T q ( P D , P α ) with
T q ( P D , P α ) = 1 q - 1 - e - x 2 2 σ 2 2 π σ q S x α , 1 , γ , 0 ; 1 1 - q d x - 1 (51) = 1 q - 1 - γ q - 1 ( 2 π σ ) q exp ( - q x 2 2 σ 2 ) S α x γ q - 1 d x - 1
T q ( P D , P α ) was determined for four different times ( t = 10 0 , 10 2 , 10 4 , 10 6 ) by direct computation analysis. As in the Kullback case, the data were obtained by a numerical integration procedure, which for the Tsallis case also requires a precise treatment of the known fat tail behavior of the stable distributions. Figure 4 depicts the results for q = 0 . 5 . One can see that T q ( P α , P D ) decays monotonically with α, thus showing that it provides an ordering compatible with the bridge ordering for the times presented. This figure also seems to indicate that T q ( P α , P D ) increases monotonically with time for fixed α. That, however, is not true. For short times, the DCA data in Figure 6 shows that T 0 . 5 ( P D , P α ) does not increase monotonically in time. Instead, it decreases first and then, around t = 1 , starts to increase.
Figure 4. For the case of q = 0 . 5 , the Tsallis relative entropy, T 0 . 5 ( P D , P α ) , is given over α for different times ( t = 10 0 , 10 2 , 10 4 , 10 6 ) . One can observe that with increasing time, the monotonic decreasing behavior is preserved, and thus, the bridge ordering property of T q ( P D , P α ) is confirmed.
Figure 4. For the case of q = 0 . 5 , the Tsallis relative entropy, T 0 . 5 ( P D , P α ) , is given over α for different times ( t = 10 0 , 10 2 , 10 4 , 10 6 ) . One can observe that with increasing time, the monotonic decreasing behavior is preserved, and thus, the bridge ordering property of T q ( P D , P α ) is confirmed.
Entropy 15 02989 g004

8. Tsallis Relative Entropy for Long Times

As in the Kullback case, we want to analyze the time dependence further to show that the bridge ordering is preserved at least for large enough times. The method used is again a saddle point approximation. Using the substitution y = x / γ , we get
T q ( P D , P α ) = 1 q - 1 γ q 2 π σ q - e - q y 2 γ 2 2 σ 2 S α y 1 - q d y - 1
The ratio γ σ is the same as in the Kullback section and thus, diverges for large times. Based on Equation (25), we make the following definitions
A = γ σ 2
f ( y ) = S α y 1 - q
g ( y ) = - q y 2 2
Again, the maximum of g ( y ) is at y 0 = 0 , and by utilizing Equation (25), we find up to O ( 1 / A 2 )
T q ( P D , P α ) = 1 q - 1 1 q σ γ 2 π S α 0 1 - q 1 + C 2 A + O 1 A 2 - 1 q - 1
Here, C 2 depends on q and takes the form
C 2 , q = 1 - q 4 q S α 2 ( 0 ) S α 0 - q S α 1 ( 0 ) S α 0 2
Collecting terms, we find the asymptotic time behavior for the Tsallis relative entropy for large t
T q ( P D , P α ) 2 π S α 0 1 - q q ( q - 1 ) σ γ 1 - q + C 2 , q σ γ 3 - q - 1 q - 1
Unlike the Kullback case, this asymptotic form has a finite limit for t . The limit is 1 / ( 1 - q ) , because lim t σ / γ = 0 , and both exponents are positive, due to 0 q < 1 . This can be seen in the unapproximated form, too
T q ( P D , P α ) = 1 q - 1 γ q 2 π σ q - e - q y 2 γ 2 2 σ 2 S α y 1 - q d y - 1 (59) = 1 q - 1 ( 2 π ) 1 - q 2 σ γ 1 - q - 1 2 π Σ 2 e - y 2 2 Σ 2 S α y 1 - q d y - 1
where Σ = 1 q σ γ . We see in the t limit that the Gaussian becomes a δ-function. The δ-function picks the value of S α y 1 - q at y = 0 , making the integral finite. The prefactor is decaying towards zero, and thus, the approach of T q ( P D , P α ) to the limiting value 1 / ( 1 - q ) is confirmed.
Figure 5. A comparison of direct numerical calculation of the Tsallis relative entropy, T q ( P D , P 1 . 3 ) (DCA), and the saddle point method of the first order (SP1) is given over logarithmic time t for different values of q.
Figure 5. A comparison of direct numerical calculation of the Tsallis relative entropy, T q ( P D , P 1 . 3 ) (DCA), and the saddle point method of the first order (SP1) is given over logarithmic time t for different values of q.
Entropy 15 02989 g005
Figure 6. A comparison of direct numerical calculation of the Tsallis relative entropy, T 0 . 5 ( P D , P α ) (DCA), and the saddle point method of the first order (SP1) is given over logarithmic time t for different values of α. Note that T q ( P D , P α ) is not monotonic in time for larger values of α, but has a clear minimum around t = 1 .
Figure 6. A comparison of direct numerical calculation of the Tsallis relative entropy, T 0 . 5 ( P D , P α ) (DCA), and the saddle point method of the first order (SP1) is given over logarithmic time t for different values of α. Note that T q ( P D , P α ) is not monotonic in time for larger values of α, but has a clear minimum around t = 1 .
Entropy 15 02989 g006
The quality of the saddle point approximation is shown in Figure 5 and Figure 6. We compare the saddle point approximations SP1 of T q ( P D , P α ) with the numerically obtained values (DCA) as a function of time. The approach of the approximation to the numerical data depends highly on the choice of parameters α and q. For small values of α and q, the approximation fits the numerical data relatively fast, as shown in Figure 5. On the other hand, Figure 6 shows that the approximations work less well for larger α.
It can also be seen in Figure 5 and Figure 6 that for times t < 1 , the asymptotic approach exhibits different curve shapes depending on the choice of α and q. This can be understood by looking on Equation (58) after applying Equations (9)–(13) and Equation (57), which leads to
T q ( P D , P α ) 2 π ( - cos α π 2 ) 1 α S α 0 1 - q q ( q - 1 ) ( D t ) 1 2 - 1 α ( 1 - q ) 1 + H 2 , q · ( D t ) 1 - 2 α - 1 q - 1
with
H 2 , q = 2 α ( q - 1 ) 4 q Γ 4 sin 3 π α ( - cos α π 2 ) 1 α + q α Γ 3 2 sin 2 2 π α
The first factor of Equation (60) is always positive, but because of sin 3 π α , H 2 , q can change sign. This depends on the parameters α and q, and is insignificant for large t.
Finally, the asymptotic form shows that T q ( P D , P α ) decreases monotonically with α for large times because
α T q G 1 , q + G 2 , q ln ( t ) t ( 1 2 - 1 α ) ( 1 - q ) + G 3 , q + G 4 , q ln ( t ) t ( 1 2 - 1 α ) ( 3 - q )
where the G i , q are constants in t. On 1 < α < 2 and 0 < q < 1 , the inequalities
1 2 - 1 α ( 3 - q ) < 1 2 - 1 α ( 1 - q ) < 0
hold. Thus, α T q will be dominated by the G 2 , q term in Equation (62) for long times, as it decays most slowly. Since
G 2 , q = - 2 π S α 0 1 - q q α 2 2 D D α 1 α 1 - q < 0
then
α T q ( P D , P α ) G 2 , q ln ( t ) t ( 1 2 - 1 α ) ( 1 - q ) < 0
for large enough times. Thus, like the Kullback case, the Tsallis relative entropy remains monotonically decreasing in the infinite time limit, confirming its ordering property at both finite times and infinity. However, while signs remain fixed, the Tsallis case differs from the Kullback case in that the α-slope goes to zero in the long-time limit, while that slope diverges in the Kullback case. Moreover, the Kullback also diverges with t, while T q ( P D , P α ) approaches a constant, 1 / ( 1 - q ) , for large t.

9. Summary

This paper is a sequel to [43], in which relative entropies were introduced to address the physical issue of irreversibility in the mathematical context of a one-parameter bridging regime between the diffusion and wave equations. This peculiar regime represents a definitive family of processes between diffusion and waves representing a direct formal mathematical test of our understanding of the differences between the reversible and irreversible. Not only does this regime exhibit paradoxical entropy production behavior (i.e., entropy production rates increase toward the wave limit), but entropy has a maximum in the interval, which has the consequence that one cannot determine which system is relatively “closer or further” from the pure diffusion limit by such means. However, the preceding paper showed that relative entropies could do what neither entropy nor entropy production rates could do: provide an intuitively sensible ordering, where diffusion stands at a maximum among all other bridge processes which become well ordered, due to a monotonically decreasing relationship with the parameter α.
The goal of this paper was to extend the treatment in [43] by addressing the issue of time. This paper asked whether the well-defined ordering produced by relative entropies was just a feature of a single time, or if the monotonic structure persist over all time. The relevant formalism was briefly set up in two parts. First, the necessary features of stable distributions, which are solutions to the dynamics in Equation (1), were presented. Then, the appropriate asymptotic representations of integrals in terms of the saddle point method were put into place. From this foundation, the two cases of the Kullback-Leibler and Tsallis relative entropies were explored for long times. Direct numerical computation showed that both types of relative entropies preserved the α-ordering for a wide range of time-scales. Then, asymptotic methods were compared to these direct computations, confirming that they agree well with each other for long times. It was then shown that the ordering did persist in the long-time limit in both cases. However, the Tsallis and Kullback-Leibler cases differ in that the former reached a finite limit, 1 / ( 1 - q ) , and its α-derivative vanished in the limit of long times, while the latter diverged in both relative entropy and its α-derivative.
Time evolution of relative entropies under a single process, particularly the Kullback-Leibler entropy, is well known to be connected to H-theorems. Thus, it would not be surprising if the time evolution of probability densities might make one think, at least initially, in terms of H-theorem questions; so, it is worth noting parenthetically that this is not an H-theorem scenario. The time evolution in this context is between pairs of processes belonging to the bridging families and not to a single process. There is no reason to expect that pairs of densities would relax to each other when subject to different processes. However, it brings back the issue of different internal quicknesses [40,41,42], although now articulated from the standpoint of relative entropies. Moreover, this paper focuses on the ordering property, which ultimately is about the differences between the densities. Nonetheless, we could consider nearby processes with infinitesimal differences in families and ask whether an H-theorem-like result might hold then, but that is a topic for future work.

Acknowledgements

Financial support by the German Research Fundation (DFG) in form of Sachmittelbeihilfe (HO901/7-1 and HO901/7-2) that was part of the project, PAK 97, is gratefully acknowledged by Karl Heinz Hoffmann, Frank Boldt and Janett Prehl.
The publication cost of this article were funded by the German Research Foundation/DFG (Geschftszeichen INST 270/219-1) and the Technische Universitt Chemnitz in the funding program Open Access Publishing.

Conflict of Interest

The authors declare no conflict of interest.

References

  1. Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
  2. Schlögl, F. Probability and Heat; Vieweg: Braunschweig, Germany, 1989. [Google Scholar]
  3. Tsallis, C. Generalized entropy-based criterion for consistent testing. Phys. Rev. E 1998, 58, 1442–1445. [Google Scholar] [CrossRef]
  4. Borland, L.; Plastino, A.R.; Tsallis, C. Information gain within nonextensive thermostatics. J. Math. Phys. 1998, 39, 6490–6501. [Google Scholar] [CrossRef]
  5. Borland, L.; Plastino, A.R.; Tsallis, C. Erratum: “Information gain within generalized thermostatistics” [J. Math. Phys. 39, 6490 (1998)]. J. Math. Phys. 1999, 40, 2196. [Google Scholar] [CrossRef]
  6. Furuichi, S.; Yanagi, K.; Kuriyama, K. Fundamental properties of Tsallis relative entropy. J. Math. Phys. 2004, 45, 4868–4877. [Google Scholar]
  7. Yamano, T. A possible extension of Shannon’s information theory. Entropy 2001, 3, 280–292. [Google Scholar] [CrossRef]
  8. Yu, S.; Mehta, P.G. The Kullback-Leibler Rate Metric for Comparing Dynamical Systems. In Proceedings of IEEE Conference on Decision and Control, joint with 28th Chinese Control Conference, Shanghai, China, 16–18 December 2009; pp. 8363–8368.
  9. Anastasiadis, A. Special issue: Tsallis entropy. Entropy 2012, 14, 174–176. [Google Scholar] [CrossRef]
  10. Zhang, Y.; Wu, L. Optimal multi-level thresholding based on maximum Tsallis entropy via an artificial bee colony approach. Entropy 2011, 13, 841–859. [Google Scholar] [CrossRef]
  11. Nelson, K.P.; Scannell, B.J.; Landau, H. A risk profile for information fusion algorithms. Entropy 2011, 13, 1518–1532. [Google Scholar] [CrossRef]
  12. Balasis, G.; Daglis, I.A.; Papadimitriou, C.; Anastasiadis, A.; Sandberg, I.; Eftaxias, K. Quantifying dynamical complexity of magnetic storms and solar flares via nonextensive Tsallis entropy. Entropy 2011, 13, 1865–1881. [Google Scholar] [CrossRef]
  13. Telesca, L. Tsallis-based nonextensive aAnalysis of the Southern California seismicity. Entropy 2011, 13, 1267–1280. [Google Scholar] [CrossRef]
  14. Ribeiro, M.S.; Nobre, F.D.; Curado, E.M.F. Classes of n-dimensional nonlinear Fokker-Planck equations associated to Tsallis entropy. Entropy 2011, 13, 1928–1944. [Google Scholar] [CrossRef]
  15. Weiss, M.; Elsner, M.; Kartberg, F.; Nilsson, T. Anomalous subdiffusion is a measure for cytoplasmic crowding in living cells. Biophys. J. 2004, 87, 3518–3524. [Google Scholar] [CrossRef] [PubMed]
  16. Banks, D.S.; Fradin, C. Anomalous diffusion of proteins due to molecular crowding. Biophys. J. 2005, 89, 2960–2971. [Google Scholar] [PubMed]
  17. Yuste, S.B.; Lindberg, K. Subdiffusion-limited reactions. Chem. Phys. 2002, 284, 169–180. [Google Scholar] [CrossRef]
  18. Havlin, S.; Ben-Avraham, D. Diffusion in disordered media. Adv. Phys. 1987, 36, 695–798. [Google Scholar] [CrossRef]
  19. Schirmacher, W.; Perm, M.; Suck, J.B.; Heidemann, A. Anomalous diffusion of hydrogen in amorphous metals. Europhys. Lett. 1990, 13, 523–529. [Google Scholar]
  20. Solomon, T.R.; Weeks, E.R.; Swinney, H.L. Observation of anomalous diffusion and Lévy flights in a two-dimensional rotating flow. Phys. Rev. Lett. 1993, 71, 3975–3978. [Google Scholar] [CrossRef] [PubMed]
  21. Hansen, A.E.; Jullien, M.C.; Paret, J.; Tabeling, P. Dispersion in Freely Decaying and Forced 2D Turbulence. In Anomalous Diffusion: From Basics to Applications; Pekalski, A., Kutner, R., Eds.; Lecture notes in physics; Springer-Verlag: Berlin, Germany, 1999; pp. 186–196. [Google Scholar]
  22. Bénichou, O.; Coppey, M.; Moreau, M.; Suet, P.H.; Voituriez, R. Optimal search strategies for hidden targets. Phys. Rev. Lett. 2005, 94, 198101. [Google Scholar] [CrossRef] [PubMed]
  23. Bénichou, O.; Loverdo, C. Moreau, M.; Voituriez, R. Two-dimensional intermittent search processes: An alternative to Lévy flight strategies. Phys. Rev. E 2006, 74, 020102. [Google Scholar] [CrossRef]
  24. Shlesinger, M.F. Mathematical physics: Search research. Nature 2006, 443, 281–282. [Google Scholar] [CrossRef] [PubMed]
  25. Barthelemy, P.; Bertolotti, J.; Wiersma, D.S. A Lévy flight for light. Nature 2008, 453, 495–498. [Google Scholar] [CrossRef] [PubMed]
  26. Malacarne, L.C.; Mendes, R.S.; Pedron, I.T.; Lenzi, E.K. Nonlinear equation for anomalous diffusion: Unified power-law and stretched exponential exact solution. Phys. Rev. E 2001, 63, 030101. [Google Scholar] [CrossRef]
  27. Pedron, I.T.; Mendes, R.S.; Buratta, T.J.; Malacarne, L.C.; Lenzi, E.K. Logarithmic diffusion and porous media equations: A unified description. Phys. Rev. E 2005, 72, 031106. [Google Scholar] [CrossRef]
  28. Schneider, W.R.; Wyss, W. Fractional diffusion and wave equation. J. Math. Phys. 1989, 30, 134–144. [Google Scholar] [CrossRef]
  29. Giona, M.; Roman, H.E. Fractional diffusion equation for transport phenomena in random media. Physica A 1992, 185, 87–97. [Google Scholar] [CrossRef]
  30. Metzler, R.; Glöckle, W.G.; Nonnenmacher, T.F. Fractional model equation for anomalous diffusion. Physica A 1994, 211, 13–24. [Google Scholar] [CrossRef]
  31. Metzler, R.; Klafter, J. The random walk’s guide to anomalous diffusion: A fractional dynamics approach. Phys. Rep. 2000, 339, 1–77. [Google Scholar] [CrossRef]
  32. Hilfer, R. Applications of Fractional Calculus in Physics; World Scientific Publishing: Singapore, Singapore, 2000. [Google Scholar]
  33. del Castillo-Negrete, D. Fractional Diffusional Models of Anomalous Transport. In Anomalous Transport: Foundations and Applications, 1st ed.; Klages, R., Radons, G., Sokolov, I.M., Eds.; Wiley-VCH: Weinheim, Germany, 2008; chapter 6; pp. 163–212. [Google Scholar]
  34. Schulzky, C.; Essex, C.; Davison, M.; Franz, A.; Hoffmann, K.H. The similarity group and anomalous diffusion equations. J. Phys. A: Math. Gen. 2000, 33, 5501–5511. [Google Scholar] [CrossRef]
  35. Fischer, A.; Seeger, S.; Hoffmann, K.H.; Essex, C.; Davison, M. Modeling anomalous superdiffusion. J. Phys. A: Math. Gen. 2007, 40, 11441–11452. [Google Scholar] [CrossRef]
  36. Hoffmann, K.H.; Prehl, J. Anomalous Transport in Disordered Fractals. In Anomalous Transport: Foundations and Applications, 1st ed.; Klages, R., Radons, G., Sokolov, I.M., Eds.; Wiley-VCH: Weinheim, Germany, 2008. [Google Scholar]
  37. Li, X.; Essex, C.; Davison, M.; Hoffmann, K.H.; Schulzky, C. Fractional diffusion, irreversibility and entropy. J. Non-Equilib. Thermodyn. 2003, 28, 279–291. [Google Scholar] [CrossRef]
  38. Dybiec, B.; Gudowska-Nowak, E.; Hänggi, P. Lévy-Brownian motion on finite intervals: Mean first passage time analysis. Phys. Rev. E 2006, 73, 046104. [Google Scholar] [CrossRef]
  39. Prehl, J.; Essex, C.; Hoffmann, K.H. The superdiffusion entropy production paradox in the space-fractional case for extended entropies. Physica A 2010, 389, 215–224. [Google Scholar] [CrossRef]
  40. Hoffmann, K.H.; Essex, C.; Schulzky, C. Fractional diffusion and entropy production. J. Non-Equilib. Thermodyn. 1998, 23, 166–175. [Google Scholar] [CrossRef]
  41. Essex, C.; Schulzky, C.; Franz, A.; Hoffmann, K.H. Tsallis and Rényi entropies in fractional diffusion and entropy production. Physica A 2000, 284, 299–308. [Google Scholar] [CrossRef]
  42. Hoffmann, K.H.; Essex, C.; Prehl, J. A unified approach to resolving the entropy production paradox. J. Non-Equilib. Thermodyn. 2012, 37, 393–412. [Google Scholar] [CrossRef]
  43. Prehl, J.; Essex, C.; Hoffmann, K.H. Tsallis relative entropy and anomalous diffusion. Entropy 2012, 14, 701–716. [Google Scholar] [CrossRef]
  44. Nolan, J.P. Stable Distributions—Models for Heavy Tailed Data; Birkhäuser: Boston, MA, USA, 2012; in progress; Chapter 1, Available online: http://academic2.american.edu/ jpnolan/stable/stable.html/ (accessed on 30 March 2012)Chapter 3, private communication.
  45. Bender, C.M.; Orszag, S.A. Advanced Mathematical Methods for Scientists and Engineers: Asymptotic Methods and Perturbation Theory; McGraw Hill: New York, NY, USA, 1978. [Google Scholar]
  46. Daniels, H.E. Saddleppoint approximation in statistics. Ann. Math. Stat. 1954, 25, 631–650. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Prehl, J.; Boldt, F.; Essex, C.; Hoffmann, K.H. Time Evolution of Relative Entropies for Anomalous Diffusion. Entropy 2013, 15, 2989-3006. https://doi.org/10.3390/e15082989

AMA Style

Prehl J, Boldt F, Essex C, Hoffmann KH. Time Evolution of Relative Entropies for Anomalous Diffusion. Entropy. 2013; 15(8):2989-3006. https://doi.org/10.3390/e15082989

Chicago/Turabian Style

Prehl, Janett, Frank Boldt, Christopher Essex, and Karl Heinz Hoffmann. 2013. "Time Evolution of Relative Entropies for Anomalous Diffusion" Entropy 15, no. 8: 2989-3006. https://doi.org/10.3390/e15082989

APA Style

Prehl, J., Boldt, F., Essex, C., & Hoffmann, K. H. (2013). Time Evolution of Relative Entropies for Anomalous Diffusion. Entropy, 15(8), 2989-3006. https://doi.org/10.3390/e15082989

Article Metrics

Back to TopTop