Next Article in Journal / Special Issue
Spherical Minimum Description Length
Previous Article in Journal
Large Deviations Properties of Maximum Entropy Markov Chains from Spike Trains
Previous Article in Special Issue
A New Underwater Acoustic Signal Denoising Technique Based on CEEMDAN, Mutual Information, Permutation Entropy, and Wavelet Threshold Denoising
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Investigating Information Geometry in Classical and Quantum Systems through Information Length

School of Mathematics and Statistics, University of Sheffield, Sheffield S3 7RH, UK
Entropy 2018, 20(8), 574; https://doi.org/10.3390/e20080574
Submission received: 19 July 2018 / Revised: 1 August 2018 / Accepted: 1 August 2018 / Published: 3 August 2018
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)

Abstract

:
Stochastic processes are ubiquitous in nature and laboratories, and play a major role across traditional disciplinary boundaries. These stochastic processes are described by different variables and are thus very system-specific. In order to elucidate underlying principles governing different phenomena, it is extremely valuable to utilise a mathematical tool that is not specific to a particular system. We provide such a tool based on information geometry by quantifying the similarity and disparity between Probability Density Functions (PDFs) by a metric such that the distance between two PDFs increases with the disparity between them. Specifically, we invoke the information length L ( t ) to quantify information change associated with a time-dependent PDF that depends on time. L ( t ) is uniquely defined as a function of time for a given initial condition. We demonstrate the utility of L ( t ) in understanding information change and attractor structure in classical and quantum systems.

1. Introduction

Stochastic processes are ubiquitous in nature and laboratories, and play a major role across traditional disciplinary boundaries. Due to the randomness associated with stochasticity, the evolution of these systems is not deterministic but instead probabilistic. Furthermore, these stochastic processes are described by different variables and are thus very system-specific. This system-specificity makes it impossible to make comparison among different processes. In order to understand universality or underlying principles governing different phenomena, it is extremely valuable to utilise a mathematical tool that is not specific to a particular system. This is especially indispensable given the diversity of stochastic processes and the growing amount of data.
Information geometry provides a powerful methodology to achieve this goal. Specifically, the similarity and disparity between Probability Density Functions (PDFs) is quantified by a metric [1] such that the distance between two PDFs increases with the disparity between them. This was the very idea behind a statistical distance [2] based on the Fisher (or Fisher–Rao) metric [3] which represents the total number of statistically different states between two PDFs in Hilbert space for quantum systems. The analysis in [2] was extended to impure (mixed-state) quantum systems using a density operator by [4]. Other related work includes [5,6,7,8,9,10,11,12]. For Gaussian PDFs, a statistically different state is attained when the physical distance exceeds the resolution set by the uncertainty (PDF width).
This paper presents a method to define such distance for a PDF which changes continuously in time, as is often the case of non-equilibrium systems. Specifically, we invoke the information length L ( t ) according to the total number of statistically different states that a system evolves through in time. L ( t ) is uniquely defined as a function of time for a given initial condition. We demonstrate the utility of L ( t ) in understanding information change and attractor structure in classical and quantum systems [13,14,15,16,17,18,19,20,21].
This paper is structured as follows: Section 2 discusses information length and Section 3 investigates attractor structure. Section 4 and Section 5 present the analysis of classical music and quantum systems, respectively. Conclusions are found in Section 6.

2. Information Length

Intuitively, we define the information length L by computing how quickly information changes in time and then measuring the clock time based on that time scale. Specifically, the time-scale of information change τ can be computed by the correlation time of a time-dependent PDF, say p ( x , t ) , as follows.
1 τ 2 = d x 1 p ( x , t ) p ( x , t ) t 2 .
From Equation (1), we can see that the dimension of τ = τ ( t ) is time and serves as a dynamical time unit for information change. L ( t ) is the total information change between time 0 and t:
L ( t ) = 0 t d t 1 τ ( t 1 ) = 0 t d t 1 d x 1 p ( x , t 1 ) p ( x , t 1 ) t 1 2 .
In principle, τ ( t ) in Equation (1) can depend on time, so we need the integral for L in Equation (2). To make an analogy, we can consider an oscillator with a period τ = 2 s. Then, within the clock time 10 s, there are five oscillations. When the period τ is changing with time, we need an integration of d t / τ over the time interval.
We now recall how τ ( t ) and L ( t ) in Equations (1) and (2) are related to the relative entropy (Kullback–Leibler divergence) [15,16]. We consider two nearby PDFs p 1 = p ( x , t 1 ) and p 2 = p ( x , t 2 ) at time t = t 1 and t 2 and the limit of a very small δ t = t 2 - t 1 to do Taylor expansion of D [ p 1 , p 2 ] = d x p 2 ln ( p 2 / p 1 ) by using
t 1 D [ p 1 , p 2 ] = d x p 2 t 1 p 1 p 1 ,
2 t 1 2 D [ p 1 , p 2 ] = d x p 2 ( t 1 p 1 ) 2 p 1 2 t 1 2 p 1 p 1 ,
t 2 D [ p 1 , p 2 ] = d x t 2 p 2 + t 2 p 2 ln p 2 ln p 1 ,
2 t 2 2 D [ p 1 , p 2 ] = d x t 2 2 p 2 + ( t 2 p 2 ) 2 p 2 + t 2 2 p 2 ln p 2 ln p 1 .
In the limit t 2 t 1 = t ( p 2 p 1 = p ), Equations (3)–(6) give us
lim t 2 t 1 t 1 D [ p 1 , p 2 ] = lim t 2 t 1 t 2 D [ p 1 , p 2 ] = d x t p = 0 , lim t 2 t 1 2 t 1 2 D [ p 1 , p 2 ] = lim t 2 t 1 2 t 2 2 D [ p 1 , p 2 ] = d x ( t p ) 2 p = 1 τ 2 .
Up to O ( ( d t ) 2 ) ( d t = t 2 t 1 ), Equation (7) and D ( p 1 , p 1 ) = 0 lead to
D [ p 1 , p 2 ] = 1 2 d x ( t p ( x , t ) ) 2 p ( x , t ) ( d t ) 2 ,
and thus the infinitesimal distance d l ( t 1 ) between t 1 and t 1 + d t as
d l ( t 1 ) = D [ p 1 , p 2 ] = 1 2 d x ( t 1 p ( x , t 1 ) ) 2 p ( x , t 1 ) d t .
By summing d t ( t i ) for i = 0 , 1 , 2 , , n 1 (where n = t / d t ) in the limit d t 0 , we have
lim d t 0 i = 0 n 1 d l ( i d t ) = lim d t 0 i = 0 n 1 D [ p ( x , i d t ) , p ( x , ( i + 1 ) ] d t 0 t d t 1 d x ( t 1 p ( x , t 1 ) ) 2 p ( x , t 1 ) = L ( t ) ,
where L ( t ) is the information length. Thus, L is related to the sum of infinitesimal relative entropy. It cannot be overemphasised that L is a Lagrangian distance between PDFs at time 0 and t and sensitively depends on the particular path that a system passed through reaching the final state. In contrast, the relative entropy D [ p ( x , 0 ) , p ( x , t ) ] depends only on PDFs at time 0 and t and thus does not tell us about intermediate states between initial and final states.

3. Attractor Structure

Since L ( t ) represents the accumulated change in information (due to the change in PDF) at time t, L ( t ) settles to a constant value L when a PDF reaches its final equilibrium PDF. The smaller L , the smaller number of states that the initial PDF passes through to reach the final equilibrium. Therefore, L provides us with a unique representation of a path-dependent, Lagrangian measure of the distance between a given initial and final PDF. We will utilise this property to map out the attractor structure by considering a narrow initial PDF at a different peak position y 0 and by measuring L against y 0 . We are particularly interested in how the behaviour of L against y 0 depends on whether a system has a stable equilibrium point or is chaotic.

3.1. Linear vs. Cubic Forces

We first consider the case where a system has a stable equilibrium point when there is no stochastic noise and investigate how L is affected by different deterministic forces [15,16]. We consider the following Langevin equation [22] for a variable x:
d x d t = F ( x ) + ξ .
Here, ξ is a short (delta) correlated stochastic noise with the strength D as
ξ ( t ) ξ ( t ) = 2 D δ ( t t ) ,
where the angular brackets denote the average over ξ and ξ = 0 . We consider two types of F, which both have a stable equilibrium point x = 0 ; the first one is the linear force F = γ x ( γ > 0 is the frictional constant) which is the familiar Ornstein–Uhlenbeck (O-U) process, a popular model for a noisy relaxation system (e.g., [23]). The second is the cubic force F = μ x 3 where μ represents the frictional constant. Note that, in these models, the dimensions of γ ( s 1 ) and μ ( s 1 m 2 ) are different.
Equivalent to the Langevin equation governed by Equations (11) and (12) is the Fokker–Planck equation [22]
t p ( x , t ) = x F ( x ) + D x p ( x , t ) .
As an initial PDF, we consider a Gaussian PDF
p ( x 0 , 0 ) = β 0 π e β 0 ( x 0 y 0 ) 2 .
Then, for the O-U process, the PDF remains Gaussian for all time with the following form [15,16]:
p ( x , t ) = β ( t ) π e β ( t ) ( x x ) 2 .
In Equations (14) and (15), x = y 0 e γ t is the mean position and y 0 is its initial value; β 0 is the inverse temperature at t = 0 , which is related to the variance at t = 0 as ( x 0 y 0 ) 2 = 1 2 β 0 = D 0 γ . The fluctuations level (variance) changes with time, with time-dependent β ( t ) given by
( x x ) 2 = 1 2 β ( t ) = D ( 1 e 2 γ t ) γ + e 2 γ t 2 β 0 .
Note that, when D = D 0 , β ( t ) = β 0 = γ 2 D for all t, PDF maintains the same width for all t.
For this Gaussian process, β and x constitute a parameter space on which the distance is defined with the Fisher metric tensor [3] g i j ( i , j = 1 , 2 ) as [16]
g i j = d x 1 p ( x , t ) p z i p z j = 1 2 β 2 0 0 2 β ,
where i , j = 1 , 2 , z 1 = β , z 2 = x . This enables us to recast 1 τ 2 in Equation (1) in terms of g i j as
1 τ 2 = 1 2 β 2 d β d t 2 + 2 β d x d t 2 = g i j d z i d t d z j d t .
The derivation of the first relation in Equation (18) is provided in Appendix A (see Equation (A2)). Using Equations (2) and (18), we can calculate L analytically for this O-U process (see also Appendix A).
In comparison, theoretical analysis can be done only in limiting cases such as small and large times for the cubic process [17,24]. In particular, the stationary PDF for large time is readily obtained as
p ( x ) = 2 β c 1 4 Γ 1 4 e β c x 4 ,
where β c = μ 4 D . For the exact calculation of L ( t ) , Equation (13) is to be solved numerically.
To summarise, due to the restoring forcing F, the equilibrium is given by a PDF around x = 0 , Gaussian for linear force and quartic exponential for cubic force. If we were to pick any point in x, say y 0 , we are curious about how close y 0 is to the equilibrium and how F ( x ) affects it. To determine this, we make a narrow PDF around x = y 0 (see Figure 1) at t = 0 and measure L . The question is how this L depends on y 0 . We repeat the same procedure for the cubic process, as shown in Figure 1, and examine how L depends on y 0 .
L as a function of y 0 is shown for both linear (in red dotted line) and cubic (in blue solid line) processes in Figure 2. In the linear case we can see a clear linear relation between y 0 and L , meaning that the information length preserves the linearity of the system. This linear relationship holds for all D and D 0 . In particular, when D = D 0 , we can show that L = 1 D / γ y 0 by taking the limit of t ( y 0 ) in Equation (A10).
In contrast, for the cubic process, the relation is not linear, and the log-log plot on the right in Figure 2 shows a power-law dependence with the power-law index p. This power-law index p varies between 1.52 and 1.91 and depends on the width ( D 0 1 / 2 ) of initial PDF and stochastic forcing amplitude D, as shown in [16]. This indicates that nonlinear force breaks the linear scaling of geometric structure and changes it to power-law scalings. In either cases here, L has a smooth variation with y 0 with its minimum value at y 0 = 0 since the equilibrium point 0 is stable. This will be compared with the behaviour in chaotic systems in Section 3.2.

3.2. Chaotic Attractor

Section 3.1 demonstrates that the minimum value of L occurs at a stable equilibrium point [15,16]. We now show that in contrast, in the case of a chaotic attractor, the minimum value of L occurs at an unstable point [13]. To this end, we consider a chaotic attractor using a logistic map [13]. The latter is simply given by a rule as to how to update the value x at t + 1 from its previous value at t as follows [25]
x t + 1 = 1 a x t 2 ,
where x = [ 1 , 1 ] and a is a parameter, which controls the stability of the system.
As we are interested in a chaotic attractor, we chose the value a = 2 so that any initial value x 0 evolves to a chaotic attractor given by an invariant density (shown in the right panel of Figure 3). A key question is then whether all values of x 0 are similar as they all evolve to the same invariant density in the long time limit. To address how close a particular point x 0 is to equilibrium, we (i) consider a narrow initial PDF around x 0 at t = 0 , (ii) evolve it until it reaches the equilibrium distribution, (iii) measure the L between initial and final PDF, and (iv) repeat steps (i)–(iii) for many different values x 0 . For example, for x 0 = 0.7 , the initial PDF is shown on the left and final PDF on the right in Figure 3. We show L against x 0 in Figure 4. A striking feature of Figure 4 is an abrupt change in L for a small change in x 0 . This means that the distance between x 0 and the final chaotic attractor depends sensitively on x 0 . This sensitive dependence of L on x ( t = 0 ) means that a small change in the initial condition x 0 causes a large difference in a path that a system evolves through and thus L . This is a good illustration of a chaotic equilibrium and is quite similar to the sensitive dependence of the Lyapunov exponent on the initial condition [25]. That is, our L provides a new methodology to test chaos. Another interesting feature of Figure 4 are several points with small values of L , shown by red circles. In particular, x 0 = 0.5 has the smallest value of L , indicating that the unstable point is closest to the chaotic attractor. That is, an unstable point is most similar to the chaotic attractor and thus minimises L .

4. Music: Can We See the Music?

Our methodology is not system-specific and applicable to any stochastic processes. In particular, given any time-dependent PDFs that are computed from a theory, simulations or from data, we can compute L ( t ) to understand information change. As an example, we apply our theory to music data and discuss information change associated with different pieces of classical music. In particular, we are interested in understanding differences among famous classical music in view of information change. To gain an insight, we used the MIDI file [26], computed time-dependent PDFs and the information length as a function of time [14].
Specifically, the midi file stores a music by the MIDI number according to 12 different music notes (C, C # , D, D # , E, F, F # , G, G # , A, A # , B) and 11 different octaves, with the typical time Δ t between the two adjacent notes of order Δ t 10 3 s. In order to construct a PDF, we specify 129 statistically different states according to the MIDI number and one extra rest state (see Table 1 in [14]) and calculate an instantaneous PDF (see Figure S1 in [14]) from an orchestra music by measuring the frequency (the total number of times) that a particular state is played by all instruments at a given time. Thus, the time-dependent PDFs are defined in discrete time steps with Δ t 10 3 , and the discrete version of L (Equation (7) in [14]) is used in numerical computation. Figure 5 shows L ( t ) against time for Vivaldi’s Summer, Mozart, Tchaikovsky’s 1812 Overture, and Beethoven’s Ninth Symphony 2nd movement. We observe the difference among different composers, in particular, more classical, more subtle in information change. We then look at the rate of information change against time for different music by calculating the gradient of L ( d L d t = 1 / τ ) in Figure 6, which also manifests the most subtle change in information length for Vivaldi and Mozart.

5. Quantum Systems

Finally, we examine quantum effects on information length [21]. In Quantum Mechanics (QM), the uncertainty relation Δ x Δ P 2 between position x and momentum P gives us an effect quite similar to a stochastic noise. We note here that we are using P to denote the momentum to distinguish it from a PDF ( p ( x , t ) ). For instance, the trajectory of a particle in the x P phase space is random and not smooth. Furthermore, the phase volume h plays the role of resolution in the phase space, one unit of information given by the phase volume h. Thus, the total number of states is given by the total phase volume divided by h. This observation points out a potentially different role of the width of PDF in QM in comparison with the classical system since a wider PDF in QM occupies a larger region of x in the phase space, with the possibility of increasing the information.
To investigate this, for simplicity, we consider a particle of mass m under a constant force F and assume an initial Gaussian wave function around x = 0 [21]
ψ ( x , 0 ) = 2 β 0 π 1 4 e β 0 x 2 + i k 0 x ,
where k 0 = P 0 / is the wave number at t = 0 , D x = ( 2 β 0 ) 1 / 2 is the width of the initial wave function, and P 0 is the initial momentum. A time-dependent PDF p ( x , t ) is then found as (e.g., see [21,27]):
p ( x , t ) = | ψ ( x , t ) | 2 = β ( t ) π e β ( t ) x x 2 .
Here,
β ( t ) = 2 β 0 m 2 m 2 + ( 2 β 0 t ) 2 , x = k 0 t m + F t 2 2 m .
Equation (22) clearly shows that the PDF is Gaussian, with the mean x = k 0 t m + F t 2 2 m and the variance
Var ( t ) = ( x x ) 2 = 1 4 β = 1 4 β 0 + β 0 2 t 2 m 2 = Var ( 0 ) + 2 t 2 4 Var ( 0 ) m 2 .
In Equation (24), Var ( 0 ) = ( x ( 0 ) x ( 0 ) ) 2 = 1 4 β 0 = D x 2 is the initial variance. We note that the last term in Equation (24) increases quadratically with time t due to the quantum effect, the width of wave function becoming larger over time. Obviously, this effect vanishes as 0 .
Since the PDF in Equation (22) is Gaussian, we can use Equation (18) to find (e.g., see [16])
1 τ 2 = 2 t 2 1 ( T 2 + t 2 ) 2 + 2 β 0 T 2 T 2 + t 2 v 0 2 1 + F t k 0 2 ,
where T = m 2 β 0 , the time scale of the broadening of the initial wave function [21]. It is interesting to note that when there is no external constant force F, the two terms in Equation (25) decrease for large time t, making τ large. The situation changes dramatically in the presence of F in Equation (25) as the second term approaches a constant value for large time. The region with the same value of τ signifies that the rate of change in information is constant in time, and was argued to be an optimal path to minimise the irreversible dissipation (e.g., [16]). Physically, this geodesic arises when when the broadening of a PDF is compensated by momentum F t which increases with time. Mathematically, the limit t reduces Equation (25) and thus L to
1 τ F D x , L ( F t ) D x .
Since F t = P and D x = ( 2 β 0 ) 1 / 2 is the width of the wave function at t = 0 , F t D x in Equation (26) represents the volume in the P x phase space spanned by this wave function. This reflects the information changes associated with the coverage of a phase volume . Interestingly, similar results are also obtained in the momentum representation where L is computed from the PDF p ( P , t ) in the momentum space:
p ( P , t ) = λ π e λ p ( m v 0 + F t ) 2 , 1 τ 2 = 2 λ F 2 , L = 2 λ F t ,
where λ = 1 2 2 β 0 . In Equation (27), τ is obviously constant, and L linearly increases with time t. We can see even a strong similarity between Equation (27) and Equation (26) as t once using L 2 λ F t ( F t ) D x / . In view of the complementary relation between position and momentum in quantum systems, the similar result for L in momentum and position space highlights the robustness of the geodesic.

6. Conclusions

We investigated information geometry associated with stochastic processes in classical and quantum systems. Specifically, we introduced τ ( t ) as a dynamical time scale quantifying information change and calculated L ( t ) by measuring the total clock time t by τ . As a unique Lagrangian measure of the information change, L was demonstrated to be a novel diagnostic for mapping out an attractor structure. In particular, L was shown to capture the effect of different deterministic forces through the scaling of L again the peak position of a narrow initial PDF. For a stable equilibrium, the minimum value of L occurs at the equilibrium point. In comparison, in the case of a chaotic attractor, L exhibits a sensitive dependence on initial conditions like a Lyapunov exponent. We then showed the application of our method to characterize the information change associated with classical music (e.g., see [14]). Finally, we elucidated the effect of the width of a PDF on information length in quantum systems. Extension of this work to impure (mixed-state) quantum systems and investigation of Riemannian geometry on the space of density operators would be of particular interest for future work.

Funding

This research received no external funding.

Acknowledgments

This paper is a review and summary of work carried out over several years with numerous co-authors. Among these, I am particularly grateful to Rainer Hollerbach, Schuyler Nicholson and James Heseltine for their contributions and valuable discussions regarding different aspects of information length.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A. for the O-U Process

To make this paper self-contained, we provide here the main steps for the derivation of L for the O-U process [15,16]. We use y = x = y 0 e γ t in p ( x , t ) in Equation (12) and differentiate it to find
p t = β ˙ 1 2 β ( x y ) 2 + 2 β ( x y ) y ˙ p .
Equations (A1) and (1) and using the properties of a Gaussian PDF [ ( x y ) 2 = 1 2 β , ( x y ) 4 = 3 ( x y ) 2 2 ] lead to
1 τ 2 = 1 2 β ( t ) 2 d β d t 2 + 2 β d y d t 2 .
We express β in Equation (16) in terms of T = 2 β 0 D ( e 2 γ t 1 ) + γ as β = γ β 0 e 2 γ t T . Differentiating this and using r = 2 β 0 D γ then give
β ˙ 2 2 β 2 = 2 γ 2 r 2 1 T 2 .
Similarly, using d y d t = γ y 0 e γ t , T = 2 β 0 D ( e 2 γ t 1 ) + γ and q = β 0 γ y 0 2 , we obtain
2 β y ˙ 2 = 2 q γ 2 1 T .
Using these results, Equations (A3) and (A4) in (A2) gives us
1 τ 2 = 1 2 β 2 d β d t 2 + 2 β d y d t 2 = 2 γ 2 T 2 ( r 2 + q T ) .
Again, in Equation (A5), q = β 0 γ y 0 2 , r = 2 β 0 D γ , and T = 2 β 0 D ( e 2 γ t 1 ) + γ [15,16,17]. It is worth noting that q and r, respectively, arise from the difference in mean position at t = 0 and t (i.e., y 0 y ( t ) ) and in PDF width at t = 0 and t (i.e., D 0 D ). Thus, the first and second terms in Equation (A5) represent the information change due to the change in PDF width and the movement of the PDF, respectively. Using D 0 = γ 2 β 0 , we express r, q and T in Equation (A5) as q = γ 2 y 0 2 2 D 0 , r = γ D D 0 1 , T = γ D D 0 ( e 2 γ t 1 ) + 1 . Equations (A5) and (2) then give us
L = 1 2 T i T f 1 T 1 T + r r 2 + q T d T ,
where T i = T ( t = 0 ) and T f = T ( t ) . To compute Equation (A6) for r 0 , we use Y = r 2 + q T and integrate
L = 1 2 ln Y r Y + r Y i Y f + 2 r H , H = Y i Y f q r r 2 Y 2 + q r r 2 d Y ,
where Y i = Y ( t = 0 ) and Y f = Y ( t ) . To calculate H in Equation (A7), we need to consider the two cases where q r or q < r . First, when q r , we use the change of the variable Y = q r r 2 tan θ to find
H = q r r 2 tan 1 Y q r r 2 Y i Y f .
When q < r , we let Y = r 2 q r sec θ and find
H = r 2 q r 2 ln Y r 2 q r Y + r 2 q r Y i Y f .
When D = D 0 ( r = 0 ), β ( t ) = β 0 for all t. Thus, Equation (2) can easily be calculated directly from Equation (A5) with the result
L = 1 2 T i T f q T 3 2 d T = 2 q 1 T T i T f = 1 D / γ y 0 y ,
where again y = x = y 0 e γ t .

References

  1. Gibbs, A.L.; Su, F.E. On choosing and bounding probability metrics. Int. Stat. Rev. 2002, 70, 419–435. [Google Scholar] [CrossRef]
  2. Wootters, W.K. Statistical distance and Hilbert space. Phys. Rev. D 1981, 23, 357. [Google Scholar] [CrossRef]
  3. Frieden, B.R. Science from Fisher Information; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
  4. Braunstein, S.L.; Caves, C.M. Statistical distance and the geometry of quantum states. Phys. Rev. Lett. 1994, 72, 3439. [Google Scholar] [CrossRef] [PubMed]
  5. Feng, E.H.; Crooks, G.E. Far-from-equilibrium measurements of thermodynamic length. Phys. Rev. E 2009, 79, 012104. [Google Scholar] [CrossRef] [PubMed]
  6. Ruppeiner, G. Thermodynamics: A Riemannian geometric model. Phys. Rev. A 1979, 20, 1608–1613. [Google Scholar] [CrossRef]
  7. Schlögl, F. Thermodynamic metric and stochastic measures. Z. Phys. B Condens. Matter 1985, 59, 449–454. [Google Scholar] [CrossRef]
  8. Nulton, J.; Salamon, P.; Andresen, B.; Anmin, Q. Quasistatic processes as step equilibrations. J. Chem. Phys. 1985, 83, 334–338. [Google Scholar] [CrossRef]
  9. Sivak, D.A.; Crooks, G.E. Thermodynamic metrics and optimal paths. Phys. Rev. Lett. 2012, 8, 190602. [Google Scholar] [CrossRef] [PubMed]
  10. Plastino, A.R.; Casas, M.; Plastino, A. Fisher’s information, Kullback’s measure, and H-theorems. Phys. Lett. A 1998, 246, 498–504. [Google Scholar] [CrossRef]
  11. Polettini, M.; Esposito, M. Nonconvexity of the relative entropy for Markov dynamics: A Fisher information approach. Phys. Rev. E. 2013, 88, 012112. [Google Scholar] [CrossRef] [PubMed]
  12. Naudts, J. Quantum statistical manifolds. Entropy 2018, 20, 472. [Google Scholar] [CrossRef]
  13. Nicholson, S.B.; Kim, E. Investigation of the statistical distance to reach stationary distributions. Phys. Lett. A 2015, 379, 83–88. [Google Scholar] [CrossRef]
  14. Nicholson, S.B.; Kim, E. Structures in sound: Analysis of classical music using the information length. Entropy 2016, 18, 258. [Google Scholar] [CrossRef]
  15. Heseltine, J.; Kim, E. Novel mapping in non-equilibrium stochastic processes. J. Phys. A Math. Theor. 2016, 49, 175002. [Google Scholar] [CrossRef]
  16. Kim, E.; Lee, U.; Heseltine, J.; Hollerbach, R. Geometric structure and geodesic in a solvable model of nonequilibrium process. Phys. Rev. E 2016, 93, 062127. [Google Scholar] [CrossRef] [PubMed]
  17. Kim, E.; Hollerbach, R. Signature of nonlinear damping in geometric structure of a nonequilibrium process. Phys. Rev. E 2017, 95, 022137. [Google Scholar] [CrossRef] [PubMed]
  18. Hollerbach, R.; Kim, E. Information geometry of non-equilibrium processes in a bistable system with a cubic damping. Entropy 2017, 19, 268. [Google Scholar] [CrossRef]
  19. Kim, E.; Tenkès, L.M.; Hollerbach, R.; Radulescu, O. Far-from-equilibrium time evolution between two gamma distributions. Entropy 2017, 19, 511. [Google Scholar] [CrossRef]
  20. Tenkès, L.M.; Hollerbach, R.; Kim, E. Time-dependent probability density functions and information geometry in stochastic logistic and Gompertz models. J. Stat. Mech. Theory Exp. 2017, 2017, 123201. [Google Scholar] [CrossRef] [Green Version]
  21. Kim, E.; Lewis, P. Information length in quantum system. J. Stat. Mech. Theory Exp. 2018, 2018, 043106. [Google Scholar] [CrossRef]
  22. Risken, H. The Fokker–Planck Equation: Methods of Solutions and Applications; Springer: Berlin, Germany, 2013. [Google Scholar]
  23. Klebaner, F. Introduction to Stochastic Calculus with Applications; Imperial College Press: London, UK, 2012. [Google Scholar]
  24. Kim, E.; Hollerbach, R. Time-dependent probability density function in cubic stochastic processes. Phys. Rev. E 2016, 94, 052118. [Google Scholar] [CrossRef] [PubMed]
  25. Ott, E. Chaos in Dynamical Systems; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
  26. Kern Scores. Available online: http://kernscores.stanford.edu/ (accessed on 12 July 2016).
  27. Andrews, M. Quantum mechanics with uniform forces. Am. J. Phys. 2018, 78, 1361–1364. [Google Scholar] [CrossRef]
Figure 1. Initial (red) and final (blue) Probability Density Functions (PDFs) for the O-U process in (a) and the cubic process in (b).
Figure 1. Initial (red) and final (blue) Probability Density Functions (PDFs) for the O-U process in (a) and the cubic process in (b).
Entropy 20 00574 g001
Figure 2. (a): L against x ( t = 0 ) = y 0 for the linear process in red dashed line and for the cubic process in blue solid line; (b): L against x ( t = 0 ) = y 0 for the cubic process on log-log scale (data from [17]).
Figure 2. (a): L against x ( t = 0 ) = y 0 for the linear process in red dashed line and for the cubic process in blue solid line; (b): L against x ( t = 0 ) = y 0 for the cubic process on log-log scale (data from [17]).
Entropy 20 00574 g002
Figure 3. (a): an initial narrow PDF at the peak x 0 = 0.7 ; (b): the invariant density of a logistic map.
Figure 3. (a): an initial narrow PDF at the peak x 0 = 0.7 ; (b): the invariant density of a logistic map.
Entropy 20 00574 g003
Figure 4. L against the peak position x = x 0 of an initial PDF in the chaotic regime of a logistic map (Reprinted from Physics Letters A, 379, S.B. Nicholson & E. Kim, Investigation of the statistical distance to reach stationary distributions, 83-88, Copyright (2015), with permission from Elsevier).
Figure 4. L against the peak position x = x 0 of an initial PDF in the chaotic regime of a logistic map (Reprinted from Physics Letters A, 379, S.B. Nicholson & E. Kim, Investigation of the statistical distance to reach stationary distributions, 83-88, Copyright (2015), with permission from Elsevier).
Entropy 20 00574 g004
Figure 5. L ( t ) against time T for different composers (from [14]).
Figure 5. L ( t ) against time T for different composers (from [14]).
Entropy 20 00574 g005
Figure 6. 1 τ = d L d t for different composers shown in Figure 5 (from [14]).
Figure 6. 1 τ = d L d t for different composers shown in Figure 5 (from [14]).
Entropy 20 00574 g006

Share and Cite

MDPI and ACS Style

Kim, E.-j. Investigating Information Geometry in Classical and Quantum Systems through Information Length. Entropy 2018, 20, 574. https://doi.org/10.3390/e20080574

AMA Style

Kim E-j. Investigating Information Geometry in Classical and Quantum Systems through Information Length. Entropy. 2018; 20(8):574. https://doi.org/10.3390/e20080574

Chicago/Turabian Style

Kim, Eun-jin. 2018. "Investigating Information Geometry in Classical and Quantum Systems through Information Length" Entropy 20, no. 8: 574. https://doi.org/10.3390/e20080574

APA Style

Kim, E. -j. (2018). Investigating Information Geometry in Classical and Quantum Systems through Information Length. Entropy, 20(8), 574. https://doi.org/10.3390/e20080574

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop