Next Article in Journal
Virtual Network Embedding Based on Topology Potential
Next Article in Special Issue
d-Dimensional Classical Heisenberg Model with Arbitrarily-Ranged Interactions: Lyapunov Exponents and Distributions of Momenta and Energies
Previous Article in Journal
Bulk-Surface Electrothermodynamics and Applications to Electrochemistry
Previous Article in Special Issue
Emergence of Shear Bands in Confined Granular Systems: Singularity of the q-Statistics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Associating an Entropy with Power-Law Frequency of Events

by
Evaldo M. F. Curado
1,
Fernando D. Nobre
1,* and
Angel Plastino
2
1
Centro Brasileiro de Pesquisas Físicas and National Institute of Science and Technology for Complex Systems, Rua Xavier Sigaud 150, Rio de Janeiro 22290-180, Brazil
2
La Plata National University and Argentina’s National Research Council (IFLP-CCT-CONICET)-C. C. 727, La Plata 1900, Argentina
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(12), 940; https://doi.org/10.3390/e20120940
Submission received: 3 October 2018 / Revised: 19 November 2018 / Accepted: 23 November 2018 / Published: 6 December 2018
(This article belongs to the Special Issue Nonadditive Entropies and Complex Systems)

Abstract

:
Events occurring with a frequency described by power laws, within a certain range of validity, are very common in natural systems. In many of them, it is possible to associate an energy spectrum and one can show that these types of phenomena are intimately related to Tsallis entropy S q . The relevant parameters become: (i) The entropic index q, which is directly related to the power of the corresponding distribution; (ii) The ground-state energy ε 0 , in terms of which all energies are rescaled. One verifies that the corresponding processes take place at a temperature T q with k T q ε 0 (i.e., isothermal processes, for a given q), in analogy with those in the class of self-organized criticality, which are known to occur at fixed temperatures. Typical examples are analyzed, like earthquakes, avalanches, and forest fires, and in some of them, the entropic index q and value of T q are estimated. The knowledge of the associated entropic form opens the possibility for a deeper understanding of such phenomena, particularly by using information theory and optimization procedures.

1. Introduction

Power laws are ubiquitous in many areas of knowledge, emerging in economics, natural and social sciences, among others [1]. In the latest years, a particular interest has been given to frequency of events, which very often follow power laws: (i) In humanities, the Zipf’s law states that the relative word frequency in a given text is inversely proportional to its rank (defined as its position in a rank of decreasing frequency); (ii) In natural sciences, the frequency of earthquakes with a magnitude larger than a certain value m, plotted versus m, leads to the Gutenberg–Richter law [2]; furthermore, the frequency of avalanches, as well as of forest fires, of a given size l, plotted versus l, yield power laws [1]. Simple dynamic scale-free models, without tuning of a control parameter, but sharing many features of the critical point in a standard phase transition, like long-range correlations, have been introduced to approach theoretically the types of phenomena in examples (ii). For these reasons, the term self-organized criticality (SOC) [3] was coined, considered as the main characteristic exhibited by these models; since then, a vast literature appeared in this area (for reviews, see References [4,5,6,7]). Although stationary states may occur in SOC models, they are essentially characterized by out-of-equilibrium states, and in many cases jumps between different states occur due to energy changes; consequently, equilibrium thermodynamics does not apply to these models. Moreover, one of the most curious aspects concerns the fact that a critical state is approached without a temperature-like control parameter, and one of the most relevant questions concerns which real systems are well-described by SOC models, and under what conditions SOC applies [7].
Recently, a wide variety of entropic forms have been considered in the literature, either in the context of information theory, or for approaching real phenomena (see, e.g., References [8,9,10,11,12,13,14,15,16,17,18]). Many of these proposals recover the well-known Boltzmann–Gibbs entropy [19,20] as particular limits, and are usually known as generalized entropic forms. In the present work we show a connection between phenomena following power-law frequency of events and Tsallis S q entropy [9,10,11]. For this purpose, we develop a framework that could be relevant for some of the phenomena described in the previous paragraph. In this proposal we assume the existence of equilibrium (or long-living metastable) states, characterized by an energy spectrum { ϵ i } , which represent notorious differences with respect to the SOC models. The main motivation is that in many cases it is possible to define an energy-like variable, related in some way to one of the relevant parameters of the system, e.g., the magnitude of an earthquake, or the size of an avalanche, should be associated to some quantity of energy released. Since these parameters obey power laws, one expects that their corresponding energies should be also power-like distributed, leading an energy probability distribution p ( ϵ ) ϵ γ , where γ , restricted to γ > 1 , for reasons that will become clear later.
Then, from the distribution p ( ϵ ) we follow previous works, where a procedure for calculating fundamental quantities (like the partition function) was developed, by combining information theory and a key thermodynamical relation (see, e.g., References [21,22,23,24]). More precisely, we calculate the internal, or more generally, average energy U and define a general entropic form satisfying basic requirements [19,20], like being a functional that depends only on the set of probabilities. Furthermore, imposing the fundamental relation of thermodynamics,
d U = T d S ,
We obtain the associated entropy and verify that the temperature should be constant, for consistency. Curiously, the distribution p ( ϵ ) turns up to be temperature-independent, and consequently, all average values calculated from this probability distribution become independent of the temperature. Hence, similarly to what happens in SOC models, in the present approach the temperature does not play a crucial role for these types of phenomena.
In the next section we review some results of References [21,22,23,24], and especially how to combine general concepts of information theory with the fundamental relation of Equation (1), with the purpose of deriving an equation for obtaining the entropic form from a given energy spectrum. In Section 3 we discuss energy power-law distributions, and show a peculiar behavior, namely, that through the normalization procedure its dependence on the temperature disappears. Consequently, all quantities derived from these distributions, like average values, do not depend on the temperature. In Section 4 we analyze data of events within the present framework, by associating the corresponding power-law distributions with the energy distributions discussed in Section 3. Finally, in Section 5 we present our main conclusions.

2. Combining Information Theory and Thermodynamics

Herein we review some basic results of References [21,22,23,24], which were derived by considering a nondegenerate energy spectrum { ϵ i } . Hence, a discrete index i will identify uniquely a state with an energy ϵ i , occurring with a probability p i , in such a way that the internal energy is defined as
U = i ϵ i p i .
Moreover, let g ( p i ) be an arbitrary concave smooth function of p i ; we assume that the entropic functional may be written in the form [19,20]
S ( { p i } ) = k i g ( p i ) [ g ( p i ) = 0 , if p i = 0 , or p i = 1 ] ,
where k is a positive constant with entropy dimensions.
Let us now consider a small change in the level populations (which may occur, e.g., due to an infinitesimal exchange of heat); then, the probabilities { p i } will vary according to,
p i p i + d p i , with i d p i = 0 ,
with the last condition resulting from normalization ( i p i = 1 ). This procedure will in turn generate infinitesimal changes in the entropy and internal energy, and we impose the fundamental relation of Equation (1). One obtains (up to first order in d p i ) [21],
i ϵ i k T g ( p i ) d p i i K i d p i = 0 ,
where the prime indicates a derivative with respect to p i . As shown in Reference [21], Equations (4) and (5) lead to just one expression for the p i and further, that all K i should be equal. The resulting value K is found through the normalization condition on the ensuing probability distribution ( K is, in fact, related to the partition function), to be determined by the relation,
K = ϵ i k T g ( p i ) g ( p i ) = β ( ϵ i K ) ; ( β 1 / k T ) .
From now on we will consider, for simplicity, a continuous energy spectrum represented by an energy probability distribution p ( ϵ ) , defined in a given range of energies between a minimum value ϵ 0 , and a maximum ϵ m . Although the events to be studied herein are expressed in terms of discrete sets of data, we will associate to them continuous distributions, which result from fittings of these data in such a range, as will be defined appropriately in the following sections. In the next section we define the probability distribution p ( ϵ ) of interest for the present work, and calculate relevant quantities; moreover, we consider the continuous form of Equation (6) to obtain the associated entropic form.

3. Power-Law Distributions and Associated Entropy

Power-law distributions frequently appear to be valid for certain ranges of its parameters, in variegated empirical settings pertaining to diverse disciplines [1,2,3,4,5,6,7]. We enlarge the scope of our methodology by considering systems for which a strict underlying thermodynamics does not exist, the inverse-temperature β being just a measure of the probability-distribution’s ”spread”. Let us then consider an energy spectrum following a power-law distribution, defined in a given range of energies between a minimum value ϵ 0 , and a maximum ϵ m ,
p ( ϵ ) = 1 Z ( β ϵ ) γ ( γ > 1 ; ϵ 0 ϵ ϵ m ) ,
with a non-negative ground-state energy, ϵ 0 0 . The normalization condition,
ϵ 0 ϵ m d ϵ p ( ϵ ) = 1 ,
yields
Z = ϵ 0 ϵ m d ϵ ( β ϵ ) γ = ϵ 0 γ 1 ( β ϵ 0 ) γ 1 ϵ 0 ϵ m γ 1 ,
leading to
p ( ϵ ) = γ 1 ϵ 0 1 ϵ 0 / ϵ m γ 1 ϵ ϵ 0 γ ,
which does not depend upon β .
One should notice that, in order to obtain an appropriate power-law decay from the distribution above one should have γ , restricted to γ > 1 . Furthermore, p ( ϵ ) presents dimensions [ energy ] 1 , as required by Equation (8).
One curious aspect of p ( ϵ ) in Equation (10) concerns its non-dependence on the parameter β , which, although introduced in Equation (7), it cancelled by imposing normalization; later on, it will be shown that the parameter β takes a constant value, for consistency. Consequently, all properties derived from the probability distribution of Equation (10) will not allow variations on the temperature; as an example, one has the average energy,
U = ϵ 0 ϵ m d ϵ ϵ p ( ϵ ) = ϵ 0 ( γ 1 ) 1 ϵ 0 / ϵ m γ 2 ( γ 2 ) 1 ϵ 0 / ϵ m γ 1 .
As mentioned before, the present approach holds for any γ > 1 ; the particular limit γ 2 of the internal energy above may be obtained through the l’Hopital rule,
U = lim γ 2 ϵ 0 ( γ 1 ) 1 exp ( γ 2 ) ln ϵ 0 / ϵ m ( γ 2 ) 1 ϵ 0 / ϵ m γ 1 = ϵ 0 ln ϵ 0 / ϵ m 1 ϵ 0 / ϵ m .
In order to deal appropriately with the continuous form of Equation (6), we define the dimensionless quantities,
p ˜ ( ϵ ˜ ) = p ( ϵ ) ϵ 0 ; ϵ ˜ = ϵ ϵ 0 ( ϵ ˜ 1 ) ,
so that Equation (10) may be expressed as
p ˜ ( ϵ ˜ ) = ϵ ˜ γ B ; B = 1 γ 1 1 ϵ 0 ϵ m γ 1 = 1 γ 1 1 1 ϵ ˜ m γ 1 ,
whereas the normalization condition becomes
ϵ 0 ϵ m d ϵ p ( ϵ ) = 1 ϵ ˜ m d ϵ ˜ p ˜ ( ϵ ˜ ) = 1 .
The continuous form of Equation (6) becomes
g ( p ˜ ) = β [ ϵ 0 ϵ ˜ ( p ˜ ) K ] ,
and we are using the fact that p ˜ ( ϵ ˜ ) is of monotonic decreasing nature, so that it can be inverted, yielding a function ϵ ˜ ( p ˜ ) . Notice that Equation (16) is a first-order differential equation for g ( p ˜ ) , in fact a Bernoulli equation of zeroth-order; its solution reads,
g ( p ˜ ) = β [ ϵ 0 ϵ ˜ ( p ˜ ) K ] d p ˜ + C .
Now, one can invert Equation (14), so that ϵ ˜ ( p ˜ ) = B 1 / γ p ˜ 1 / γ , and substitute this result in Equation (17), leading to
g [ p ˜ ( ϵ ˜ ) ] = β K p ˜ ( ϵ ˜ ) + β ϵ 0 B 1 / γ [ p ˜ ( ϵ ˜ ) ] 1 1 / γ 1 1 / γ + C .
Using the conditions of Equation (3), i.e., g [ p ˜ ( ϵ ˜ ) ] = 0 , for p ˜ ( ϵ ˜ ) = 0 and p ˜ ( ϵ ˜ ) = 1 , one obtains that C = 0 and
K = B 1 / γ 1 1 / γ ϵ 0 ,
showing that K is indeed related to the normalization of the probability distribution. Hence, Equation (18) becomes
g [ p ˜ ( ϵ ˜ ) ] = β ϵ 0 B 1 / γ 1 1 / γ p ˜ ( ϵ ˜ ) + [ p ˜ ( ϵ ˜ ) ] 1 1 / γ ,
leading to
S [ p ˜ ] = k d ϵ ˜ g [ p ˜ ( ϵ ˜ ) ] = k β ϵ 0 B 1 / γ 1 1 / γ d ϵ ˜ p ˜ ( ϵ ˜ ) + [ p ˜ ( ϵ ˜ ) ] 1 1 / γ .
By recourse to the exact mapping detailed below, the expression above may be identified with Tsallis entropy [9,10,11],
S q [ p ˜ ] = k q 1 1 d ϵ ˜ p ˜ q ,
through
1 1 γ = q ; β ϵ 0 B 1 / γ 1 1 / γ = 1 1 q ,
where q represents the usual entropic index. This is of practical utility because we have now at our disposal the large set of useful recipes developed since 1988 with regards to Tsallis’ measure. Now, manipulating Equations (14) and (23), we obtain
β ϵ 0 = q 1 q q 1 ϵ 0 ϵ m q / ( 1 q ) 1 q ,
showing that the parameter β should assume a real constant value, for a given value of 0 < q < 1 . Hence, defining a fixed pseudo-temperature T q , such that the spread β = 1 / ( k T q ) , one finds
k T q = ϵ 0 1 q q q 1 ϵ 0 ϵ m q / ( 1 q ) q 1 .
In this way, the probability distribution of Equation (10), which is indeed a power-law, may be expressed in terms of the entropic index q,
p ( ϵ ) = q ( 1 q ) ϵ 0 1 1 ϵ 0 / ϵ m q / ( 1 q ) ϵ 0 ϵ 1 / ( 1 q ) ,
being defined for 0 < q < 1 only; notice that this restriction is equivalent to γ > 1 (cf. Equation (23)).
For several of the examples to be considered below, the associated energy spectra will be characterized by ϵ m ϵ 0 , so the Equation (25) may be expanded in a power series, e.g.,
k T q = ϵ 0 1 q q q 1 + ( 1 q ) ϵ 0 ϵ m q / ( 1 q ) + ,
whereas for the probability distribution one has the approximate expression
p ( ϵ ) = q ( 1 q ) ϵ 0 ϵ 0 ϵ 1 / ( 1 q ) 1 + ϵ 0 ϵ m q / ( 1 q ) + ,
which is not a q-exponential.
The expansions of Equations (27) and (28) show that the maximum energy value ϵ m only appears in higher-order corrections of T q and p ( ϵ ) . In such cases, the most relevant parameters in Equation (10) become the exponent γ [directly related to q through Equation (23)] and the ground-state energy ϵ 0 .
One should focus attention upon the curious result we have obtained in this effort. We were able to relate with Tsallis entropy the power-law distribution Equation (7) (not the usual q-exponential distribution). In fact, the equilibrium distribution that arises out of the extremization procedure for a given entropic form depends directly on the constraints imposed and the choices made regarding the corresponding Lagrange multipliers [10]. As shown in Reference [24], the distribution Equation (7) may be obtained from an extremization procedure effected on Tsallis entropy in Equation (22), by considering the usual constraints of probability normalization (associated Lagrange multiplier α ˜ ) and internal energy definition in Equation (2) (corresponding Lagrange multiplier β ), by choosing appropriately the first Lagrange multiplier, i.e.,
α ˜ = Z 1 / γ 1 1 / γ .
In the following section we will analyze examples of real systems governed by a power-law frequency of events.

4. Typical Examples in Natural Systems: From Data of Events to Energy Spectrum

Next, we describe some examples, chosen from the literature, of power-law distributions found in natural systems. In order to associate these examples with the theoretical approach of the previous sections, we will assume that: (i) The relevant variable of each distribution may be related in some way to the energy ϵ ; (ii) The fittings describing each class of phenomena may be associated with the continuous probability distribution of Equation (10), defined in the range between its minimum and maximum values ( ϵ 0 and ϵ m , respectively). We discuss separately two types of phenomena: (i) Systems presenting energy power-law distributions that can be directly related to the distribution of Equation (10). In such cases, we calculate, from the corresponding data, important quantities like the entropic index q, the dimensionless ratio ϵ 0 / ϵ m , and the fixed value of the temperature T q ; (ii) Systems presenting power-law distributions P ( x ) , depending on a parameter x that can be related to the energy ϵ through some invertible monotonic function. For these cases, we propose a procedure for calculating the quantities of interest.

4.1. Systems Exhibiting Energy Power-Law Distributions

Certainly, one of the most paradigmatic power-law distributions is the Gutenberg–Richter law, which measures the frequency of earthquakes with a magnitude larger than a certain value m [2]. The magnitude m may be related to the seismic energy (or energy released) E [25], so that the Gutenberg–Richter law is sometimes expressed in a form similar to Equation (10),
p ( E ) E γ .
In fact, as pointed out in Reference [26], the distribution above was proposed previously by Wadati (1932) in a paper written in japonese [27]. By analyzing earthquakes around the Tokyo station, Wadati obtained two different estimates for the exponent γ , respectively γ = 1.7 and γ = 2.1 , under different assumptions for the distributions of hypocenters. One should notice that the first estimate is very close to γ = 5 / 3 , which is nowadays generally accepted for the index of the power-law distribution of seismic energies [26,28]. For earthquakes, one can assume that the seismic energy E can be related to the energy ϵ in a simple way, e.g., at most, apart from a proportionality constant, ϵ E , so that Equation (30) can be associated with the probability distribution of Equation (10). Under this assumption one has γ = γ , and using Equation (23) one obtains the entropic index q = 2 / 5 for earthquakes.
Recently, the possibility of investigating seismic phenomena by means of laboratory experiments has gained a big motivation after the identification of deep associations between earthquakes and the fracture of materials [29,30]. As examples, one may mention experiments of compression on porous glasses [31,32], as well as on small wood samples [33]. This connection is based on the crackling noise idea, where systems under slow perturbations may respond through discrete events covering a wide variety of amplitudes. By recording the amplitudes of these cracking noises, one can compute the associated energies, which may be normalized conveniently in such a way to produce energy probability distributions. Inspired by those, further experiments have been carried out by considering different apparatus, e.g., without compression, through the analysis of the acoustic emission in a variety of systems, like crumpled plastic sheets [34], or ethanol-dampened charcoal [35].
The two examples presented in Figure 1 follow these procedures, where the energy probability distribution P ( E ) is represented versus E for two distinct experiments. Results from the cracking noise produced by charcoal samples, when dampened with ethanol, are presented in Figure 1a; through their experiments, the authors have shown that the most fundamental seismic laws ruling earthquakes could be reproduced [35]. In an analogous way, avalanches were observed recently by means of acoustic emission in small wood samples under compression; these avalanches show results very similar to earthquakes and crackling noise in rocks and laboratory tests on brittle materials [33]. The distributions of energies are shown in Figure 1b, where data from different experimental conditions, i.e., constant strain rate ε t , constant stress rate σ t , and distinct event rates r ( t ) (defined as the number of events in a time interval divided by the interval length), all fall in a universal probability distribution P ( E ) . Like done before for natural earthquakes, in both cases one can identify directly the energy liberated E with ϵ , i.e., ϵ E , so that the probability distribution of Equation (10) can be related with the fitting distributions P ( E ) shown in Figure 1a,b. In this way, these examples correspond respectively, to γ = 1.3 and γ = 1.4 , representing smaller values when compared to γ = 5 / 3 generally accepted for earthquakes. From Equation (23) one obtains the entropic indexes q 0.23 (Figure 1a) and q 0.29 (Figure 1b). Moreover, in the plots of Figure 1 one has very small values for ϵ 0 / ϵ m (typically, ϵ 0 / ϵ m < 10 4 ), so that the expansions of Equations (27) and (28) are well approximated by their leading-order contributions. In particular, the dimensionless temperature of Equation (27) becomes ( k T q / ϵ 0 ) [ ( 1 q ) / q ] q , so that the two examples of Figure 1 can be associated with fixed values of the dimensionless temperature, ( k T q / ϵ 0 ) 1.32 (Figure 1a) and ( k T q / ϵ 0 ) 1.30 (Figure 1b). One notices that the estimates of q and T q are very close to one another in these two experiments.

4.2. Systems Exhibiting General Power-Law Distributions: Identifying Relevant Variables with Energy

Let us now analyze systems characterized by a given parameter x and its associated power-law distribution P ( x ) ; contrary to the examples shown in Figure 1, the relation between x and ϵ does not follow straightforwardly; two typical examples in this class are shown in Figure 2. In Figure 2a the forest-fire frequency density per year is represented versus forest burned area A F . The straight line yields a frequency versus area power-law distribution with an exponent 1.38 ; the data corresponds to Ontario, Canada, during the period 1976–1996 [36]. Results from experiments carried out on a NbTi (conventional superconductor) sample, at the Bean critical state, are exhibited in Figure 2b [37]. For hard superconductors, the Bean critical state corresponds to a marginal stable state, where the Lorentz force acting on each vortex equals the maximum pinning force. A sketch of the experimental arrangement is represented in the inset, where one has a tubular NbTi sample and the pickup coil. An external magnetic field enters the interior of the tube, inducing a voltage on the pickup coil; large variations of the voltage in the pickup coil are associated with avalanches. The corresponding probability density for measuring an avalanche of s vortices is represented versus s (cf. Figure 2b), for three different values of the magnetic field (the exponent of the power-law distribution is field-dependent); one notices that for the higher value of the magnetic field (7.55 kG), one gets avalanches up to 5000 vortices. In both examples shown in Figure 2, one expects the variable ϵ of the previous section to be an increasing function of the relevant variable, i.e., of the burned area A F (Figure 2a), as well as of the energy required for producing s vortices in a given avalanche (Figure 2b).
In order to relate probability distributions associated to these types of events to the approach of the previous sections, let us consider a given set of discrete data { x i } , given by m + 1 values ( x 0 , x 1 , x 2 , , x m ) , ordered in such a way that 0 x 0 x 1 x 2 x m 1 x m . Moreover, each quantity x i occurs with a frequency c i , following
i = 0 m c i = C ; c 0 c 1 c 2 c m 1 c m .
Rescaling the set of variables by its minimum value x 0 , one gets a discrete set of dimensionless data { x ˜ i } , 1 x ˜ 1 x ˜ 2 x ˜ m 1 x ˜ m , each x ˜ i occurring with a probability P i ( x ˜ i ) [ P i ( x ˜ i ) = c i / C ( i = 0 , 1 , 2 , , m ) representing a set of decreasing probabilities], so that
i = 0 m P i ( x ˜ i ) = 1 .
Herein we will be interested in the kind of phenomena illustrated in Figure 2, which are well-fitted by continuous power-law distributions; furthermore, we define dimensionless quantities similarly to those of Equations (13) and (14), i.e.,
P ˜ ( x ˜ ) = 1 A x ˜ α ( α > 1 ; 1 x ˜ x ˜ m ) ,
where now x ˜ corresponds to the continuous representation of the discrete variables { x ˜ i } , whereas P ˜ ( x ˜ ) denotes a dimensionless probability distribution. Moreover, the normalization condition,
1 x ˜ m d x ˜ P ˜ ( x ˜ ) = 1 ,
requires
A = 1 α 1 1 x 0 x m α 1 = 1 α 1 1 x ˜ m 1 α .
Accordingly, one can also calculate the average value,
x ˜ = 1 x ˜ m d x ˜ P ˜ ( x ˜ ) x ˜ = ( α 1 ) 1 x ˜ m 2 α ( α 2 ) 1 x ˜ m 1 α .
One should notice the resemblance of the probability distribution P ˜ ( x ˜ ) of Equation (33) with the energy distribution of p ˜ ( ϵ ˜ ) in Equation (14), as well as of the average value x ˜ with the internal energy of Equation (11). Such similarities suggest that ϵ ˜ and x ˜ should be directly related to one another; herein, we propose
ϵ ˜ = Λ x ˜ ϵ = ϵ 0 Λ x x 0 ,
where Λ ( y ) represents an invertible and monotonically increasing function of y, such that Λ ( 1 ) = 1 . The normalization condition on both distributions P ˜ ( x ˜ ) and p ˜ ( ϵ ˜ ) requires that
P ˜ ( x ˜ ) d x ˜ = p ˜ ( ϵ ˜ ) d ϵ ˜ ; P ˜ ( x ˜ ) d x ˜ d ϵ ˜ = p ˜ ( ϵ ˜ ) ,
which implies that ϵ ˜ and x ˜ should be related through a power, i.e., Λ ( y ) = y ν , with ν being a positive real number. In this way, one obtains the relation between the two variables,
ϵ ˜ = x ˜ ν .
Therefore, the internal energy of Equation (11) may be written as
U = ϵ 0 ϵ ˜ = ϵ 0 x ˜ ν = ϵ 0 1 x ˜ m d x ˜ x ˜ ν P ˜ ( x ˜ ) = ϵ 0 ( α 1 ) α 1 ν 1 x ˜ m ν x ˜ m 1 α 1 x ˜ m 1 α ,
which recovers the result of Equation (11) by using ϵ ˜ m = x ˜ m ν and imposing the relation
α 1 ν = γ 1 .
Hence, for systems exhibiting power-law distributions presenting a dependence on a general parameter x, being characterized by an exponent α according to Equation (33), the entropic form of Equation (22) still applies. In order to identify the entropic index q, one should carry out the following procedure: (i) Obtain the exponent ν relating the energy ϵ to the relevant parameter x through Equation (39); (ii) The exponent α is taken directly from the data, like those in Figure 2, e.g., α = 1.38 in the case of forest fires (Figure 2a). Then, use Equation (41) to calculate the exponent γ of the corresponding energy distribution; (iii) Calculate the entropic index q by means of Equation (23). In many cases step (i) may become the most difficult task, since obtaining an energy distribution from a given set of data of natural systems may not be so obvious.

5. Conclusions

We have analyzed events that occur with a frequency following power laws, within a certain range of validity of their relevant parameters. These types of phenomena are very common in natural systems and are usually associated with self-organized criticality. In many of such cases it is possible to introduce an energy spectrum, defined in a given interval of energies between a minimum value ϵ 0 , and a maximum ϵ m , so that an internal energy may be calculated. Based on this, we have assumed the validity of the fundamental relation d U = T d S , and have calculated important quantities, like the associated entropic form and temperature. As a curious aspect, the power-law probability distribution is temperature-independent, in agreement with self-organized-criticality; however, we have shown that these phenomena occur at a constant temperature and follow Tsallis entropy S q , with an entropic index 0 < q < 1 ; from the thermodynamical point of view, these phenomena could be identified as isothermal processes. In cases where ( ϵ m / ϵ 0 ) 1 , the relevant parameters within this procedure become the entropic index q, which is directly related to the power of the corresponding distribution, and the ground-state energy ε 0 , in terms of which all energies are rescaled. In particular, the corresponding processes take place at a temperature T q with ( k T q / ϵ 0 ) [ ( 1 q ) / q ] q .
Typical examples were analyzed, like earthquakes, avalanches, and forest fires, and in some of them, the entropic index q and value of T q were estimated. Specially for earthquakes, we obtained q = 2 / 5 and ( k T q / ϵ 0 ) 1.18 . It should be mentioned that an analysis of probability distributions of energy differences (returns) of data from the Northern California earthquake catalogue has led to q-Gaussian distributions with q = 1.75 ± 0.15 [38]. Although the power-law distributions considered herein are very different from the q-Gaussian distribution of Reference [38], both are associated in some way to Tsallis entropy S q ; curiously, our estimate for the entropic index q agrees, within the error bars, with the result of Reference [38] by considering the usual correspondence q 2 q .
The main contribution of the present work concerns the association of events occurring with a frequency following power laws with the entropy S q , and that distinct types of events should be characterized by different values of q. Furthermore, the identification of an associated entropic form opens the possibility for a deeper understanding of such important natural phenomena, particularly by using information theory and optimization procedures.

Author Contributions

All three authors contributed in all stages of the paper.

Funding

This research received no external funding

Acknowledgments

We thank Constantino Tsallis for fruitful conversations. The partial financial support from CNPq and FAPERJ (Brazilian funding agencies) is acknowledged. We acknowledge Physical Review Letters (American Physical Society) for permissions to reuse Figure 1a,b and Figure 2b, as well as Physica A (Elsevier) for permission to reuse Figure 2a.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Schroeder, M. Fractals, Chaos, Power Laws; W. H. Freeman and Company: New York, NY, USA, 1991. [Google Scholar]
  2. Gutenberg, B.; Richter, C. Seismicity of the Earth and Associated Phenomena; Princeton University Press: Princeton, NJ, USA, 1954. [Google Scholar]
  3. Bak, P.; Tang, C.; Wiesenfeld, K. Self-organized criticality. Phys. Rev. A 1988, 38, 364–374. [Google Scholar] [CrossRef]
  4. Bak, P. How Nature Works: The Science of Self-Organized Criticality; Copernicus: New York, NY, USA, 1996. [Google Scholar]
  5. Jensen, H.J. Self-Organized Criticality: Emergent Complex Behavior in Physical and Biological Systems; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar]
  6. Turcotte, D.L. Self-Organized Criticality. Rep. Prog. Phys. 1999, 62, 1377–1429. [Google Scholar] [CrossRef]
  7. Pruessner, G. Self-Organised Criticality: Theory, Models and Characterisation; Cambridge University Press: Cambridge, UK, 2012. [Google Scholar]
  8. Renyi, A. Probability Theory; North-Holland: Amsterdam, The Netherlands, 1970. [Google Scholar]
  9. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  10. Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer: New York, NY, USA, 2009. [Google Scholar]
  11. Tsallis, C. An introduction to nonadditive entropies and a thermostatistical approach to inanimate and living matter. Contemp. Phys. 2014, 55, 179–197. [Google Scholar] [CrossRef] [Green Version]
  12. Borges, E.P.; Roditi, I. A family of nonextensive entropies. Phys. Lett. A 1998, 246, 399–402. [Google Scholar] [CrossRef] [Green Version]
  13. Kaniadakis, G. Non-linear kinetics underlying generalized statistics. Physica A 2001, 296, 405–425. [Google Scholar] [CrossRef] [Green Version]
  14. Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef]
  15. Csiszár, I. Axiomatic characterization of information measures. Entropy 2008, 10, 261–273. [Google Scholar] [CrossRef]
  16. Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. Europhys. Lett. 2011, 93, 20006. [Google Scholar] [CrossRef]
  17. Hanel, R.; Thurner, S. Generalized (c,d)-entropy and aging random walks. Entropy 2013, 15, 5324–5337. [Google Scholar] [CrossRef] [Green Version]
  18. Guariglia, E. Entropy and Fractal Antennas. Entropy 2016, 18, 84. [Google Scholar] [CrossRef]
  19. Khinchin, A.I. Mathematical Foundations of Information Theory; Dover Publications: New York, NY, USA, 1957. [Google Scholar]
  20. Balian, R. From Microphysics to Macrophysics; Springer: Berlin, Germany, 1991; Volumes I–II. [Google Scholar]
  21. Plastino, A.; Curado, E.M.F. Equivalence between maximum entropy principle and enforcing dU=TdS. Phys. Rev. E 2005, 72, 047103. [Google Scholar] [CrossRef] [PubMed]
  22. Curado, E.M.F.; Plastino, A. Information theory link between MaxEnt and a key thermodynamic relation. Physica A 2007, 386, 155–166. [Google Scholar] [CrossRef]
  23. Curado, E.M.F.; Nobre, F.D.; Plastino, A. Computation of energy exchanges by combining information theory and a key thermodynamic relation: Physical applications. Physica A 2010, 389, 970–980. [Google Scholar] [CrossRef]
  24. Plastino, A.; Curado, E.M.F.; Nobre, F.D. Deriving partition functions and entropic functionals from thermodynamics. Physica A 2014, 403, 13–20. [Google Scholar] [CrossRef] [Green Version]
  25. Gutenberg, B.; Richter, C. Magnitude and Energy of Earthquakes. Nature 1955, 4486, 795. [Google Scholar] [CrossRef]
  26. Utsu, T. Representation and Analysis of the Earthquake Size Distribution: A Historical Review and Some New Approaches. Pure Appl. Geophys. 1999, 155, 509–535. [Google Scholar] [CrossRef]
  27. Wadati, K. On the Frequency Distribution of Earthquakes. Kishoshushi J. Meteorol. Soc. Jpn. 1932, 10, 559–568. (In Japanese) [Google Scholar] [CrossRef]
  28. Vallianatos, F.; Papadakis, G.; Michas, G. Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A 2016, 472, 20160497. [Google Scholar] [CrossRef] [Green Version]
  29. Sethna, J.P.; Dahmen, K.A.; Myers, C.R. Crackling noise. Nature 2001, 410, 242–250. [Google Scholar] [CrossRef] [Green Version]
  30. Salje, E.K.H.; Dahmen, K.A. Crackling noise in disordered materials. Annu. Rev. Condens. Matter Phys. 2014, 5, 233–254. [Google Scholar] [CrossRef]
  31. Navas-Portella, V.; Corral, A.; Vives, E. Avalanches and force drops in displacement-driven compression of porous glasses. Phys. Rev. E 2016, 94, 033005. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Navas-Portella, V.; Serra, I.; Corral, A.; Vives, E. Increasing power-law range in avalanche amplitude and energy distributions. Phys. Rev. E 2018, 97, 022134. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Mäkinen, T.; Miksic, A.; Ovaska, M.; Alava, M.J. Avalanches in Wood Compression. Phys. Rev. Lett. 2015, 115, 055501. [Google Scholar] [CrossRef]
  34. Mendes, R.S.; Malacarne, L.C.; Santos, R.P.B.; Ribeiro, H.V.; Picoli, S. Earthquake-like patterns of acoustic emission in crumpled plastic sheets. Europhys. Lett. 2010, 92, 29001. [Google Scholar] [CrossRef] [Green Version]
  35. Ribeiro, H.V.; Costa, L.S.; Alves, L.G.A.; Santoro, P.A.; Picoli, S.; Lenzi, E.K.; Mendes, R.S. Analogies Between the Cracking Noise of Ethanol-Dampened Charcoal and Earthquakes. Phys. Rev. Lett. 2015, 115, 025503. [Google Scholar] [CrossRef]
  36. Turcotte, D.L.; Malamud, B.D. Landslides, forest fires, and earthquakes: Examples of self-organized critical behavior. Physica A 2004, 340, 580–589. [Google Scholar] [CrossRef]
  37. Field, S.; Witt, J.; Nori, F.; Ling, X. Superconducting Vortex Avalanches. Phys. Rev. Lett. 1995, 74, 1206–1209. [Google Scholar] [CrossRef]
  38. Caruso, F.; Pluchino, A.; Latora, V.; Vinciguerra, S.; Rapisarda, A. Analysis of self-organized criticality in the Olami-Feder-Christensen model and in real earthquakes. Phys. Rev. E 2007, 75, 055101. [Google Scholar] [CrossRef]
Figure 1. Typical energy power-law distributions found in experiments. (a) Energy distribution P ( E ) versus E, obtained from the cracking noise produced by charcoal samples, when dampened with ethanol (from Reference [35]). (b) Energy distribution P ( E ) versus E, obtained from acoustic emission in small wood samples under compression. Data from different experimental conditions, i.e., constant strain rate ε t , constant stress rate σ t , and various event rates r ( t ) (defined as the number of events in a time interval divided by the interval length), all fall in a universal probability distribution (from Reference [33]). In both cases, the variable E is properly normalized and defined as a dimensionless quantity; within the present approach (cf. Equation (10)), these examples correspond to γ = 1.3 (case (a)) and γ = 1.4 (case (b)).
Figure 1. Typical energy power-law distributions found in experiments. (a) Energy distribution P ( E ) versus E, obtained from the cracking noise produced by charcoal samples, when dampened with ethanol (from Reference [35]). (b) Energy distribution P ( E ) versus E, obtained from acoustic emission in small wood samples under compression. Data from different experimental conditions, i.e., constant strain rate ε t , constant stress rate σ t , and various event rates r ( t ) (defined as the number of events in a time interval divided by the interval length), all fall in a universal probability distribution (from Reference [33]). In both cases, the variable E is properly normalized and defined as a dimensionless quantity; within the present approach (cf. Equation (10)), these examples correspond to γ = 1.3 (case (a)) and γ = 1.4 (case (b)).
Entropy 20 00940 g001
Figure 2. Typical power-law distributions found in natural systems. (a) Forest-fire frequency density per year is represented versus forest burned area A F ; the data corresponds to the period 1976–1996 in Ontario, Canada (from Reference [36]). (b) Probability density for measuring an avalanche of s vortices [ D ( s ) ] in a hard superconductor is represented versus s, for three different values of the magnetic field. The inset shows a sketch of the experimental arrangement, where one has a tubular NbTi sample and the pickup coil. Large variations of the voltage measured in the pickup coil are associated with avalanches (from Reference [37]).
Figure 2. Typical power-law distributions found in natural systems. (a) Forest-fire frequency density per year is represented versus forest burned area A F ; the data corresponds to the period 1976–1996 in Ontario, Canada (from Reference [36]). (b) Probability density for measuring an avalanche of s vortices [ D ( s ) ] in a hard superconductor is represented versus s, for three different values of the magnetic field. The inset shows a sketch of the experimental arrangement, where one has a tubular NbTi sample and the pickup coil. Large variations of the voltage measured in the pickup coil are associated with avalanches (from Reference [37]).
Entropy 20 00940 g002

Share and Cite

MDPI and ACS Style

Curado, E.M.F.; Nobre, F.D.; Plastino, A. Associating an Entropy with Power-Law Frequency of Events. Entropy 2018, 20, 940. https://doi.org/10.3390/e20120940

AMA Style

Curado EMF, Nobre FD, Plastino A. Associating an Entropy with Power-Law Frequency of Events. Entropy. 2018; 20(12):940. https://doi.org/10.3390/e20120940

Chicago/Turabian Style

Curado, Evaldo M. F., Fernando D. Nobre, and Angel Plastino. 2018. "Associating an Entropy with Power-Law Frequency of Events" Entropy 20, no. 12: 940. https://doi.org/10.3390/e20120940

APA Style

Curado, E. M. F., Nobre, F. D., & Plastino, A. (2018). Associating an Entropy with Power-Law Frequency of Events. Entropy, 20(12), 940. https://doi.org/10.3390/e20120940

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop