1. Introduction
One of the principal features of complex self-organizing systems is the multitude of
a priori available states [
1]. This confers to their evolution an element of unexpectedness, reflected by the ability to choose among several outcomes and the concomitant difficulty of an observer to localize the actual state in state space. This is reminiscent of a central problem of information and communication theories [
2], namely how to recognize a particular signal blurred by noise among the multitude of signals emitted by a source.
The connection between self-organization and information finds its origin in the pioneering works of Haken and Nicolis [
3,
4]. In the present work, we explore this connection in a class of multistable systems subjected to stochastic variability generated by fluctuations of intrinsic or environmental origin, as well as to a systematic nonequilibrium constraint in the form of a weak external periodic forcing. As is well known, stochasticity typically induces transitions between the states [
5]. Furthermore, under appropriate conditions, one witnesses sharp, stochasticity-induced amplification of the response to the periodic forcing, referred to as stochastic resonance [
6]. Our objective is to relate these phenomena to information processing.
A general formulation of the stochastic dynamics in the presence of a periodic forcing for multistable systems involving one variable is presented in
Section 2, where, building on previous work by one of the present authors [
7], the classical linear response theory of stochastic resonance in bistable systems is extended to the case of an arbitrary number of simultaneously stable states. In
Section 3, a set of entropy-like quantities characterizing the complexity, variability and predictability of the system viewed as an information processor are introduced. Their dynamics as induced by the dynamics of the underlying multistable system is analyzed in
Section 4. It is shown that by varying some key parameters the system can attain states of optimal response and predictability. The main conclusions are summarized in
Section 5.
2. Self-Organization and Stochastic Resonance in a Periodically-Forced Multistable System
Consider a one-variable nonlinear system subjected to additive periodic and stochastic forcings. The evolution of such a system can be cast in a potential form [
1,
5],
where
x is the state variable and the stochastic forcing
is assimilated to a Gaussian white noise of variance
,
We decompose the generalized potential
U as:
Here,
is the potential in absence of the periodic forcing, and
ϵ,
ω stand for the amplitude and frequency of the forcing, respectively. In the classical setting of stochastic resonance,
possesses two minima (associated with two stable steady states of the system) separated by a maximum. In the present work, this setting is extended by allowing for the existence of an arbitrary number
n of stable steady states and, thus, for a
possessing
n minima
separated by intermediately situated maxima. Furthermore, we stipulate that the leftmost and rightmost minima one and
n are separated from the environment by impermeable boundaries, such that there are no probability fluxes directed from these states to the environment [
7].
A simple implementation of this setting amounts to choosing
in such a way that the successive minima and maxima are equidistant and of equal depth and height, respectively. These conditions become increasingly difficult to fulfill for increasing
n if
has a polynomial form. For the sake of simplicity, we will therefore adopt the following model for
:
with stable and unstable states located respectively at:
Equations (1) and (2) describe a composite motion consisting of a combination of small-scale diffusion around each of the stable states and of large-scale transitions between neighboring stable states across the intermediate unstable state. The latter is an activated process whose rate depends sensitively on the potential barrier (
cf. Equation (2b)):
As long as the noise is sufficiently weak in the sense of
, the characteristic time scale of this motion is much slower than the characteristic time of diffusion around a given stable state, the corresponding rate being given by Kramers’ formula [
1,
5],
where the accents denote derivatives with respect to
x.
Placing ourselves in this limit, we can map Equation (1) into a discrete state process [
5,
7] describing the transfer of probability masses
contained in the attraction basins of the stable states
i (
):
The corresponding kinetic equations read [
7]:
where
is the conditional probability per unit time to reach state
i starting from state
j. The transfer operator
T appearing in this equation is a tridiagonal matrix satisfying the normalization condition
, whose structure can be summarized as follows:
Elements along the principal diagonal:
Elements along the upper sub-diagonal:
Elements along the lower sub-diagonal:
The rate constants
can be evaluated from Equations (4a) and (4b),
with:
Equation (5) constitutes a linear system with time-periodic coefficients. In what follows, we focus on the linear response, which will provide us with both qualitative and quantitative insights into the role of the principal parameters involved in the problem.
The starting point is to expand Equation (7a) in
ϵ,
with:
This induces a decomposition of the transfer operator
T and of the probability vector
in Equation (5) in the form:
Here, and Δ are again tridiagonal matrices with elements given by Equations (6a)–(6c), where are replaced by and , respectively. is the invariant probability in absence of the periodic forcing and the forcing-induced response. Notice that and are normalized to unity and to zero, respectively. Furthermore, since in absence of the forcing all k’s are equal (cf. Equations (7c) and (3a)), the corresponding invariant probabilities are uniform, .
Substituting Equation (9) into Equation (5) and adopting for compactness a vector notation, we obtain to the first order in
ϵ:
The solution of this equation in the long time limit is of the form:
where the components
and
of
and
determine the amplitudes and phases of the
’s with respect to the periodic forcing,
Substituting Equation (11) into Equation (10) and identifying the coefficients of
and
, one obtains following the lines of [
7] the following explicit expressions of
and
,
where
is given by:
and
is the value of the unperturbed rates
.
Figure 1a–c depicts the maxima
and the phases of
as a function of
i, keeping
n and
fixed as obtained by numerical evaluation of the analytic expressions (12). The plot of the coefficient
as a function of
i in
Figure 1b shows that this coefficient is subjected to several changes in sign. This entails that the corresponding response (Equation (11b)) will be subjected to an additional phase shift of
π in regions where
is negative. As can be seen, the maximal response is obtained for the boundary states one and
n. This is due to the fact that while the intermediate states are depleted by transferring probability masses to both of their neighbors, for the boundary states, the depletion is asymmetric. Furthermore, for given noise strength
, the response is more pronounced in the range of low frequencies, as expected to be the case in stochastic resonance. Note that for
n odd, the response in the middle state is strictly zero. Finally, varying
for fixed
ω provides an optimal value
for which the amplitude of the response is maximized.
3. Multistability, Information Entropy, Information Production and Information Transfer
In this section, we introduce a set of quantities serving as measures of the choice and unexpectedness associated with self-organization and, in particular, with the multiplicity of available states of the system introduced in
Section 2. We start with information (Shannon) entropy [
1,
2,
3,
4]:
where the probabilities
of the various states are defined by Equations (5), (9b) and (11). As a reference, we notice that in the absence of the nonequilibrium constraint provided by the external forcing, the probabilities are uniform,
, and
in this state of full randomness attains its maximum value:
The deviation from full randomness, and thus, the ability to reduce errors, is conveniently measured by the redundancy:
We come next to the link with dynamics. Differentiating both sides of Equation (14) with respect to time and utilizing Equation (5), we obtain a balance equation for the rate of change of
.
Setting:
we can rewrite this equation in the more suggestive form:
Writing:
we finally obtain:
where the information entropy production
and the associated flux
are defined by [
8,
9]:
We notice the bilinear structure of
in which the factors within the sum can be viewed as generalized (probability) fluxes and their associated generalized forces. This is reminiscent of the expression of entropy production of classical irreversible thermodynamics [
10]. As a reference, in the state of equipartition realized in the absence of the external forcing, one has
,
, and
vanishes along with all individual generalized fluxes and forces. This property of detailed balance, characteristic of thermodynamic equilibrium, breaks down in the presence of the forcing, which introduces a differentiation in
and an asymmetry in the
.
measures, therefore the distance between equilibrium and nonequilibrium on the one side and between direct (
i to
j) and reverse (
j to
i) process on the other. In this latter context
is also closely related to the Kullback information.
Finally, in an information theory perspective, one is led to consider the information transfer between a part of the system playing the role of “transmitting set”
X and a “receiver set”
Y separated by a “noisy channel” [
2,
4]. In the dynamical perspective developed in this work, the analogs are two states, say
i and
j, and a conditional probability matrix,
. The information transfer is then simply the sum of the Shannon entropy and the Kolmogorov–Sinai entropy [
1,
4]:
To relate
h with the quantities governing the evolution of our multistable system, we need to relate the transition probabilities
to the transition rates (probabilities per unit time)
featured in Equations (6) and (17). This requires in turn to map the continuous time process of the previous section to a discrete-time Markov chain. To this end, we introduce the discretized expression of the time derivative in Equation (5), utilize Equation (17) and choose the time step
as a fraction of the Kramers time associated with the passage over the potential barriers as discussed in
Section 2:
This leads to the discrete master-type equation:
where the stochastic matrix
W is defined by:
Introducing again as a reference the state in the absence of the forcing, one sees straightforwardly that
,
and
for
,
for
and
. Expression (19) reduces then to:
where the two terms on the right-hand side account, respectively, for the contributions of the intermediate states and of the boundary states one and
n.
It is worth noting that for
n, even the Markov chain associated with the forcing-free system is lumpable [
11], in the sense that upon grouping the original states, one can reduce Equation (21) into a system of just two states
a and
b, with
and
. In other words, in the absence of the nonequilibrium constraint, the intermediate states play no role. The presence of the forcing will change this situation radically by inducing non-trivial correlations and information exchanges within the system.
5. Conclusions
In this work, a nonlinear system subjected to a nonequilibrium constraint in the form of a periodic forcing, giving rise to complex behavior in the form of fluctuation-induced transitions between multiple steady states and of stochastic resonance, was considered. Mapping the dynamics into a discrete-state Markov process allowed us to view the system as an information processor. Subsequently, the link between the dynamics and, in particular, the self-organization induced by the nonequilibrium constraint, on the one side, and quantities of interest in information theory, on the other side, was addressed. It was shown that the nonequilibrium constraint leaves a clear-cut signature on these quantities by reducing randomness and by enhancing predictability, which is maximized under conditions of stochastic resonance. Of special interest is the a priori unexpected existence of an optimum for a particular number of simultaneously stable states suggesting the existence of optimal alphabets on which information is to be generated and transmitted.
In summary, it appears that when viewed in a dynamical perspective, generalized entropy-like quantities as used in information theory can provide useful characterizations of self-organizing systems led to choose among a multiplicity of possible outcomes. Conversely and in line with the pioneering work in [
3,
4], information constitutes in turn one of the basic attributes emerging out of the dynamics of wide classes of self-organizing systems and conveying to them their specificity.