entropy-logo

Journal Browser

Journal Browser

Statistical Theory and Modeling of Rare, Extreme Events: Entropy and Information Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (20 August 2022) | Viewed by 14615

Special Issue Editor

Special Issue Information

Dear Colleagues,

Rare, extreme events refer to those that occur rarely but have a significant impact when they occur. Examples include the sudden outbreak of devastating infectious diseases, solar flares, extreme weather conditions, flood, forest fire, etc. Such events often result from strong non-linear interactions across a wide range of different lengths and time scales, with the limited utility of conventional perturbative methods. Furthermore, rare events contribute to tails of a probabilistic distribution function while hardly affecting mean values or variance. These present the main challenge in the theory and modeling of rare, extreme events.

This Special Issue aims to present different approaches to address this challenge in a broad range of disciplines from the perspective of entropy and information theory. This will include the development of a new theory, applications to phenomena in nature, industry, society, medicine, and finances, as well as data analysis from these systems.

Dr. Eun-jin Kim
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Rare extreme event
  • Entropy
  • Information theory
  • Transfer entropy
  • Wavelet entropy
  • Multiscale entropy
  • Statistical physics
  • Large deviation theory
  • Extreme value theory
  • Probability distribution
  • Extreme principle
  • Entropy production
  • Turbulence
  • Quantum systems
  • Self-organization
  • Avalanches
  • Geometric frustration
  • Machine learning
  • Neural network
  • Deep learning
  • Spin systems
  • Magnetization

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 548 KiB  
Article
Testing for Serial Correlation in Autoregressive Exogenous Models with Possible GARCH Errors
by Hanqing Li, Xiaohui Liu, Yuting Chen and Yawen Fan
Entropy 2022, 24(8), 1076; https://doi.org/10.3390/e24081076 - 4 Aug 2022
Cited by 2 | Viewed by 2223
Abstract
Autoregressive exogenous, hereafter ARX, models are widely adopted in time series-related domains as they can be regarded as the combination of an autoregressive process and a predictive regression. Within a more complex structure, extant diagnostic checking methods face difficulties in remaining validity in [...] Read more.
Autoregressive exogenous, hereafter ARX, models are widely adopted in time series-related domains as they can be regarded as the combination of an autoregressive process and a predictive regression. Within a more complex structure, extant diagnostic checking methods face difficulties in remaining validity in many conditions existing in real applications, such as heteroscedasticity and error correlations exhibited between the ARX model itself and its exogenous processes. For these reasons, we propose a new serial correlation test method based on the profile empirical likelihood. Simulation results, as well as two real data examples, show that our method has a good performance in all mentioned conditions. Full article
Show Figures

Figure 1

23 pages, 5078 KiB  
Article
A Model of Interacting Navier–Stokes Singularities
by Hugues Faller, Lucas Fery, Damien Geneste and Bérengère Dubrulle
Entropy 2022, 24(7), 897; https://doi.org/10.3390/e24070897 - 29 Jun 2022
Viewed by 2193
Abstract
We introduce a model of interacting singularities of Navier–Stokes equations, named pinçons. They follow non-equilibrium dynamics, obtained by the condition that the velocity field around these singularities obeys locally Navier–Stokes equations. This model can be seen as a generalization of the vorton model [...] Read more.
We introduce a model of interacting singularities of Navier–Stokes equations, named pinçons. They follow non-equilibrium dynamics, obtained by the condition that the velocity field around these singularities obeys locally Navier–Stokes equations. This model can be seen as a generalization of the vorton model of Novikov that was derived for the Euler equations. When immersed in a regular field, the pinçons are further transported and sheared by the regular field, while applying a stress onto the regular field that becomes dominant at a scale that is smaller than the Kolmogorov length. We apply this model to compute the motion of a pair of pinçons. A pinçon dipole is intrinsically repelling and the pinçons generically run away from each other in the early stage of their interaction. At a late time, the dissipation takes over, and the dipole dies over a viscous time scale. In the presence of a stochastic forcing, the dipole tends to orientate itself so that its components are perpendicular to their separation, and it can then follow during a transient time a near out-of-equilibrium state, with forcing balancing dissipation. In the general case where the pinçons have arbitrary intensity and orientation, we observe three generic dynamics in the early stage: one collapse with infinite dissipation, and two expansion modes, the dipolar anti-aligned runaway and an anisotropic aligned runaway. The collapse of a pair of pinçons follows several characteristics of the reconnection between two vortex rings, including the scaling of the distance between the two components, following Leray scaling tct. Full article
Show Figures

Figure 1

20 pages, 2340 KiB  
Article
Causal Information Rate
by Eun-jin Kim and Adrian-Josue Guel-Cortez
Entropy 2021, 23(8), 1087; https://doi.org/10.3390/e23081087 - 21 Aug 2021
Cited by 10 | Viewed by 2976
Abstract
Information processing is common in complex systems, and information geometric theory provides a useful tool to elucidate the characteristics of non-equilibrium processes, such as rare, extreme events, from the perspective of geometry. In particular, their time-evolutions can be viewed by the rate (information [...] Read more.
Information processing is common in complex systems, and information geometric theory provides a useful tool to elucidate the characteristics of non-equilibrium processes, such as rare, extreme events, from the perspective of geometry. In particular, their time-evolutions can be viewed by the rate (information rate) at which new information is revealed (a new statistical state is accessed). In this paper, we extend this concept and develop a new information-geometric measure of causality by calculating the effect of one variable on the information rate of the other variable. We apply the proposed causal information rate to the Kramers equation and compare it with the entropy-based causality measure (information flow). Overall, the causal information rate is a sensitive method for identifying causal relations. Full article
Show Figures

Figure 1

14 pages, 1942 KiB  
Article
Extreme Value Theory in Application to Delivery Delays
by Marcin Fałdziński, Magdalena Osińska and Wojciech Zalewski
Entropy 2021, 23(7), 788; https://doi.org/10.3390/e23070788 - 22 Jun 2021
Cited by 3 | Viewed by 2337
Abstract
This paper uses the Extreme Value Theory (EVT) to model the rare events that appear as delivery delays in road transport. Transport delivery delays occur stochastically. Therefore, modeling such events should be done using appropriate tools due to the economic consequences of these [...] Read more.
This paper uses the Extreme Value Theory (EVT) to model the rare events that appear as delivery delays in road transport. Transport delivery delays occur stochastically. Therefore, modeling such events should be done using appropriate tools due to the economic consequences of these extreme events. Additionally, we provide the estimates of the extremal index and the return level with the confidence interval to describe the clustering behavior of rare events in deliveries. The Generalized Extreme Value Distribution (GEV) parameters are estimated using the maximum likelihood method and the penalized maximum likelihood method for better small-sample properties. The findings demonstrate the advantages of EVT-based prediction and its readiness for application. Full article
Show Figures

Figure 1

23 pages, 5917 KiB  
Article
Information Geometric Theory in the Prediction of Abrupt Changes in System Dynamics
by Adrian-Josue Guel-Cortez and Eun-jin Kim
Entropy 2021, 23(6), 694; https://doi.org/10.3390/e23060694 - 31 May 2021
Cited by 16 | Viewed by 3235
Abstract
Detection and measurement of abrupt changes in a process can provide us with important tools for decision making in systems management. In particular, it can be utilised to predict the onset of a sudden event such as a rare, extreme event which causes [...] Read more.
Detection and measurement of abrupt changes in a process can provide us with important tools for decision making in systems management. In particular, it can be utilised to predict the onset of a sudden event such as a rare, extreme event which causes the abrupt dynamical change in the system. Here, we investigate the prediction capability of information theory by focusing on how sensitive information-geometric theory (information length diagnostics) and entropy-based information theoretical method (information flow) are to abrupt changes. To this end, we utilise a non-autonomous Kramer equation by including a sudden perturbation to the system to mimic the onset of a sudden event and calculate time-dependent probability density functions (PDFs) and various statistical quantities with the help of numerical simulations. We show that information length diagnostics predict the onset of a sudden event better than the information flow. Furthermore, it is explicitly shown that the information flow like any other entropy-based measures has limitations in measuring perturbations which do not affect entropy. Full article
Show Figures

Figure 1

Back to TopTop