entropy-logo

Journal Browser

Journal Browser

Information and Self-Organization II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (28 February 2022) | Viewed by 20127

Special Issue Editors


E-Mail Website
Guest Editor
Institute for Theoretical Physics, Center of Synergetics, Pfaffenwaldring 57/4, Stuttgart University, D-70550 Stuttgart, Germany
Interests: self-organization/synergetics; information theory; pattern recognition; quantum information

E-Mail Website
Guest Editor
Head of the City Center, Tel Aviv University Research Center for Cities and Urbanism, Tel Aviv University, Tel Aviv 69978, Israel
Interests: complexity theories; urban dynamics; information theory; spatial cognition
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The process of “self-organization” invokes a property that is far from equilibrium and refers to open and complex systems that acquire spatiotemporal or functional structures without specific ordering instructions from the outside. In domains such as physics, chemistry or biology, the phrase “far from equilibrium” refers to systems that are “far from thermal equilibrium”, while in other disciplines, the term refers to the property of being “away from the resting state”. Such systems are “complex” in the sense that they are composed of many interacting components, parts, elements, etc. and “open” in the sense that they exchange with their environment matter, energy, and information. Here, “information” may imply Shannon information, as a measure of the capacity of a channel through which a message passes, pragmatic information, as the impact of a message on recipients, or semantic information, as the meaning per se conveyed by a message.

In the first Special Issue—Information and Self-Organization”—(Haken and Portugali 2016), the aim was to deal with the different ways processes of self-organization are linked with the various forms of information. A prominent example was the concept of information adaptation, whereby Shannon information and semantic information condition each other. A study of such links has consequences on a number of research domains, ranging from physics and chemistry, through the life sciences and cognitive science, including human behavior and action, to our understanding of society, economics, and the dynamics of cities and urbanization.

In the present Special Issue—“Information and Self-Organization II”—the aim is to focus on studies exploring specific aspects of information and self-organization, such as principles of self-organization based on information theory, the ‘free energy principle’, social neurology and coordination dynamics, or their conjunction. If relevant, we urge contributors to refer in their papers to the "lessons from corona”, by adding a few notes on the issue as seen from the paper’s perspective.

Prof. Hermann Haken
Prof. Juval Portugali
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Self-organization in nature and society
  • Synergetics
  • Information theory
  • Information adaptation (IA)
  • Coordination dynamics
  • Action
  • Cities
  • Language
  • Society
  • Economy

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Related Special Issue

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

27 pages, 3083 KiB  
Article
Information and Self-Organization II: Steady State and Phase Transition
by Hermann Haken and Juval Portugali
Entropy 2021, 23(6), 707; https://doi.org/10.3390/e23060707 - 2 Jun 2021
Cited by 15 | Viewed by 3983
Abstract
This paper starts from Schrödinger’s famous question “what is life” and elucidates answers that invoke, in particular, Friston’s free energy principle and its relation to the method of Bayesian inference and to Synergetics 2nd foundation that utilizes Jaynes’ maximum entropy principle. Our presentation [...] Read more.
This paper starts from Schrödinger’s famous question “what is life” and elucidates answers that invoke, in particular, Friston’s free energy principle and its relation to the method of Bayesian inference and to Synergetics 2nd foundation that utilizes Jaynes’ maximum entropy principle. Our presentation reflects the shift from the emphasis on physical principles to principles of information theory and Synergetics. In view of the expected general audience of this issue, we have chosen a somewhat tutorial style that does not require special knowledge on physics but familiarizes the reader with concepts rooted in information theory and Synergetics. Full article
(This article belongs to the Special Issue Information and Self-Organization II)
Show Figures

Figure 1

19 pages, 1770 KiB  
Article
Unifying Large- and Small-Scale Theories of Coordination
by J. A. Scott Kelso
Entropy 2021, 23(5), 537; https://doi.org/10.3390/e23050537 - 27 Apr 2021
Cited by 39 | Viewed by 7211
Abstract
Coordination is a ubiquitous feature of all living things. It occurs by virtue of informational coupling among component parts and processes and can be quite specific (as when cells in the brain resonate to signals in the environment) or nonspecific (as when simple [...] Read more.
Coordination is a ubiquitous feature of all living things. It occurs by virtue of informational coupling among component parts and processes and can be quite specific (as when cells in the brain resonate to signals in the environment) or nonspecific (as when simple diffusion creates a source–sink dynamic for gene networks). Existing theoretical models of coordination—from bacteria to brains to social groups—typically focus on systems with very large numbers of elements (N→∞) or systems with only a few elements coupled together (typically N = 2). Though sharing a common inspiration in Nature’s propensity to generate dynamic patterns, both approaches have proceeded largely independent of each other. Ideally, one would like a theory that applies to phenomena observed on all scales. Recent experimental research by Mengsen Zhang and colleagues on intermediate-sized ensembles (in between the few and the many) proves to be the key to uniting large- and small-scale theories of coordination. Disorder–order transitions, multistability, order–order phase transitions, and especially metastability are shown to figure prominently on multiple levels of description, suggestive of a basic Coordination Dynamics that operates on all scales. This unified coordination dynamics turns out to be a marriage of two well-known models of large- and small-scale coordination: the former based on statistical mechanics (Kuramoto) and the latter based on the concepts of Synergetics and nonlinear dynamics (extended Haken–Kelso–Bunz or HKB). We show that models of the many and the few, previously quite unconnected, are thereby unified in a single formulation. The research has led to novel topological methods to handle the higher-dimensional dynamics of coordination in complex systems and has implications not only for understanding coordination but also for the design of (biorhythm inspired) computers. Full article
(This article belongs to the Special Issue Information and Self-Organization II)
Show Figures

Figure 1

20 pages, 1918 KiB  
Article
Neural Dynamics under Active Inference: Plausibility and Efficiency of Information Processing
by Lancelot Da Costa, Thomas Parr, Biswa Sengupta and Karl Friston
Entropy 2021, 23(4), 454; https://doi.org/10.3390/e23040454 - 12 Apr 2021
Cited by 23 | Viewed by 5696
Abstract
Active inference is a normative framework for explaining behaviour under the free energy principle—a theory of self-organisation originating in neuroscience. It specifies neuronal dynamics for state-estimation in terms of a descent on (variational) free energy—a measure of the fit between an internal (generative) [...] Read more.
Active inference is a normative framework for explaining behaviour under the free energy principle—a theory of self-organisation originating in neuroscience. It specifies neuronal dynamics for state-estimation in terms of a descent on (variational) free energy—a measure of the fit between an internal (generative) model and sensory observations. The free energy gradient is a prediction error—plausibly encoded in the average membrane potentials of neuronal populations. Conversely, the expected probability of a state can be expressed in terms of neuronal firing rates. We show that this is consistent with current models of neuronal dynamics and establish face validity by synthesising plausible electrophysiological responses. We then show that these neuronal dynamics approximate natural gradient descent, a well-known optimisation algorithm from information geometry that follows the steepest descent of the objective in information space. We compare the information length of belief updating in both schemes, a measure of the distance travelled in information space that has a direct interpretation in terms of metabolic cost. We show that neural dynamics under active inference are metabolically efficient and suggest that neural representations in biological agents may evolve by approximating steepest descent in information space towards the point of optimal inference. Full article
(This article belongs to the Special Issue Information and Self-Organization II)
Show Figures

Figure 1

17 pages, 3542 KiB  
Article
Adaptive Information Sharing with Ontological Relevance Computation for Decentralized Self-Organization Systems
by Wei Liu, Weizhi Ran, Sulemana Nantogma and Yang Xu
Entropy 2021, 23(3), 342; https://doi.org/10.3390/e23030342 - 14 Mar 2021
Cited by 2 | Viewed by 2035
Abstract
Decentralization is a peculiar characteristic of self-organizing systems such as swarm intelligence systems, which function as complex collective responsive systems without central control and operates based on contextual local coordination among relatively simple individual systems. The decentralized particularity of self-organizing systems lies in [...] Read more.
Decentralization is a peculiar characteristic of self-organizing systems such as swarm intelligence systems, which function as complex collective responsive systems without central control and operates based on contextual local coordination among relatively simple individual systems. The decentralized particularity of self-organizing systems lies in their capacity to spontaneously respond to accommodate environmental changes in a cooperative manner without external control. However, if members cannot obtain observations of the state of the whole team and environment, they have to share their knowledge and policies with each other through communication in order to adapt to the environment appropriately. In this paper, we propose an information sharing mechanism as an independent decision phase to improve individual members’ joint adaption to the world to fulfill an optimal self-organization in general. We design the information sharing decision analogous to human information sharing mechanisms. In this case, information can be shared among individual members by evaluating the semantic relationship of information based on ontology graph and their local knowledge. That is, if individual member collects more relevant information, the information will be used to update its local knowledge and improve sharing relevant information by measuring the ontological relevance. This will enable more related information to be acquired so that their models will be reinforced for more precise information sharing. Our simulations and experimental results show that this design can share information efficiently to achieve optimal adaptive self-organizing systems. Full article
(This article belongs to the Special Issue Information and Self-Organization II)
Show Figures

Figure 1

Back to TopTop