entropy-logo

Journal Browser

Journal Browser

Information and Self-Organization

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 October 2015) | Viewed by 78536

Special Issue Editors


E-Mail Website
Guest Editor
Institute for Theoretical Physics, Center of Synergetics, Pfaffenwaldring 57/4, Stuttgart University, D-70550 Stuttgart, Germany
Interests: self-organization/synergetics, information theory, pattern recognition, quantum information

E-Mail Website
Guest Editor
ESLab (Environmental simulation lab), Department of Geography and the Human Environment, Tel Aviv University, Tel Aviv 69978, Israel
Interests: complexity theories, urban dynamics, information theory, spatial cognition
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The process of “self-organization” invokes a property that is far from equilibrium, and refers to open and complex systems that acquire spatio-temporal or functional structures without specific ordering instructions from the outside. In domains such as physics, chemistry or biology, the phrase, “far from equilibrium,” refers to systems that are “far from thermal equilibrium,” while in other disciplines, the term refers to the property of being “away from the resting state.” Such systems are “complex” in the sense that they are composed of many interacting components, parts, elements, etc., and “open” in the sense that they exchange with their environment matter, energy, and information. Here, “information” may imply Shannon information, as a measure of the capacity of a channel through which a message passes, pragmatic information, as the impact of a message on recipients, or semantic information, as the meaning conveyed by a message. This Special Issue aims to deal with the different ways processes of self-organization are linked with the various forms of information. A prominent example is the concept of information adaptation, whereby Shannon information and semantic information condition each other. A study of such links has consequences on a number of research domains, ranging from physics and chemistry, through the life sciences and cognitive science, including human behavior and action, to our understanding of society, economics, and the dynamics of cities and urbanization.

Prof. Dr. Hermann Haken
Prof. Dr. Juval Portugali
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • self-organization in nature and society
  • synergetics, information theory
  • information adaptation (IA)
  • action
  • cities

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Related Special Issue

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

179 KiB  
Editorial
Information and Self-Organization
by Hermann Haken and Juval Portugali
Entropy 2017, 19(1), 18; https://doi.org/10.3390/e19010018 - 31 Dec 2016
Cited by 24 | Viewed by 8616
Abstract
The process of “self-organization” takes place in open and complex systems that acquire spatio-temporal or functional structures without specific ordering instructions from the outside. [...] Full article
(This article belongs to the Special Issue Information and Self-Organization)

Research

Jump to: Editorial, Review

6775 KiB  
Article
On Macrostates in Complex Multi-Scale Systems
by Harald Atmanspacher
Entropy 2016, 18(12), 426; https://doi.org/10.3390/e18120426 - 29 Nov 2016
Cited by 20 | Viewed by 7152
Abstract
A characteristic feature of complex systems is their deep structure, meaning that the definition of their states and observables depends on the level, or the scale, at which the system is considered. This scale dependence is reflected in the distinction of micro- and [...] Read more.
A characteristic feature of complex systems is their deep structure, meaning that the definition of their states and observables depends on the level, or the scale, at which the system is considered. This scale dependence is reflected in the distinction of micro- and macro-states, referring to lower and higher levels of description. There are several conceptual and formal frameworks to address the relation between them. Here, we focus on an approach in which macrostates are contextually emergent from (rather than fully reducible to) microstates and can be constructed by contextual partitions of the space of microstates. We discuss criteria for the stability of such partitions, in particular under the microstate dynamics, and outline some examples. Finally, we address the question of how macrostates arising from stable partitions can be identified as relevant or meaningful. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

797 KiB  
Article
Potential of Entropic Force in Markov Systems with Nonequilibrium Steady State, Generalized Gibbs Function and Criticality
by Lowell F. Thompson and Hong Qian
Entropy 2016, 18(8), 309; https://doi.org/10.3390/e18080309 - 18 Aug 2016
Cited by 6 | Viewed by 5213
Abstract
In this paper, we revisit the notion of the “minus logarithm of stationary probability” as a generalized potential in nonequilibrium systems and attempt to illustrate its central role in an axiomatic approach to stochastic nonequilibrium thermodynamics of complex systems. It is demonstrated that [...] Read more.
In this paper, we revisit the notion of the “minus logarithm of stationary probability” as a generalized potential in nonequilibrium systems and attempt to illustrate its central role in an axiomatic approach to stochastic nonequilibrium thermodynamics of complex systems. It is demonstrated that this quantity arises naturally through both monotonicity results of Markov processes and as the rate function when a stochastic process approaches a deterministic limit. We then undertake a more detailed mathematical analysis of the consequences of this quantity, culminating in a necessary and sufficient condition for the criticality of stochastic systems. This condition is then discussed in the context of recent results about criticality in biological systems. Full article
(This article belongs to the Special Issue Information and Self-Organization)
11424 KiB  
Article
Information and Selforganization: A Unifying Approach and Applications
by Hermann Haken and Juval Portugali
Entropy 2016, 18(6), 197; https://doi.org/10.3390/e18060197 - 14 Jun 2016
Cited by 35 | Viewed by 8837
Abstract
Selforganization is a process by which the interaction between the parts of a complex system gives rise to the spontaneous emergence of patterns, structures or functions. In this interaction the system elements exchange matter, energy and information. We focus our attention on the [...] Read more.
Selforganization is a process by which the interaction between the parts of a complex system gives rise to the spontaneous emergence of patterns, structures or functions. In this interaction the system elements exchange matter, energy and information. We focus our attention on the relations between selforganization and information in general and the way they are linked to cognitive processes in particular. We do so from the analytical and mathematical perspective of the “second foundation of synergetics” and its “synergetic computer” and with reference to several forms of information: Shannon’s information that deals with the quantity of a message irrespective of its meaning, semantic and pragmatic forms of information that deal with the meaning conveyed by messages and information adaptation that refers to the interplay between Shannon’s information and semantic or pragmatic information. We first elucidate the relations between selforganization and information theoretically and mathematically and then by means of specific case studies. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

239 KiB  
Article
Entropy and the Self-Organization of Information and Value
by Rainer Feistel and Werner Ebeling
Entropy 2016, 18(5), 193; https://doi.org/10.3390/e18050193 - 19 May 2016
Cited by 24 | Viewed by 7418
Abstract
Adam Smith, Charles Darwin, Rudolf Clausius, and Léon Brillouin considered certain “values” as key quantities in their descriptions of market competition, natural selection, thermodynamic processes, and information exchange, respectively. None of those values can be computed from elementary properties of the particular object [...] Read more.
Adam Smith, Charles Darwin, Rudolf Clausius, and Léon Brillouin considered certain “values” as key quantities in their descriptions of market competition, natural selection, thermodynamic processes, and information exchange, respectively. None of those values can be computed from elementary properties of the particular object they are attributed to, but rather values represent emergent, irreducible properties. In this paper, such values are jointly understood as information values in certain contexts. For this aim, structural information is distinguished from symbolic information. While the first can be associated with arbitrary physical processes or structures, the latter requires conventions which govern encoding and decoding of the symbols which form a message. As a value of energy, Clausius’ entropy is a universal measure of the structural information contained in a thermodynamic system. The structural information of a message, in contrast to its meaning, can be evaluated by Shannon’s entropy of communication. Symbolic information is found only in the realm of life, such as in animal behavior, human sociology, science, or technology, and is often cooperatively valuated by competition. Ritualization is described here as a universal scenario for the self-organization of symbols by which symbolic information emerges from structural information in the course of evolution processes. Emergent symbolic information exhibits the novel fundamental code symmetry which prevents the meaning of a message from being reducible to the physical structure of its carrier. While symbols turn arbitrary during the ritualization transition, their structures preserve information about their evolution history. Full article
(This article belongs to the Special Issue Information and Self-Organization)
409 KiB  
Article
Stochastic Resonance, Self-Organization and Information Dynamics in Multistable Systems
by Grégoire Nicolis and Catherine Nicolis
Entropy 2016, 18(5), 172; https://doi.org/10.3390/e18050172 - 4 May 2016
Cited by 14 | Viewed by 5900
Abstract
A class of complex self-organizing systems subjected to fluctuations of environmental or intrinsic origin and to nonequilibrium constraints in the form of an external periodic forcing is analyzed from the standpoint of information theory. Conditions under which the response of information entropy and [...] Read more.
A class of complex self-organizing systems subjected to fluctuations of environmental or intrinsic origin and to nonequilibrium constraints in the form of an external periodic forcing is analyzed from the standpoint of information theory. Conditions under which the response of information entropy and related quantities to the nonequilibrium constraint can be optimized via a stochastic resonance-type mechanism are identified, and the role of key parameters is assessed. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

1372 KiB  
Article
An Evolutionary Game Theoretic Approach to Multi-Sector Coordination and Self-Organization
by Fernando P. Santos, Sara Encarnação, Francisco C. Santos, Juval Portugali and Jorge M. Pacheco
Entropy 2016, 18(4), 152; https://doi.org/10.3390/e18040152 - 20 Apr 2016
Cited by 19 | Viewed by 9700
Abstract
Coordination games provide ubiquitous interaction paradigms to frame human behavioral features, such as information transmission, conventions and languages as well as socio-economic processes and institutions. By using a dynamical approach, such as Evolutionary Game Theory (EGT), one is able to follow, in detail, [...] Read more.
Coordination games provide ubiquitous interaction paradigms to frame human behavioral features, such as information transmission, conventions and languages as well as socio-economic processes and institutions. By using a dynamical approach, such as Evolutionary Game Theory (EGT), one is able to follow, in detail, the self-organization process by which a population of individuals coordinates into a given behavior. Real socio-economic scenarios, however, often involve the interaction between multiple co-evolving sectors, with specific options of their own, that call for generalized and more sophisticated mathematical frameworks. In this paper, we explore a general EGT approach to deal with coordination dynamics in which individuals from multiple sectors interact. Starting from a two-sector, consumer/producer scenario, we investigate the effects of including a third co-evolving sector that we call public. We explore the changes in the self-organization process of all sectors, given the feedback that this new sector imparts on the other two. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

1271 KiB  
Article
Self-Organization with Constraints—A Mathematical Model for Functional Differentiation
by Ichiro Tsuda, Yutaka Yamaguti and Hiroshi Watanabe
Entropy 2016, 18(3), 74; https://doi.org/10.3390/e18030074 - 26 Feb 2016
Cited by 7 | Viewed by 8053
Abstract
This study proposes mathematical models for functional differentiations that are viewed as self-organization with external constraints. From the viewpoint of system development, the present study investigates how system components emerge under the presence of constraints that act on a whole system. Cell differentiation [...] Read more.
This study proposes mathematical models for functional differentiations that are viewed as self-organization with external constraints. From the viewpoint of system development, the present study investigates how system components emerge under the presence of constraints that act on a whole system. Cell differentiation in embryos and functional differentiation in cortical modules are typical examples of this phenomenon. In this paper, as case studies, we deal with three mathematical models that yielded components via such global constraints: the genesis of neuronal elements, the genesis of functional modules, and the genesis of neuronal interactions. The overall development of a system may follow a certain variational principle. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

701 KiB  
Article
Measuring the Complexity of Continuous Distributions
by Guillermo Santamaría-Bonfil, Nelson Fernández and Carlos Gershenson
Entropy 2016, 18(3), 72; https://doi.org/10.3390/e18030072 - 26 Feb 2016
Cited by 10 | Viewed by 9623
Abstract
We extend previously proposed measures of complexity, emergence, and self-organization to continuous distributions using differential entropy. Given that the measures were based on Shannon’s information, the novel continuous complexity measures describe how a system’s predictability changes in terms of the probability distribution parameters. [...] Read more.
We extend previously proposed measures of complexity, emergence, and self-organization to continuous distributions using differential entropy. Given that the measures were based on Shannon’s information, the novel continuous complexity measures describe how a system’s predictability changes in terms of the probability distribution parameters. This allows us to calculate the complexity of phenomena for which distributions are known. We find that a broad range of common parameters found in Gaussian and scale-free distributions present high complexity values. We also explore the relationship between our measure of complexity and information adaptation. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

Review

Jump to: Editorial, Research

2602 KiB  
Review
Increase in Complexity and Information through Molecular Evolution
by Peter Schuster
Entropy 2016, 18(11), 397; https://doi.org/10.3390/e18110397 - 14 Nov 2016
Cited by 11 | Viewed by 6448
Abstract
Biological evolution progresses by essentially three different mechanisms: (I) optimization of properties through natural selection in a population of competitors; (II) development of new capabilities through cooperation of competitors caused by catalyzed reproduction; and (III) variation of genetic information through mutation or recombination. [...] Read more.
Biological evolution progresses by essentially three different mechanisms: (I) optimization of properties through natural selection in a population of competitors; (II) development of new capabilities through cooperation of competitors caused by catalyzed reproduction; and (III) variation of genetic information through mutation or recombination. Simplified evolutionary processes combine two out of the three mechanisms: Darwinian evolution combines competition (I) and variation (III) and is represented by the quasispecies model, major transitions involve cooperation (II) of competitors (I), and the third combination, cooperation (II) and variation (III) provides new insights in the role of mutations in evolution. A minimal kinetic model based on simple molecular mechanisms for reproduction, catalyzed reproduction and mutation is introduced, cast into ordinary differential equations (ODEs), and analyzed mathematically in form of its implementation in a flow reactor. Stochastic aspects are investigated through computer simulation of trajectories of the corresponding chemical master equations. The competition-cooperation model, mechanisms (I) and (II), gives rise to selection at low levels of resources and leads to symbiontic cooperation in case the material required is abundant. Accordingly, it provides a kind of minimal system that can undergo a (major) transition. Stochastic effects leading to extinction of the population through self-enhancing oscillations destabilize symbioses of four or more partners. Mutations (III) are not only the basis of change in phenotypic properties but can also prevent extinction provided the mutation rates are sufficiently large. Threshold phenomena are observed for all three combinations: The quasispecies model leads to an error threshold, the competition-cooperation model allows for an identification of a resource-triggered bifurcation with the transition, and for the cooperation-mutation model a kind of stochastic threshold for survival through sufficiently high mutation rates is observed. The evolutionary processes in the model are accompanied by gains in information on the environment of the evolving populations. In order to provide a useful basis for comparison, two forms of information, syntactic or Shannon information and semantic information are introduced here. Both forms of information are defined for simple evolving systems at the molecular level. Selection leads primarily to an increase in semantic information in the sense that higher fitness allows for more efficient exploitation of the environment and provides the basis for more progeny whereas understanding transitions involves characteristic contributions from both Shannon information and semantic information. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

Back to TopTop