Information in Dynamical Systems and Complex Systems
A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".
Deadline for manuscript submissions: closed (28 February 2014) | Viewed by 70971
Special Issue Editors
Interests: dynamical systems; chaos theory; control of chaos; time-series analysis; Frobenius-Perron operators; stochastic dynamical systems; measurable dynamics; symbol dynamics; connections to information theory; image processing; data assimilation; connections between models and observed data; complex and networked coupled systems
Interests: dynamical systems; complex networks; information theory; time series analysis
Special Issue Information
Dear Colleagues,
From July 18-19, 2013, a workshop was held, entitled, Information in Dynamical Systems and Complex Systems, Summer 2013 Workshop in Burlington, VT with Organizers: Erik M. Bollt and Jie Sun (Clarkson University). Invited Attendees were, Erik Bollt, Ned Corron (U.S. Army), James Crutchfield (University of California, Davis), David Feldman (College of the Atlantic), Adom Giffin (Clarkson University), Kevin Knuth (University at Albany, SUNY), Ying-Cheng Lai (Arizona State University), John Mahoney, (University of California, Merced), Konstantin Mischaikow, (Rutgers University), Edward Ott, (University of Maryland, College Park), Milan Palus, (Academy of Sciences of the Czech Republic), Shawn Pethel (U.S. Army), Maurizio Porfiri, (Polytechnic Institute of New York University), Samuel Stanton (U.S. Army), Jie Sun, James Yorke, (University of Maryland, College Park).
This special issue of Entropy will offer a venue to collect some of the synergy, consensus and collective thoughts on the topical themes stated for the session. The following were the topics and themes of the workshop and the participants are invited to submit papers summarizing the collective discussions presented.
To that end, writings should take the form of,
- Research articles related to the presentation given at the workshop.
- Research articles related to the themes of the workshop stated below.
- Commentaries on future directions of information and complexity in large scaled systems as related to some themes below.
- Commentaries regarding connections and contrasts in the directions below.
- Commentaries on wisdom from experience and theory related to misuses of concepts from tools from this area, the so-called, “stop the insanity” thoughts.
- Discussions on other related themes connecting to the broader themes such as connections between observer and intrinsic based information and flow.
Workshop themes were stated as follows. Given the modern focus of dynamical systems on coupled oscillators that form complex networks, it is important to move forward and explore these problems from the perspective of information content, flow and causality. The following general areas are central in this endeavor:
Information flow. In particular transfer entropy has gained a great deal of interest in recent years as future states may be conditioned on past states both with and without access to other stochastic processes as a test through Kullback-Lieber divergence, but recent work suggests that there are possible misinterpretations from the use of transfer entropy for causality inference.Some questions to consider:
Causality, and information signatures of causation. A central question in science is what causes outcomes of interest, and in particular for affecting and controlling outcomes this is even more important. From the perspective of information flow, causation and causal inference becomes particularly poignant.
Symmetries and reversibility, may be exploited in special circumstances to enlighten understanding of causality,structure, as well as clustering.
Scales, hierarchies, lead to understanding of relevance and nested topological partitions when defining a scale from which information symbols are defined. Variational and optimization principles of information in physics, in particular regarding maximum entropy principles that lead to understanding underlying laws.
Randomness, structure and causality. In some sense randomness may be described as external and unmodelled effects, which we may interpret in the context here as \unknown information.” This leads to:
Hidden states and hidden processes, including such methods as hidden Markov models and more generally Bayesian inference methods. In the context of information content of a dynamical system, such perspective should potentially yield better understanding.
Measures and metrics of complexity and information content. The phrase “complexity” is commonly used for a wide variety of systems, behaviors and processes, and yet a commonly agreed description as to what the phrase means is lacking.
Physical laws as information filters or algorithms. Since physical laws lead to evolution equations, which from the perspective of this discussion defines evolution from some information state to a new information state, then it can be said that physical laws may be described either as algorithms or information filters that translate states.
Can we develop a general mechanistic description of what renders a real complex system different from a large but perhaps simpler system (particularly from an information theory perspective)?
Can physical laws be defined in an algorithmic and information theoretic manner?
Identify engineering applications, especially those that benefit directly from information theory perspective and methods.
How can this perspective impact design?
Can specific control methods be developed that benefit?
Are piecewise impulsive systems from mechanical as well as electronic engineering design particularly well suited?
Can group behaviors and cooperative behaviors such as those of animals and humans be better understood in terms of information theoretic descriptions? What role does hierarchical structures come into play?
Can synchronization be understood as the counter point to complex behavior?
Can methods designed to identify causal influences be adapted to further adjust and define control strategies for complex systems in biological, social, physical and engineering contexts?
Is there a minimal information description of a dynamical system that will facilitate engineering design? Does approximate description of the formal language suffice for approximate modeling lead to faster and easier design?
Discuss the validity of the popular approaches of information and entropy measures as a systems probe, change detection, damage detection and systems health monitoring.
Prof. Dr. Erik M Bollt
Dr. Jie Sun
Guest Editors
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue polices can be found here.