entropy-logo

Journal Browser

Journal Browser

Integrated Information Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (28 February 2019) | Viewed by 109546

Special Issue Editors


E-Mail Website
Guest Editor
Department of Psychiatry, School of Medicine and Public Health, University of Wisconsin–Madison, 6001 Research Park Blvd, Madison, WI 53719, USA
Interests: causation and causal analysis; information; complex system science; (artificial) neural networks; machine learning; computational neuroscience; cognition; decision-making; artificial life/intelligence
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Psychiatry, School of Medicine, University of Wisconsin–Madison, 6001 Research Park Blvd, Madison, WI 53719, USA
Interests: consciousness; sleep

Special Issue Information

Dear Colleagues,

Originally developed to address the problem of consciousness and its physical substrate, integrated information theory (IIT), in its latest version (“IIT 3.0”), provides a quantitative framework to analyze the compositional causal structure of (discrete) dynamical systems. In particular, IIT’s formalism is based on a notion of information that is physical and intrinsic (observer-independent), and a set of causal principles (“postulates”), including causal composition, specificity (information), irreducibility (integration), and causal exclusion.
IIT’s main quantity, a system’s amount of integrated information (Φ, “Phi”), has been employed as a general measure of complexity that captures to what extent a system is both differentiated and integrated. What is more, the IIT analysis can reveal a system’s causal borders, and, applied across macro and micro spatiotemporal scales, allows identifying organizational levels at which the system exhibits strong causal constraints.

Applying IIT’s causal measures rigorously, however, is only possible for rather small, discrete or discretized systems, due to combinatorial explosion. Moreover, the proposed mathematical framework may not be unique as a translation of IIT’s causal postulates, and relations to other proposed measures of complexity, (macro) causation, and biological information often remain vague.

For this special issue, we invite contributions that apply, discuss, compare, or extend the theoretical framework of integrated information theory, specifically its latest version, IIT 3.0. Submissions proposing approximations, practical measures, or alternative formulations of (parts of) the IIT formalism are also welcome, as are studies addressing causal composition and physical, intrinsic information in general.

Dr. Larissa Albantakis
Prof. Giulio Tononi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • physical information
  • causal composition and higher order interactions
  • complexity
  • identifying causal/informational boundaries
  • informational/causal measures of autonomy
  • causal exclusion and emergence
  • practical approximations of integrated information
  • applications

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (16 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

24 pages, 595 KiB  
Article
Scaling Behaviour and Critical Phase Transitions in Integrated Information Theory
by Miguel Aguilera
Entropy 2019, 21(12), 1198; https://doi.org/10.3390/e21121198 - 5 Dec 2019
Cited by 15 | Viewed by 3982
Abstract
Integrated Information Theory proposes a measure of conscious activity ( Φ ), characterised as the irreducibility of a dynamical system to the sum of its components. Due to its computational cost, current versions of the theory (IIT 3.0) are difficult to apply to [...] Read more.
Integrated Information Theory proposes a measure of conscious activity ( Φ ), characterised as the irreducibility of a dynamical system to the sum of its components. Due to its computational cost, current versions of the theory (IIT 3.0) are difficult to apply to systems larger than a dozen units, and, in general, it is not well known how integrated information scales as systems grow larger in size. In this article, we propose to study the scaling behaviour of integrated information in a simple model of a critical phase transition: an infinite-range kinetic Ising model. In this model, we assume a homogeneous distribution of couplings to simplify the computation of integrated information. This simplified model allows us to critically review some of the design assumptions behind the measure and connect its properties with well-known phenomena in phase transitions in statistical mechanics. As a result, we point to some aspects of the mathematical definitions of IIT that 3.0 fail to capture critical phase transitions and propose a reformulation of the assumptions made by integrated information measures. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Graphical abstract

44 pages, 9400 KiB  
Article
Why Does Space Feel the Way it Does? Towards a Principled Account of Spatial Experience
by Andrew Haun and Giulio Tononi
Entropy 2019, 21(12), 1160; https://doi.org/10.3390/e21121160 - 27 Nov 2019
Cited by 93 | Viewed by 15210
Abstract
There must be a reason why an experience feels the way it does. A good place to begin addressing this question is spatial experience, because it may be more penetrable by introspection than other qualities of consciousness such as color or pain. Moreover, [...] Read more.
There must be a reason why an experience feels the way it does. A good place to begin addressing this question is spatial experience, because it may be more penetrable by introspection than other qualities of consciousness such as color or pain. Moreover, much of experience is spatial, from that of our body to the visual world, which appears as if painted on an extended canvas in front of our eyes. Because it is ‘right there’, we usually take space for granted and overlook its qualitative properties. However, we should realize that a great number of phenomenal distinctions and relations are required for the canvas of space to feel ‘extended’. Here we argue that, to be experienced as extended, the canvas of space must be composed of countless spots, here and there, small and large, and these spots must be related to each other in a characteristic manner through connection, fusion, and inclusion. Other aspects of the structure of spatial experience follow from extendedness: every spot can be experienced as enclosing a particular region, with its particular location, size, boundary, and distance from other spots. We then propose an account of the phenomenal properties of spatial experiences based on integrated information theory (IIT). The theory provides a principled approach for characterizing both the quantity and quality of experience by unfolding the cause-effect structure of a physical substrate. Specifically, we show that a simple simulated substrate of units connected in a grid-like manner yields a cause-effect structure whose properties can account for the main properties of spatial experience. These results uphold the hypothesis that our experience of space is supported by brain areas whose units are linked by a grid-like connectivity. They also predict that changes in connectivity, even in the absence of changes in activity, should lead to a warping of experienced space. To the extent that this approach provides an initial account of phenomenal space, it may also serve as a starting point for investigating other aspects of the quality of experience and their physical correspondents. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

14 pages, 761 KiB  
Article
Integrated Information Theory and Isomorphic Feed-Forward Philosophical Zombies
by Jake R. Hanson and Sara I. Walker
Entropy 2019, 21(11), 1073; https://doi.org/10.3390/e21111073 - 2 Nov 2019
Cited by 7 | Viewed by 4595
Abstract
Any theory amenable to scientific inquiry must have testable consequences. This minimal criterion is uniquely challenging for the study of consciousness, as we do not know if it is possible to confirm via observation from the outside whether or not a physical system [...] Read more.
Any theory amenable to scientific inquiry must have testable consequences. This minimal criterion is uniquely challenging for the study of consciousness, as we do not know if it is possible to confirm via observation from the outside whether or not a physical system knows what it feels like to have an inside—a challenge referred to as the “hard problem” of consciousness. To arrive at a theory of consciousness, the hard problem has motivated development of phenomenological approaches that adopt assumptions of what properties consciousness has based on first-hand experience and, from these, derive the physical processes that give rise to these properties. A leading theory adopting this approach is Integrated Information Theory (IIT), which assumes our subjective experience is a “unified whole”, subsequently yielding a requirement for physical feedback as a necessary condition for consciousness. Here, we develop a mathematical framework to assess the validity of this assumption by testing it in the context of isomorphic physical systems with and without feedback. The isomorphism allows us to isolate changes in Φ without affecting the size or functionality of the original system. Indeed, the only mathematical difference between a “conscious” system with Φ > 0 and an isomorphic “philosophical zombie” with Φ = 0 is a permutation of the binary labels used to internally represent functional states. This implies Φ is sensitive to functionally arbitrary aspects of a particular labeling scheme, with no clear justification in terms of phenomenological differences. In light of this, we argue any quantitative theory of consciousness, including IIT, should be invariant under isomorphisms if it is to avoid the existence of isomorphic philosophical zombies and the epistemological problems they pose. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

29 pages, 2689 KiB  
Article
Causal Composition: Structural Differences among Dynamically Equivalent Systems
by Larissa Albantakis and Giulio Tononi
Entropy 2019, 21(10), 989; https://doi.org/10.3390/e21100989 - 11 Oct 2019
Cited by 23 | Viewed by 6088
Abstract
The dynamical evolution of a system of interacting elements can be predicted in terms of its elementary constituents and their interactions, or in terms of the system’s global state transitions. For this reason, systems with equivalent global dynamics are often taken to be [...] Read more.
The dynamical evolution of a system of interacting elements can be predicted in terms of its elementary constituents and their interactions, or in terms of the system’s global state transitions. For this reason, systems with equivalent global dynamics are often taken to be equivalent for all relevant purposes. Nevertheless, such systems may still vary in their causal composition—the way mechanisms within the system specify causes and effects over different subsets of system elements. We demonstrate this point based on a set of small discrete dynamical systems with reversible dynamics that cycle through all their possible states. Our analysis elucidates the role of composition within the formal framework of integrated information theory. We show that the global dynamical and information-theoretic capacities of reversible systems can be maximal even though they may differ, quantitatively and qualitatively, in the information that their various subsets specify about each other (intrinsic information). This can be the case even for a system and its time-reversed equivalent. Due to differences in their causal composition, two systems with equivalent global dynamics may still differ in their capacity for autonomy, agency, and phenomenology. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

16 pages, 1910 KiB  
Article
Criticality as a Determinant of Integrated Information Φ in Human Brain Networks
by Hyoungkyu Kim and UnCheol Lee
Entropy 2019, 21(10), 981; https://doi.org/10.3390/e21100981 - 8 Oct 2019
Cited by 33 | Viewed by 4670
Abstract
Integrated information theory (IIT) describes consciousness as information integrated across highly differentiated but irreducible constituent parts in a system. However, in a complex dynamic system such as the brain, the optimal conditions for large integrated information systems have not been elucidated. In this [...] Read more.
Integrated information theory (IIT) describes consciousness as information integrated across highly differentiated but irreducible constituent parts in a system. However, in a complex dynamic system such as the brain, the optimal conditions for large integrated information systems have not been elucidated. In this study, we hypothesized that network criticality, a balanced state between a large variation in functional network configuration and a large constraint on structural network configuration, may be the basis of the emergence of a large Φ, a surrogate of integrated information. We also hypothesized that as consciousness diminishes, the brain loses network criticality and Φ decreases. We tested these hypotheses with a large-scale brain network model and high-density electroencephalography (EEG) acquired during various levels of human consciousness under general anesthesia. In the modeling study, maximal criticality coincided with maximal Φ. The EEG study demonstrated an explicit relationship between Φ, criticality, and level of consciousness. The conscious resting state showed the largest Φ and criticality, whereas the balance between variation and constraint in the brain network broke down as the response rate dwindled. The results suggest network criticality as a necessary condition of a large Φ in the human brain. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

26 pages, 632 KiB  
Article
Integrated Information in Process-Algebraic Compositions
by Tommaso Bolognesi
Entropy 2019, 21(8), 805; https://doi.org/10.3390/e21080805 - 17 Aug 2019
Cited by 2 | Viewed by 2780
Abstract
Integrated Information Theory (IIT) is most typically applied to Boolean Nets, a state transition model in which system parts cooperate by sharing state variables. By contrast, in Process Algebra, whose semantics can also be formulated in terms of (labeled) state [...] Read more.
Integrated Information Theory (IIT) is most typically applied to Boolean Nets, a state transition model in which system parts cooperate by sharing state variables. By contrast, in Process Algebra, whose semantics can also be formulated in terms of (labeled) state transitions, system parts—“processes”—cooperate by sharing transitions with matching labels, according to interaction patterns expressed by suitable composition operators. Despite this substantial difference, questioning how much additional information is provided by the integration of the interacting partners above and beyond the sum of their independent contributions appears perfectly legitimate with both types of cooperation. In fact, we collect statistical data about ϕ—integrated information—relative to pairs of boolean nets that cooperate by three alternative mechanisms: shared variables—the standard choice for boolean nets—and two forms of shared transition, inspired by two process algebras. We name these mechanisms α , β and γ. Quantitative characterizations of all of them are obtained by considering three alternative execution modes, namely synchronous, asynchronous and “hybrid”, by exploring the full range of possible coupling degrees in all three cases, and by considering two possible definitions of ϕ based on two alternative notions of distribution distance. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

23 pages, 2884 KiB  
Article
Evaluating Approximations and Heuristic Measures of Integrated Information
by André Sevenius Nilsen, Bjørn Erik Juel and William Marshall
Entropy 2019, 21(5), 525; https://doi.org/10.3390/e21050525 - 24 May 2019
Cited by 14 | Viewed by 6060
Abstract
Integrated information theory (IIT) proposes a measure of integrated information, termed Phi (Φ), to capture the level of consciousness of a physical system in a given state. Unfortunately, calculating Φ itself is currently possible only for very small model systems and far from [...] Read more.
Integrated information theory (IIT) proposes a measure of integrated information, termed Phi (Φ), to capture the level of consciousness of a physical system in a given state. Unfortunately, calculating Φ itself is currently possible only for very small model systems and far from computable for the kinds of system typically associated with consciousness (brains). Here, we considered several proposed heuristic measures and computational approximations, some of which can be applied to larger systems, and tested if they correlate well with Φ. While these measures and approximations capture intuitions underlying IIT and some have had success in practical applications, it has not been shown that they actually quantify the type of integrated information specified by the latest version of IIT and, thus, whether they can be used to test the theory. In this study, we evaluated these approximations and heuristic measures considering how well they estimated the Φ values of model systems and not on the basis of practical or clinical considerations. To do this, we simulated networks consisting of 3–6 binary linear threshold nodes randomly connected with excitatory and inhibitory connections. For each system, we then constructed the system’s state transition probability matrix (TPM) and generated observed data over time from all possible initial conditions. We then calculated Φ, approximations to Φ, and measures based on state differentiation, coalition entropy, state uniqueness, and integrated information. Our findings suggest that Φ can be approximated closely in small binary systems by using one or more of the readily available approximations (r > 0.95) but without major reductions in computational demands. Furthermore, the maximum value of Φ across states (a state-independent quantity) correlated strongly with measures of signal complexity (LZ, rs = 0.722), decoder-based integrated information (Φ*, rs = 0.816), and state differentiation (D1, rs = 0.827). These measures could allow for the efficient estimation of a system’s capacity for high Φ or function as accurate predictors of low- (but not high-)Φ systems. While it is uncertain whether the results extend to larger systems or systems with other dynamics, we stress the importance that measures aimed at being practical alternatives to Φ be, at a minimum, rigorously tested in an environment where the ground truth can be established. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

15 pages, 1006 KiB  
Article
The Evolution of Neuroplasticity and the Effect on Integrated Information
by Leigh Sheneman, Jory Schossau and Arend Hintze
Entropy 2019, 21(5), 524; https://doi.org/10.3390/e21050524 - 24 May 2019
Cited by 3 | Viewed by 4814
Abstract
Information integration theory has been developed to quantify consciousness. Since conscious thought requires the integration of information, the degree of this integration can be used as a neural correlate (Φ) with the intent to measure degree of consciousness. Previous research has [...] Read more.
Information integration theory has been developed to quantify consciousness. Since conscious thought requires the integration of information, the degree of this integration can be used as a neural correlate (Φ) with the intent to measure degree of consciousness. Previous research has shown that the ability to integrate information can be improved by Darwinian evolution. The value Φ can change over many generations, and complex tasks require systems with at least a minimum Φ . This work was done using simple animats that were able to remember previous sensory inputs, but were incapable of fundamental change during their lifetime: actions were predetermined or instinctual. Here, we are interested in changes to Φ due to lifetime learning (also known as neuroplasticity). During lifetime learning, the system adapts to perform a task and necessitates a functional change, which in turn could change Φ . One can find arguments to expect one of three possible outcomes: Φ might remain constant, increase, or decrease due to learning. To resolve this, we need to observe systems that learn, but also improve their ability to learn over the many generations that Darwinian evolution requires. Quantifying Φ over the course of evolution, and over the course of their lifetimes, allows us to investigate how the ability to integrate information changes. To measure Φ , the internal states of the system must be experimentally observable. However, these states are notoriously difficult to observe in a natural system. Therefore, we use a computational model that not only evolves virtual agents (animats), but evolves animats to learn during their lifetime. We use this approach to show that a system that improves its performance due to feedback learning increases its ability to integrate information. In addition, we show that a system’s ability to increase Φ correlates with its ability to increase in performance. This suggests that systems that are very plastic regarding Φ learn better than those that are not. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

21 pages, 1705 KiB  
Article
Informational Structures and Informational Fields as a Prototype for the Description of Postulates of the Integrated Information Theory
by Piotr Kalita, José A. Langa and Fernando Soler-Toscano
Entropy 2019, 21(5), 493; https://doi.org/10.3390/e21050493 - 14 May 2019
Cited by 12 | Viewed by 4691
Abstract
Informational Structures (IS) and Informational Fields (IF) have been recently introduced to deal with a continuous dynamical systems-based approach to Integrated Information Theory (IIT). IS and IF contain all the geometrical and topological constraints in the phase space. This allows one to characterize [...] Read more.
Informational Structures (IS) and Informational Fields (IF) have been recently introduced to deal with a continuous dynamical systems-based approach to Integrated Information Theory (IIT). IS and IF contain all the geometrical and topological constraints in the phase space. This allows one to characterize all the past and future dynamical scenarios for a system in any particular state. In this paper, we develop further steps in this direction, describing a proper continuous framework for an abstract formulation, which could serve as a prototype of the IIT postulates. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

48 pages, 1647 KiB  
Article
What Caused What? A Quantitative Account of Actual Causation Using Dynamical Causal Networks
by Larissa Albantakis, William Marshall, Erik Hoel and Giulio Tononi
Entropy 2019, 21(5), 459; https://doi.org/10.3390/e21050459 - 2 May 2019
Cited by 40 | Viewed by 13208
Abstract
Actual causation is concerned with the question: “What caused what?” Consider a transition between two states within a system of interacting elements, such as an artificial neural network, or a biological brain circuit. Which combination of synapses caused the neuron to fire? Which [...] Read more.
Actual causation is concerned with the question: “What caused what?” Consider a transition between two states within a system of interacting elements, such as an artificial neural network, or a biological brain circuit. Which combination of synapses caused the neuron to fire? Which image features caused the classifier to misinterpret the picture? Even detailed knowledge of the system’s causal network, its elements, their states, connectivity, and dynamics does not automatically provide a straightforward answer to the “what caused what?” question. Counterfactual accounts of actual causation, based on graphical models paired with system interventions, have demonstrated initial success in addressing specific problem cases, in line with intuitive causal judgments. Here, we start from a set of basic requirements for causation (realization, composition, information, integration, and exclusion) and develop a rigorous, quantitative account of actual causation, that is generally applicable to discrete dynamical systems. We present a formal framework to evaluate these causal requirements based on system interventions and partitions, which considers all counterfactuals of a state transition. This framework is used to provide a complete causal account of the transition by identifying and quantifying the strength of all actual causes and effects linking the two consecutive system states. Finally, we examine several exemplary cases and paradoxes of causation and show that they can be illuminated by the proposed framework for quantifying actual causation. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Graphical abstract

12 pages, 219 KiB  
Article
Exclusion and Underdetermined Qualia
by Kyumin Moon
Entropy 2019, 21(4), 405; https://doi.org/10.3390/e21040405 - 16 Apr 2019
Cited by 10 | Viewed by 3835
Abstract
Integrated information theory (IIT) asserts that both the level and the quality of consciousness can be explained by the ability of physical systems to integrate information. Although the scientific content and empirical prospects of IIT have attracted interest, this paper focuses on another [...] Read more.
Integrated information theory (IIT) asserts that both the level and the quality of consciousness can be explained by the ability of physical systems to integrate information. Although the scientific content and empirical prospects of IIT have attracted interest, this paper focuses on another aspect of IIT, its unique theoretical structure, which relates the phenomenological axioms with the ontological postulates. In particular, the relationship between the exclusion axiom and the exclusion postulate is unclear. Moreover, the exclusion postulate leads to a serious problem in IIT: the quale underdetermination problem. Therefore, in this paper, I will explore answers to the following three questions: (1) how does the exclusion axiom lead to the exclusion postulate? (2) How does the exclusion postulate cause the qualia underdetermination problem? (3) Is there a solution to this problem? I will provide proposals and arguments for each question. If successful, IIT can be confirmed with respect to, not only its theoretical foundation, but also its practical application. Full article
(This article belongs to the Special Issue Integrated Information Theory)
18 pages, 3267 KiB  
Article
Integrated Information as a Measure of Cognitive Processes in Coupled Genetic Repressilators
by Luis Abrego and Alexey Zaikin
Entropy 2019, 21(4), 382; https://doi.org/10.3390/e21040382 - 10 Apr 2019
Cited by 4 | Viewed by 5501
Abstract
Intercellular communication and its coordination allow cells to exhibit multistability as a form of adaptation. This conveys information processing from intracellular signaling networks enabling self-organization between other cells, typically involving mechanisms associated with cognitive systems. How information is integrated in a functional manner [...] Read more.
Intercellular communication and its coordination allow cells to exhibit multistability as a form of adaptation. This conveys information processing from intracellular signaling networks enabling self-organization between other cells, typically involving mechanisms associated with cognitive systems. How information is integrated in a functional manner and its relationship with the different cell fates is still unclear. In parallel, drawn originally from studies on neuroscience, integrated information proposes an approach to quantify the balance between integration and differentiation in the causal dynamics among the elements in any interacting system. In this work, such an approach is considered to study the dynamical complexity in a genetic network of repressilators coupled by quorum sensing. Several attractors under different conditions are identified and related to proposed measures of integrated information to have an insight into the collective interaction and functional differentiation in cells. This research particularly accounts for the open question about the coding and information transmission in genetic systems. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

30 pages, 1330 KiB  
Article
Measuring Integrated Information: Comparison of Candidate Measures in Theory and Simulation
by Pedro A.M. Mediano, Anil K. Seth and Adam B. Barrett
Entropy 2019, 21(1), 17; https://doi.org/10.3390/e21010017 - 25 Dec 2018
Cited by 70 | Viewed by 12102
Abstract
Integrated Information Theory (IIT) is a prominent theory of consciousness that has at its centre measures that quantify the extent to which a system generates more information than the sum of its parts. While several candidate measures of integrated information (“ Φ ”) [...] Read more.
Integrated Information Theory (IIT) is a prominent theory of consciousness that has at its centre measures that quantify the extent to which a system generates more information than the sum of its parts. While several candidate measures of integrated information (“ Φ ”) now exist, little is known about how they compare, especially in terms of their behaviour on non-trivial network models. In this article, we provide clear and intuitive descriptions of six distinct candidate measures. We then explore the properties of each of these measures in simulation on networks consisting of eight interacting nodes, animated with Gaussian linear autoregressive dynamics. We find a striking diversity in the behaviour of these measures—no two measures show consistent agreement across all analyses. A subset of the measures appears to reflect some form of dynamical complexity, in the sense of simultaneous segregation and integration between system components. Our results help guide the operationalisation of IIT and advance the development of measures of integrated information and dynamical complexity that may have more general applicability. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

18 pages, 298 KiB  
Article
What Does ‘Information’ Mean in Integrated Information Theory?
by Olimpia Lombardi and Cristian López
Entropy 2018, 20(12), 894; https://doi.org/10.3390/e20120894 - 22 Nov 2018
Cited by 12 | Viewed by 4923
Abstract
Integrated Information Theory (IIT) intends to provide a principled theoretical approach able to characterize consciousness both quantitatively and qualitatively. By starting off identifying the fundamental properties of experience itself, IIT develops a formal framework that relates those properties to the physical substratum of [...] Read more.
Integrated Information Theory (IIT) intends to provide a principled theoretical approach able to characterize consciousness both quantitatively and qualitatively. By starting off identifying the fundamental properties of experience itself, IIT develops a formal framework that relates those properties to the physical substratum of consciousness. One of the central features of ITT is the role that information plays in the theory. On the one hand, one of the self-evident truths about consciousness is that it is informative. On the other hand, mechanisms and systems of mechanics can contribute to consciousness only if they specify systems’ intrinsic information. In this paper, we will conceptually analyze the notion of information underlying ITT. Following previous work on the matter, we will particularly argue that information within ITT should be understood in the light of a causal-manipulabilist view of information (López and Lombardi 2018), conforming to which information is an entity that must be involved in causal links in order to be precisely defined. Those causal links are brought to light by means of interventionist procedures following Woodward’s and Pearl’s version of the manipulability theories of causation. Full article
(This article belongs to the Special Issue Integrated Information Theory)

Review

Jump to: Research

21 pages, 2234 KiB  
Review
Mathematics and the Brain: A Category Theoretical Approach to Go Beyond the Neural Correlates of Consciousness
by Georg Northoff, Naotsugu Tsuchiya and Hayato Saigo
Entropy 2019, 21(12), 1234; https://doi.org/10.3390/e21121234 - 17 Dec 2019
Cited by 18 | Viewed by 9268
Abstract
Consciousness is a central issue in neuroscience, however, we still lack a formal framework that can address the nature of the relationship between consciousness and its physical substrates. In this review, we provide a novel mathematical framework of category theory (CT), in which [...] Read more.
Consciousness is a central issue in neuroscience, however, we still lack a formal framework that can address the nature of the relationship between consciousness and its physical substrates. In this review, we provide a novel mathematical framework of category theory (CT), in which we can define and study the sameness between different domains of phenomena such as consciousness and its neural substrates. CT was designed and developed to deal with the relationships between various domains of phenomena. We introduce three concepts of CT which include (i) category; (ii) inclusion functor and expansion functor; and, most importantly, (iii) natural transformation between the functors. Each of these mathematical concepts is related to specific features in the neural correlates of consciousness (NCC). In this novel framework, we will examine two of the major theories of consciousness, integrated information theory (IIT) of consciousness and temporospatial theory of consciousness (TTC). We conclude that CT, especially the application of the notion of natural transformation, highlights that we need to go beyond NCC and unravels questions that need to be addressed by any future neuroscientific theory of consciousness. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Show Figures

Figure 1

30 pages, 362 KiB  
Review
Dynamic Computation in Visual Thalamocortical Networks
by Roy Moyal and Shimon Edelman
Entropy 2019, 21(5), 500; https://doi.org/10.3390/e21050500 - 16 May 2019
Cited by 8 | Viewed by 4034
Abstract
Contemporary neurodynamical frameworks, such as coordination dynamics and winnerless competition, posit that the brain approximates symbolic computation by transitioning between metastable attractive states. This article integrates these accounts with electrophysiological data suggesting that coherent, nested oscillations facilitate information representation and transmission in thalamocortical [...] Read more.
Contemporary neurodynamical frameworks, such as coordination dynamics and winnerless competition, posit that the brain approximates symbolic computation by transitioning between metastable attractive states. This article integrates these accounts with electrophysiological data suggesting that coherent, nested oscillations facilitate information representation and transmission in thalamocortical networks. We review the relationship between criticality, metastability, and representational capacity, outline existing methods for detecting metastable oscillatory patterns in neural time series data, and evaluate plausible spatiotemporal coding schemes based on phase alignment. We then survey the circuitry and the mechanisms underlying the generation of coordinated alpha and gamma rhythms in the primate visual system, with particular emphasis on the pulvinar and its role in biasing visual attention and awareness. To conclude the review, we begin to integrate this perspective with longstanding theories of consciousness and cognition. Full article
(This article belongs to the Special Issue Integrated Information Theory)
Back to TopTop