Next Article in Journal
Design of Adaptive Fractional-Order Fixed-Time Sliding Mode Control for Robotic Manipulators
Next Article in Special Issue
Working with Convex Responses: Antifragility from Finance to Oncology
Previous Article in Journal
A Parallel Multi-Modal Factorized Bilinear Pooling Fusion Method Based on the Semi-Tensor Product for Emotion Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Perspective

Noise Enhancement of Neural Information Processing

CNRS, Paris-Saclay Institute of Neuroscience (NeuroPSI), Paris-Saclay University, 91400 Saclay, France
Entropy 2022, 24(12), 1837; https://doi.org/10.3390/e24121837
Submission received: 7 November 2022 / Revised: 9 December 2022 / Accepted: 13 December 2022 / Published: 16 December 2022

Abstract

:
Cortical neurons in vivo function in highly fluctuating and seemingly noisy conditions, and the understanding of how information is processed in such complex states is still incomplete. In this perspective article, we first overview that an intense “synaptic noise” was measured first in single neurons, and computational models were built based on such measurements. Recent progress in recording techniques has enabled the measurement of highly complex activity in large numbers of neurons in animals and human subjects, and models were also built to account for these complex dynamics. Here, we attempt to link these two cellular and population aspects, where the complexity of network dynamics in awake cortex seems to link to the synaptic noise seen in single cells. We show that noise in single cells, in networks, or structural noise, all participate to enhance responsiveness and boost the propagation of information. We propose that such noisy states are fundamental to providing favorable conditions for information processing at large-scale levels in the brain, and may be involved in sensory perception.

1. Introduction

Brain activity is subject to various sources of variability and noise, from the thermal noise present in ion channels, noisy membrane potential activity in single neurons, networks and circuits displaying highly irregular dynamics, up to the whole brain, where global electric and magnetic brain signals also display considerable amounts of noise. In this perspective article, we would like to explore internal sources of noise that are present at different scales, and what possible role they could have in neuronal computations.
At the single-cell level, neurons are subject to highly fluctuating and seemingly noisy conditions in vivo. This fact has been noted since the early days of the recording of brain electrical activity. The first intracellular recordings of cortical neurons in awake animals [1,2,3] showed that neurons are depolarized and subject to an intense and irregular synaptic bombardment of both excitatory and inhibitory inputs. This led to the proposal that cortical neurons function in a “high-conductance state” of neurons, which was inspired from early theoretical studies [4,5,6,7,8] (reviewed in [9]). The high-conductance state could be measured in a series of early experiments [6,10] which quantified the level of membrane potential ( V m ), the amount of V m fluctuations and the total conductance of synaptic inputs, which are three of the fundamental parameters necessary to describe high-conductance states.
At the network level, the more recent spectacular progress in extracellular recording techniques for recording or imaging large numbers of neurons has established that the distributed activity of neurons in awake cortex is sustained, irregular and highly complex [11,12,13]. Such observations corroborate studies based on single-neuron measurements and models. In the present perspective, we would like to link these two levels, and propose that the synaptic “noise” present in neurons, the highly complex network activity and structure, all combine to confer interesting computational properties to these active networks.

2. Noisy Neurons

The first detailed measurements of synaptic noise in cortical neurons in vivo were performed in studies in cats [6,10], and are summarized in Figure 1. The three parameters mentioned above, the mean V m ( μ V ), the level of V m fluctuations ( σ V ) and the effective membrane time constant ( τ V ) were measured. Note that the effective membrane time constant is given by the capacitance divided by the total conductance, so it is equivalent to measure the membrane conductance (or input resistance), as done in [6,10]. These measurements were performed in neurons recording in intact network activity in vivo, compared to the same neurons after total suppression of network activity using tetrodotoxin (TTX) microdialysis [10]. The measurements shown in Figure 1 were obtained by comparing periods of network activity (Up states) with the resting of the cell after TTX [6].
These measurements quantified, for the first time, that neurons in cortical neurons in vivo are subjected to intense synaptic activity, responsible for a depolarized V m , intense V m fluctuations and a short time constant (high conductance), and that these conditions have a strong impact on single neurons. The measurements were incorporated in detailed computational models of neocortical pyramidal neurons [6] and showed that the sustained release of excitatory and inhibitory synapses in soma and dendrites could replicate all measurements, but only if a low level of correlation was introduced between synaptic release events. This low correlation matched the measurements made between pairs of neurons in awake animals [15].
It must be noted that not all measurements agree with these values. While some measurements in awake animals [16] or under anesthesia [6,10,17] do suggest a high-conductance state, some measurements in awake animals find that the conductance due to background activity seems negligible, or even negative, compared to states with quiescent activity (Down states) [18]. These differences were attributed to the presence of strong rectifying currents which mask the conductance of synaptic activity. This rectifying current was not present in the previous measurements, as shown by the linear V I relations [6,16]. Note also that no computational model was proposed for the zero or negative conductance measurements, so this issue still requires further study.
Computational models were not only used to find plausible synaptic release conditions to explain conductance measurements, but they also can be used to infer computational properties of neurons in the presence of synaptic noise. A number of interesting properties have been found, and perhaps the most consequent one was that, with synaptic noise, cortical neurons present an enhanced responsiveness to synaptic inputs [14,19,20]. Due to noise, the response to inputs becomes non-determinsitic, or probabilistic, and the probability to evoke spikes by a given input can be boosted in the presence of noise. This phenomenon was also identified in other studies and was also called “gain modulation” [19]. A detailed series of dynamic-clamp experiments [20] showed the differential effect of the three parameters μ V , σ V and τ V . One of the main findings was that the membrane fluctuations, σ V , were most effective in regulating the gain of the neural response, consistent with model predictions. It was also shown that synaptic noise modulates complex intrinsic properties such as bursting in thalamic neurons [21], resulting in a form of boosting of evoked responses. These findings motivated the development of theoretical models and further experiments to investigate this fluctuation-driven regime [17,22,23].

3. Noisy Networks

Multi-electrode or calcium imaging techniques have enabled the recording of large populations of neurons simultaneously, from animals to human. When performed during quiet wakefulness (under non-stimulated conditions), the distributed network activity appears sustained, irregular, and with very low levels of apparent synchrony. This network activity state is called “asynchronous irregular” (AI). This condition is illustrated in Figure 2A for awake human subjects recorded using multi-electrode Utah arrays [12].
These conditions can be replicated by simple network models. The AI state was actually first identified in computational models [26]. In that study, sparsely connected networks of integrate-and-fire excitatory and inhibitory neurons, were capable of displaying AI states, in addition to various other states, such as synchronized oscillations. AI states were also found in even simpler networks of binary neurons [27], or in more realistic networks of excitatory and inhibitory neurons matching the “regular-spiking” (RS) and “fast-spiking” (FS) intrinsic properties seen experimentally in cortical neurons (Figure 2B) [24]. In this case, more realistic conductance-based synaptic interactions were used, which allowed the model to be compared to the conductance measurements. It was found that networks of RS and FS neurons, with sparse connectivity, can generate AI states with membrane conductances consistent with in vivo measurements [24].
Again, computational models can be used to infer what advantageous properties AI states can offer. It is presently unclear if current-based or conductance-based interactions are important, but in conductance-based networks of RS-FS neurons, it was shown that AI states indeed can confer advantageous properties compared to other network states such as synchronized oscillations. In [25], it was shown that AI states provide enhanced responsiveness to external inputs, and that this property is responsible for an enhanced propagation of information in multi-layer networks (Figure 2C,D). This enhanced responsiveness at the network level is reminiscent of the enhanced response found at the cellular level (Figure 1C).

4. Structural Noise

Noise does not only apply to the activity of neurons or networks, but it can also be present in its structure. This structural noise, similar to the notion of “quenched disorder” in physics [28,29], is also present in the cerebral cortex which contains cells of very diverse shapes and sizes [30]. Figure 3 shows simulations of heterogeneous neural systems. In Figure 3A, networks of neurons were designed with different levels of cell-to-cell heterogeneity (which can be size, resting membrane potential, threshold, etc.) One can see that the homogeneous network is not the most responsive to external inputs, but here also, the presence of noise in the structure of the network is not detrimental but it seems to boost responsiveness [31,32]. Remarkably, the optimal responsiveness corresponds to the level of responsiveness measured experimentally in different preparations (Figure 3B). It was also found that there is a form of resonance to the level of heterogeneity (which can be seen in Figure 3B).
To investigate its impact at large scales, heterogeneous mean-field models were designed and could capture the responsiveness properties of heterogeneous networks [31]. Using these heterogeneous mean-fields, large-scale networks can be built, and in such large-scale networks, the system with moderate levels of heterogeneity was found to propagate information better than homogeneous systems (Figure 3C). This shows that microscopic heterogeneity, here from cell to cell, can have notable consequences at the large-scale level.

5. Discussion

In this perspective, we have shown three examples illustrating the strong impact that internal noise sources can have at different scales. At the cellular scale (Figure 1), we have shown the amount and impact of synaptic noise as seen from single neurons. In a sense, this noise can be seen as a feedback from network-level activity onto single cells, although of course, single cells collectively participate in setting up this network activity. At this single-neuron level, the experimental characterization [6,10] showed that synaptic noise is significant, and has a strong impact on neuronal parameters, such as the mean membrane potential ( μ V ), voltage fluctuations ( σ V ) and the effective membrane time constant ( τ V ), which is linked to the total conductance in the cell. Playing on those parameters can change the position and the slope of the transfer function (Figure 1E). A particularly striking effect is that, for some input amplitudes, the response in the presence of noise is amplified compared to quiescent conditions (* in Figure 1E). This phenomenon of noise amplification bears some similarity to stochastic resonance phenomena [33], although the noise is here from network activity, so is “internal” to the system considered as a whole.
At the network level (Figure 2), we have illustrated that AI states are seen in awake human subjects (Figure 2A) and in awake animals [12], as well as in computational models where they are widely seen activity states (AI states constitute a large portion of the parameter space of these models) in sparsely connected spiking networks [26]. Remarkably, networks in AI states also can be more responsive to external inputs [25], similarly to the enhanced responsiveness seen at the cellular level. This enhanced responsiveness can also be seen by connecting multilayer networks (Figure 2C), where they can support the propagation of evoked activity across layers (Figure 2D). This also suggests that noise can be beneficial, where the “noise” is here the internal AI state exhibited by the system.
We also illustrated that noise does not need to be dynamical, but can also be structural (Figure 3). Here, networks made of neurons exhibiting cell-to-cell heterogeneity, for example in their intrinsic properties (size, excitability, resting level, etc.), can also be optimally responsive for non-zero levels of heterogeneity (Figure 3A). Interestingly, reporting the levels of heterogeneity measured experimentally falls in the region of maximal predicted responsiveness (Figure 3B). This cellular-level heterogeneity can have consequences at large scales, as shown by large networks of heterogeneous mean-field models (Figure 3C) [31]. Here again, the presence of noise at the structural level can lead to enhanced responsiveness and facilitation of information propagation.
Thus, based on those observations, the picture that emerges is that looking at the experimental conditions in single cell dynamics, in network dynamics, and in network structure, models predict that these different sources of noise tend to enhance the response of the system to external inputs and boost the propagation of information. This was proposed as the basis to explain why the awake and conscious brain is systematically associated with asynchronous and irregular activity states [25]. One may go a step further, and propose that the brain connectivity and cellular diversity are elements that are tuned to produce AI states of maximal responsiveness. Indeed, it was found that the human brain presents such an enhanced responsiveness in the waking state, compared to sleep or anesthesia [34,35,36]. Simulations of large-scale networks, based on the human connectome, showed that indeed, when the brain model is put in an asynchronous-irregular mode, it has an enhanced responsiveness compared to simulated slow waves [37,38].
Thus, cerebral cortex, by its structure with an extremely high cellular diversity and heterogeneity, combined with sparse and random connections [30], seems entirely consistent with what would be necessary to produce asynchronous-irregular states with optimal responsiveness. We have overviewed here how such states may provide enhanced responsiveness to external inputs, which may be consistent with their role in sensory perception. However, to investigate whether they are implicated in sensory awareness, or even consciousness, additional properties seem necessary. It was proposed that AI states can have the property that the information about an external input is immediately available to the whole network [25]. Such properties may provide interesting directions to explore why AI states are so systematically seen in the awake and active brain. Such complex activity states still have a lot to reveal, and understanding their properties and underlying mechanisms will require experiments with high spatial and temporal resolution, to characterize how AI states detect, propagate and communicate information across large scales and different brain areas. It will also require theoretical models to understand how information is represented in AI states, and how the interplay of the different scales organizes the underlying neuronal computations.

Funding

Research supported by the CNRS and the European Union (Human Brain Project H2020-785907, H2020-945539).

Acknowledgments

I thank Thierry Bal, Frederic Chavane, Nima Dehghani, Matteo di Volo, Sami El Boustani, Nicolas Hô, Denis Paré and Yann Zerlaut for support and collaboration.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Baranyi, A.; Szente, M.B.; Woody, C.D. Electrophysiological characterization of different types of neurons recorded in vivo in the motor cortex of the cat. II. Membrane parameters, action potentials, current-induced voltage responses and electrotonic structures. J. Neurophysiol. 1993, 69, 1865–1879. [Google Scholar] [CrossRef]
  2. Matsumura, M.; Cope, T.; Fetz, E.E. Sustained excitatory synaptic input to motor cortex neurons in awake animals revealed by intracellular recording of membrane potentials. Exp. Brain Res. 1988, 70, 463–469. [Google Scholar] [CrossRef]
  3. Steriade, M.; Timofeev, I.; Grenier, F. Natural waking and sleep states: A view from inside neocortical neurons. J. Neurophysiol. 2001, 85, 1969–1985. [Google Scholar] [CrossRef] [Green Version]
  4. Barrett, J.N. Motoneuron dendrites: role in synaptic integration. Fed. Proc. 1975, 34, 1398–1407. [Google Scholar]
  5. Bernander, O.; Douglas, R.J.; Martin, K.A.; Koch, C. Synaptic background activity influences spatiotemporal integration in single pyramidal cells. Proc. Natl. Acad. Sci. USA 1991, 88, 11569–11573. [Google Scholar] [CrossRef]
  6. Destexhe, A.; Paré, D. Impact of network activity on the integrative properties of neocortical pyramidal neurons in vivo. J. Neurophysiol. 1999, 81, 1531–1547. [Google Scholar] [CrossRef] [Green Version]
  7. Holmes, W.R.; Woody, C.D. Effects of uniform and non-uniform synaptic ’activation-distributions’ on the cable properties of modeled cortical pyramidal neurons. Brain Res. 1989, 505, 12–22. [Google Scholar] [CrossRef]
  8. Rudolph, M.; Destexhe, A. Tuning neocortical pyramidal neurons between integrators and coincidence detectors. J. Comput. Neurosci. 2003, 14, 239–251. [Google Scholar] [CrossRef]
  9. Destexhe, A.; Rudolph, M.; Paré, D. The high-conductance state of neocortical neurons in vivo. Nat. Rev. Neurosci. 2003, 4, 739–751. [Google Scholar] [CrossRef]
  10. Paré, D.; Shink, E.; Gaudreau, H.; Destexhe, A.; Lang, E.J. Impact of spontaneous synaptic activity on the resting properties of cat neocortical pyramidal neurons in vivo. J. Neurophysiol. 1998, 79, 1450–1460. [Google Scholar] [CrossRef]
  11. Buzsáki, G. Large-scale recording of neuronal ensembles. Nat. Neurosci. 2004, 7, 446–451. [Google Scholar] [CrossRef]
  12. Dehghani, N.; Peyrache, A.; Telenczuk, B.; Le Van Quyen, M.; Halgren, E.; Cash, S.S.; Hatsopoulos, N.G.; Destexhe, A. Dynamic balance of excitation and inhibition in human and monkey neocortex. Sci. Rep. 2016, 6, 23176. [Google Scholar] [CrossRef] [Green Version]
  13. Lin, L.; Chen, G.; Xie, K.; Zaia, K.A.; Zhang, S.; Tsien, J.Z. Large-scale neural ensemble recording in the brains of freely behaving mice. J. Neurosci. Methods 2006, 155, 28–38. [Google Scholar] [CrossRef]
  14. Hô, N.; Destexhe, A. Synaptic background activity enhances the responsiveness of neocortical pyramidal neurons. J. Neurophysiol. 2000, 84, 1488–1496. [Google Scholar] [CrossRef] [Green Version]
  15. Zohary, E.; Shadlen, M.N.; Newsome, W.T. Correlated neuronal discharge rate and its implications for psychophysical performance. Nature 1994, 370, 140–143. [Google Scholar] [CrossRef]
  16. Rudolph, M.; Pospischil, M.; Timofeev, I.; Destexhe, A. Inhibition determines membrane potential dynamics and controls action potential generation in awake and sleeping cat cortex. J. Neurosci. 2007, 27, 5280–5290. [Google Scholar] [CrossRef] [Green Version]
  17. Reig, R.; Zerlaut, Y.; Vergara, R.; Destexhe, A.; Sanchez-Vives, M.V. Gain modulation of synaptic inputs by network state in auditory cortex in vivo. J. Neurosci. 2015, 35, 2689–2702. [Google Scholar] [CrossRef] [Green Version]
  18. Waters, J.; Helmchen, F. Background synaptic activity is sparse in neocortex. J. Neurosci. 2006, 26, 8267–8277. [Google Scholar] [CrossRef] [Green Version]
  19. Chance, F.S.; Abbott, L.F.; Reyes, A.D. Gain modulation from background synaptic input. Neuron 2002, 35, 773–782. [Google Scholar] [CrossRef] [Green Version]
  20. Shu, Y.; Hasenstaub, A.; Badoual, M.; Bal, T.; McCormick, D.A. Barrages of synaptic activity control the gain and sensitivity of cortical neurons. J. Neurosci. 2003, 23, 10388–10401. [Google Scholar] [CrossRef] [Green Version]
  21. Wolfart, J.; Debay, D.; Le Masson, G.; Destexhe, A.; Bal, T. Synaptic background activity controls spike transfer from thalamus to cortex. Nat. Neurosci. 2005, 8, 1760–1767. [Google Scholar] [CrossRef]
  22. Kuhn, A.; Aertsen, A.; Rotter, S. Neuronal integration of synaptic input in the fluctuation-driven regime. J. Neurosci. 2004, 24, 2345–2356. [Google Scholar] [CrossRef] [Green Version]
  23. Zerlaut, Y.; Teleńczuk, B.; Deleuze, C.; Bal, T.; Ouanounou, G.; Destexhe, A. Heterogeneous firing rate response of mouse layer v pyramidal neurons in the fluctuation-driven regime. J. Physiol. 2016, 594, 3791–3808. [Google Scholar] [CrossRef] [Green Version]
  24. Zerlaut, Y.; Chemla, S.; Chavane, F.; Destexhe, A. Modeling mesoscopic cortical dynamics using a mean-field model of conductance-based networks of adaptive exponential integrate-and-fire neurons. J. Comput. Neurosci. 2018, 44, 45–61. [Google Scholar] [CrossRef]
  25. Zerlaut, Y.; Destexhe, A. Enhanced responsiveness and low-level awareness in stochastic network states. Neuron 2017, 94, 1002–1009. [Google Scholar] [CrossRef] [Green Version]
  26. Brunel, N. Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J. Comput. Neurosci. 2000, 8, 183–208. [Google Scholar] [CrossRef]
  27. van Vreeswijk, C.; Sompolinsky, H. Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science 1996, 274, 1724–1726. [Google Scholar] [CrossRef] [Green Version]
  28. Balog, I.; Uzelac, K. Quenched disorder: demixing thermal and disorder fluctuations. Phys. Rev. E 2012, 86, 061124. [Google Scholar] [CrossRef]
  29. Radzihovsky, L. Introduction to Quenched Disorder; University of Colorado: Boulder, CO, USA, 2015. [Google Scholar]
  30. Braitenberg, V.; Schüz, A. Cortex: Statistics and Geometry of Neuronal Connectivity; Springer: Berlin/Heidelberg, Germany, 1998. [Google Scholar]
  31. Di Volo, M.; Destexhe, A. Optimal responsiveness and information flow in networks of heterogeneous neurons. Sci. Rep. 2021, 11, 17611. [Google Scholar] [CrossRef]
  32. Mejias, J.; Longtin, A. Optimal heterogeneity for coding in spiking neural networks. Phys. Rev. Lett. 2012, 108, 228102. [Google Scholar] [CrossRef] [Green Version]
  33. Wiesenfeld, K.; Moss, F. Stochastic resonance and the benefits of noise: from ice ages to crayfish and SQUIDs. Nature 1995, 373, 33–36. [Google Scholar] [CrossRef]
  34. Casali, A.G.; Gosseries, O.; Rosanova, M.; Boly, M.; Sarasso, S.; Casali, K.R.; Casarotto, S.; Bruno, M.A.; Laureys, S.; Tononi, G.; et al. A theoretically based index of consciousness independent of sensory processing and behavior. Sci. Transl. Med. 2013, 5, 198ra105. [Google Scholar] [CrossRef]
  35. Massimini, M.; Ferrarelli, F.; Huber, R.; Esser, S.K.; Singh, H.; Tononi, G. Breakdown of cortical effective connectivity during sleep. Science 2005, 309, 2228–2232. [Google Scholar] [CrossRef]
  36. Sarasso, S.; Boly, M.; Napolitani, M.; Gosseries, O.; Charland-Verville, V.; Casarotto, S.; Rosanova, M.; Casali, A.G.; Brichant, J.F.; Boveroux, P.; et al. Consciousness and Complexity during Unresponsiveness Induced by Propofol, Xenon, and Ketamine. Curr. Biol. 2015, 25, 3099–3105. [Google Scholar] [CrossRef] [Green Version]
  37. Goldman, J.S.; Kusch, L.; Yalcinkaya, B.H.; Depannemaecker, D.; Nghiem, T.A.E.; Jirsa, V.; Destexhe, A. Brain-scale emergence of slow-wave synchrony and highly responsive asynchronous states based on biologically realistic population models simulated in the virtual brain. bioRxiv 2020. [Google Scholar] [CrossRef]
  38. Goldman, J.S.; Kusch, L.; Yalcinkaya, B.H.; Nghiem, T.A.E.; Jirsa, V.; Destexhe, A. A comprehensive neural simulation of slow-wave 1 sleep and highly responsive wakefulness dynamics 2. bioRxiv 2021. [Google Scholar] [CrossRef]
Figure 1. Synaptic noise in single neurons and enhanced responsiveness. (A) First measurements of synaptic noise in neuron. Neurons were recorded intracellularly in vivo, before and after microdialysis of tetrodotoxin (TTX). (B) Comparison of the membrane properties before and after TTX. Modified from [6,10]. (C) Measurements of mean voltage, voltage fluctuations ( σ V ) and relative membrane resistance change ( R i n ). Modified from [6]. (D) Model of synaptic noise showing the probabilistic aspect of the spike response (two stimulus amplitudes shown, 40 trials each). (E) Effect of synaptic noise on the transfer function of the neuron (D,E) modified from [14].
Figure 1. Synaptic noise in single neurons and enhanced responsiveness. (A) First measurements of synaptic noise in neuron. Neurons were recorded intracellularly in vivo, before and after microdialysis of tetrodotoxin (TTX). (B) Comparison of the membrane properties before and after TTX. Modified from [6,10]. (C) Measurements of mean voltage, voltage fluctuations ( σ V ) and relative membrane resistance change ( R i n ). Modified from [6]. (D) Model of synaptic noise showing the probabilistic aspect of the spike response (two stimulus amplitudes shown, 40 trials each). (E) Effect of synaptic noise on the transfer function of the neuron (D,E) modified from [14].
Entropy 24 01837 g001
Figure 2. Asynchronous and irregular activity in cortical networks better propagate information. (A) Asynchronous-Irregular (AI) activity in an awake human subject recorded with a multi-electrode array. Excitatory (RS, blue) and inhibitory (FS, red) cells are shown, together with their average rate at the bottom. Modified from [12]. (B) AI states generated by AdEx networks, with conductance-based synapses. The bottom trace shows the total excitatory (green) and inhibitory (red) conductances in three example RS cells (superimposed traces). Modified from [24]. (C) Scheme of a multilayer arrangement of AdEx networks with excitatory inter-connections (dotted lines) and receiving input (top). (D) Input propagation across layers in AI states (blue curves), compared to non-propagation in quiescent states (brown). Modified from [25].
Figure 2. Asynchronous and irregular activity in cortical networks better propagate information. (A) Asynchronous-Irregular (AI) activity in an awake human subject recorded with a multi-electrode array. Excitatory (RS, blue) and inhibitory (FS, red) cells are shown, together with their average rate at the bottom. Modified from [12]. (B) AI states generated by AdEx networks, with conductance-based synapses. The bottom trace shows the total excitatory (green) and inhibitory (red) conductances in three example RS cells (superimposed traces). Modified from [24]. (C) Scheme of a multilayer arrangement of AdEx networks with excitatory inter-connections (dotted lines) and receiving input (top). (D) Input propagation across layers in AI states (blue curves), compared to non-propagation in quiescent states (brown). Modified from [25].
Entropy 24 01837 g002
Figure 3. Heterogeneous networks can be highly responsive. (A) AdEx networks with different levels of heterogeneity, submitted to the same external input. A moderate level of heterogeneity presents the maximal response (middle). (B) Responsiveness in the plane of excitatory and inhibitory neuron heterogeneity. The maximum responsiveness (warm colors) occurs for intermediate levels of heterogeneity, and corresponds to the level of heterogeneity measured experimentally in cerebral cortex (white symbols). (C) Large-scale networks of heterogeneous mean-field units. When the units were based on heterogeneous networks (bottom), the propagating response was maximal. Modified from [31].
Figure 3. Heterogeneous networks can be highly responsive. (A) AdEx networks with different levels of heterogeneity, submitted to the same external input. A moderate level of heterogeneity presents the maximal response (middle). (B) Responsiveness in the plane of excitatory and inhibitory neuron heterogeneity. The maximum responsiveness (warm colors) occurs for intermediate levels of heterogeneity, and corresponds to the level of heterogeneity measured experimentally in cerebral cortex (white symbols). (C) Large-scale networks of heterogeneous mean-field units. When the units were based on heterogeneous networks (bottom), the propagating response was maximal. Modified from [31].
Entropy 24 01837 g003
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Destexhe, A. Noise Enhancement of Neural Information Processing. Entropy 2022, 24, 1837. https://doi.org/10.3390/e24121837

AMA Style

Destexhe A. Noise Enhancement of Neural Information Processing. Entropy. 2022; 24(12):1837. https://doi.org/10.3390/e24121837

Chicago/Turabian Style

Destexhe, Alain. 2022. "Noise Enhancement of Neural Information Processing" Entropy 24, no. 12: 1837. https://doi.org/10.3390/e24121837

APA Style

Destexhe, A. (2022). Noise Enhancement of Neural Information Processing. Entropy, 24(12), 1837. https://doi.org/10.3390/e24121837

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop