Next Article in Journal
Transformer Monitoring with Electromagnetic Energy Transmission and Wireless Sensing
Next Article in Special Issue
Hardware and Software Setup for Quantitative 23Na Magnetic Resonance Imaging at 3T: A Phantom Study
Previous Article in Journal
Deep Interpolation of Remote Sensing Land Surface Temperature Data with Partial Convolutions
Previous Article in Special Issue
Multi-Feature Automatic Extraction for Detecting Obstructive Sleep Apnea Based on Single-Lead Electrocardiography Signals
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Information-Theoretical Analysis of the Cycle of Creation of Knowledge and Meaning in Brains under Multiple Cognitive Modalities

by
Joshua J. J. Davis
1,
Florian Schübeler
2 and
Robert Kozma
3,4,5,*
1
Dodd-Walls Centre for Photonics and Quantum Technologies, Department of Physics & Ian Kirk’s Lab., Centre for Brain Research, The University of Auckland, Auckland 1142, New Zealand
2
The Embassy of Peace, Whitianga, Coromandel 3591, New Zealand
3
Department of Mathematics, University of Memphis, Memphis, TN 38152, USA
4
School of Informatics, Obuda University, H-1034 Budapest, Hungary
5
Kozmos Research Laboratories, Boston, MA 02215, USA
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(5), 1605; https://doi.org/10.3390/s24051605
Submission received: 25 January 2024 / Revised: 21 February 2024 / Accepted: 27 February 2024 / Published: 29 February 2024

Abstract

:
It is of great interest to develop advanced sensory technologies allowing non-invasive monitoring of neural correlates of cognitive processing in people performing everyday tasks. A lot of progress has been reported in recent years in this research area using scalp EEG arrays, but the high level of noise in the electrode signals poses a lot of challenges. This study presents results of detailed statistical analysis of experimental data on the cycle of creation of knowledge and meaning in human brains under multiple cognitive modalities. We measure brain dynamics using a HydroCel Geodesic Sensor Net, 128-electrode dense-array electroencephalography (EEG). We compute a pragmatic information (PI) index derived from analytic amplitude and phase, by Hilbert transforming the EEG signals of 20 participants in six modalities, which combine various audiovisual stimuli, leading to different mental states, including relaxed and cognitively engaged conditions. We derive several relevant measures to classify different brain states based on the PI indices. We demonstrate significant differences between engaged brain states that require sensory information processing to create meaning and knowledge for intentional action, and relaxed-meditative brain states with less demand on psychophysiological resources. We also point out that different kinds of meanings may lead to different brain dynamics and behavioral responses.

1. Introduction

Intensive research has been conducted in recent years to develop advanced sensory technologies allowing non-invasive monitoring and classification of neural correlates of cognitive processing in human brains via EEG with the aid of convolutional neural networks [1], for example. Some interesting studies have focused on exploring neural correlates of learning and working memory tasks, for example, with the use of EEG to compare several classifiers [2]. As indicated by [3], EEG technologies have some advantages and good applications for Developmental Cognitive Neuroscience, while new trends in quantitative analysis are emerging in a broad spectrum of applications, including medical applications [4]. One important area of research is the study of brain dynamics in relation to motor tasks and rehabilitation [5].
Significant progress has been made using scalp EEG arrays for cognitive state identification of people performing everyday tasks, some of them in sport [6] and meditation [7], and for mental state detection for pilots and accident prevention [8]. Studies have also involved participants tested with closed vs. open eyes in visual and auditory recognition tasks, as well as motor tasks [9,10] including emotion recognition in human computer interaction [11]. Several challenges in different areas are reviewed in [12] that included safety and health, for example, via measuring physiological signals such as EEG. Other authors have explored the challenges associated with brain computer interface for emotion recognition [13,14] associated with machine learning and other models [15], as for example, in planning and rational decision making [16], something that requires robust classification methods for EEG signals in emotion recognition, as discussed by [17]. Another relevant field of study concerning various physiological signals concerns the characterization of mental stress and fatigue, as described in studies like [18] where electrodermal activity is measured to classify calm from distress conditions. This has also been discussed with the use of EEG in cooperative game theory and fatigue [19], as well as mental stress and its influence on neural coordination [20].
Intentional action and values-based decision making necessitate meaningful mental frameworks [21,22]. Therefore, in considering meaning and values, we must take into consideration the interaction of the brain and the mind in creating semantic, visual, aesthetic, scientific, musical, and spiritual meanings, for example. When we value someone or something, this person or thing becomes meaningful to us. It is of great importance to study the neural processes underlying the creation of meaning in the human cortex.
Intentional neurodynamics is a key component of the emergence of meaning in human brains, as described by Freeman [23]. An important aspect of intentionality is the unity of brain, mind, and body, which has been described in the early works of Thomas Aquinas [24]. The traditions of Aquinas were followed by Brentano [25], describing the creation of meaning in spiritual experiences; see also [26,27]. Considering the unity of brain, mind, and body, the energy consumption of the brain during the creation of meaning becomes an important research area [28,29]. Freeman’s pioneering experimental studies on spatio-temporal oscillations in brains produced several key findings, including the identification of sequences of metastable amplitude-modulated activity patterns (wave packets), which are associated with the meaning of the sensory experiences and produce consecutive meaningful decisions and actions [30,31,32]. The formation of the sequence of such brain activity patterns impacts the energy utilization of the brain, which in turn provides a potential tool to monitor cognitive processing.
Modern brain imaging techniques allow monitoring the role of various brain areas during cognitive processing. Depending on the nature of the cognitive activities, the measured patterns of brain oscillations would change. For example, it is expected to observe markedly different patterns during relaxed states, as compared with more active states, such as solving mathematical tasks, identifying visual clues. It is expected that more engaged cognitive states likely require more metabolic energy than relaxed ones. From our perspective, the cognitive activities using various sensory stimuli and leading to intentional actions are especially important, as they are related to the meaning of the stimuli in the context of the actual physical and mental state of the individual.
It is important to extend the results of previous studies, which used invasive electrocorticogram (ECoG) arrays in animal brains [33], to the domain of non-invasive scalp monitoring with humans. Successes of those efforts would produce crucial progress in human brain studies by providing powerful tools to quantify the creation of meaning and knowledge in human brains, using statistical and information-theoretical indices. The challenges are daunting, as the extraction of meaning from scalp EEG signals appears to be intractable due to, for example, the drastic deterioration of the brain electric signals after passing through the skull, amongst other challenges. Nonetheless, a breakthrough in this problem is possible, based on advanced signal processing and information-theoretical analysis [34,35,36].
In this paper, we introduce the action perception cycle, by which a stimulus becomes meaningful and is selected by the subject’s cerebral cortex, which creates the network structures and dynamics required for decision making, intentional action, and behavior [23,24]. We expand the preliminary analysis of EEG measurements reported in [37]. We employ the Hilbert transform to obtain the analytic amplitude (AA), analytic phase (AP), and instantaneous frequency (IF) of the highly nonlinear, nonstationary EEG signals, and derive pragmatic information (PI) indices, which were originally introduced for lasers [38,39], and extended to brain dynamics [33,34,40,41]. Next, we present EEG experiments based on human participants in different modalities, collected in Ian J. Kirk’s Lab, Centre for Brain Research at The University of Auckland in New Zealand [42,43]. Previously, we reported results of EEG measurements with two modalities, meditation and video watching [36], which are now extended to six modalities [37], including meditation, scrambled words, video watching, ambiguous images, math mind, and sentences, for a comprehensive and robust analysis. We apply mathematical and statistical analysis using pragmatic information indices to evaluate the EEG experimental data. The obtained results demonstrate that the behavior of such indices correlates with the formation of meaning in neural structures and energy consumption levels in the brain. Finally, we outline perspectives and directions for future research.

2. Materials and Methods

2.1. Data Acquisition

Experiments were conducted using Electrical Geodesics Inc (EGI, Eugene, OR, USA) 128-channel scalp EEG electrode arrays [44], following the basic methodology for experimental design and data acquisition described in [36] with 20 participants. Expanding on the previous results using only 2 modalities (meditation and video watching) with 20 participants, here we describe the outcomes corresponding to 6 modalities:
Meditation (MED): Participants were requested to conduct meditation of their choice for 7 min. In the case of participants with little or no meditation experience, they were asked to relax with their eyes closed for 7 min.
Scrambled Words (WORDS): Participants were presented twenty (20) scrambled words that pertained to one (1) of the two (2) categories, either a value, such as Love or Truth, for example, or an object from nature, such as Sand or Pebble. The participants were requested to try to identify the original word (unscramble it). Once they had established the category of the word, they were asked to press number 1 on the numerical keypad when they identified the word describing a “Value” or to press number 2 on the numerical keypad when the word was describing an “Object”.
Ambiguous Images (IMG): We presented twelve (12) images with ambiguous content to the participants and asked them to specify how many individual images they could find by pressing a number between 1 and 9 on the numerical keypad to identify the number of images found; see Figure 1 with two (2) different images in it, for an example.
Math Mind (MM): For this modality, we recorded a pleasant female voice presenting twenty-eight (28) simple arithmetic operations that the participants were asked to resolve in their mind by applying some rules, until they would reach a solution with a single digit between one (1) and nine (9), for each arithmetic operation. For example, the participant would hear “7 times 7”. This equates to 49, a two-digit result. Now the participant would add these two digits together (i.e., 4 + 9 = 13) and repeat this step until they reach a single digit result, in this case the answer would be “4”, after the final computation of “1 + 3” was performed.
Sentences (SENT): Twenty (20) sentences were recorded with the same female voice as for the MM modality. The sentence was comprised of either a positive statement, such as: “You are caring and kind” or a gibberish sentence such as “briggy tublish tuchty”. The participants were asked to categorize the sentence they heard either as meaningless, meaningful, or meaningful and pleasant, and press the corresponding number 1, 2. or 3 on the numerical keypad.
Video (VDO): In this sensory modality, ambiguous images were displayed sequentially (see example as illustrated in Figure 1), while the song “Imagine” by John Lennon was played in the background. Like meditation, this was a passive task that required no active response from the participants.
The experiments were conducted inside a Faraday chamber, while the participants were sitting comfortably on a chair, in front of a computer with a keyboard. To minimize artifacts while answering the experimental questions via the space bar and the numerical keypad, participants were asked to attenuate eye, head, and body movements. Three (3) checks were conducted by the experimenters as follows:
  • An impedance check prior to the beginning of the experiment.
  • A second impedance check after the first two modalities, MED and WORDS (block 1).
  • A third and final impedance check after modalities IMG and MM (block 2).
  • The experiment concluded with the modalities SENT and VDO (block 3).
We measured twenty (20) healthy participants, eleven (11) males and nine (9) females, between twenty-three (23) and sixty-four (64) years of age. Amongst the participants were eleven (11) meditators varying in expertise and nine (9) non-meditators. The present work excludes the analysis of the significance of meditation experience among the participants; for meditation-related results, see [36]. Here we group meditation and relaxation in the modality MED as described above.

2.2. Preprocessing

Data were collected at a sampling frequency of 1000 Hz, using the 128-channel EEG array. The vector of 128 data points at a given time was converted into a 12 × 12 square matrix, corresponding to cortical areas. In order to fill in the open positions in the matrix, some of the data channels were duplicated. Two locations on the prefrontal cortex, A(12,3) and A(12,10), were used as reference positions and left open; see [36] for experimental details.
We used standard preprocessing techniques, including a notch filter at a frequency of 50 Hz, and a detrending filter to reduce movement artefacts [34]. These approaches produced data over the frequency band of 2 Hz to 48 Hz, which served as a starting point for future detailed analysis.
Finally, we mention that a 22 ms adjustment was introduced to align the EEG recordings with the time instances of the recorded events and actions, e.g., pressing a key by the participants. The 22 ms adjustment was required in part due to the delay in the anti-aliasing filter (8 ms), and also due to the screen refreshing rate (14 ms).

2.3. Hilbert Analysis

In this section, we briefly summarize the signal-processing methodology using the Hilbert transform, which allows studying the characteristics of rapidly changing signals.
The Hilbert transform methodology has the advantage that it allows for targeting significantly nonstationary signals, with possible nonlinear characteristics in the time domain. Brain signals exhibit strongly nonstationary and nonlinear properties, and Hilbert transform-based methods have been successfully applied to analyze spatio-temporal brain dynamics in the past decades [40]. The Hilbert transform significantly relies on band-passed filtering of the measured data, and the proper design of the filters is an essential component of the approach. Clearly, Hilbert analysis ought to be complemented with additional approaches and information-theoretic measures, based on Fourier analysis and spectral densities, e.g., [36].
By applying the Hilbert transform, each EEG electrode signal s(t) is transformed to S(t) as follows:
S(t) = s(t) + i s(t)*
S t = A A t e i   A P t
where s(t)* = 1 π p.v. + s t t t d t and where p.v. is the Cauchy Principal value.
Here, AA(t), AP(t), and IF(t) stands for analytic amplitude, analytic phase, and instantaneous frequency, respectively. They are defined as:
A A t = s t 2 + s t * 2 ;   A P t = atan s t * s t
I F t = 1 2 π A P t t = 1 2 π A P t A P ( t t t
Figure 2 illustrates the applied Hilbert transform-based signal-processing algorithm [40,45]. As an example, let us define s(t) = sin(ωt), then s(t)* = s i n ω t π 2   ω < 0 and s i n ω t π 2   ω > 0, and we can visualize AA(t) and AP(t) as shown in Figure 2b,c. We used the MATLAB “Hilbert” function to compute the imaginary part s(t)* from the real valued signal s(t). In Figure 2d–g we show an example for one of the 144 channels, channel 2, measured via the EEG net.

2.4. Computation of the Pragmatic Information Index

Pragmatic information (PI) was introduced originally to describe collective oscillations in laser systems [38]. It is a fundamental extension of the Shannon information, and it has been applied to brain dynamics [39]. Freeman observed that pragmatic information is applicable to describe the sequences of brain amplitude modulation patterns, as order parameters [45]. Pragmatic information, as applied to brain oscillatory patterns, can serve as a biomarker of intentionality and the creation of meaning, and it can be interpreted in terms of many-body field dynamics [41]. In practical terms, the pragmatic information index is given as the ratio of the dissipation of free energy and the rate of change in the indicated order parameter [45]. Let He(t) denote the pragmatic information index, which is expressed as follows:
H e t = A A 2 t / D e t
Here, De(t) is the distance between consecutive AA(t) patterns, while 〈 〉 is the ensemble average across space. Instead of using the distance between AA(t) patterns to derive He(t), we can use AP(t) to arrive at an alternative version of the pragmatic information index.
In the case of the EEG experiments, the AA(t) vector has 128 components, and the distance between consecutive patterns can be simply calculated using the Euclidean measure as follows:
De(t) = |AA2(t) − AA2(t − 1)|2
In an alternative definition of De(t), the distance between consecutive analytic phase patterns AP(t) can be employed, as it is described in Equations (7)–(11). Here, the time index runs through t = 1, 2, …, t*, …, T, while the index for the spatial location of the electrodes will be i = 1, 2, …, 128:
  A P ( t * ) i = A P t * i A P t * i 1
i = 2 , N c h , f o r   t = t *
  A P ( t * ) =   A P ( t * ) 1 ,     A P ( t * ) 2 ,   A P ( t * N c h
D e t * =   A P ( t * ) 2 i = 2 , N c h
D e t * = i = 2 N c h   A P ( t * ) i 2
D e t = D e 1 , D e 2 , . , D e t * , D e T t = 1 , T
where i is a particular channel, t* is a particular point in time, Nch is the total number of channels (electrodes), and T is the time length of the signal or last temporal point.
The concept of pragmatic information has been applied to general complex systems [46]. Pragmatic information has a unique potential to characterize meaning and provide a measure to describe brain dynamics [47]. This is especially remarkable as the very foundation of pragmatic information goes back to the work by von Weizsäcker, making a distinction between the Shannonian syntactic information, and the general semantic and pragmatic information [48,49].
Using the pragmatic information index to describe brain dynamics based on measuring EEG signals, we may gain insight into the electrical and metabolic processes underlying the creation of meaning in brains. This view is supported by results obtained by ECoG experiments with rabbits trained by the classical conditioning paradigm [21,23]. Detailed re-evaluation of the experiments conducted in the 1990s showed that the pragmatic information index exhibited distinct peaks following an approximately 1 s post-stimulus period. This is illustrated in Figure 3, previously published in [37], where the stimulus happens at time instant 3 s.
During the 1 s post-stimulus period, the following distinct cognitive processing steps are indicated: Awe, Chaotic Exploration, Aha (also called the Eureka moment of sudden insight), Chaotic Exploration, and returning to Background Activity. These steps are hypothesized as parts of the cycle of the creation of knowledge and meaning (CCKM) [50].
In the next section, we present our findings and analysis of human brain dynamics, which show greater complexity than animal brain dynamics. In humans, meaning creation has several levels, which manifest more complexity than the case of perceiving a single salient stimulus in a time window of 1 s, which has been illustrated based on the rabbit ECoG studies outlined in Figure 3.

3. Results

3.1. Overview of the Multimodal Experiments

This section provides a comprehensive description of the experimental results obtained with six modalities, using the 128-eletrode EEG array placed over the scalp of the participants. Details of measured EEG signals, their analytic amplitudes and phases, as well as the derived pragmatic information, are given.
Figure 4 illustrates the analytic amplitude (AA(t)), signal amplitude (SA(t)), and analytic frequency (IF(t)) over a 3.5 s observation period, for frequency band (Theta), for Participant 7. Specifically, Figure 4a,b display the AA(t) and SA(t) signals for all 128 electrodes, while Figure 4c,d show the average 〈AA(t)〉 and 〈SA(t)〉 signals calculated across the EEG array, together with the shaded upper and lower limits, respectively. Over the ~2000 ms to 2300 ms time window, we observe significant peaks in AA(t) and 〈AA(t)〉, indicating that the power of the signal, and therefore its energy, have increased in that time window.
Figure 4e displays the analytic frequency IF(t) for the 128 EEG electrodes. The curves show many spikes, which indicate rapid changes in the analytic frequency at various time instances. For our future discussions, it is important to point out that the IF(t) values are consistently low during time interval ~2000 ms and 2300 ms, showing that the signals maintain a rather constant phase during this period, leading to only small variations in the analytic frequency. This means that the signals are highly synchronized during this time window, exactly where AA(t) had a significant peak, indicating that the power of the signal, and therefore the energy, increased at that time of high synchronization. This observation has important implications for the behavior of the pragmatic information index, as elaborated later in Figure 5.
Figure 5 provides a graphical illustration of the evaluation of the pragmatic information index, He(t), as described above. In order to characterize where and when knowledge and meanings are created to conduct an intentional action, the pragmatic information index, He(t), was evaluated during the 3.5 s response period depicted in Figure 5. We can appreciate the significant changes in pragmatic information, He(t), as a result of computing the ratio A A 2 t / D e t , where De(t) and AA2(t) show relevant features shaping He(t) dynamics. We display two versions of PI, namely He(t)1 and He(t)2, where De(t) is based on amplitude and phase, respectively [33].
We produced similar plots for all participants, for all modalities, for all stimuli, in every frequency band of interest, as shown in Table 1. From this first qualitative analysis and based on Freeman’s findings [45], we expect that the most relevant frequency bands with potential impact on the creation of knowledge and meaning are High Gamma (35–48 Hz), and Alpha (8–12 Hz), which is proposed to be acting as a gating band. This is in line with the proposal of Freeman and Kozma and others, suggesting a Gamma–Alpha and Gamma–Theta link [51,52,53,54].
Figure 6 illustrates the EEG in subplots (a)–(b), and an example of the distribution of the pragmatic information index across various brain areas, during a 3.5 s experimental period. The PI plot corresponds to participant 7 (P7), in the modality WORDS, on stimulus 9 (S9), in the Theta band. This example illustrates that the creation of knowledge and meaning is manifested in different combinations of brain areas and frequency bands. These computations for He(t) use a specific implementation of the PI evaluation, where we have performed the evaluation in time windows of length Δt as He(t − Δt).

3.2. Evaluation of Pragmatic Information Variables and Parameters

Next, we introduce a set of variables to characterize the dynamics of He(t) during the time period necessary to process a particular stimulus [37], which are given as follows:
  • NPS: the total number of peaks per unit time (s);
  • TBP: the time spent between peaks, describing the quiet periods;
  • TOP: the time describing the duration of peaks, measuring intensive periods;
  • QPT: the total quiet processing time;
  • IPT: the total intensive processing time.
The number of peaks will be described by the following aggregate quantity γ b , e   p , m , s , where the indices stand for stimulus (s), participant (p), modality (m), frequency band (b), and electrode or channel (e), where p, m, b, e, and s can take values as shown in Table 2. Table 2 shows that there are different numbers of stimuli in different experiments, as specified by N s m in Table 3.
Integrating over all channels, Nch = 128, we obtain the total number of peaks in an experiment with the following parameters: stimulus (s), participant (p), modality (m), and frequency band (b):
γ ~ b p , m , s = e = 1 N c h = 128 γ b , e   p , m , s
From γ ~ b p , m , s , NPS can be derived as the number of peaks per second, for given stimulus (s), participant (p), modality (m), and frequency band (b), based on the total duration:
γ ˙ b p , m , s = γ ~ b p , m , s t s p , m
Considering all the combinations of frequency bands (b) and modalities (m), there are 36 possibilities; we can evaluate the mean values over all stimuli, γ ˙ ¯ b p , m as follows:
γ ˙ ¯ b p , m = s = 1 N s m γ ˙ b p , m , s / N s m
From γ ˙ ¯ b p , m we can derive (a) the number of peaks/second for participant p, in modality m for all frequency bands as γ ˙ ¯ p , m , and (b) the number of peaks/second for participant p, in frequency band b for all modalities as, γ ˙ ¯ b p as follows:
γ ˙ ¯ p , m = b = 1 6 γ ˙ ¯ b p , m / 6
γ ˙ ¯ b p = m = 1 6 γ ˙ ¯ b p , m / 6
Similarly, we compute intervals of confidence based on the standard deviation, S γ ˙ p , m for the values of γ ˙ ¯ p , m , as well as intervals of confidence based on the standard deviation,   S γ ˙ b p for the values of γ ˙ ¯ b p , as follows:
γ ˙ ¯ p , m ±   t = 0.05,6   S γ ˙ p , m / 6
γ ˙ ¯ b p ±   t = 0.05,6   S γ ˙ b p / 6
Similar formulas can be derived over brain areas, when the number of electrodes over the specific area are used. For example, the following expressions are valid for the prefrontal region PF:
γ ~ b , F p p , m , s = γ b , 1 p , m , s + γ b , 2 p , m , s + γ b , 8 p , m , s +       e   F p
Considering all areas, we write:
γ ~ b , B a p , m , s = γ b , n 1 p , m , s + γ b , n 2 p , m , s + + γ b , n n p , m , s
e = n 1 , n 2 , n n B a   w h e r e
B a = F p , F , T , C , P , O , F a c i a l
Following the approach described above, formulas can be derived for the other statistical quantities described earlier, such as TBP, TOP, QPT, and IPT. When needed, the corresponding formulas will be provided in the section of analysis.
When characterizing He(t), the following specifications are used:
  • A threshold signal value is determined heuristically; a specific threshold is set at 0.1 in these studies.
  • Time of peak (TOP) for each significant peak is defined as the duration of the signal continuously above such threshold.
  • Time between peaks (TBP) is defined as the duration of the signal continuously below the threshold.
  • The selection of a minimum TBP, in this study ≤11 ms, where two consecutive peaks should be taken and joined as one peak.
  • Isolated peaks that are too short, here shorter than ≤50 ms, are rejected.
We start by setting a threshold value for the pragmatic information index [29]. Most of the time PI has sub-threshold values, but at some moments it may cross to super-threshold values. Such crossing events are used as markers for the postulated emergence of meaning related to the specific sensory stimuli in the context of the participants’ physical and cognitive state, and their intentions and desired actions. These are precisely the moments that may be considered to consume the most energy. Details of the energy consumption during the cognitive cycle require further studies and are outside the scope of this work; see, e.g., [29].
Figure 7 shows an example with a PI threshold set at 0.1, for frequency band (a) Alpha (b = 2) and (b) High Gamma (b = 6); these examples correspond to the fourth stimulus (s = 4) in the WORDS modality (m = 2). We observe by visual inspection 10 peaks in the case of the Alpha band, and 11 peaks for the H-Gamma band. Using the compact index notation, this is expressed as γ ~ b = 2 p = 1 , m = 2 , s = 4 = 10 and γ ~ b = 6 p = 1 , m = 2 , s = 4 = 11. Several peaks may be merged when counting their numbers if they are very close to each other, or rejected when they are individually very short. Details of the practical implementations are given in the next section.

3.3. Results Based on NPS

An important statistical index derived from the pragmatic information index is the number of peaks per second (NPS). In Figure 8a we display the mean NPS per modality and frequency band for participant P1. We observe that the mean NPS, γ ˙ ¯ b 1 , m , varies across modalities and frequency bands. For example, the modality SENT shows more peaks/second than the other modalities. In the case of MED modality, there is a general tendency of having higher NPS over the Theta and Alpha bands, as compared to the rest of the bands. Visual inspections indicate remarkable differences between various participants, and also commonalities between participants over modalities and frequency bands. Detailed statistical analysis can reveal the significance of such observations, as described next.
In order to illustrate the difference in peaks/second for each modality, γ ˙ ¯ p , m , we produce a bar graph in Figure 8c, after summing up over all frequencies; see Equation (17) for details. Figure 8b displays the mean peaks/second, γ ˙ ¯ b p , for each frequency band, after summing up over all modalities, according to Equation (18).
The results regarding the NPS values combined over all participants are shown in Figure 9, which shows the differences with respect to modalities, as well as frequencies. The observations are summarized as follows:
  • In Figure 9c, the Alpha and H-Gamma frequency bands show the highest mean NPS values for all H-Gamma and most Alpha values across participants, when we compute the mean and confidence intervals for each modality in each frequency band. This points to the Alpha and H-Gamma linkage established in other studies [6,55].
  • Except for Theta and Alpha frequency bands, modality MED shows the lowest mean NPS values. This should be expected since the Theta and particularly the Alpha frequency bands have been shown to be the dominant frequencies in meditative states in previous studies [36]. However, this needs more investigation.
At the next level of granulation, we combine either all frequency bands together in each modality, Figure 9b, or all modalities in each frequency band, Figure 9d. Main observations are as follows:
  • Modalities MED and IMG show lower NPS values when compared to all other modalities; Figure 9b.
  • H-Gamma and Alpha frequency bands show the highest NPS values, see Figure 9d.
  • The NPS measure, it seems to us, is a good candidate for the estimation of intensive processing periods, and likely will be suitable to differentiate between high and low energy consumption modalities.
Figure 9. Results of NPS evaluations: (a) mean NPS for various frequencies, clustered according to the six modalities; (c) mean NPS for various modalities, clustered according to the frequencies. Mean NPS with error bars across participants for (b) modalities and (d) frequency bands.
Figure 9. Results of NPS evaluations: (a) mean NPS for various frequencies, clustered according to the six modalities; (c) mean NPS for various modalities, clustered according to the frequencies. Mean NPS with error bars across participants for (b) modalities and (d) frequency bands.
Sensors 24 01605 g009

3.4. Results Obtained with TBP and TOP

Another set of relevant variables is described by TBP, which is the time between peaks, and it is a measure of quiet processing. We also introduce results for TOP, which is the time of peaks, and it corresponds to time periods with intensive processing; see Figure 10 for illustrations.
TOP and TBP are used to describe the intensive and the quiet processing periods, respectively. At this time, no systematic hypothesis tests were performed regarding TOP and TBP values across modalities. Nonetheless, several observations are useful. For example, when analyzing the cumulative distributions for TOP, as shown in Figure 11, for each modality, in the Alpha (a) and H-Gamma (b) frequency band for all participants, we observe that the TOP values for the modalities of SENT and WORDS seem to behave differently from other modalities, and appear to be greater than the values for the modalities of MED and IMG for the Alpha band. Regarding the H-Gamma frequency band, the values for the modalities of MED and IMG seem to be significantly lower only in relation to the SENT modality, while the modality WORDS behaves more similarly to the modalities of MM and VDO.
The cumulative distribution of the Alpha frequency band (a) may indicate a bimodal or possibly multimodal distribution, which could be due to the fact that some stimuli have longer duration than others for different participants in a particular modality, meaning that some participants may be slower to resolve a task than others. This bimodal or multimodal behavior seems to be absent for most of the modalities in the H-Gamma frequency band. The bimodal or multimodal behavior observed in the Alpha for TOP values is also observed in each frequency band for most modalities, and this could also be associated with the behavior of different brain regions. The study of the behavior of different regions will be the objective of future studies. In general, the mean TOP for each participant shows similar values between the Alpha (c) and the H-Gamma (d) frequency bands, with slightly more variability in the H-Gamma frequency band.
The cumulative distributions for TBP are shown in the lower row (e–h) in Figure 11. Qualitative observations indicate that the distributions of TBP in all modalities for all participants seem to exhibit smaller values for H-Gamma (f) than for Alpha (e), with the exception of the modality MED. In general, several of the mean TBP values are higher for the Alpha (g) than for the H-Gamma (h) frequency band, except for some exceptions like participants 19 and 20, for example. Note that the H-Gamma frequency band displays more variability.
Finally, in Figure 12 we summarize the above analysis by showing the mean TOP (a,b) and TBP (c,d) for each modality, for all participants, in the Alpha (a,c) and H-Gamma (b,d) frequency bands. We can observe that, for TOP, the values are smaller for the modalities of MED and IMG, while SENT and WORDS display the largest values for both the Alpha and the H-Gamma frequency bands. For TBP, we find a more diverse range of values across modalities. IMG displays large values for both frequency bands. However, MED, and SENT and VDO, have the lowest values for the Alpha frequency band, and it is MM and SENT that display the smallest values for the H-Gamma frequency band. Again, these qualitative visual observations must be evaluated using systematic hypothesis tests in the future.
Regarding the mean NPS, some interesting observations can be made when contrasting the Alpha and H-Gamma frequency bands, as shown in Figure 13 and Table 4. Table 4 uses data from [37], but the modalities have been regrouped, to emphasize the significance of the observations on low NPS values in IMG, MEG, and MM modalities.
In the Alpha frequency band, we can clearly identify the two (2) minimum mean NPS values associated with the modalities IMG and MM, while for the H-Gamma frequency band, the two (2) minimum values are identified for the modalities MED and IMG. The common minimum of the two frequency bands is found in the modality IMG. In the IMG task, the participants are less in control of the process of knowledge creation, until the moment of sudden insight, the “aha” moment, when they identify the meaning of the earlier hidden image. This, it seems to us, would require less total intensive processing time, which is a combination of the mean TOP and the mean NPS, as previously explained in Figure 10, if we add all the TOP values (red) of each peak above the threshold.
We expected that the TOP values for the modality MED would be smaller than for the rest of the modalities, since meditation practices are known to reduce information processing in the brain, and therefore the creation of knowledge and meaning would also be reduced, more likely leading to less NPS and shorter TOP. What is interesting, however, is that we observe a similar behavior for IMG, which suggests that between the sudden discoveries of meaningful images, there is little processing of semantic information, which coincides with longer periods of less intensive processing, as seen in TBP values for both the Alpha and H-Gamma frequency bands.
Based on the parameters (mean and standard deviation) used to compute the confidence intervals displayed in Table 4, we performed a set of unequal variance t-tests of hypothesis, where H0: μ1 = μ2 was tested for means between all modalities in the Alpha and H-Gamma bands. The results are displayed in Table 5. We also tested for means between the Alpha and H-Gamma bands for each modality. It is important to note that we accept H0 for p-values greater than or equal 0.05, while we reject it for values smaller than 0.05. An accept is labeled as {0} and a reject as {1}, and a {0*} indicates a just accepted with a value marginally greater than 0.05, like, for example, 0.06.
From Table 5, it is clear that the modality MED shows a significant number of rejects (1) and just accept (0*), particularly for the H-Gamma band, when compared with the rest of the modalities. For example, we observe that the NPS means for the modality WORDS show the same mean, statistically speaking, when compared to the rest of the modalities in both bands, except for modality MED in the H-Gamma band.
When taking the results of the Alpha and H-Gamma bands together, we reject H0 when comparing the mean NPS values of modality MED with the rest of the modalities, except for the modality VDO where H0 is accepted (0) and MM is just accepted (0*) when compared with MED. In the case of modality WORDS, H0 is accepted (0) for all but modality MED. For modality MM, H0 is accepted (0) when compared with most modalities and only rejected (1) for modalities SENT and VDO in the Alpha band. Additionally, H0 is rejected when contrasting the modalities of IMG with SENT and VDO for the Alpha band.
In H-Gamma, all other combinations of means hypothesis testing return a value of 0, accepting that means between modalities are equal, except when contrasted with modality MED.
When we test H0: μ1 = μ2 for a comparison between the Alpha and H-Gamma bands, we accept H0 for p-values greater than or equal 0.05, otherwise H0 is rejected. It is important to note that we rejected H0 for mean values of NPS for all modalities except for modality MED.
In order to have an idea of the percentage of time that the brain is engaged in intensive processing, as well as the percentage of time that it is operating in quiet processing mode, we computed a pair of variables, as follows.
Let us define (1) T b p , m , s as the duration of stimuli s, for modality m, for participant p, in frequency band b; (2) I P T b p , m , s as the total intensive processing time in stimuli s, for modality m, for participant p, in frequency band b; and (3) N P s as the number of peaks per stimuli s, for modality m, for participant p, for a particular frequency band b*. It follows that:
I P T b * p , m , s = k = 1 N P s T O P b * , k p , m , s
P I P T b * p , m , s = I P T b * p , m , s T b * p , m , s
where P I P T b * p , m , s is the percentage of intensive processing time in one stimulus, for modality m, participant p in a particular frequency band b*.
We then compute a vector P I P T b * m where each particular P I P T b * p , m , s becomes an element of this vector, for modality m, in a particular frequency band b*, and we use the values in this vector to compute and plot the cumulative probability distributions of P I P T b * m for each modality m.
Similarly, we compute the percentage of quiet processing time for each modality m, for a particular frequency band b*, as vector P Q P T b * m to produce further statistical analysis.
In Figure 14a,b,e,f we show the empirical cumulative probability distributions for both P I P T b * m and P Q P T b * m for b* = [Alpha, H-Gamma], for each modality m. As expected, we can observe almost a mirror image of the cumulative probability distributions P I P T b * m (a, b) and P Q P T b * m (e, f) for all modalities in both bands.
We also observe that the P I P T b * m values for modality MED and IMG are smaller than the values for the modality of VDO and this modality in turn shows values smaller than modalities SENT and WORDS for both bands. The modality MED behaves like MM in the Alpha frequency band (a), while the modality of MED follows a closer behavior to the modality of IMG in the H-Gamma (b). It is also clear that the modalities are more clustered in Alpha than H-Gamma, making H-Gamma a better band to discriminate between modalities.
We also observe in Figure 14c,d,g for the P I P T b * m values for b* = [L-Beta, H-Beta, L-Gamma] how clearly the cumulative probability distribution shows the difference between modalities, where the values for MED < IMG < MM < WORDS < VDO < SENT are smaller, in that order, with some small variability for L-Beta. These frequency bands meet the expectation that the modality MED would show the lowest processing times, since participants are expected to be in a calmer and quieter information-processing state of mind.
In Figure 15, we show the mean and errors for the values of the P I P T b * m , for all modalities and frequency bands, and identify an order from the least to the most intensive processing time modalities, as follows: MED, {MM, IMG}, VDO, {SENT, WORDS}.
Here we identify the smallest and largest PIPT values per modality and per band as displayed in Table 6.
From Table 6, we observe how MED displays the lowest values of PIPT for most bands, apart from Alpha. This could be because in MED, Alpha is the most dominant frequency and less entropic band for most participants, as shown in [36]. Note that modalities MED, IMG, and MM show the smallest PIPT values for all frequency bands, in contrast with the modalities of WORDS, SENT, and VDO, presumably indicating that the first group is less time and energy consuming out of all modalities.
Finally, we present in Figure 16 the big picture about TBP, TOP, and NPS and their relationship to one another for all participants, for all modalities, in the Alpha and H-Gamma frequency bands. As important as they might be in giving us a fuller picture of brain dynamics, we leave the study of other frequency bands for future research since (a) we have still to gain more insight into the Alpha–H-Gamma linkage and (b) because it would make this paper exceed the acceptable length.
As we can appreciate in Figure 16a,d, generally speaking, the mean values of TOP drop in H-Gamma when compared to those in Alpha for all modalities except for VDO, while the values for NPS increase significantly for all modalities (except for MED showing a slight increase) in H-Gamma when compared to the Alpha frequency band.
On the other hand, when we observe the mean values of TBP (b, e) they show a diverse behavior between modalities. Modality MED shows a significant increase and clusters together with modality IMG for H-Gamma when compared to the Alpha frequency band. However, modalities SENT, MM, VDO, and WORDS show a large decrease in H-Gamma when compared to the Alpha frequency band.
In Figure 16c,f we note an inverse relationship between the mean TOP and the mean TBP values, with a larger variability in Alpha, where we also see three (3) clusters of modalities as follows: {MED, IMG}, {MM, WORDS, VDO} and {SENT}.
Based on the above, we constructed Table 7 in order to have a concise picture of the behavior of the different modalities in the Alpha and H-Gamma frequency bands in terms of mean TOP (MTOP), mean TBP (MTBP), and NPS. This allows us to qualitatively characterize these variables, from the smallest to the largest values, that influence the percentage of intensive and quite processing, PIPT, and PQPT respectively.
Taken together, the Alpha and H-Gamma frequency bands show MTOP values displaying an increasing order from the modalities of MED and IMG (1st), VDO (2nd), MM (3rd), WORDS (4th), and SENT (5th). However, when we look at the MTBP values, the different modalities behave in a different manner when contrasting Alpha and H-Gamma, as follows: (a) SENT (1st), VDO (2nd), MED (3rd), MM (4th), WORDS (5th), and IMG (6th) for Alpha, and (b) SENT (1st), MM (2nd), VDO (3rd), WORDS (4th), MED (5th), and IMG (6th) for H-Gamma.
Note that modalities MM and MED show the larger changes in mean TBP values when contrasting their values for the Alpha and H-Gamma frequency bands.
For the NPS, the order is as follows: (a) IMG, MM, WORDS, MED, VDO, and SENT for the Alpha frequency band; and (b) MED, IMG, VDO, MM, SENT, and WORDS for the H-Gamma frequency band. Note that both frequency bands show very distinct orders.
We observe a general tendency of inverse correlation between the variables MTBP and NPS, except for some exceptions in the H-Gamma band, particularly for modality WORDS.
These variables describe very particular aspects of brain dynamics related to velocity of processing per stimulus and the mean intensive processing time per stimulus that can be derived from I P T b * p , m , s in Equation (21), which also can be estimated as the mean TOP multiplied by the mean NP per stimulus for all participants, for all modalities, for each band. This analysis could be extended to differentiate between experienced meditators and participants with little or no meditation experience. This issue is not addressed in the present study.
Note that this analysis is based on mean values. In the future, this study should be complemented following the results obtained in Table 5, and further statistical analysis, where a test is performed for H0: μ1 < μ2 or H0: μ1 > μ2, for NPS and other variables.

4. Discussion

Based on a comprehensive evaluation of EEG measurements conducted with the participation of 20 volunteers, we explored the brain response to a range of sensory and task modalities, following the pioneering work by Freeman on intentional neurodynamics and the creation of meaning [23,30]. The results presented in this work help better understand the energy aspects of brain dynamics that may lead to a healthy and meaningful life. Several main conclusions are as follows:
  • Several statistical indices were introduced based on pragmatic information (PI) to characterize brain dynamics over the Theta, Alpha, Low Beta, High Beta, Low Gamma, and High Gamma bands. We defined the following variables over each band and each modality: number of PI peaks per second (NPS), time between peaks (TBP), time of peak (TOP), quiet processing time (QPT), and intensive processing time (IPT). We conducted a thorough statistical analysis of these variables and found important differences and similarities between modalities and bands.
  • The analysis showed that H-Gamma and Alpha frequency bands demonstrate high NPS across the pool of 20 participants. Brain dynamics variables, in both the Alpha and H-Gamma frequency bands, served as classifiers for the different behaviors observed between modalities. This result provides a novel quantitative support to the previously established relationship between Alpha and H-Gamma bands using alternative approaches.
  • Except for Theta and Alpha frequency bands, the meditation (MED) modality shows the lowest mean NPS values. This observation is in accordance with other studies showing that meditative states significantly rely on processes over the Theta and Alpha bands. This topic requires further detailed investigation and rigorous statistical hypothesis testing, which are beyond our present work.
  • A significant, novel aspect of our PI-based statistical analysis is that the derived information-theoretical indices can be considered as promising candidates for the estimation of intensive processing periods in brains, potentially suitable to differentiate between high and low energy consumption modalities. To compare modalities and characterize their behavior, we may use complementary measures such as information and entropy indices.
  • Having a robust experimental tool to non-invasively monitor the energy consumption of brain operational modalities will be very useful for the analysis of cognitive processing in healthy brains, with minimal interference in the person’s daily activities. Moreover, deviations from well-established patterns of activities may help to identify and rectify potential pathological conditions.
In recent years, the important role of rhythmic breathing on psychophysiological coherence has been clearly identified [56,57], including the respiratory modulation of cognitive functions in epileptic patients [58]. The delicate balance of various rhythms in the human body and the relationship to general well-being [59] has been documented, including a comprehensive study on heart rhythm synchronization with the Earth’s time-varying magnetic field [60]. The insights gained in our present study using EEG measurements involving multiple cognitive modalities indicate potential applications to address various cognitive neuroscience challenges, including brain studies associated with mental conditions and brain diseases. Intentions and decisions can shape our dietary, exercise, and relaxation habits, with lasting consequences on health in general. Having a clear understanding of the effect of such intentions, decisions, and habits can help in preventing and, perhaps, even correcting, illness and traumatic experiences.
The main focus of the present study is on meditative states and their potential positive influence on general well-being, closely related to our values and meaningful intentional actions. When we value someone or something, this person or thing becomes meaningful to us. In turn, we value what is meaningful to us, like a spiritual experience, the love of our lives, or our general well-being. Spiritual or universal values like Love, Truth, and Unity seem to be very meaningful as abstractions and experiences that we reflect in social relationships and contracts like marriages and constitutions. Together with inner transformations, new insights, and higher meanings, we can refine our intentions and continuously redefine our goals, in order to improve ourselves in many dimensions.
Meanings are veiled to brain dynamics measurements, yet they are very salient and revealed in human experience. This forces us to consider that the creation and valuation of our most precious meanings ought to be understood by first- and third-person perspective science, which means the study of brain dynamics, hand-in-hand with personal reports and anecdotal veridical evidence based on subjective experience. We hope these studies help to improve the conditions for enhancing human potential, well-being, and meaningful behavioral responses, something that could be explored both experientially and scientifically from an early age.
We dedicate this work to the memory and the spirit of our beloved friend and collaborator Walter J. Freeman.

Author Contributions

Conceptualization, J.J.J.D., F.S. and R.K.; methodology, J.J.J.D. and R.K.; software, J.J.J.D. and F.S.; validation, F.S.; data curation, J.J.J.D.; writing—original draft preparation, J.J.J.D.; writing—review and editing, R.K. and F.S.; visualization, F.S.; supervision, J.J.J.D. and R.K.; project administration, J.J.J.D. All authors have read and agreed to the published version of the manuscript.

Funding

Funding for this study has been provided by University of Auckland Postgraduate Research Student Support Account.

Institutional Review Board Statement

The study protocol involving human participants was reviewed and approved by the University of Auckland Human Participants Ethics Committee (Reference Number 016873, dated 13 March 2020, for a 3-year period). The participants provided their written informed consent to participate in this study.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The EEG datasets generated for this study may be available upon request, provided that the proper approval is granted by Professor Ian J. Kirk, Head of Ian J. Kirk’s Lab at the Centre for Brain Research at The University of Auckland in New Zealand.

Acknowledgments

This work has been the result of a collaborative effort between The Embassy of Peace in New Zealand and Ian J. Kirk’s Lab, Centre for Brain Research at The University of Auckland in New Zealand. Both The Embassy of Peace team and the main author, J.J.J. Davis, remain in deep gratitude to Veema Lodhia and Ian Kirk for their continuous support and kindness, as well as the fellow PhD students with whom we shared the lab and many good and meaningful moments.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zafar, R.; Dass, S.C.; Malik, A. Electroencephalogram-based decoding cognitive states using convolutional neural network and likelihood ratio-based score fusion. PLoS ONE 2017, 12, e0178410. [Google Scholar] [CrossRef]
  2. Suhail, T.; Indiradevi, K.; Suhara, E.; Poovathinal, S.; Ayyappan, A. Distinguishing cognitive states using elec-troencephalography local activation and functional connectivity patterns. Biomed. Signal Process Control 2022, 77, 103742. [Google Scholar] [CrossRef]
  3. Buzzell, G.; Morales, S.; Valadez, E.; Hunnius, S.; Fox, N. Maximizing the potential of EEG as a developmental neuroscience tool. Dev. Cogn. Neurosci. 2023, 60, 101201. [Google Scholar] [CrossRef]
  4. Gavaret, M.; Iftimovici, A.; Pruvost-Robieux, E. EEG: Current relevance and promising quantitative analyses. Rev. Neurol. 2023, 179, 352–360. [Google Scholar] [CrossRef]
  5. Oliveira Filho, F.; Ribeiro, F.; Leyva Cruz, J.; Nunes de Castro, A.; Zebende, G. Statistical study of the EEG in motor tasks (real and imaginary). Phys. A Stat. Mech. 2023, 622, 128802. [Google Scholar] [CrossRef]
  6. Büchel, D.; Sandbakk, Ø.; Baumeister, J. Exploring intensity-dependent modulations in EEG resting-state network efficiency induced by exercise. Eur. J. Appl. Physiol. 2021, 121, 2423–2435. [Google Scholar] [CrossRef]
  7. Walter, N.; Hinterberger, T. Determining states of consciousness in the electroencephalogram based on spectral, complexity, and criticality features. Neurosci. Conscious. 2022, 8, niac008. [Google Scholar] [CrossRef]
  8. Alreshidi, I.; Moulitsas, I.; Jenkins, K. Multimodal Approach for Pilot Mental State Detection Based on EEG. Sensors 2023, 23, 7350. [Google Scholar] [CrossRef] [PubMed]
  9. Parameshwaran, D.; Thiagarajan, T. High Variability Periods in the EEG Distinguish Cognitive Brain States. Brain Sci. 2023, 13, 1528. [Google Scholar] [CrossRef] [PubMed]
  10. Dvorak, D.; Shang, A.; Abdel-Baki, S.; Suzuki, W.; Fenton, A. Cognitive behavior classification from scalp EEG signals. IEEE Trans. Neural. Syst. Rehabil Eng. 2018, 26, 729–739. [Google Scholar] [CrossRef] [PubMed]
  11. Zhang, J.; Chen, M.; Hu, S.; Cao, Y.; Kozma, R. PNN for EEG-based Emotion Recognition. In Proceedings of the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, Hungary, 9–12 October 2016. [Google Scholar] [CrossRef]
  12. Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A review of emotion recognition using physiological signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef]
  13. Topic, A.; Russo, M.; Stella, M.; Saric, M. Emotion recognition using a reduced set of EEG channels based on holographic feature maps. Sensors 2022, 22, 3248. [Google Scholar] [CrossRef]
  14. Yuvaraj, R.; Thagavel, P.; Thomas, J.; Fogarty, J.; Ali, F. Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings. Sensors 2023, 23, 915. [Google Scholar] [CrossRef] [PubMed]
  15. Akhand, M.A.H.; Maria, M.A.; Kamal, M.A.S.; Murase, K. Improved EEG-based emotion recognition through information enhancement in connectivity feature map. Sci. Rep. 2023, 13, 13804. [Google Scholar] [CrossRef] [PubMed]
  16. Jafari, M.; Shoeibi, A.; Khodatars, M.; Bagherzadeh, S.; Shalbaf, A.; García, D.L.; Gorriz, J.; Acharya, U. Emotion recognition in EEG signals using deep learning methods: A review. Comput. Biol. Med. 2023, 165, 107450. [Google Scholar] [CrossRef] [PubMed]
  17. Li, X.; Song, D.; Zhang, P.; Zhang, Y.; Hou, Y.; Hu, B. Exploring EEG Features in Cross-Subject Emotion Recognition. Front. Neurosci. 2018, 12, 162. [Google Scholar] [CrossRef] [PubMed]
  18. Zangróniz, R.; Martínez-Rodrigo, A.; Pastor, J.; López, M.; Fernández-Caballero, A. Electrodermal Activity Sensor for Classification of Calm/Distress Condition. Sensors 2017, 17, 2324. [Google Scholar] [CrossRef] [PubMed]
  19. Liu, Z.; Xiao, F.; Lin, C.T.; Cao, Z. A Robust Evidential Multisource Data Fusion Approach Based on Cooperative Game Theory and Its Application in EEG. IEEE Trans. Syst. Man Cybern. Syst. 2024, 54, 729–740. [Google Scholar] [CrossRef]
  20. Saffari, F.; Norouzi, K.; Bruni, L.E.; Zarei, S.; Ramsøy, T.Z. Impact of varying levels of mental stress on phase information of EEG Signals: A study on the Frontal, Central, and parietal regions. Biomed. Signal Process. Control. 2023, 86, 105236. [Google Scholar] [CrossRef]
  21. Freeman, W.J. How and Why Brains Create Meaning from Sensory Information. Int. J. Bifurcat. Chaos. 2004, 14, 515–530. [Google Scholar] [CrossRef]
  22. Buzsaki, G. The Brain from Inside Out; Oxford University Press: New York, NY, USA, 2019. [Google Scholar]
  23. Freeman, W.J. A Neurobiological Theory of Meaning in Perception. Part 1. Information and Meaning in Nonconvergent and Nonlocal Brain Dynamics. Int. J. Bifurcat. Chaos 2003, 13, 2493–2511. [Google Scholar] [CrossRef]
  24. Freeman, W.J. Nonlinear Brain Dynamics and Intention According to Aquinas. Mind Matter 2008, 6, 207–234. [Google Scholar]
  25. Davis, J.J.; Gillett, G. Brentano & Panikkar’s Beautiful Anticipation of the Cognitive Cycle & the Emergence of Knowledge & Meaning: Towards a Broader Understanding of Transcendence. J. Conscious. Explor. Res. 2023, 14, 372–386. [Google Scholar]
  26. Pannikkar, R. The Experience of God—Icons of the Mystery; Fortress Press: Minneapolis, MN, USA, 2006. [Google Scholar]
  27. Bergman, R.S. Celebrar la Differencia—Unidad en la Diversidad; Ediciones B: Buenos Aires, Argentina, 2009. [Google Scholar]
  28. Freeman, W.; Vitiello, G. Matter and Mind are Entangled in Two Streams of Images Guiding Behavior and Informing the Subject Through Awareness. Mind Matter 2016, 14, 7–24. [Google Scholar]
  29. Noack, R.; Davis, J.J.J.; Manjesh, C.; Kozma, R. Neuro-energetic aspects of cognition—The role of pulse-wave-pulse conversion in the interpretation of brain imaging data. In Proceedings of the 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Honolulu, HI, USA, 27 November–1 December 2017. [Google Scholar] [CrossRef]
  30. Freeman, W.J. The Wave Packet: An Action Potential for the 21st Century. J. Integr. Neurosci 2003, 2, 3–30. [Google Scholar] [CrossRef] [PubMed]
  31. Freeman, W.J.; Quiroga, R.Q. Imaging Brain Function with EEG—Advanced Temporal and Spatial Analysis of Electroencephalographic Signals; Springer: New York, NY, USA, 2013. [Google Scholar]
  32. Freeman, W.J. Mechanism and significance of global coherence in scalp EEG. Curr. Opin. Neurobiol. 2015, 31, 199–205. [Google Scholar] [CrossRef]
  33. Davis, J.J.J.; Kozma, R. Creation of Knowledge & Meaning Manifested via Cortical Singularities in Cognition towards a Methodology to Understand Intentionality and Critical Behavior in Neural Correlates of Awareness. In Proceedings of the 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), Singapore, 16–19 April 2013. [Google Scholar]
  34. Khoshnevis, S.; Sankar, R. Applications of higher order statistics in electroencephalography signal processing: A comprehensive survey. IEEE Rev. Biomed. Eng. 2020, 13, 169–183. [Google Scholar] [CrossRef] [PubMed]
  35. Demirer, R.; Kesebir, S. The entropy of chaotic transitions of EEG phase growth in bipolar disorder with lithium carbonat. Sci. Rep. 2021, 11, 11888. [Google Scholar] [CrossRef]
  36. Davis, J.J.J.; Kozma, R.; Schübeler, F. Analysis of Meditation vs. Sensory Engaged Brain States Using Shannon Entropy and Pearson’s First Skewness Coefficient Extracted from EEG Data. Sensors 2023, 23, 1293. [Google Scholar] [CrossRef]
  37. Davis, J.J.J.; Schübeler, F.; Kozma, R. Brain Dynamics in Engaged and Relaxed Psychophysiological States Reflecting the Creation of Knowledge and Meaning. In Proceedings of the 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023. [Google Scholar] [CrossRef]
  38. Atmanspacher, H.; Scheingraber, H. Pragmatic information and dynamical instabilities in a multimode continuous-wave dye laser. Can. J. Phys. 1990, 68, 728–737. [Google Scholar] [CrossRef]
  39. beim Graben, P. Pragmatic Information in Dynamic Semantics. Mind Matter 2006, 4, 169–193. [Google Scholar]
  40. Freeman, W.J. Origin, structure, and role of background EEG activity. Part 2. Analytic phase. Clin. Neurophysiol. 2004, 115, 2089–2107. [Google Scholar] [CrossRef] [PubMed]
  41. Freeman, W.J.; Vitiello, G. Nonlinear brain dynamics as macroscopic manifestation of underlying many-body field dynamics. Phys. Life Rev. 2006, 3, 93–118. [Google Scholar] [CrossRef]
  42. Kirk, I.; Spriggs, M.; Sumner, R. Human EEG and the mechanisms of memory: Investigating long-term potentiation (LTP) in sensory-evoked potentials. J. R. Soc. N. Z. 2021, 51, 24–40. [Google Scholar] [CrossRef]
  43. Zheng, Y.; Kirk, I.; Chen, T.; O’Hagan, M.; Waldie, K. Task-Modulated Oscillation Differences in Auditory and Spoken Chinese-English Bilingual Processing: An Electroencephalography Study. Front. Psychol. 2022, 13, 823700. [Google Scholar] [CrossRef] [PubMed]
  44. Electrical Geodesics, Inc. Geodesic Sensor Net Technical Manual. Available online: https://philipsproductcontent.blob.core.windows.net/assets/20180705/6f388e7ade4d41e38ad5a91401755b6f.pdf (accessed on 24 January 2024).
  45. Freeman, W.J. Origin, structure, and role of background EEG activity. Part 1. Analytic amplitude. Clin. Neurophysiol. 2005, 115, 2077–2088. [Google Scholar] [CrossRef] [PubMed]
  46. beim Graben, P.; Atmanspacher, H. Editorial. Mind Matter 2006, 4, 131–139. [Google Scholar]
  47. Peirce, C.S. The Collected Papers of Charles Sanders Peirce, Volumes V and VI: Pragmatism and Pragmaticism and Scientific Metaphysics; Hartshorne, C., Weiss, P., Eds.; Harvard University Press: Cambridge, MA, USA, 1935. [Google Scholar]
  48. von Weizsäcker, E.; von Weizsäcker, C. Wiederaufnahme der begrifflichen Frage: Was ist Information? Nova Acta Leopold. 1972, 37, 536–555. [Google Scholar]
  49. von Weizsäcker, E. Erstmaligkeit und Bestätigung als Komponenten der pragmatischen Information. Offene Syst. 1974, 1, 82–111. [Google Scholar]
  50. Davis, J.J.J.; Kozma, R.; Freeman, W.J. Neurophysiological evidence of the cognitive cycle and the emergence of awareness. In Proceedings of the 2013 iCAST & UMEDIA, Aizuwakamatsu, Japan, 2–4 November 2013. [Google Scholar]
  51. Freeman, W.J.; Rogers, L.J.; Holmes, M.D.; Silbergeld, D.L. Spatial spectral analysis of human electrocorticograms including the alpha and gamma bands. J. Neurosci. Methods 2000, 95, 111–121. [Google Scholar] [CrossRef]
  52. Buzsáki, G. Rhythms of the Brain; Oxford University Press: New York, NY, USA, 2006. [Google Scholar]
  53. Freeman, W.J. Neurodynamics: An Exploration in Mesoscopic Brain Dynamics; Springer: London, UK, 2012. [Google Scholar]
  54. Freeman, W.J. Mass Action in the Nervous System—Examination of The Neurophysiological Basis of Adaptive Behaviour through the EEG; Academic Press: New York, NY, USA, 1975. [Google Scholar]
  55. Jensen, O.; Mazaheri, A. Shaping Functional Architecture by Oscillatory Alpha Activity: Gating by Inhibition. Front. Hum. Neurosci. 2010, 4, 186. [Google Scholar] [CrossRef] [PubMed]
  56. McCraty, R.; Atkinson, M.; Tomasino, D.; Bradley, R. The coherent heart heart-brain interactions, psychophysiological coherence, and the emergence of system-wide order. Integral Rev. 2009, 5, 10–115. [Google Scholar]
  57. Zelano, C.; Jiang, H.; Zhou, G.; Arora, N.; Schuele, S.; Rosenow, J.; Gottfried, J.A. Nasal Respiration Entrains Human Limbic Oscillations and Modulates Cognitive Function. J. Neurosci. 2016, 36, 12448–12467. [Google Scholar] [CrossRef] [PubMed]
  58. Kozma, R.; Davis, J.J.J.; Schübeler, F.; McAfee, S.; Wheless, J.W.; Heck, D. Respiratory Modulation of Cortical Rhythms - Testing The Phase Transition Hypothesis. In Proceedings of the 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Prague, Czech Republic, 8–12 October 2022. [Google Scholar] [CrossRef]
  59. Davis, J.J.J.; Schübeler, F.; Kozma, R. Psychophysiological Coherence in Community Dynamics—A Comparative Analysis between Meditation and Other Activities. OBM Integr. Complement. Med. 2019, 4, 1901015. [Google Scholar] [CrossRef]
  60. Timofejeva, I.; McCraty, R.; Atkinson, M.; Alabdulgader, A.; Vainoras, A.; Landauskas, M.; Šiaučiūnaitė, V.; Ragulskis, M. Global Study of Human Heart Rhythm Synchronization with the Earth’s Time Varying Magnetic Field. Appl. Sci. 2021, 11, 2935. [Google Scholar] [CrossRef]
Figure 1. An example of the ambiguous images shown to the participants. (My Wife and My Mother-In-Law, W. E. Hill, 1915).
Figure 1. An example of the ambiguous images shown to the participants. (My Wife and My Mother-In-Law, W. E. Hill, 1915).
Sensors 24 01605 g001
Figure 2. (a) Schematics of the Hilbert transform-based methodology when a narrow frequency band is applied to the EEG signal (Xt), producing the filtered signal Yt, followed by a Hilbert transformation. This leads to signal Zt, which is complex valued. Considering polar coordinates, the signal is described by its amplitude and phase. The modulus of Zt gives the analytic amplitude (AA), while the angle produces the analytic phase (AP). (b) Resulting signals after Hilbert transform is applied to a sinus time series, showing the real and imaginary parts of the complex signal. (c) Analytic amplitude (AA) and phase (AP) derived from the resulting signals after Hilbert transform is applied. Examples of the different indices that were computed after Hilbert transforming the signal amplitude for EEG Channel 2: (d) AA(t), (e) IF(t), (f) AP(t), (g) SA(t).
Figure 2. (a) Schematics of the Hilbert transform-based methodology when a narrow frequency band is applied to the EEG signal (Xt), producing the filtered signal Yt, followed by a Hilbert transformation. This leads to signal Zt, which is complex valued. Considering polar coordinates, the signal is described by its amplitude and phase. The modulus of Zt gives the analytic amplitude (AA), while the angle produces the analytic phase (AP). (b) Resulting signals after Hilbert transform is applied to a sinus time series, showing the real and imaginary parts of the complex signal. (c) Analytic amplitude (AA) and phase (AP) derived from the resulting signals after Hilbert transform is applied. Examples of the different indices that were computed after Hilbert transforming the signal amplitude for EEG Channel 2: (d) AA(t), (e) IF(t), (f) AP(t), (g) SA(t).
Sensors 24 01605 g002
Figure 3. Illustration on the cycle of creation of knowledge and meaning. A visual stimulus is presented to the animal at time instant 3 s. The stimulus is processed and resolved in the 1 s window following stimulus presentation [37].
Figure 3. Illustration on the cycle of creation of knowledge and meaning. A visual stimulus is presented to the animal at time instant 3 s. The stimulus is processed and resolved in the 1 s window following stimulus presentation [37].
Sensors 24 01605 g003
Figure 4. Examples of the different indices that were computed after Hilbert transforming the signal amplitude for each of the 128 electrodes (plotted in different colors in a, b and e): (a) analytic amplitude A(t) or AA(t), (b) signal amplitude S(t) or SA(t), (c) spatial ensemble averages 〈AA(t)〉 with 3-sigma band, (d) spatial ensemble averages 〈SA(t)〉 with 3-sigma band, (e) analytic frequency IF(t).
Figure 4. Examples of the different indices that were computed after Hilbert transforming the signal amplitude for each of the 128 electrodes (plotted in different colors in a, b and e): (a) analytic amplitude A(t) or AA(t), (b) signal amplitude S(t) or SA(t), (c) spatial ensemble averages 〈AA(t)〉 with 3-sigma band, (d) spatial ensemble averages 〈SA(t)〉 with 3-sigma band, (e) analytic frequency IF(t).
Sensors 24 01605 g004
Figure 5. Pragmatic information illustration; (e) He(t) is the result of the ratio A 2 t / D e t where De(t) and AA2(t) are shown in plot (a,b,c), respectively. (d) displays He(t)1 and (e) He(t)2; these are pragmatic information indices where De(t) is based on amplitude and phase, respectively.
Figure 5. Pragmatic information illustration; (e) He(t) is the result of the ratio A 2 t / D e t where De(t) and AA2(t) are shown in plot (a,b,c), respectively. (d) displays He(t)1 and (e) He(t)2; these are pragmatic information indices where De(t) is based on amplitude and phase, respectively.
Sensors 24 01605 g005
Figure 6. (a) Example of the positioning of the EGI EEG array (128 electrodes) on participant’s scalp. (b) Brain areas color coded and represented in a matrix. (c) Contour plot of the pragmatic information index He(t) during the 3.5 s response time across pre-frontal, frontal, central, and occipital brain areas, as displayed on subplot (b); see also [37]; (d) shows the He(t) signals for the same brain areas and time windows Δt. (e) Both graphs (c,d) display results for participant P7, stimulus S9, in modality WORDS, for the Theta band, where the stimuli presentation (LVEO) coincides with start time 0, and the pressing of the button to provide an answer takes place at the end of the processing of the stimuli, which coincides with time 3.5 s.
Figure 6. (a) Example of the positioning of the EGI EEG array (128 electrodes) on participant’s scalp. (b) Brain areas color coded and represented in a matrix. (c) Contour plot of the pragmatic information index He(t) during the 3.5 s response time across pre-frontal, frontal, central, and occipital brain areas, as displayed on subplot (b); see also [37]; (d) shows the He(t) signals for the same brain areas and time windows Δt. (e) Both graphs (c,d) display results for participant P7, stimulus S9, in modality WORDS, for the Theta band, where the stimuli presentation (LVEO) coincides with start time 0, and the pressing of the button to provide an answer takes place at the end of the processing of the stimuli, which coincides with time 3.5 s.
Sensors 24 01605 g006
Figure 7. Pragmatic information index He(t) for (a) in the Alpha and (b) the H-Gamma frequency bands; illustrating peaks above threshold 0.1, with selection rules considering peak duration and time between peaks.
Figure 7. Pragmatic information index He(t) for (a) in the Alpha and (b) the H-Gamma frequency bands; illustrating peaks above threshold 0.1, with selection rules considering peak duration and time between peaks.
Sensors 24 01605 g007
Figure 8. (a) Comprehensive illustration of NPS γ ˙ ¯ b p , m across six modalities and six frequency bands, in the case of participant 1. The modalities are in the same order as introduced at the beginning, e.g., Meditation (M) first in dark blue and MathMind (MM) fourth in yellow. (b) Mean peaks/second γ ˙ ¯ b p , across modalities, for the six frequency bands; (c) mean peaks/second γ ˙ ¯ p , m across frequencies, for the six modalities.
Figure 8. (a) Comprehensive illustration of NPS γ ˙ ¯ b p , m across six modalities and six frequency bands, in the case of participant 1. The modalities are in the same order as introduced at the beginning, e.g., Meditation (M) first in dark blue and MathMind (MM) fourth in yellow. (b) Mean peaks/second γ ˙ ¯ b p , across modalities, for the six frequency bands; (c) mean peaks/second γ ˙ ¯ p , m across frequencies, for the six modalities.
Sensors 24 01605 g008
Figure 10. Illustration of the quantities time between peaks (TBP) in green, and time or duration of a peak (TOP), in red. The blue line shows the computed PI index He(t)2.
Figure 10. Illustration of the quantities time between peaks (TBP) in green, and time or duration of a peak (TOP), in red. The blue line shows the computed PI index He(t)2.
Sensors 24 01605 g010
Figure 11. Upper row shows CDF plots for TOP values in (a) the Alpha and (b) the H-Gamma frequency band for all participants in each modality; (c) shows 3D bar graphs of mean TOP values for each participant (x-axis) and each modality (y-axis) for the Alpha frequency band and (d) for the H-Gamma frequency band. Lower row shows CDF plots for TBP values in (e) the Alpha and (f) the H-Gamma frequency band for all participants in each modality; (g) shows 3D bar graphs of mean TBP values for each participant (x-axis) and each modality (y-axis) for the Alpha frequency band and (h) for the H-Gamma frequency band. The different colors in (c,d,g,h) represent the 20 participants from P1 (dark blue) to P20 (dark red).
Figure 11. Upper row shows CDF plots for TOP values in (a) the Alpha and (b) the H-Gamma frequency band for all participants in each modality; (c) shows 3D bar graphs of mean TOP values for each participant (x-axis) and each modality (y-axis) for the Alpha frequency band and (d) for the H-Gamma frequency band. Lower row shows CDF plots for TBP values in (e) the Alpha and (f) the H-Gamma frequency band for all participants in each modality; (g) shows 3D bar graphs of mean TBP values for each participant (x-axis) and each modality (y-axis) for the Alpha frequency band and (h) for the H-Gamma frequency band. The different colors in (c,d,g,h) represent the 20 participants from P1 (dark blue) to P20 (dark red).
Sensors 24 01605 g011
Figure 12. Mean TOP (a,b) and mean TBP (c,d) values with corresponding error bars for all participants and each modality in the Alpha frequency band (a,c) and the H-Gamma frequency band (b,d).
Figure 12. Mean TOP (a,b) and mean TBP (c,d) values with corresponding error bars for all participants and each modality in the Alpha frequency band (a,c) and the H-Gamma frequency band (b,d).
Sensors 24 01605 g012
Figure 13. Mean NPS with error bars for all participants in each modality for the Alpha and the H-Gamma frequency bands.
Figure 13. Mean NPS with error bars for all participants in each modality for the Alpha and the H-Gamma frequency bands.
Sensors 24 01605 g013
Figure 14. CDF for PIPT, considering all modalities, in Alpha (a), H-Gamma (b), L-Gamma (e), L-Beta (f) and H-Beta (g), and for PQPT in Alpha (c) and H-Gamma (d) frequency bands.
Figure 14. CDF for PIPT, considering all modalities, in Alpha (a), H-Gamma (b), L-Gamma (e), L-Beta (f) and H-Beta (g), and for PQPT in Alpha (c) and H-Gamma (d) frequency bands.
Sensors 24 01605 g014
Figure 15. PIPT mean values with error bars, for all modalities in each frequency band.
Figure 15. PIPT mean values with error bars, for all modalities in each frequency band.
Sensors 24 01605 g015
Figure 16. The relationships between mean TOP, NPS, and TBP: (a) mean TOP vs. NPS values for Alpha and H-Gamma (d), TBP vs. NPS values for Alpha (b) and H-Gamma (e) and TOP vs. TBP (in red) for Alpha (c) and H-Gamma (f).
Figure 16. The relationships between mean TOP, NPS, and TBP: (a) mean TOP vs. NPS values for Alpha and H-Gamma (d), TBP vs. NPS values for Alpha (b) and H-Gamma (e) and TOP vs. TBP (in red) for Alpha (c) and H-Gamma (f).
Sensors 24 01605 g016
Table 1. Relevant frequency ranges and bands studied.
Table 1. Relevant frequency ranges and bands studied.
Frequency RangeFrequency Band
4–7 HzTheta
8–12 HzAlpha
13–17 HzLow Beta
18–25 HzHigh Beta
26–34 HzLow Gamma
35–48 HzHigh Gamma
Table 2. Values for p, m, b, e, and s as used in the computations.
Table 2. Values for p, m, b, e, and s as used in the computations.
Participants: P1, P2, …, P20pmax = 20
Modalities: MED, WORDS, …, VDOmmax = 6
Frequency Bands: Theta, …, H-Gammabmax = 6
Electrodes/Channels: 1, 2, 3, …, 128emax = 128
Stimuli per modality: 1, 2, 3, … , Nsmsmax = Nsm
Table 3. Number of stimuli per modality, Nsm.
Table 3. Number of stimuli per modality, Nsm.
ModalityMEDWORDSIMGMMSENTVDO
Nsm202012282010
Table 4. NPS values averaged over all participants and all modalities; frequency bands Alpha and High Gamma.
Table 4. NPS values averaged over all participants and all modalities; frequency bands Alpha and High Gamma.
ModalitySignificance of BandAlpha BandH-Gamma Band
Images (IMG)Alpha, H-Gamma1.64 +/− 0.212.28 +/− 0.59
Meditation (MED)H-Gamma2.00 +/− 0.232.07 +/− 0.62
Math Mind (MM)Alpha1.71 +/− 0.212.84 +/− 0.51
Words (WORDS)None1.99 +/− 0.333.02 +/− 0.58
Sentences (SENT)None2.11 +/− 0.203.03 +/− 0.54
Video (VDO)None2.02 +/− 0.202.77 +/− 0.55
Table 5. Results of the hypothesis tests between modalities. A number in the set {0,1} that corresponds to a value of the set {accept, reject} respectively, as a result of an unequal variance t-test of hypothesis, where H0: μ1 = μ2, allowing for a comparison between a pair of modalities for the Alpha (above matrix diagonal, color yellow) and H-Gamma bands (below matrix diagonal, color blue) based on the NPS index. The symbol “*” indicates a value very close to α = 0.05 (0* means accepted, just).
Table 5. Results of the hypothesis tests between modalities. A number in the set {0,1} that corresponds to a value of the set {accept, reject} respectively, as a result of an unequal variance t-test of hypothesis, where H0: μ1 = μ2, allowing for a comparison between a pair of modalities for the Alpha (above matrix diagonal, color yellow) and H-Gamma bands (below matrix diagonal, color blue) based on the NPS index. The symbol “*” indicates a value very close to α = 0.05 (0* means accepted, just).
Vs.MEDMMWORDSSENTIMGVDO
MED.0*0010
MM0*.0101
WORDS10.000
SENT100.10
IMG0000*.1
VDO0*0000.
Table 6. The smallest and largest PIPT values for all modalities in each frequency band.
Table 6. The smallest and largest PIPT values for all modalities in each frequency band.
ThetaAlphaL-BetaH-BetaL-GammaH-Gamma
SmallestMM, MEDIMG, MMMED, MMMED, IMGMED, IMGMED, IMG
LargestWORDS, VDOSENT, WORDSWORDS, SENTWORDS, VDOSENT, VDOSENT, WORDS
Table 7. Comparing modalities and frequencies for several indices. The index of ordering from the smallest to the largest, based on the mean values of TOP (MTOP), the mean values of TBP (MTBP), and NPS values as derived from Figure 16, per modality for the Alpha (A) and H-Gamma (G) frequency bands.
Table 7. Comparing modalities and frequencies for several indices. The index of ordering from the smallest to the largest, based on the mean values of TOP (MTOP), the mean values of TBP (MTBP), and NPS values as derived from Figure 16, per modality for the Alpha (A) and H-Gamma (G) frequency bands.
MODMEDWORDSIMGMMSENTVDO
BANDAGAGAGAGAGAG
MTOP125521446633
MTBP355466421123
NPS413612246553
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Davis, J.J.J.; Schübeler, F.; Kozma, R. Information-Theoretical Analysis of the Cycle of Creation of Knowledge and Meaning in Brains under Multiple Cognitive Modalities. Sensors 2024, 24, 1605. https://doi.org/10.3390/s24051605

AMA Style

Davis JJJ, Schübeler F, Kozma R. Information-Theoretical Analysis of the Cycle of Creation of Knowledge and Meaning in Brains under Multiple Cognitive Modalities. Sensors. 2024; 24(5):1605. https://doi.org/10.3390/s24051605

Chicago/Turabian Style

Davis, Joshua J. J., Florian Schübeler, and Robert Kozma. 2024. "Information-Theoretical Analysis of the Cycle of Creation of Knowledge and Meaning in Brains under Multiple Cognitive Modalities" Sensors 24, no. 5: 1605. https://doi.org/10.3390/s24051605

APA Style

Davis, J. J. J., Schübeler, F., & Kozma, R. (2024). Information-Theoretical Analysis of the Cycle of Creation of Knowledge and Meaning in Brains under Multiple Cognitive Modalities. Sensors, 24(5), 1605. https://doi.org/10.3390/s24051605

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop