Next Article in Journal
Exploring Neighbor Spatial Relationships for Enhanced Lumbar Vertebrae Detection in X-ray Images
Next Article in Special Issue
A Closer Look at the Statistical Behavior of a Chaotic System with Message Inclusion for Cryptographic Applications
Previous Article in Journal
Efficient Gaussian Process Calculations Using Chebyshev Nodes and Fast Fourier Transform
Previous Article in Special Issue
Random Numbers Generated Based on Dual-Channel Chaotic Light
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Neural Chaotic Oscillation: Memristive Feedback, Symmetrization, and Its Application in Image Encryption

1
School of Electronics and Information Engineering, Nanjing University of Information Science and Technology, Nanjing 210044, China
2
School of Artificial Intelligence, Nanjing University of Information Science and Technology, Nanjing 210044, China
3
Collaborative Innovation Center of Memristive Computing Application (CICMCA), Qilu Institute of Technology, Jinan 250200, China
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(11), 2138; https://doi.org/10.3390/electronics13112138
Submission received: 10 April 2024 / Revised: 29 May 2024 / Accepted: 29 May 2024 / Published: 30 May 2024

Abstract

:
The symmetry of neuron discharging has some relationship with the electrophysiological characteristics and dynamic behavior of a neuron, and has a close relation with the symmetry of ion channels, current balance, neuron type, synaptic transmission, and network effects. Among them, the feedback and interactions in the network have a particularly direct impact on the symmetrical discharge of a neuron element. This work introduces a memristor as a synapse into a neuron cell, taking the membrane potential back to ion channels, and therefore various symmetric firing behaviors of Hindmarsh–Rose (HR) neurons are observed, including chaos and various periodic firings. By further adjusting the feedback, coexisting symmetrical discharge of the neuron is achieved. Furthermore, the impact of frequency variations on the memristor synapse is analyzed, and thus the operating regimes of memristor and resistor are classified and discussed. Circuit simulations prove the neural chaotic firings along with their symmetrized discharging processes, demonstrating the effectiveness of symmetrical control of chaotic discharge. Finally, applying the symmetrical system to DNA image encryption can effectively protect the security of images.

1. Introduction

The brain’s activity is closely associated with the dynamic behavior of neurons and neuronal networks [1,2,3]. The electrical interactions among neurons commonly exhibit intricate and complex chaotic dynamics [4,5,6]. Exploring and studying the chaotic dynamics of neuron systems contribute to a further understanding of the mechanisms underlying neuronal diseases [7,8,9]. The brain, a labyrinth of interconnected neurons numbering in the billions, is woven together by synapses, creating an expansive and elaborate neural network. These synapses both establish physical connections within the network and serve as conduits for the vital transmission of information between neurons. Synapses include both the synapses between neurons and the synapses within individual neurons. Synapses between neurons act as bridges connecting two neurons and facilitate the transmission of information. A memristor [10,11,12], recognized as the fourth fundamental electronic component alongside resistor, capacitor, and inductor, exhibits a unique characteristic: its resistance varies in response to the charge passing through it and maintains this resistance state even after the current ceases. Due to its synaptic plasticity biomimetic properties, the memristor can emulate the functions of neural synapses and effectively simulate the complex dynamic characteristics of biological neural systems [13,14]. Memristive synapses are characterized by their plasticity, as their resistance can be altered upon receiving input signals. This plasticity enables memristive synapses to store and process information, akin to the synaptic plasticity observed in biological neurons. By adjusting the resistance values within memristors, connections between neurons can be strengthened or weakened, thereby influencing the behavior and learning processes of neural networks. This plasticity is an essential feature of memristors as fundamental components in artificial intelligence and neuromorphic computing. Compared with traditional resistor synapses, memristors, when they are used to simulate neural networks modeling biological synapses between neurons, not only offer adjustable weights but also generate more complex dynamic behaviors, such as synchronous oscillations and chaotic phenomena. Using memristors as neuronal synapses, researchers have developed various types of memristive neuron models building upon traditional neuron models [15,16,17]. For instance, utilizing a local active memristor model, the synaptic connection between 2D-HR and 2D-FHN neurons was simulated by Li, leading to the creation of a dual-neuron model with memristive synaptic coupling. The outcomes revealed a coexistence of periodic and chaotic burst firing patterns, alongside the simultaneous presence of two periodic firing patterns characterized by distinct topological structures. The phase synchronization of neurons coupled through memristor synapses was also discussed [18]. Compared with the previously proposed memristive neuron systems [19,20,21], the memristive HR system designed in this paper can realize the control of the amount of spiking by altering the synaptic strength of the memristor. This system can achieve single-spike, two-spike, three-spike, four-spike, and chaotic firing, respectively.
The action potential of a neuron can be divided into four phases: polarization [22], depolarization [23], repolarization [24], and hyperpolarization [25]. When excitatory neurotransmitters cause the opening of sodium channels in the receiving neuron, the influx of sodium ions leads to an increase in positive charge inside the neuron, resulting in depolarization, and the membrane potential abruptly rises to a positive level. On the other hand, the transmission of inhibitory neurotransmitters causes the opening of potassium and chloride channels in the receiving neuron. The efflux of potassium ions and influx of chloride ions lead to an increase in negative charge inside the neuron, resulting in hyperpolarization, and the membrane potential decreases to a negative level, as shown in Figure 1. In the electrical activity of neurons, the phenomenon of multistability is often observed, representing an important characteristic of neural networks and simultaneously reflecting the diversity inherent in the brain [26,27]. Electrophysiological experiments indicate the presence of multistability in the discharge activity of biological neurons [28,29]. For instance, studies suggest that neural networks with multistability features can simulate human multistable visual effects by switching between different attractors [30]. Therefore, exploring the characteristics of multistability not only enhances our comprehension of its role in brain information processing but also provides a valuable avenue for deeper investigation into the intricacies of brain function. The modulation of neuronal membrane potential predominantly hinges on the activity of ion channels, which regulate their opening and closing dynamics. The symmetry of certain ion channels may lead to a symmetric variation in membrane potential during excitatory and inhibitory phases, resulting in depolarization and hyperpolarization processes. The organization and functionality of neurons exhibit a considerable amount of symmetry, as illustrated in Figure 2. Therefore, symmetry plays a significant role in the study of neurons. In this work, the technology of attractor doubling [31] is utilized to construct coexisting attractors, simulating the states of neurons during excitatory and inhibitory phases, where the membrane potential undergoes depolarization and hyperpolarization processes, exhibiting symmetry over time. This provides a new avenue for further exploration of the diversity and symmetry within the brain itself.
Image encryption integrates research from fields such as image communication, image processing, and cryptography, representing a new development in information security within the era of AI. Chaos possesses many desirable characteristics such as sensitivity to initial parameter values, unpredictability of trajectories, and topological transitivity. Sequences generated by chaotic mappings are pseudo-random and can be used as random sequences, indicating that chaotic mappings with good chaotic behavior serve as excellent random number generators suitable for image encryption applications. Gradually, some scholars have combined chaotic mappings with image encryption, proposing chaos-based image encryption algorithms [32,33]. This paper proposes a structural design for a three-dimensional memristive HR neuron system, integrating it with a DNA encryption algorithm. It then subjects the output chaotic sequence to NIST and TestU01 testing showing its randomness, and identifying the algorithm’s speed and security. The proposed chaotic system exhibits features such as faster speed and enhanced security.
This paper brings to light several significant findings, including:
  • A memristor is introduced into the HR neuron for firing generation, which includes four phenomena: single-spike firing, two-spike firing, three-spike firing, four-spike firing, and chaotic firing, depending on the varying intensity of synaptic strength. The impact of frequency variations on the memristor synapse is analyzed, and the memristor and resistor operating regimes are analyzed;
  • Attractor doubling is implemented independently in the x-dimension and x-y-dimension, leading to the formation of unique coexisting attractors and pseudo-multi-vortex attractors. The discharging states of neurons are simulated when neurons are in the depolarization and hyperpolarization phases through the method of attractor doubling;
  • The symmetric chaotic system designed is applied in chaotic encryption, yielding excellent confidentiality effects.
The subsequent sections of this article are structured in the following manner: In Section 2, a model of a memristor is introduced and incorporated into the HR neuron, creating the memristive HR neuron model. In Section 3, the impact of varying intensities of synaptic strength on the firing behavior of the HR neuron is analyzed. In Section 4, an analysis is conducted on the changes in system dynamics following attractor doubling. In Section 5, circuit simulations are performed using Multisim to validate the accuracy of the theoretical research. In Section 6, the designed symmetrical system is applied to image encryption and the confidentiality effectiveness is analyzed.

2. Model of the Memristive HR Neuron and the Synaptic Plasticity

The plasticity of memristive synapses refers to their ability to adjust resistance in response to variations in external input signal parameters such as frequency, intensity, and duration. This plasticity enables memristive synapses to emulate the learning and memory processes observed in biological neural systems. By modulating the voltage signals applied to the input terminals of memristive synapses, changes in synaptic resistance can be induced, thereby impacting the connectivity and functionality of neural networks. Similar to the plasticity observed in biological neural synapses, the plasticity of memristive synapses provides a flexible and effective mechanism for artificial neural networks to simulate and implement intelligent behaviors such as learning, adaptation, and memory.
The HR neuron model was proposed in 1984 by Hindmarsh and Rose, based on extensive data from voltage-clamp experiments on snail neurons [34]. The model aims to describe the excitability and firing behavior of neurons and can replicate various dynamic characteristics of biological neurons, such as periodic firing, chaotic behavior, and transitions between excited states. The model is as follows:
{ x ˙ = y a x 3 + b x 2 + I y ˙ = c d x 2 y
where x symbolizes the membrane potential, y denotes the recovery variable, and I is the external direct current stimulus. Constants a, b, c, and d hold specific values.
Introducing a memristor into two-dimensional HR neurons, the newly constructed memristive neural system is shown as follows:
{ x ˙ = y a x 3 + b x 2 + I k ( 0.5 2 tanh ( z ) ) x y ˙ = c d x 2 y z ˙ = 0.5 x 0.1 z
where parameter k is the synaptic strength. For instance, when a = 1, b = 3, c = 1, d = 5, I = 0, and k = 1, with an initial condition IC of (0.1, 0.1, 0.1), the system manifests chaotic behavior, as depicted in Figure 3. The system has no equilibria. The Lyapunov exponents of this attractor are (0.0442, 0, −3.8438), and its Kaplan–Yorke dimension is 2.0115.
Introducing the frequency of system variables into the system (2), Equation (3) is obtained. Figure 4 illustrates the bifurcation diagram and Lyapunov exponents across the range of f, spanning from 0 to 5. When f is in the intervals (0, 0.84) and (3.15, 3.5), the system is in a period-1 state. In the interval (0.84, 2.05), the system exhibits chaotic behavior. For f in the range (2.05, 2.23), the system is in a period-4 state, and for f in the interval (2.23, 3.15), the system is in a period-2 state. The phase trajectories for different values of f are illustrated in Figure 5. The system’s state with varying frequency f is shown in Table 1. Simultaneously, by observing Figure 4 and Figure 5, it is evident that as the frequency increases, the system’s nonlinear feedback weakens, while the linear feedback strengthens.
{ x ˙ = f ( y a x 3 + b x 2 + I k ( 0.5 2 tanh ( z ) ) x ) y ˙ = f ( c d x 2 y ) z ˙ = 0.5 x 0.1 z

3. The Impact of Synaptic Coupling Strength

Under different electrophysiological stimulations, synaptic function activation can cause self-adaptation, thereby developing synaptic plasticity in biological neurons [35,36,37]. In fact, controllability and effective energy exchanging imply that coupling strength should be controllable to effectively absorb external energy. The synaptic coupling strength is represented by k in Equation (2). When the synaptic coupling strength is altered, the firing behavior of the neuron is affected.
The parameters are held constant as follows: a = 1, b = 3, c = 1, and d = 5. Initial conditions are specified as (0.1, 0.1, 0.1). The control parameter for synaptic coupling strength, denoted as k, varies within the range [0, 1.4]. Figure 6 presents the bifurcation diagram and the Lyapunov exponents. When k is varied in (0.65, 1.1), the positive value of the maximum Lyapunov exponent signifies the chaotic state of the system, aligning with the observations from the bifurcation diagram. Specifically, as the synaptic strength varies, alterations occur in the firing patterns of the system. When k = 0.1, as depicted in Figure 7a, the single-spike firing phenomenon is exhibited. When k = 0.4, as illustrated in Figure 7b, a two-spike firing pattern emerges. When k = 0.6, as shown in Figure 7c, a four-spike firing pattern emerges. When k = 1, as shown in Figure 7d, chaotic firing patterns are observed. When k = 1.1, as seen in Figure 7e, a three-spike firing phenomenon is presented. Finally, when k = 2, as displayed in Figure 7f, a single-spike firing is exhibited.

4. Dynamic Symmetrization

The introduction of the offset function makes the chaotic attractor shifted in phase space. Since |x| − e = |−x| − e, the absolute value function can increase the number of attractors. In order to ensure the dynamical characteristics of the chaotic attractors in the negative-x region remain consistent with those in the positive-x region, the signum function needs to be introduced for polarity balance. Therefore, it enables attractor doubling. Let x | x | e and introduce sgn(x) to the first dimension, and system (2) can be modified to:
{ x ˙ = sgn ( x ) ( y a ( | x | e ) 3 + b ( | x | e ) 2 + I ( 0.5 2 tanh ( z ) ) ( | x | e ) ) y ˙ = c d ( | x | e ) 2 y z ˙ = 0.5 ( | x | e ) 0.1 z
When the offset distance e is large, the two coexisting attractors are further apart from each other. A smaller e brings the two attractors closer together. When e is sufficiently small, less than or equal to 1.4, the two coexisting attractors get merged. From Figure 8b, it can be observed that the density of phase trajectories in the positive and negative half-spaces is unbalanced, indicating that this attractor is actually a pseudo-double-scroll attractor. After attractor doubling, the system has an equilibrium point (0, 1 − 5e2, −5e). When a = 1, b = 3, c = 1, d = 5, I = 0, e = 1.4, IC = (0.1, 0.1, 0.1), a pseudo-multi-scroll attractor is generated, as shown in Figure 8a, and the membrane potential at this time is shown in Figure 8b. When e = 2, the pseudo-multi-vortex attractors separate and generate two independent attractors, as shown in Figure 8c, and the membrane potential at this time is shown as Figure 8d, where the green waveform represents the polarization state of neuron discharge, and the yellow waveform represents the hyperpolarization state of neuron discharge.
Attractor doubling can also be simultaneously achieved in x-y dimensions. At this point, two offset boosting knobs m and n are required. The system can be transformed into:
{ x ˙ = sgn ( x ) ( ( | y | f ) a ( | x | e ) 3 + b ( | x | e ) 2 + I ( 0.5 2 tanh ( z ) ) ( | x | e ) ) y ˙ = sgn ( y ) ( c d ( | x | e ) 2 ( | y | f ) ) z ˙ = 0.5 ( | x | e ) 0.1 z
After attractor doubling, the system has the equilibrium point (0, 0, −5e). In the x and y dimensions, the newly introduced signum functions, in conjunction with two offset boosting parameters, collectively contribute to the formation of a pseudo-four-scroll attractor. Reducing parameter e induces the connection of attractors along the x dimension, yielding two pseudo-double-scroll attractors. Similarly, a decrease in parameter f facilitates the linking of the two pseudo-double-scroll attractors, culminating in a pseudo-four-scroll attractor, exemplified in Figure 9a. The corresponding waveforms are shown in Figure 9d–f. The phenomenon of attractor doubling hinges on restructuring the system. By incorporating two signum functions and two offset boosters, it becomes possible to engender up to four concurrent attractors, as depicted in Figure 9c. To generate more coexisting attractors, it is necessary to repetitively introduce signum functions and substitute absolute value functions.

5. Circuit Simulation

To verify the accuracy of the attractor doubling design, further validation is conducted through circuit simulation based on Multisim. To realize system (4), we devise a circuit configuration illustrated in Figure 10, accompanied by the corresponding circuit equations outlined below:
{ x ˙ = 1 R 22 C 2 sgn ( x ) ( R 21 R 17 y R 21 R 18 ( | x | m ) 3 + R 21 R 19 ( | x | m ) 2 R 21 R 20 ( V 1 R 16 R 14 tanh ( z ) ) ( | x | m ) ) y ˙ = V 2 1 R 26 C 3 ( | x | m ) 2 1 R 27 C 3 y z ˙ = 1 R 1 C 1 ( | x | m ) 1 R 2 C 1 z
The circuit in Figure 10 incorporates memristor modules and system construction modules. The component parameters are determined based on the time scale of the attractor displayed on the oscilloscope: C1 = C2 = C3 = 10 nF, R1 = 20 k Ω , R2 = R23 = R24 = R28 = R29 = R30 = R31 = R32 = R33 = R34 = R38 = R39 = R42 = R43 = 100 k Ω , R3 = R10 = R11 = R12 = R13 = R15 = R16 = R17 = R18 = R20 = R21 = R22 = R25 = R27 = R35 = R36 = R37 = R41 = 10 k Ω , R4 = 520   Ω , R5 = R6 = 1 k Ω , R7 = R8 = R26 = 2 k Ω , R9 = 9.8 k Ω , R14 = 7 k Ω , R19 = 3 k Ω , R40 = 140 k Ω , V1 = V2 = 1 V. The circuit realization for attractor doubling of x dimension is displayed in Figure 11.
To instantiate system (5), we devise a circuit configuration depicted in Figure 12, along with the corresponding circuit equations presented below:
{ x ˙ = 1 R 22 C 2 sgn ( x ) ( R 21 R 17 ( | y | n ) R 21 R 18 ( | x | m ) 3 + R 21 R 19 ( | x | m ) 2 R 21 R 20 ( V 1 R 16 R 14 tanh ( z ) ) ( | x | m ) ) y ˙ = 1 R 27 C 3 sgn ( y ) ( R 26 R 23 R 26 R 24 ( | x | m ) 2 R 26 R 25 ( | y | n ) ) z ˙ = 1 R 1 C 1 ( | x | m ) 1 R 2 C 1 z
The circuit in Figure 12 incorporates memristor modules and system construction modules. The component parameters are determined based on the time scale of the attractor displayed on the oscilloscope: C1 = C2 = C3 = 10 nF, R1 = 20 k Ω , R2 = R28 = R29 = R30 = R31 = R32 = R36 = R37 = R40 = R41 = R42 = R43 = R44 = R45 = R46 = R50 = R51 = R54 = R55 = 100 k Ω , R3 = R10 = R11 = R12 = R13 = R15 = R16 = R17 = R18 = R20 = R21 = R22 = R23 = R25 = R26 = R27 = R33 = R34 = R35 = R39 = R47 = R48 = R49 = R53 = 10 k Ω , R4 = 520   Ω , R5 = R6 = 1 k Ω , R7 = R8 = R24 = 2 k Ω , R9 = 9.8 k Ω , R14 = 6.5 k Ω , R19 = 3.33 k Ω , R38 = R52 = 140 k Ω , V1 = V2 = 1 V. The circuit realization for attractor doubling of x-y dimensions is displayed in Figure 13.

6. The Application in Image Encryption

6.1. DNA Coding

In this article, color images are encoded using DNA coding rules. Leveraging the extensive parallelism and remarkable information density inherent in DNA molecules, integration with chaos techniques enables the creation of robust and efficient encryption protocols with enhanced security. DNA molecules consist of four nucleic acid bases: adenine (A), guanine (G), cytosine (C), and thymine (T), with A pairing with T and G pairing with C. Correspondingly, in binary notation, 0 and 1 are complementary, where 00 pairs with 11 and 01 pairs with 10. By assigning A, T, C, and G to encode 00, 01, 10, and 11, respectively, there are twenty-four encoding combinations. However, only eight of these combinations adhere to the Watson–Crick complementary pairing principle, as delineated in Table 2.
According to DNA coding rules, image pixels are processed, and chaos sequences are used to determine the rules for encoding and decoding.

6.2. Encryption Process

The specific encryption steps are outlined below:
Step 1: The original image is padded and segmented. The image undergoes padding in both its rows and columns to ensure that their dimensions are evenly divisible by the parameter d. Subsequently, the image is uniformly segmented into blocks and combined with a random matrix, with each block sized d × d. The image is padded to meet the conditions of Equations (7) and (8). The padded pixels are assigned a grayscale value of 0.
mod ( M , t ) = M 0
mod ( N , t ) = N 0
Step 2: Based on the characteristics of the logistic map, when the parameter µ = 3.9999, the system reaches its optimum. Continuous iteration of the logistic map yields a one-dimensional sequence {pi} (i = 1, 2, …, M × N + 1000) of length M × N + 1000. To enhance the randomness of the sequence, the first 1000 elements are discarded, resulting in a subsequence {pi} (i = 1001, 1002, …, M × N + 1000). The initial value ε is computed following the equation presented in Equation (10).
ε = I 1 ( x , y ) + I 2 ( x , y ) 255 × M × N × 2
Step 3: The one-dimensional sequence is transformed into a two-dimensional random matrix. All sequence elements are normalized to lie within [0, 255] using Equation (11), and Equation (12) is employed to convert the one-dimensional sequence into an M × N order two-dimensional random matrix Q.
p i = mod ( r o u n d ( p i × 10 4 ) , 256 )
Q = r e s h a p e ( p i , N , M )
Step 4: The chaotic system’s initial values, denoted as x0, y0, and z0, are established, after which the system’s equations are solved utilizing the ode45 algorithm. This yields three chaotic sequences, from which the first 3001 items are removed to enhance randomness, resulting in {xi}, {yi}, {zi}.
Step 5: The DNA encoding methodology is defined through a selection process. The sets {xi} and {yi} individually specify the DNA encoding techniques for the image matrix and the random matrix blocks.
Step 6: The choice of DNA operations is guided by {zi}, which regulates the operations between the image matrix and the random matrix. The method for selecting DNA operations unfolds as follows:
z i = mod ( r o u n d ( z i × 10 4 ) , 4 )
The process of selecting DNA operations between corresponding pixels in the ith block of the image matrix and the ith block of the random matrix is as outlined below: for i = 0, 1, 2, 3, they correspond to addition, subtraction, XOR, and XNOR respectively.
Step 7: The association between the encryption of the current image block and that of the preceding image block is also defined by the chaotic sequence {zi}. As an illustration, when z = 0, the encryption outcome ci of the ith block is expressed as follows:
c i = c i 1 + I i + R i
Figure 14a displays the original image (512 × 512) before encryption. The primary control parameters and initial variables for the logistic system are selected as follows: μ = 3.9999, ε = 0.5764. The parameter variables for system (4) are set as a = 1, b = 3, c = 1, d = 5, I = 0, e = 1.4. The system’s initial values are established as (0.1, 0.1, 0.1). The complete key for this system is illustrated in Table 3. The encrypted image is represented in Figure 14b, showcasing a chaotic distribution of pixel values post-encryption. Figure 14c showcases the decrypted image, exhibiting a flawless resemblance to the original image, with a discrepancy of 0. This serves as conclusive proof of the absolute fidelity between the decrypted image and its original counterpart.

6.3. Performance Analysis

6.3.1. Encryption Efficiency Analysis

The time complexity is used to measure the duration of the encryption algorithm execution, and it is an important metric for evaluating encryption algorithms. While maintaining the security of encryption algorithms, improving encryption speed is crucial for image encryption. The actual encryption rate of different algorithms is influenced by numerous complex factors, such as the number of iterations, programming methods, programming software, etc. After testing and calculation, Table 4 presents the time test results for encrypting images of different sizes. The comparison of encryption time between the proposed algorithm and other encryption algorithms is shown in Table 5, demonstrating that the proposed algorithm has better encryption efficiency.

6.3.2. National Institute of Standards and Technology (NIST) Test

The NIST testing is a rigorous and standardized testing suite used to assess the statistical properties of pseudo-random sequences generated by chaotic systems. Whether pseudo-random sequences can meet the 15 testing criteria set by NIST is assessed. If the sequences pass these criteria, it indicates they possess good random characteristics and offer a certain level of security for encryption purposes. Initially, a 108 bit sequence is selected for testing. In the test results, if the p-value exceeds 0.01, the sequence passes the test; otherwise, it fails. The NIST testing is conducted on the chaotic sequences used in the encryption algorithm for this image, and the results are shown in Table 6. The test results indicate that the proposed chaotic sequences in this paper can pass all tests in the NIST testing suite, implying that the chaotic pseudo-random sequences exhibit excellent statistical performance and can be considered genuine random sequences.

6.3.3. TESTU01 Test

Compared to NIST testing, TESTU01 testing is a more stringent statistical feature assessment. TESTU01 comprises seven test suites, including SmallCrush, Crush, BigCrush, Alphabit, Rabbit, PseudoDIEHARD, and the FIPS-140-2 suite. The data length of SmallCrush is 6 Gb. The data length of Crush is 1 Tb. The data length of BigCrush is 10 Tb. The data length of Alphabit and Rabbit is 1 Gb. The data length of PseudoDIEHARD is 5 Gb. The data length of FIPS-140-2 is 19 Kb. The test results are shown in Table 7. For the more stringent TESTU01 testing, the proposed random number sequences passed SmallCrush, Alphabit, and FIPS-140-2. However, for Crush, BigCrush, Rabbit, and PseudoDIEHARD tests, a small portion of the test results did not pass. This indicates that the proposed chaotic sequences exhibit good randomness, but there is still room for improvement.

6.4. Security Performance Analysis

6.4.1. Key Space

The key space refers to the total sum of the numerical space available for keys. A good encryption scheme’s key space should be sufficiently large to resist brute force attacks. The larger the value of the key space, the longer the time required for brute force cracking algorithms, enhancing the ability to effectively defend against brute force attacks and thereby ensuring the security of the encryption algorithm. This algorithm uses the key { μ , ε , x0, y0, z0, M0, N0, xx0, xx1}, and on a 64-bit computer, the floating-point precision is 10−16. Therefore, the key space can reach (1016)9 = 10144. Compared to other research schemes, the key space of this algorithm is larger, providing higher security as shown in Table 8. Therefore, this algorithm has strong resistance against brute force attacks and can withstand attempts at brute force cracking.

6.4.2. Key Sensitivity Analysis

Key sensitivity refers to the degree to which an encryption algorithm is affected by changes in the key. An excellent encryption scheme will show significant differences in the encryption effect even with extremely slight changes in the key. Even minor alterations to the initial key will impact the final encryption and decryption results. The higher the key sensitivity of the algorithm, the greater the difference between the ideal image and the actual decrypted image. Only a completely corresponding key sequence can fully and clearly restore the plaintext image. Figure 15 shows the key sensitivity experiment results of the encryption scheme presented in this paper. The initial key sequence values are varied by adding slight changes with a precision of 10−16. The final decryption results are shown in Figure 15b–j. It can be seen that the encryption system is highly sensitive to the key, as the original image cannot be obtained after minor modifications. This demonstrates that the encryption system has strong key sensitivity.

6.4.3. Histogram

The histogram provides a direct view of the pixel count and grayscale distribution within the image, revealing a pronounced imbalance in the original image’s histogram distribution. To thwart attempts at extracting original image details via statistical analysis, modifying the histogram distribution of the original image becomes imperative to prevent any inadvertent disclosure of image-related information. Hence, it is crucial to apply a uniform smoothing process to the histogram distribution of the encrypted image to guarantee the confidentiality and security of the image data. In Figure 16a, we see the histogram representation of the initial color image, contrasting with Figure 16b that illustrates the histogram of the encrypted version. Notably, the histogram of the encrypted image exhibits a more uniform distribution, offering a degree of protection against image information theft.

6.4.4. Information Entropy

The resistance of the encryption system against attacks can be effectively assessed based on the value of information entropy. When information entropy rises, the color image’s pixel distribution becomes more evenly spread, enhancing its resilience against attacks. If we denote the information source as x, the formula for computing information entropy G(x) is expressed as follows:
G ( x ) = j = 1 M p ( x j log 2 p ( x j ) )
in this context, M denotes the grayscale levels within the image, xj signifies the specific grayscale value, and p(xj) indicates the frequency of that grayscale value occurring in the image. For a digital image with 256 grayscale values, each grayscale value’s occurrence frequency is p(xj) = 1/256, resulting in an information entropy of 8 bits. According to this principle, if the information entropy of a color image after encryption approaches 8 infinitely, this suggests that the encryption applied to the color image is highly effective. Table 9 presents the information entropy values of the image pre- and post-encryption using system (4). Notably, the entropy post-encryption approaches 8, and the comparison of the information entropy between this algorithm and other algorithms is shown in Table 10. This demonstrates the effectiveness of this encryption method in resisting external threats.

6.4.5. Correlation

In standard images, there tends to be a significant correlation between pixels and their adjacent pixels in various directions—be it horizontal, vertical, or diagonal. If adjacent pixels in an image exhibit high correlation, the image is susceptible to statistical attacks. This can be exploited for analyzing cryptographic systems. Hence, robust encryption schemes must nullify the correlation among neighboring pixels in images, thus fortifying encryption security through diminished correlation coefficients between the original and encrypted images. To accomplish this task, we randomly select 1000 pairs of adjacent pixels from both the original and encrypted images. Subsequently, we calculate the correlation between adjacent pixels before and after encryption individually in horizontal, vertical, and diagonal directions, following this approach:
H ( x ) = 1 M j = 1 M x j
B ( x ) = 1 M j = 1 M ( x j H ( x ) ) 2
cov ( x , y ) = 1 M j = 1 M ( ( x j H ( x ) ) ( y j H ( y ) )
ρ x y = cov ( x , y ) B ( x ) B ( y )
Table 11 displays the computation outcomes, where cov(x, y) denotes the correlation function, and B(x) stands for the mean square deviation. Ordinarily, in standard images, the correlation coefficient hovers close to 1. Yet, the data presented in the table showcase a significant reduction in the correlation between adjacent pixels in the encrypted image, approaching near-zero levels post the application of the memristive neuron system encryption technique. As a result, this method demonstrates exceptional encryption capabilities. Figure 17 and Figure 18 illustrate correlation plots for the R, G, and B channels in both the original and encrypted images, spanning horizontal, vertical, and diagonal orientations. In the original image, there is noticeable pixel correlation, contrasted by the encrypted image’s uniform pixel distribution, signaling that adjacent pixel correlations in the horizontal, vertical, and diagonal directions approach zero. The encryption method has extensively modified the positions and values of pixels within the original image, showcasing its formidable resistance against statistical attack methods.

7. Conclusions

In the HR neuron system, the introduction of the memristor results in various firing patterns, including one-spike firing, two-spike firing, three-spike firing, four-spike firing, and chaotic firing, effectively simulating the impact of synaptic strength at different intensities on the neuron. The impact of memristor frequency variations on the system’s dynamical behavior is analyzed, and an increase in frequency is found to result in the disappearance of memristive characteristics. Therefore, the memristor and resistor operating regimes of the system are distinguished when frequency varies. The attractor doubling method is employed to construct coexisting attractors, enriching the dynamic behavior of the neuron system. The attractor doubling method simulates the discharge states of neurons during the depolarization and hyperpolarization processes. During the excitatory and inhibitory phases of neurons, when the changes in input current and membrane potential are balanced, the membrane potential exhibits symmetry in its temporal evolution. The attractor doubling method simulates this phenomenon of symmetric membrane potential. The presence of various symmetries in the organization and function of neurons highlights the significance of studying neuron symmetry. The application of the attractor doubling method in neurons provides a new avenue for exploring the diversity and symmetry within the brain itself. Applying the symmetrical system to image encryption can achieve excellent encryption. In this paper, we conducted analyses and comparisons with existing algorithms in terms of algorithm running speed, randomness, key space, key sensitivity, and statistical complexity. The results show that this algorithm performs well in encryption, ensuring the security of image encryption. However, the chaotic sequences designed in this paper have not yet achieved a 100% pass rate in the stringent TESTU01 tests. Further study should be undertaken to improve performance.

Author Contributions

K.H. performed the numerical simulations and the experiments, as well as most of the analysis, data acquisition and processing, and writing. C.L. supported and conceived the experiments and correspondence to this article. Y.L., T.L. and H.F. contributed to parts of the analysis and discussion. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China under Grant 62371242, and the Postgraduate Research & Practice Innovation Program of Jiangsu Province under Grant KYCX24_1497.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Carro-Pérez, I.; Sánchez-López, C.; González-Hernández, H.G. Experimental verification of a memristive neural network. Nonlinear Dyn. 2018, 93, 1823–1840. [Google Scholar] [CrossRef]
  2. Zhang, L.; Jin, W.; An, X. Energy evolution in function neuronal network under different coupling channels. Nonlinear Dyn. 2024, 112, 8581–8602. [Google Scholar] [CrossRef]
  3. Sun, J.; Zhai, Y.; Liu, P.; Wang, Y. Memristor-Based Neural Network Circuit of Associative Memory with Overshadowing and Emotion Congruent Effect. IEEE Trans. Neural Netw. Learn. Syst. 2024, 1–13. [Google Scholar] [CrossRef] [PubMed]
  4. Lai, Q.; Lai, C.; Zhang, H.; Li, C. Hidden coexisting hyperchaos of new memristive neuron model and its application in image encryption. Chaos Solitons Fractals 2022, 158, 112017. [Google Scholar] [CrossRef]
  5. Bao, B.; Hu, A.; Xu, Q.; Bao, H.; Wu, H.; Chen, M. AC-induced coexisting asymmetric bursters in the improved Hindmarsh–Rose model. Nonlinear Dyn. 2018, 92, 1695–1706. [Google Scholar] [CrossRef]
  6. Lin, H.; Wang, C.; Sun, Y.; Yao, W. Firing multistability in a locally active memristive neuron model. Nonlinear Dyn. 2020, 100, 3667–3683. [Google Scholar] [CrossRef]
  7. Li, F.; Chen, Z.; Zhang, Y.; Bai, L.; Bao, B. Cascade tri-neuron hopfield neural network: Dynamical analysis and analog circuit implementation. AEU-Int. J. Electron. Commun. 2024, 174, 155037. [Google Scholar] [CrossRef]
  8. Wan, Q.; Chen, S.; Yang, Q.; Liu, J.; Sun, K. Grid multi-scroll attractors in memristive Hopfield neural network under pulse current stimulation and multi-piecewise memristor. Nonlinear Dyn. 2023, 111, 18505–18521. [Google Scholar] [CrossRef]
  9. Lin, H.; Wang, C.; Cui, L.; Sun, Y.; Xu, C.; Yu, F. Brain-like initial-boosted hyperchaos and application in biomedical image encryption. IEEE Trans. Ind. Inform. 2022, 18, 8839–8850. [Google Scholar] [CrossRef]
  10. Sun, J.; Li, C.; Wang, Z.; Wang, Y. A memristive fully connect neural network and application of medical image encryption based on central diffusion algorithm. IEEE Trans. Ind. Inform. 2023, 20, 3778–3788. [Google Scholar] [CrossRef]
  11. Bao, B.; Hu, J.; Bao, H.; Xu, Q.; Chen, M. Memristor-coupled dual-neuron mapping model: Initials-induced coexisting firing patterns and synchronization activities. Cogn. Neurodyn. 2023, 18, 539–555. [Google Scholar] [CrossRef]
  12. Sun, J.; Wang, Y.; Liu, P.; Wen, S. Memristor-based circuit design of PAD emotional space and its application in mood congruity. IEEE Internet Things J. 2023, 10, 16332–16342. [Google Scholar] [CrossRef]
  13. Yuan, F.; Yu, X.; Deng, Y.; Li, Y.; Chen, G. A Cu-Doped TiO2−x Nanoscale Memristor with Application to Heterogeneous Coupled Neurons. IEEE Trans. Ind. Electron. 2023, 71, 9480–9488. [Google Scholar] [CrossRef]
  14. Lai, Q.; Yang, L.; Hu, G.; Guan, Z.H.; Iu, H.H.C. Constructing Multiscroll Memristive Neural Network with Local Activity Memristor and Application in Image Encryption. IEEE Trans. Cybern. 2024, 1–10. [Google Scholar] [CrossRef] [PubMed]
  15. Deng, Z.; Wang, C.; Lin, H.; Sun, Y. A memristive spiking neural network circuit with selective supervised attention algorithm. IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst. 2023, 42, 2604–2617. [Google Scholar] [CrossRef]
  16. Lai, Q.; Wan, Z.; Zhang, H.; Chen, G. Design and analysis of multiscroll memristive hopfield neural network with adjustable memductance and application to image encryption. IEEE Trans. Neural Netw. Learn. Syst. 2023, 34, 7824–7837. [Google Scholar] [CrossRef] [PubMed]
  17. Li, Y.; Li, C.; Lei, T.; Yang, Y.; Chen, G. Offset Boosting-Entangled Complex Dynamics in the Memristive Rulkov Neuron. IEEE Trans. Ind. Electron. 2023, 71, 9569–9579. [Google Scholar] [CrossRef]
  18. Bao, B.; Tang, H.; Su, Y.; Bao, H.; Chen, M.; Xu, Q. Two-Dimensional Discrete Bi-Neuron Hopfield Neural Network with Polyhedral Hyperchaos. IEEE Trans. Circuits Syst. I Regul. Pap. 2024, 1–12. [Google Scholar] [CrossRef]
  19. Vijay, S.D.; Thamilmaran, K.; Ahamed, A.I. Superextreme spiking oscillations and multistability in a memristor-based Hindmarsh–Rose neuron model. Nonlinear Dyn. 2023, 111, 789–799. [Google Scholar] [CrossRef]
  20. Innocenti, G.; Morelli, A.; Genesio, R.; Torcini, A. Dynamical phases of the Hindmarsh-Rose neuronal model: Studies of the transition from bursting to spiking chaos. Chaos Interdiscip. J. Nonlinear Sci. 2007, 17, 043128. [Google Scholar] [CrossRef] [PubMed]
  21. Usha, K.; Subha, P.A. Hindmarsh-Rose neuron model with memristors. Biosystems 2019, 178, 1–9. [Google Scholar]
  22. Sakakibara, A.; Hatanaka, Y. Neuronal polarization in the developing cerebral cortex. Front. Neurosci. 2015, 9, 116. [Google Scholar] [CrossRef]
  23. Kim, C.M.; Nykamp, D.Q. The influence of depolarization block on seizure-like activity in networks of excitatory and inhibitory neurons. J. Comput. Neurosci. 2017, 43, 65–79. [Google Scholar] [CrossRef] [PubMed]
  24. Alexander, T.D.; Muqeem, T.; Zhi, L.; Tymanskyj, S.R.; Covarrubias, M. Tunable action potential repolarization governed by Kv3. 4 channels in dorsal root ganglion neurons. J. Neurosci. 2022, 42, 8647–8657. [Google Scholar] [CrossRef] [PubMed]
  25. Liu, C.Y.; Xiao, C.; Fraser, S.E.; Lester, H.A.; Koos, D.S. Electrophysiological characterization of Grueneberg ganglion olfactory neurons: Spontaneous firing, sodium conductance, and hyperpolarization-activated currents. J. Neurophysiol. 2012, 108, 1318–1334. [Google Scholar] [CrossRef] [PubMed]
  26. Innocenti, G.; Marco, M.D.; Tesi, A.; Forti, M. Memristor circuits for simulating neuron spiking and burst phenomena. Front. Neurosci. 2021, 15, 681035. [Google Scholar] [CrossRef] [PubMed]
  27. Li, C.; Gao, Y.; Lei, T.; Li, R.; Xu, Y. Two Independent Offset Controllers in a Three-Dimensional Chaotic System. Int. J. Bifurc. Chaos 2024, 34, 2450008. [Google Scholar] [CrossRef]
  28. Zhang, S.; Zheng, J.; Wang, X.; Zeng, Z. Multi-scroll hidden attractor in memristive HR neuron model under electromagnetic radiation and its applications. Chaos Interdiscip. J. Nonlinear Sci. 2021, 31, 011101. [Google Scholar] [CrossRef] [PubMed]
  29. Vivekanandhan, G.; Natiq, H.; Merrikhi, Y.; Rajagopal, K.; Jafari, S. Dynamical analysis and synchronization of a new memristive Chialvo neuron model. Electronics 2023, 12, 545. [Google Scholar] [CrossRef]
  30. Eagleman, D.M. Visual illusions and neurobiology. Nat. Rev. Neurosci. 2001, 2, 920–926. [Google Scholar] [CrossRef]
  31. Li, C.; Lu, T.; Chen, G.; Xing, H. Doubling the coexisting attractors. Chaos Interdiscip. J. Nonlinear Sci. 2019, 29, 051102. [Google Scholar] [CrossRef]
  32. Tlelo-Cuautle, E.; Díaz-Muñoz, J.D.; González-Zapata, A.M.; Li, R.; León-Salas, W.D.; Fernández, F.V.; Guillén-Fernández, O.; Cruz-Vega, I. Chaotic image encryption using hopfield and hindmarsh–rose neurons implemented on FPGA. Sensors 2020, 20, 1326. [Google Scholar] [CrossRef]
  33. Yildirim, M. DNA encoding for RGB image encryption with memristor based neuron model and chaos phenomenon. Microelectron. J. 2020, 104, 104878. [Google Scholar] [CrossRef]
  34. Hindmarsh, J.L.; Rose, R.M. A model of neuronal bursting using three coupled first order differential equations. Proc. R. Soc. Lond. Ser. B Biol. Sci. 1984, 221, 87–102. [Google Scholar]
  35. Yang, F.; Ma, J. Creation of memristive synapse connection to neurons for keeping energy balance. Pramana 2023, 97, 55. [Google Scholar] [CrossRef]
  36. Yang, F.; An, X. A new discrete chaotic map application in image encryption algorithm. Phys. Scr. 2022, 97, 035202. [Google Scholar] [CrossRef]
  37. Yang, F.; Wang, Y.; Ma, J. An adaptive synchronization approach in a network composed of four neurons with energy diversity. Indian J. Phys. 2023, 97, 2125–2137. [Google Scholar] [CrossRef]
  38. Luo, Y.; Zhou, R.; Liu, J.; Qiu, S.; Cao, Y. An efficient and self-adapting colour-image encryption algorithm based on chaos and interactions among multiple layers. Multimed. Tools Appl. 2018, 77, 26191–26217. [Google Scholar] [CrossRef]
  39. Wang, X.; Guan, N.; Liu, P. A selective image encryption algorithm based on a chaotic model using modular sine arithmetic. Optik 2022, 258, 168955. [Google Scholar] [CrossRef]
  40. Zang, H.; Tai, M.; Wei, X. Image encryption schemes based on a class of uniformly distributed chaotic systems. Mathematics 2022, 10, 1027. [Google Scholar] [CrossRef]
  41. Zhu, S.; Deng, X.; Zhang, W.; Zhu, C. Image encryption scheme based on newly designed chaotic map and parallel DNA coding. Mathematics 2023, 11, 231. [Google Scholar] [CrossRef]
  42. Zhang, Z.; Zhang, J. Parallel multi-image encryption based on cross-plane DNA manipulation and a novel 2D chaotic system. Vis. Comput. 2024. [Google Scholar] [CrossRef]
Figure 1. The processes of cellular depolarization and hyperpolarization.
Figure 1. The processes of cellular depolarization and hyperpolarization.
Electronics 13 02138 g001
Figure 2. The symmetry in the organization and functionality of neurons.
Figure 2. The symmetry in the organization and functionality of neurons.
Electronics 13 02138 g002
Figure 3. Projections on different planes of the HR neuro trajectories and waveforms of system (2) with a = 1, b = 3, c = 1, d = 5 and initial condition (0.1, 0.1, 0.1): (a) y-x plane; (b) x-z plane; (c) signal x(t); (d) signal y(t).
Figure 3. Projections on different planes of the HR neuro trajectories and waveforms of system (2) with a = 1, b = 3, c = 1, d = 5 and initial condition (0.1, 0.1, 0.1): (a) y-x plane; (b) x-z plane; (c) signal x(t); (d) signal y(t).
Electronics 13 02138 g003
Figure 4. Dynamical evolution of system (3) with a = 1, b = 3, c = 1, d = 5, I = 0, k = 1 and initial condition (0.1, 0.1, 0.1): (a) Bifurcation diagram; (b) Lyapunov exponents.
Figure 4. Dynamical evolution of system (3) with a = 1, b = 3, c = 1, d = 5, I = 0, k = 1 and initial condition (0.1, 0.1, 0.1): (a) Bifurcation diagram; (b) Lyapunov exponents.
Electronics 13 02138 g004
Figure 5. Different phase trajectories with a = 1, b = 3, c = 1, d = 5, I = 0, k = 1 and initial condition (0.1, 0.1, 0.1): (a) f = 0.5 (1-period state); (b) f = 1 (chaotic state); (c) f = 2 (chaotic state); (d) f = 2.1 (4-period state); (e) f = 2.5 (2-period state); (f) f = 5 (1-period state).
Figure 5. Different phase trajectories with a = 1, b = 3, c = 1, d = 5, I = 0, k = 1 and initial condition (0.1, 0.1, 0.1): (a) f = 0.5 (1-period state); (b) f = 1 (chaotic state); (c) f = 2 (chaotic state); (d) f = 2.1 (4-period state); (e) f = 2.5 (2-period state); (f) f = 5 (1-period state).
Electronics 13 02138 g005
Figure 6. Dynamical evolution of system (2) with a = 1, b = 3, c = 1, d = 5 and initial condition (0.1, 0.1, 0.1): (a) Bifurcation diagram; (b) Lyapunov exponents.
Figure 6. Dynamical evolution of system (2) with a = 1, b = 3, c = 1, d = 5 and initial condition (0.1, 0.1, 0.1): (a) Bifurcation diagram; (b) Lyapunov exponents.
Electronics 13 02138 g006
Figure 7. Different firing patterns with a = 1, b = 3, c = 1, d = 5 and initial condition (0.1, 0.1, 0.1): (a) k = 0.1 (single-spike firing); (b) k = 0.4 (two-spike firing); (c) k = 0.6 (four-spike firing); (d) k = 1 (chaotic firing); (e) k = 1.1 (three-spike firing); (f) k = 1.4 (single-spike firing).
Figure 7. Different firing patterns with a = 1, b = 3, c = 1, d = 5 and initial condition (0.1, 0.1, 0.1): (a) k = 0.1 (single-spike firing); (b) k = 0.4 (two-spike firing); (c) k = 0.6 (four-spike firing); (d) k = 1 (chaotic firing); (e) k = 1.1 (three-spike firing); (f) k = 1.4 (single-spike firing).
Electronics 13 02138 g007
Figure 8. Doubled attractors and waveforms in system (4) with a = 1, b = 3, c = 1, d = 5, I = 0 under various control gates: (a) e = 1.4 (pseudo-double-scroll attractor); (b) x(t) (e = 1.4); (c) e = 2 (double coexisting attractors); (d) x(t) (e = 2). Here IC = (0.1, 0.1, 0.1) is green, and IC = (−0.1, 0.1, 0.1) is yellow.
Figure 8. Doubled attractors and waveforms in system (4) with a = 1, b = 3, c = 1, d = 5, I = 0 under various control gates: (a) e = 1.4 (pseudo-double-scroll attractor); (b) x(t) (e = 1.4); (c) e = 2 (double coexisting attractors); (d) x(t) (e = 2). Here IC = (0.1, 0.1, 0.1) is green, and IC = (−0.1, 0.1, 0.1) is yellow.
Electronics 13 02138 g008
Figure 9. Doubled attractors in system (5) with a = 1, b = 3, c = 1, d = 5, I = 0: (a) e = 1.3, f = 8.5 (pseudo-four-scroll attractor); (b) e = 2, f = 8.5 (two coexisting attractors); (c) e = 2, f = 10 (four coexisting attractors); (d) waveform of signal x(t) (e = 1.3, f = 8.5); (e) waveform of signal x(t) (e = 2, f = 10); (f) waveform of signal y(t) (e = 2, f = 10). Here IC = (0.1, 0.1, 0.1) is green, IC = (−0.1, 0.1, 0.1) is yellow, IC = (−0.1, −0.1, 0.1) is purple, IC = (0.1, −0.1, 0.1) is blue.
Figure 9. Doubled attractors in system (5) with a = 1, b = 3, c = 1, d = 5, I = 0: (a) e = 1.3, f = 8.5 (pseudo-four-scroll attractor); (b) e = 2, f = 8.5 (two coexisting attractors); (c) e = 2, f = 10 (four coexisting attractors); (d) waveform of signal x(t) (e = 1.3, f = 8.5); (e) waveform of signal x(t) (e = 2, f = 10); (f) waveform of signal y(t) (e = 2, f = 10). Here IC = (0.1, 0.1, 0.1) is green, IC = (−0.1, 0.1, 0.1) is yellow, IC = (−0.1, −0.1, 0.1) is purple, IC = (0.1, −0.1, 0.1) is blue.
Electronics 13 02138 g009
Figure 10. Circuit schematic of system (6).
Figure 10. Circuit schematic of system (6).
Electronics 13 02138 g010
Figure 11. Chaotic attractors and waveforms of system (6): (a) V3 = 1.4 V (pseudo-double-scroll attractor); (b) V3 = 2 V (double coexisting attractors); (c) signal x(t) (V3 = 1.4 V); (d) x(t) (V3 = 2 V).
Figure 11. Chaotic attractors and waveforms of system (6): (a) V3 = 1.4 V (pseudo-double-scroll attractor); (b) V3 = 2 V (double coexisting attractors); (c) signal x(t) (V3 = 1.4 V); (d) x(t) (V3 = 2 V).
Electronics 13 02138 g011
Figure 12. Circuit schematic of system (7).
Figure 12. Circuit schematic of system (7).
Electronics 13 02138 g012
Figure 13. Circuit realization of system (7): (a) V3 = 1.3 V, V4 = 8.5 V (pseudo-four-scroll attractor); (b) V3 = 2 V, V4 = 8.5 V (two coexisting attractors); (c) V3 = 2 V, V4 = 10 V (four coexisting attractors); (d) waveform of signal x(t) (V3 = 1.3 V, V4 = 8.5 V); (e) waveforms of signal x(t) (V3 = 2 V, V4 = 10 V); (f) waveforms of signal y(t) (V3 = 2 V, V4 = 10 V). Here IC = (0.1, 0.1, 0.1) is green, IC = (−0.1, 0.1, 0.1) is yellow, IC = (−0.1, −0.1, 0.1) is purple, IC = (0.1, −0.1, 0.1) is blue.
Figure 13. Circuit realization of system (7): (a) V3 = 1.3 V, V4 = 8.5 V (pseudo-four-scroll attractor); (b) V3 = 2 V, V4 = 8.5 V (two coexisting attractors); (c) V3 = 2 V, V4 = 10 V (four coexisting attractors); (d) waveform of signal x(t) (V3 = 1.3 V, V4 = 8.5 V); (e) waveforms of signal x(t) (V3 = 2 V, V4 = 10 V); (f) waveforms of signal y(t) (V3 = 2 V, V4 = 10 V). Here IC = (0.1, 0.1, 0.1) is green, IC = (−0.1, 0.1, 0.1) is yellow, IC = (−0.1, −0.1, 0.1) is purple, IC = (0.1, −0.1, 0.1) is blue.
Electronics 13 02138 g013
Figure 14. Encryption results: (a) original image; (b) encrypted image; (c) decrypted image.
Figure 14. Encryption results: (a) original image; (b) encrypted image; (c) decrypted image.
Electronics 13 02138 g014
Figure 15. Image key sensitivity analysis: (a) plaintext image; (b) μ + 10−16; (c) ε + 10−16; (d) x0 + 10−16; (e) y0 + 10−16; (f) z0 + 10−16; (g) M0 + 10−16; (h) N0 + 10−16; (i) xx0 + 10−16; (j) xx1 + 10−16.
Figure 15. Image key sensitivity analysis: (a) plaintext image; (b) μ + 10−16; (c) ε + 10−16; (d) x0 + 10−16; (e) y0 + 10−16; (f) z0 + 10−16; (g) M0 + 10−16; (h) N0 + 10−16; (i) xx0 + 10−16; (j) xx1 + 10−16.
Electronics 13 02138 g015
Figure 16. Histograms: (a) histograms of the R, G, B channels of the original image; (b) histograms of the R, G, B channels of the encrypted image.
Figure 16. Histograms: (a) histograms of the R, G, B channels of the original image; (b) histograms of the R, G, B channels of the encrypted image.
Electronics 13 02138 g016
Figure 17. Correlation of adjacent pixels in different directions before encryption in the R, G, B channels of the image: (ac) horizontally adjacent element correlation plot of the original image’s R, G, B channels; (df) vertically adjacent element correlation plot of the original image’s R, G, B channels, (gi) diagonally adjacent element correlation plot of the original image’s R, G, B channels.
Figure 17. Correlation of adjacent pixels in different directions before encryption in the R, G, B channels of the image: (ac) horizontally adjacent element correlation plot of the original image’s R, G, B channels; (df) vertically adjacent element correlation plot of the original image’s R, G, B channels, (gi) diagonally adjacent element correlation plot of the original image’s R, G, B channels.
Electronics 13 02138 g017
Figure 18. Correlation of adjacent pixels in different directions after encryption in the R, G, B channels of the image: (ac) correlation plot of horizontally adjacent elements in the encrypted image’s R, G, B channels; (df) correlation plot of vertically adjacent elements in the encrypted image’s R, G, B channels; (gi) correlation plot of diagonally adjacent elements in the encrypted image’s R, G, B channels.
Figure 18. Correlation of adjacent pixels in different directions after encryption in the R, G, B channels of the image: (ac) correlation plot of horizontally adjacent elements in the encrypted image’s R, G, B channels; (df) correlation plot of vertically adjacent elements in the encrypted image’s R, G, B channels; (gi) correlation plot of diagonally adjacent elements in the encrypted image’s R, G, B channels.
Electronics 13 02138 g018
Table 1. The distribution of memristor and resistor operating regimes.
Table 1. The distribution of memristor and resistor operating regimes.
RegimesRange of fAttractor
Memristor
Synapse
0 < f < 0.84period-1
0.84 < f < 2chaotic
Resistor
Synapse
2 < f < 2.05chaotic
2.05 < f < 2.23period-4
2.23 < f < 3.15period-2
f > 3.15period-1
Table 2. The DNA encoding method.
Table 2. The DNA encoding method.
DNA12345678
A0000111101100110
T1111000010011001
G1001100100001111
C0110011011110000
Table 3. Algorithm key.
Table 3. Algorithm key.
Secret Key μ ε x0y0z0M0N0xx0xx1
Value3.99990.57640.54980.43070.5847000.59080.5913
Table 4. Encryption time for images of different sizes.
Table 4. Encryption time for images of different sizes.
Image Size (dpi)128 × 128256 × 256512 × 5121024 × 1024
Encryption
Time (s)
0.506520.660311.17263.5201
Table 5. Comparison of encryption efficiency.
Table 5. Comparison of encryption efficiency.
Encryption AlgorithmImage Size (dpi)Encryption Time (s)
[38]512 × 5121.8876
[39]512 × 5121.3105
[40]512 × 5123.893
This work512 × 5121.1726
Table 6. NIST test results.
Table 6. NIST test results.
No.Statistical Test TermsPRNG Generated by xPRNG Generated by yPRNG Generated by z
Propp-ValuePropp-ValuePropp-Value
01Frequency0.990.90606910.4484240.980.056069
02Block
frequency
0.960.6544670.970.2210290.960.152754
03Cumulative sums0.990.73188610.5996930.990.715679
04Runs0.990.3635930.990.63295510.089843
05Longest run10.4335900.990.9114130.990.542228
06Rank0.980.5229550.980.1087910.980.221154
07FFT0.980.0493460.990.1223250.980.858002
08Non-overlapping template10.98345310.12232510.948298
09Overlapping template0.980.6412840.990.05898410.331408
10Universal10.28771210.20156810.215248
11Approximate entropy0.960.2327710.990.1516220.970.125698
12Random
excursions
10.80433710.83430810.671779
13Random
excursions variant
10.80433710.53414610.931952
14Serial0.980.4377540.980.0519420.970.112698
15Linear
complexity
0.990.3070770.990.2417410.990.779188
Table 7. TESTU01 test results.
Table 7. TESTU01 test results.
Test SuiteNumber of TestsTest Results
SmallCrush1515/15
Crush144140/144
BigCrush160152/160
Alphabit1717/17
Rabbit4039/40
PseudoDIEHARD126124/126
FIPS-140-21616/16
Table 8. The comparison of key space in image encryption algorithms.
Table 8. The comparison of key space in image encryption algorithms.
Algorithm[39][41][42]This Work
Key space23761075247610144
Table 9. The information entropy of the R, G, B channels of the original and encrypted images.
Table 9. The information entropy of the R, G, B channels of the original and encrypted images.
ChannelThe Original ImageThe Encrypted Image
Channel R7.25327.9995
Channel G7.24427.9996
Channel B7.26497.9995
Table 10. The comparison of information entropy of the R, G, B channels in image encryption algorithms.
Table 10. The comparison of information entropy of the R, G, B channels in image encryption algorithms.
Encryption AlgorithmChannel RChannel GChannel B
[39]7.99937.99937.9993
[41]7.99927.99937.9992
[42]7.99947.99947.9994
This work7.99957.99967.9995
Table 11. Calculation results of the correlation between adjacent pixels.
Table 11. Calculation results of the correlation between adjacent pixels.
ImageChannel Horizontal Correlation Vertical Correlation Diagonal Correlation
Original image R0.882690.880490.82386
G0.870640.869520.80557
B0.886370.88690.83126
Encrypted image R0.0075872−0.0033409−0.012801
G0.00640570.0042039−0.022798
B0.0045693−0.00096852−0.0071015
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Huang, K.; Li, C.; Li, Y.; Lei, T.; Fu, H. Neural Chaotic Oscillation: Memristive Feedback, Symmetrization, and Its Application in Image Encryption. Electronics 2024, 13, 2138. https://doi.org/10.3390/electronics13112138

AMA Style

Huang K, Li C, Li Y, Lei T, Fu H. Neural Chaotic Oscillation: Memristive Feedback, Symmetrization, and Its Application in Image Encryption. Electronics. 2024; 13(11):2138. https://doi.org/10.3390/electronics13112138

Chicago/Turabian Style

Huang, Keyu, Chunbiao Li, Yongxin Li, Tengfei Lei, and Haiyan Fu. 2024. "Neural Chaotic Oscillation: Memristive Feedback, Symmetrization, and Its Application in Image Encryption" Electronics 13, no. 11: 2138. https://doi.org/10.3390/electronics13112138

APA Style

Huang, K., Li, C., Li, Y., Lei, T., & Fu, H. (2024). Neural Chaotic Oscillation: Memristive Feedback, Symmetrization, and Its Application in Image Encryption. Electronics, 13(11), 2138. https://doi.org/10.3390/electronics13112138

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop