Next Article in Journal
Fluctuations in the Energetic Properties of a Spark-Ignition Engine Model with Variability
Next Article in Special Issue
Information Entropy As a Basic Building Block of Complexity Theory
Previous Article in Journal
Truncation Effects of Shift Function Methods in Bulk Water Systems
Previous Article in Special Issue
A Maximum Entropy-Based Chaotic Time-Variant Fragile Watermarking Scheme for Image Tampering Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Synchronization of a Class of Fractional-Order Chaotic Neural Networks

1
School of Automation, Chongqing University, Chongqing 400044, China
2
School of Mathematics, Anhui University, Hefei 230039, China
3
Department of Electrical Engineering, Tshwane University of Technology, Pretoria 0001, South Africa
*
Author to whom correspondence should be addressed.
Entropy 2013, 15(8), 3265-3276; https://doi.org/10.3390/e15083355
Submission received: 5 June 2013 / Revised: 3 August 2013 / Accepted: 5 August 2013 / Published: 14 August 2013
(This article belongs to the Special Issue Dynamical Systems)

Abstract

:
The synchronization problem is studied in this paper for a class of fractional-order chaotic neural networks. By using the Mittag-Leffler function, M-matrix and linear feedback control, a sufficient condition is developed ensuring the synchronization of such neural models with the Caputo fractional derivatives. The synchronization condition is easy to verify, implement and only relies on system structure. Furthermore, the theoretical results are applied to a typical fractional-order chaotic Hopfield neural network, and numerical simulation demonstrates the effectiveness and feasibility of the proposed method.

1. Introduction

Fractional calculus has been a 300-year-old topic. Although it has a long mathematical history, the applications of fractional calculus to physics and engineering are only a recent focus of interest. Recent monographs and symposia proceedings have highlighted the application of fractional calculus in physics, continuum mechanics, signal processing, bioengineering, diffusion wave and electromagnetics [1,2,3,4]. The major advantage of the fractional-order derivatives is that they provide an excellent instrument for the description of memory and hereditary properties of various materials and processes. As such, some researchers introduced fractional calculus to neural networks to form fractional-order neural networks, which can better describe the dynamical behavior of the neurons, such as “memory". It was pointed out that fractional derivatives provide neurons with a fundamental and general computation ability that can contribute to efficient information processing, stimulus anticipation and frequency-independent phase shifts of oscillatory neuronal firing [5]. It is suggested that the oculomotor integrator, which converts eye velocity into eye position commands, may be of a fractional order [6]. It was demonstrated that neural network approximation taken at the fractional level resulted in higher rates of approximation [7]. Furthermore, note that fractional-order recurrent neural networks might be expected to play an important role in parameter estimation. Therefore, the incorporation of memory terms (a fractional derivative or integral operator) into neural network models is an important improvement [8], and it will be of important significance to study fractional-order neural networks.
Chaos has been a focus of intensive discussion in numerous fields during the last four decades. Moreover, it has been verified that some neural networks can exhibit chaotic dynamics. For example, experimental and theoretical studies have revealed that a mammalian brain not only can display in its dynamical behavior strange attractors and other transient characteristics for its associative memories, but also can modulate oscillatory neuronal synchronization by selective visual attention optimization problems [9,10]. In recent years, the study on synchronization of chaotic neural networks has attracted considerable attention, due to the potential applications in many fields, including secure communication, parallel image processing, biological systems, information science, etc. As we know, there are many synchronization results about integer-order neural networks; see [11,12,13] and references therein. On the other hand, since bifurcations and chaos of fractional-order neural networks were investigated firstly in [14,15], some important and interesting results about fractional-order neural networks have been obtained. For instance, in [16], a fractional-order Hopfield neural model was proposed, and its stability was investigated by an energy-like function. Chaos and hyperchaos in fractional-order cellular neural networks was discussed in [17]. Yu et al. [18] investigated α-stability and α-synchronization for fractional-order neural networks. Several recent results concerning chaotic synchronization in fractional-order neural networks have been reported in [19,20,21,22].
Due to the complexity of fractional-order systems, to the best of our knowledge, there are few theoretical results on the synchronization of fractional-order neural networks; most of the existing results are only numerical simulation [19,20,21,22]. Although there have been many synchronization results about integer-order neural networks in the past few decades, these results and methods could not be extended and applied easily to the fractional-order case. Therefore, to establish some theoretical sufficient criteria for the synchronization of fractional-order neural networks is very necessary and challenging. Motivated by the above discussions, by using the Mittag-Leffler function, some properties of fractional calculus and linear feedback control, a simple and efficient criterion in terms of the M-matrix for synchronization of such neural network is derived. Numerical simulations also demonstrate the effectiveness and feasibility of the proposed technique.
The rest of the paper is organized as follows. Some necessary definitions and lemmas are given, and the fractional-order network model is introduced in Section 2. A sufficient criterion ensuring the synchronization of such neural networks is presented in Section 3. An example and simulation are obtained in Section 4. Finally, the paper is concluded in Section 5.

2. Preliminaries and System Description

In this section, some definitions of fractional calculation are recalled and some useful lemmas are introduced.
Definition 1
[1]. The fractional integral (Riemann-Liouville integral), D t 0 , t α , with fractional order, α R + , of function x ( t ) is defined as:
D t 0 , t α x ( t ) = 1 Γ ( α ) t 0 t ( t τ ) α 1 x ( τ ) d τ
where Γ ( · ) is the gamma function, Γ ( τ ) = 0 t τ 1 e t d t .
Definition 2
[1]. The Riemann-Liouville derivative of fractional order α of function x ( t ) is given as:
R L D t 0 , t α x ( t ) = d n d t n D t 0 , t ( n α ) x ( t ) = d n d t n 1 Γ ( n α ) t 0 t ( t τ ) n α 1 x ( τ ) d τ
where n 1 < α < n Z + .
Definition 3
[1]. The Caputo derivative of fractional order α of function x ( t ) is defined as follows:
C D t 0 , t α x ( t ) = D t 0 , t ( n α ) d n d t n x ( t ) = 1 Γ ( n α ) t 0 t ( t τ ) n α 1 x ( n ) ( τ ) d τ
where n 1 < α < n Z + .
Note from Equations (2) and (3) that the fractional derivative is related to all the history information of a function, while the integer one is only related to its nearby points. That is, the next state of a system not only depends upon its current state, but also upon its historical states starting from the initial time. As a result, a model described by fractional-order derivatives possesses memory and inheritance and will be more precise to describe the states of neurons. In the following, the notation, D α , is chosen as the Caputo derivative, D 0 , t α . For x R n , the norm is defined by x = i = 1 n | x i | .
Definition 4
[1]. The Mittag-Leffler function with two parameters appearing is defined as:
E α , β ( z ) = k = 0 z k Γ ( k α + β )
where α > 0 , β > 0 , and z C . When β = 1 , one has E α ( z ) = E α , 1 ( z ) , further, E 1 , 1 ( z ) = e z .
Lemma 1.
Let V ( t ) be a continuous function on [ 0 , + ) and satisfy:
D α V ( t ) λ V ( t )
Then:
V ( t ) V ( t 0 ) E α ( λ ( t t 0 ) α )
where α ( 0 , 1 ) and λ are positive constant.
Proof. 
It follows from Equation (5) that there exists a nonnegative function, M ( t ) , such that:
D α V ( t ) + λ V ( t ) + M ( t ) = 0
Taking the Laplace transform on Equation (7), then one has:
s α V ( s ) s α 1 V ( t 0 ) + λ V ( s ) + M ( s ) = 0
where V ( s ) = L { V ( t ) } , M ( s ) = L { M ( t ) } . It then follows that:
V ( s ) = s α 1 V ( t 0 ) M ( s ) s α + λ
Taking the inverse Laplace transform in Equation (9), one obtains:
V ( t ) = V ( t 0 ) E α ( λ ( t t 0 ) α ) M ( t ) * [ ( t t 0 ) α 1 E α , α ( λ ( t t 0 ) α ) ]
Note that both ( t t 0 ) α and E α , α ( λ ( t t 0 ) α ) are nonnegative functions; it follows that:
V ( t ) V ( t 0 ) E α ( λ ( t t 0 ) α )
Lemma 2
[1]. If α < 2 , β is an arbitrary real number, μ is such that π α / 2 < μ < min { π , π α } and C is a real constant, then:
| E α , β ( z ) | C 1 + | z | , ( μ | arg ( z ) | π ) , | z | > 0
Definition 4
[23]. A real n × n matrix, A = ( a i j ) , is said to be a M-matrix if a i j 0 , i , j = 1 , 2 , n , i j , and all successive principal minors of A are positive.
Lemma 3
[23]. Let A = ( a i j ) be an n × n matrix with non-positive off-diagonal elements. Then, the following statements are equivalent:
(1)
A is a nonsingular M-matrix;
(2)
there exists a vector, ξ, such that A ξ > 0 ;
(3)
there exists a vector, ξ, such that ξ T A > 0 .
The dynamic behavior of a continuous fractional-order cellular neural networks can be described by the following system:
D α x i ( t ) = c i x i ( t ) + j = 1 n a i j f j ( x j ( t ) ) + I i
which can also be written in the following compact form:
D α x ( t ) = C x ( t ) + A f ( x ( t ) ) + I
where i N = { 1 , 2 , , n } , t 0 , 0 < α < 1 , n is the number of units in a neural network, x ( t ) = ( x 1 ( t ) , , x n ( t ) ) T R n corresponds to the state vector at time t, f ( x ( t ) ) = ( f 1 ( x 1 ( t ) , , f n ( x n ( t ) ) T denotes the activation function of the neurons and C = diag ( c 1 , , c n ) represents the rate with which the ith unit will reset its potential to the resting state in isolation when disconnected from the network and external inputs. The weight matrix, A = ( a i j ) n × n , is referred to as the connection of the jth neuron to the ith neuron at time t; I = ( I 1 , I 2 , , I n ) T is an external bias vector.
Here, in order to obtain the main results, the following assumption is presented firstly.
A1. The neuron activation functions, f j , are Lipschitz continuous, that is, there exist positive constants, L j ( j = 1 , 2 , , n ) , such that:
| f j ( u j ) f j ( v j ) | L j | u j v j | , u j , v j R

3. Main Results

In this section, a sufficient condition for synchronization of fractional-order neural networks is derived.
Based on the drive-response concept, we refer to system Equation (13) as the drive cellular neural network and consider a response network characterized as follows:
D α y i ( t ) = c i y i ( t ) + j = 1 n a i j f j ( y j ( t ) ) + I i + u i ( t )
or, equivalently:
D α y ( t ) = C y ( t ) + A f ( y ( t ) ) + I + u ( t )
where y ( t ) = ( y 1 ( t ) , , y n ( t ) ) T R n is the state vector of the slave system, C , A and f ( · ) are the same as Equation (13) and u ( t ) = ( u 1 ( t ) , , u n ( t ) ) T is the external control input to be designed later.
Defining the synchronization error signal as e i ( t ) = y i ( t ) x i ( t ) , the error dynamics between the master system Equation (14) and the slave system Equation (17) can be expressed by:
D α e ( t ) = C e ( t ) + A [ f ( y ( t ) ) f ( x ( t ) ) ] + u ( t )
where e ( t ) = ( e 1 ( t ) , , e n ( t ) ) T ; therefore, synchronization between master system Equation (13) and slave Equation (16) is equivalent to the asymptotic stability of error system Equation (18) with the suitable control law, u ( t ) . To this end, the external control input, u ( t ) , can be defined as u ( t ) = K e ( t ) , where K=diag ( k i , , k n ) is the controller gain matrix. Then, error system Equation (18) can be rewritten as:
D α e i ( t ) = ( c i k i ) e ( t ) + j = 1 n a i j ( f j ( y j ( t ) ) f j ( x j ( t ) ) )
or can be described by the following compact form:
D α e ( t ) = ( C K ) e ( t ) + A ( f ( y ( t ) ) f ( x ( t ) ) )
Theorem 1.
For the master-slave fractional-order chaotic neural networks Equations (14) and (17), which satisfy Assumption 1, if the controller gain matrix, K, satisfies ( C K ) | A | L as a M matrix(L=diag ( L 1 , , L n ) ), then the synchronization between systems Equations (14) and (17) is achieved.
Proof. 
If e i ( t ) = 0 , then D α | e i ( t ) | = 0 . If e i ( t ) > 0 , then:
D α | e i ( t ) | = 1 Γ ( 1 α ) 0 t | e i ( s ) | ( t s ) α d s = 1 Γ ( 1 α ) 0 t e i ( s ) ( t s ) α d s = D α e i ( t )
Similarly, if e i ( t ) < 0 , then:
D α | e i ( t ) | = 1 Γ ( 1 α ) 0 t | e i ( s ) | ( t s ) α d s = 1 Γ ( 1 α ) 0 t e i ( s ) ( t s ) α d s = D α e i ( t )
Therefore, it follows that:
D α | e i ( t ) | = sgn ( e i ( t ) ) D α e i ( t )
Due to ( C K ) | A | L being an M matrix, it follows from Lemma 3 that there are a set of positive constants, ξ i , such that:
( c i k i ) ξ i + j = 1 n ξ j | a j i | L i < 0 , i N
Define functions:
F i ( θ ) = ( c i k i θ ) ξ i + j = 1 n ξ j | a j i | L i , i N
Obviously:
F i ( 0 ) = ( c i k i ) ξ i + j = 1 n ξ j | a j i | L i < 0 , i N
Therefore, there exists a constant, λ > 0 , such that:
( c i k i λ ) ξ i + j = 1 n ξ j | a j i | L i 0 , i N
Consider an auxiliary function defined by V ( t ) = i = 1 n ξ i | e i ( t ) | , where ξ i ( i N are chosen as those in Equation (27). The Caputo derivative of V ( t ) along the solution of system Equation (19) is:
D α V ( t ) = i = 1 n ξ i D α | e i ( t ) | = i = 1 n ξ i sign ( e i ( t ) ) { ( c i k i ) e i ( t ) + j = 1 n a i j ( f j ( x j ( t ) ) f j ( y j ( t ) ) ) } i = 1 n ξ i { ( c i k i ) | e i ( t ) | + j = 1 n | a i j | L j | e j ( t ) | } = i = 1 n { ξ i ( c i k i ) + j = 1 n ξ j | a j i | L i } | e i ( t ) | ( 28 ) λ V ( t )
One can see that:
( 29 ) V ( t 0 ) = i = 1 n ξ i | e i ( t 0 ) ) | max 1 i n { ξ i } | | e ( t 0 ) | | ( 30 ) V ( t ) = i = 1 n ξ i | e i ( t ) ) | min 1 i n { ξ i } | | e ( t ) | |
Based on Lemma 1, it yields:
min 1 i n { ξ i } | | e ( t ) | | max 1 i n { ξ i } | | e ( t 0 ) | | E α ( λ ( t t 0 ) α )
That is:
| | e ( t ) | | max 1 i n { ξ i } min 1 i n { ξ i } | | e ( t 0 ) | | E α ( λ ( t t 0 ) α )
Let z = λ ( t t 0 ) α in Lemma 2, | arg ( z ) | = π ; it follows from Lemma 2 that there exists a real constant C, such that:
| | e ( t ) | | max 1 i n { ξ i } min 1 i n { ξ i } | | e ( t 0 ) | | C 1 + | λ ( t t 0 ) α |
which implies that | | e ( t ) | | converges asymptotically to zero as t tends to infinity, namely, the fractional-order chaotic neural network Equation (14) is globally synchronized with Equation (17). □
Remark 1.
Up to now, with the help of the traditional Lyapunov direct theory, there are many results about synchronization of integer-order chaotic neural networks, but the method and these results are not suitable for fractional-order chaotic neural networks.
Remark 2.
[19,20,21,22] discussed chaos and synchronization of the fractional-order neural networks, but these are only numerical simulations. Here, theoretical proof is proposed.
Remark 3.
[18] considered α-synchronization for fractional-order neural networks; unfortunately, the obtained results are not correct [24].

4. Numerical Example

An illustrative example is given to demonstrate the validity of the proposed controller.
Consider a fractional-order Hopfield neural chaotic network with neurons as follows [25]:
D α x ( t ) = C x ( t ) + A f ( x ( t ) )
where x ( t ) = ( x 1 ( t ) , x 2 ( t ) , x 3 ( t ) ) T , C = diag ( 1 , 1 , 1 ) , f ( x ( t ) ) = ( tanh ( x 1 ( t ) ) , tanh ( x 2 ( t ) ) , tanh ( x 3 ( t ) ) ) T , and A = 2 1.2 0 2 1.71 1.15 4.75 0 1.1 . The system satisfies Assumption 1 with L 1 = L 2 = L 3 = 1 . As is shown in Figure 1, the fractional-order Hopfield neural network possesses a chaotic behavior when α = 0.95 .
Figure 1. Chaotic behaviors of fractional-order Hopfield neural network Equation (34) with fractional-order, α = 0.95 .
Figure 1. Chaotic behaviors of fractional-order Hopfield neural network Equation (34) with fractional-order, α = 0.95 .
Entropy 15 03265 g001
The controlled response fractional Hopfield neural network is designed as follows:
D α y ( t ) = C y ( t ) + A f ( y ( t ) ) + u ( t )
The controller gain matrix, u ( t ) , is chosen as K= diag ( 6 , 5 , 2 ) , and it can be easily verified that ( C K ) | A | L = 5 1.2 0 2 4.29 1.15 4.75 0 1.9 is an M matrix. According to Theorem 1, the synchronization between Equations (34) and (35) can be achieved. In the numerical simulations, the initial states of the drive and response systems are taken as x ( 0 ) = ( 0.1 , 0.4 , 0.2 ) T and y ( 0 ) = ( 0.8 , 0.1 , 0.7 ) T , respectively. Figure 2 shows the state synchronization trajectory of the drive and response systems; the synchronization error response is depicted in Figure 3.
Figure 2. State synchronization trajectories of drive system Equation (34) and response system Equation (35).
Figure 2. State synchronization trajectories of drive system Equation (34) and response system Equation (35).
Entropy 15 03265 g002
Figure 3. Synchronization error time response of drive system Equation (34) and response system Equation (35).
Figure 3. Synchronization error time response of drive system Equation (34) and response system Equation (35).
Entropy 15 03265 g003aEntropy 15 03265 g003b

5. Conclusions

In this paper, the synchronization problem has been studied theoretically for a class of fractional-order chaotic neural networks, which is more difficult and challenging than the integer-order chaotic neural networks. Based on the Mittag-Leffler function and linear feedback control, a sufficient condition in the form of the M-matrix has been derived. Finally, a simulation example has been given to illustrate the effectiveness of the developed approach.

Acknowledgements

The authors thank the referees and the editor for their valuable comments and suggestions. This work was supported by the National Natural Science Foundation of China (No.60974090), the Fundamental Research Funds for the Central Universities (No. CDJXS12170001), the Natural Science Foundation of Anhui Province (No. 11040606M12), the Ph.D. Candidate Academic Foundation of Ministry of Education of China, the Natural Science Foundation of Anhui Education Bureau (KJ2013B015) and the 211 project of Anhui University (No. KJJQ1102).

Conflict of Interest

The authors declare no conflict of interest.

References

  1. Podlubny, I. Fractional Differential Equations; Academic Press: San Diego, CA, USA, 1999. [Google Scholar]
  2. Hilfer, R. Applications of Fractional Calculus in Physics; World Scientific York: Singapore, Singapore, 2000. [Google Scholar]
  3. Kilbas, A.A.; Srivastava, H.M.; Trujillo, J.J. Theory and Application of Fractional Differential Equations; Elsevier: New York, NY, USA, 2006. [Google Scholar]
  4. Srivastava, H.M.; Owa, S. Univalent Functions, Fractional Calculus and Their Applications; Prentice Hall: New Jersey, NJ, USA, 1989. [Google Scholar]
  5. Lundstrom, B.; Higgs, M.; Spain, W.; Fairhall, A. Fractional differentiation by neocortical pyramidal neurons. Nat. Neurosci. 2008, 11, 1335–1342. [Google Scholar] [CrossRef] [PubMed]
  6. Anastasio, T. The fractional-order dynamics of brainstem vestibulo-oculomotor neurons. Biol. Cybern. 1994, 72, 69–79. [Google Scholar] [CrossRef] [PubMed]
  7. Anastassiou, G. Fractional neural network approximation. Comput. Math. Appl. 2012, 64, 1655–1676. [Google Scholar] [CrossRef]
  8. Kaslika, E.; Sivasundaram, S. Nonlinear dynamics and chaos in fractional-order neural networks. Neural Netw. 2012, 32, 245–256. [Google Scholar] [CrossRef] [PubMed]
  9. Steinmetz, P.N.; Roy, A.; Fitzgerald, P.J.; Hsiao, S.S.; Johnson, K.O.; Niebur, E. Attention modulates synchronized neuronal firing in primate somatosensory cortex. Nature 2000, 404, 187–190. [Google Scholar] [CrossRef] [PubMed]
  10. Fire, P.; Reynolds, J.H.; Rorie, A.E.; Desimone, R. Modulation of oscillatory neuronal synchronization by selective visual attention. Science 2001, 291, 1560–1564. [Google Scholar]
  11. Li, T.; Song, A.G.; Fei, S.M.; Guo, Y.Q. Synchronization control of chaotic neural networks with time-varying and distributed delays. Nonlinear Anal. Theory Method. Appl. 2009, 71, 2372–2384. [Google Scholar] [CrossRef]
  12. Zhou, J.; Chen, T.P.; Xiang, L. Chaotic Lag synchronization of coupled delayed neural networks and its applications in secure communication. Circuits Syst. Signal Process. 2005, 24, 599–613. [Google Scholar] [CrossRef]
  13. Gan, Q.T.; Hu, R.X.; Liang, Y.H. Adaptive synchronization for stochastic competitive neural networks with mixed time-varying delays. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 3708–3718. [Google Scholar] [CrossRef]
  14. Arena, P.; Fortuna, L.; Porto, D. Chaotic behavior in noninteger-order cellular neural networks. Phys. Rev. E 2000, 61, 776–781. [Google Scholar] [CrossRef]
  15. Arena, P.; Caponetto, R.; Fortuna, L.; Porto, D. Bifurcation and chaos in noninteger order cellular neural networks. Int. J. Bifurc. Chaos 1998, 8, 1527–1539. [Google Scholar] [CrossRef]
  16. Boroomand, A.; Menhaj, M. Fractional-order hopfield neural networks. In Advances in Neuro-Information Processing; Springer: Berlin Heidelberg, Germany, 2009; pp. 883–890. [Google Scholar]
  17. Huang, X.; Zhao, Z.; Wang, Z.; Lia, Y.X. Chaos and hyperchaos in fractional-order cellular neural networks. Neurocomputing 2012, 94, 13–21. [Google Scholar]
  18. Yu, J.; Hu, C.; Jiang, H. α-stability and α-synchronization for fractional-order neural networks. Neural Netw. 2012, 35, 82–87. [Google Scholar] [CrossRef] [PubMed]
  19. Zhou, S.; Li, H.; Zhu, Z. Chaos control and synchronization in a fractional neuron network system. Chaos Soliton. Fract. 2008, 36, 973–984. [Google Scholar] [CrossRef]
  20. Moaddy, K.; Radwan, A.G.; Salama, K.N.; Momani, S.; Hashim, I. The fractional-order modeling and synchronization of electrically coupled neuron systems. Comput. Math. Appl. 2012, 64, 3329–3339. [Google Scholar] [CrossRef]
  21. Zhou, S.; Hu, P.; Li, H. Chaotic synchronization of a fractional neuron network system with time-varying delays. In Proceedings of International Conference on Communications, Circuits and Systems (ICCCAS 2009), Taipei, Taiwan, 24–27 May 2009; pp. 863–867.
  22. Zhu, H.; Zhou, S.; Zhang, W. Chaos and synchronization of time-delayed fractional neuron network system. In Proceedings of the 9th International Conference for Young Computer Scientists (ICYCS 2008), Zhang Jia Jie, Hunan, China, 18–21 November 2008; pp. 2937–2941.
  23. Berman, A.; Plemmons, R.J. Nonnegative Matrices in the Mathematical Sciences; Academic Press: New York, NY, USA, 1979. [Google Scholar]
  24. Li, K.; Peng, J.; Gao, J. A comment on “α-stability and α-synchronization for fractional-order neural networks”. Neural Netw. 2013. Available online: http://dx.doi.org/10.1016/j.neunet.2013.04.013. [Google Scholar]
  25. Zhang, R.; Qi, D.; Wang, Y. Dynamics analysis of fractional order three-dimensional Hopfield neural network. In Proceeding of 6th International Conference on Natural Computation (ICNC 2010), Yantai, Shandong, China, 10–12 August 2010; pp. 3037–3039.

Share and Cite

MDPI and ACS Style

Chen, L.; Qu, J.; Chai, Y.; Wu, R.; Qi, G. Synchronization of a Class of Fractional-Order Chaotic Neural Networks. Entropy 2013, 15, 3265-3276. https://doi.org/10.3390/e15083355

AMA Style

Chen L, Qu J, Chai Y, Wu R, Qi G. Synchronization of a Class of Fractional-Order Chaotic Neural Networks. Entropy. 2013; 15(8):3265-3276. https://doi.org/10.3390/e15083355

Chicago/Turabian Style

Chen, Liping, Jianfeng Qu, Yi Chai, Ranchao Wu, and Guoyuan Qi. 2013. "Synchronization of a Class of Fractional-Order Chaotic Neural Networks" Entropy 15, no. 8: 3265-3276. https://doi.org/10.3390/e15083355

APA Style

Chen, L., Qu, J., Chai, Y., Wu, R., & Qi, G. (2013). Synchronization of a Class of Fractional-Order Chaotic Neural Networks. Entropy, 15(8), 3265-3276. https://doi.org/10.3390/e15083355

Article Metrics

Back to TopTop