Next Article in Journal
Evaluating Collapse Fragility Curves for Existing Buildings Retrofitted Using Seismic Isolation
Next Article in Special Issue
Hybrid Bayesian Network Models of Spinal Injury and Slip/Fall Events
Previous Article in Journal
Dental Images Recognition Technology and Applications: A Literature Review
Previous Article in Special Issue
Simulation of Skeletal Muscles in Real-Time with Parallel Computing in GPU
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Scalable Implementation of Hippocampal Network on Digital Neuromorphic System towards Brain-Inspired Intelligence

1
School of Computer Science and Technology, Shandong Jianzhu University, Jinan 250101, China
2
School of Electrical and Information Engineering, Tianjin University, Tianjin 300072, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(8), 2857; https://doi.org/10.3390/app10082857
Submission received: 30 March 2020 / Revised: 13 April 2020 / Accepted: 14 April 2020 / Published: 20 April 2020
(This article belongs to the Special Issue Artificial Intelligence (AI) and Virtual Reality (VR) in Biomechanics)

Abstract

:
In this paper, an expanded digital hippocampal spurt neural network (HSNN) is innovatively proposed to simulate the mammalian cognitive system and to perform the neuroregulatory dynamics that play a critical role in the cognitive processes of the brain, such as memory and learning. The real-time computation of a large-scale peak neural network can be realized by the scalable on-chip network and parallel topology. By exploring the latest research in the field of neurons and comparing with the results of this paper, it can be found that the implementation of the hippocampal neuron model using the coordinate rotation numerical calculation algorithm can significantly reduce the cost of hardware resources. In addition, the rational use of on-chip network technology can further improve the performance of the system, and even significantly improve the network scalability on a single field programmable gate array chip. The neuromodulation dynamics are considered in the proposed system, which can replicate more relevant biological dynamics. Based on the analysis of biological theory and the theory of hardware integration, it is shown that the innovative system proposed in this paper can reproduce the biological characteristics of the hippocampal network and may be applied to brain-inspired intelligent subjects. The study in this paper will have an unexpected effect on the future research of digital neuromorphic design of spike neural network and the dynamics of the hippocampal network.

1. Introduction

Over the last few decades, numerous studies have accumulated a great deal of knowledge about brain function, but our understanding of brain mechanisms and functional dynamics remains limited [1,2]. Spiking neural networks (SNNs) have been increasingly popular in recent years due to their relationship to dynamics in human brain and enhanced biological relevance [3]. The SNNs with cognitive and motor functions, such as the hippocampal network, have been investigated with abundant in vitro and model-based experiments [4,5]. One of the most essential components of the human brain is the hippocampus, which processes short-term memory and spatial navigation information [6,7,8,9]. It can be biologically demonstrated that behavioral learning and memory in mammals are closely related to its hippocampal rhythm [10,11]. In addition, there is ample evidence that abnormal hippocampal rhythms in animals induce psychological dysfunction of the nervous system. One of the most representational examples is that abnormal electrical activity in the hippocampal neural network can cause cognitive decline and behavioral inhibition in Alzheimer’s patients [12,13,14,15]. Previous experiments have shown that hippocampal oscillations play an irreplaceable role both in the encoding and transmission of information [16]. The dynamics of neuromodulation can affect sensory processing and information processing in cognitive function [17,18]. Most of the current spiking neural network models for machine learning application use the feedforward network architecture for the sake of object classification. However, the human brain does not execute the classification task most of the time. Besides, many critical brain areas for learning do not use a feedforward architecture. On the basis of these observations, we were inspired to present a novel model with recurrent network architecture for the neuromodulatory dynamics, which are critical in the cognitive and learning process in human brain.
Although the hippocampal neural network has been investigated for a long time and applied in various fields widely, its computational process has some constraint. CPU characteristics lead to low clock rates and low memory bandwidth on traditional platforms. A high-performance computing platform in neuromorphic engineering is urgently needed. Various kinds of hardware designs have endeavored to realize the real-time computation of the human brain system in recent years. The analog neuromorphic system based on the very large scale integration (VLSI) can be more realistic and power efficient [19,20,21], but it has not been popularized due to its excessive resource consumption and long development time. The field programmable gate array (FPGA) has the characteristics of parallel computation, reconfigurable architecture, and distributed structure in a digital neuromorphic system [22,23]. Previous studies have focused on using FPGAs to improve the performance of neuromorphic platforms, which present different realization methods with various applications [24,25,26,27]. However, although this is important for information encoding and signal processing in the brain’s cognitive system, few previous studies have considered the dynamic oscillations of the hippocampal neural network. In addition, how to effectively implement large-scale hippocampal networks with coupled synapses with random connections is a difficult challenge. Due to the shortcomings of on-chip network (NoC) technology [28], the digital implementation of the seahorse network we proposed earlier is limited in scalability and system performance [29].
Since the introduction of microchip technology, great progress has been made in its development, and large-scale on-chip system technology has been applied in more and more fields. As the architectures of the system-on-chip platform become more complex, NoC techniques are investigated and used to solve the problems of data transmission, large-scale computation, and complicated network topology [30]. The NoC structure can improve the utilization of the hardware resource and transmission rate of the data flow, which can make the topology architecture flexible and improve the system scalability. In this study, we proposed a novel NoC technique to complete the realization of a scalable digital neural morphology of the hippocampal neural network and focus on the neuromodulation oscillation dynamics of the hippocampus. In particular, the proposed FPGA-based hippocampal neural network can track biodynamics in real time under normal and even neuroregulatory conditions.
The general structure of this article is as follows. Section 2 describes the construction principle of the hippocampal neural network model, and the digital implementation of the presented hippocampal network model is described in detail in Section 3. The experimental results and the biological feasibility of the network are demonstrated in Section 4.

2. The Hippocampal Neural Network Model

As shown in Figure 1, hippocampal neural coding is closely related to peridergic pyramidal neurons and inhibitory neurons, and the hippocampal neural network can contain both important types of neurons. Pyramidal neurons are asymmetrical in the narrow region between the dendrites and the cell body, whereas the structures of the genus rabbit on both sides of the cell body of the intermediate neuron are symmetrical. As a result, the two types of neurons have different effects in the electrical field outside the cell. Besides, the network is coupled with various synaptic connections due to the relatively local dense connectivity.
In this paper, Ne = 48,000 excitatory neurons and Ni = 16,000 inhibitory neurons were used to form a hippocampal network model, and the synapses between these neurons were connected with a sparsity of 20%. When neuron i peaks, the synaptic current of neuron j is weighted by the presynaptic pulse s(i, t), which is weighted by the corresponding synaptic current w(j, i). This current generation process can be regarded as the integral process of all synaptic current to the jth neuron, which can be defined by the following mathematical equation:
I S ( j ) / t = ( I S ( j ) + i g s y n w ( j , i ) s ( i , t ) ) / τ S ( j )
In the above formula, s(i, t) is used to indicate presynaptic impulse, and w(j, t) is used to describe synaptic strength. For different connections between excitatory neurons and inhibitory neurons, the parameter range of synaptic strength is different: the synaptic strength w is uniformly distributed with wee ∈ [0, 0.65] for: excitatory-to-excitatory synapses, wei ∈ [0, 2] for excitatory-to-inhibitory synapses, wie ∈ [−1.7, −0.8] for inhibitory-to-excitatory synapses, and wii ∈ [−1.1, −0.3] for inhibitory-to-inhibitory synapses. τs is a constant describing the synapse time constant. For excitatory neurons, τs = 0.5 ms, and for inhibitory neurons, τs = 6 ms. The effective gain gsyn, which stands for the amplitude response of the synaptic current neurons, is 8.7 ± 4.8 Hz and 21.1 ± 1.4 Hz for inhibitory neurons [31]. The firing rate of the excitatory field E will affect the nerve polarization of cone excitatory neurons, and E is proportional to the applied electric field, which can be described by the following mathematical equation:
I E / t = ( I E + S E E ( t ) ) / τ E
The SE in the above formula is a description of the sensitivity of the membrane to the field determined by the cell geometry and field orientation. It should be noted that the parameter τE in the formula is a constant value and will not change with the iteration process. At the same time, the total input current I of the jth neuron is proposed, which is expressed as follows:
I ( j ) = { I S ( j ) + I E + I c a r ,   for   j th   excitatory   neuron I S ( j ) ,   for   j th   inhibitory   neuron
In this paper, in order to describe the impact caused by carbachol, Icar uses a Gaussian white noise current with an average value of 0. The advantage of this choice is that the operation of the endogenous hippocampal neural network can be demonstrated by applying a weak electric field.
Through the past research on neural network models, it can be found that using excitatory neurons and inhibitory neurons to build a hippocampal neural network model can achieve very good simulation results. Using phenomenological Izhikevich neuron [32] modeling, the differential equation of neuronal membrane potential V(t) is as follows:
V ( t + 1 ) = { a V 2 ( t ) + ( b + 1 ) V ( t ) + c U ( t ) + I ( t ) ,   for   V ( t ) θ ,   V 0 ,   otherwise ,
During the calculation iteration, the unit time step is selected to be 0.77 ms. The driving current I(t) is an input stimulation, and θ is the threshold value. The recovery variable U(t) can be expressed by the following equation:
U ( t + 1 ) = { U ( t ) + [ k U V ( t ) U ( t ) ] / τ U ,   for   V ( t ) < θ , U ( t ) + Δ U   otherwise ,
where kU is the slope of the variation in V(t), and τU describes the relaxation time. The recovery of the ΔU represents variables U(t) reset after the peak. The recovery of neuronal membrane potential and the discharge rate of excitatory and inhibitory neurons in hippocampus are affected by V0, ΔU, τU, and kU when carbachol-induced gamma oscillation emerges. Heterogeneity within the neural network is considered by using a normally distributed expression. Table 1 shows the corresponding parameter values. Based on these values, the network dynamics can be consistent with electrical biological experimental results.

3. Digital Implementation

The proposed system uses a high-end Intel Stratix III FPGA to implement a digital neuromorphic system that can simulate large-scale hippocampal neural networks. On the FPGA chip, the hippocampal neural network is realized using the torus architecture. The enhancement of the system throughput depends on pipeline technology.

3.1. Network-On-Chip (NoC) Architecture

The digital structure of the proposed hippocampal neural network is based on the NoC architecture that can enable a scalable and cost-efficient digital neuromorphic system. It is so indispensable that it determines the hardware performance of the proposed system. In the proposed study, the torus structure is used because it can avoid the node closure on the edge of toroidal topology; thus, the system can show better performance. Figure 2 shows the detailed implementation of the proposed NoC-based SNN. The top-level NoC structure is shown in Figure 2a, which contains 64 nucleus processors (NPs). Address event routing (AER) is essential to data communication between NPs determined by routers. The specific structure is shown in Figure 2b: Each NP contains a neuron unit, a router, a silicon synapse unit, and a configuration unit. The routers can transmit the data flow via the east, south, west, or north ports. The configuration unit is used to configure the routers, which determines the data transmission within an NP. The neuron unit is used to compute the hippocampal neurons, and the synaptic current is calculated in the silicon synapse unit. In the router module, the data flow is determined by hippocampal information processing (HIP) scheduling as shown in Figure 2c.
The NoC structure proposed in this paper uses a torus topology and requires a router with a dedicated routing algorithm to carry out correct data transmission, which is quite different from the conventional network structure. In the case of inter-chip data communication, the router receives external events from four adjacent NPs and sends data streams based on programmed routing rules from the configuration unit. The Algorithm 1 shows the routing algorithm of the proposed system. The AER spike information will be firstly routed along the X direction. After it reaches the location along the X direction checked by the embedded router, the information will be then routed along the Y direction to the destination node according to the routing table. A judgement is executed during the following routing algorithm to determine the routing direction for the shorter routing path planning. The AER data in the packet transmitted through the router is synaptic information rather than peak information, which is different from the traditional AER-based implementation of SNN.
Algorithm 1: The HIP router for routing the packets in torus-based NoC
  loop
   if posedge clk then
   Δx1=XC-1; Δx2= WN-XC;
   //WN: Width of the NoC, XC: xcurrent
   Δy1=YC-1; Δy2= HN-YC;
   // HN: Height of the NoC, YC: ycurrent
   Δxsign <-((xdest-XC) > 0)?0:l
   Δysign <- ((ydest-YC) > 0)?0:l
     if Δxsign = 0 then Δx<- (xdest-XC); Δxreverse<- (WN -xdest) + XC;
       if Δx ≤ Δxreverse then Route the packet EAST;
       else Route the packet WEST; end if
     elsexsign != 0}; Δx<- (XC-xdest); Δxreverse<- (WN -XC) + xdest;
       if Δx ≤ Δxreverse then Route the packet WEST;
       else Route the packet EAST; end if
     end if
     if Δysign = 0 then Δy<- (ydest-YC); Δyreverse <- (HN -ydest) + YC;
       if Δy ≤ Δyreverse then Route the packet SOUTH;
       else Route the packet NORTH; end if
     elseysign != 0}; Δy<- (YC-ydest); /Δyreverse <- (HN -YC) + ydest;
       if Δy ≤ Δyreverse then Route the packet NORTH;
       else Route the packet SOUTH; end if
     end if
     if XC=1 or XC=WN or YC=1 or YC= HN
     then Route the packet UP;
     // The situation that the current node is the edge node
     elseif Δxl>=Δx2 then Route the packet West;
     else Route the packet East; end if   
   end if
   end loop

3.2. CORDIC-Based Neuron Design

In order to solve the prior function, the coordinate rotation digital computer (CORDIC) algorithm is used in this paper to complete the conversion between rectangular coordinates and polar coordinates. The CORDIC algorithm can use the addition, shift, and look-up table operations to calculate basic transcendental functions, which are complicated for the hardware computation, including sin θ, cos θ, sinh θ, cosh θ, eθ, ln θ. The major method of the CORDIC algorithm is to rotate series of angles which are related to the specific values for the approximation of the target angle, which is based on multi-iteration operations. It is worth noting that the results can become more accurate as the number of iterations increases. All the computations are based on addition and shift operation of the specific computing results.
The unified CORDIC algorithm employs a parameter of the plane coordinate system m to integrate the linear rotation, circle rotation, and hyperbolic rotation in a CORDIC function set, the related equation set can be written as follows:
{ X i + 1 = X i m ξ i Y i 2 i Y i + 1 = Y i + ξ i X i 2 i   Z i + 1 = Z i ξ i θ i
In the function set, Xi and Yi are the result values of the ith iteration. Zi is the angle value of the ith iteration. The number of iterations can be expressed as parameter i, and ξi is the judgment operator. If ξi > 0, then rotate counterclockwise, and if ξi < 0, then rotate in the opposite direction, that is, clockwise. This parameter has a specific angle used to represent the ith rotation, where m refers to the parameter of the coordinate system. The parameter m can be equal to −1, 0, 1, and different values of m correspond to different nonlinear functions. If m = −1, the function set can be used to represent sinh θ and cosh θ. If m = 0, the function set can be used to calculate multiplication and division. In addition, if m = 1, the function set can compute sin θ and cos θ. Besides, different initial values can result in different computing results. If X0 = A + 1/4, Y0 = A − 1/4, and m = −1, the function set can be used to solve A . If X0 = A + 1, Y0 = A – 1, and m = −1, the function set can be used to solve 0.5ln A.
In the research model in this paper, the Izhikevich model is mainly used in the neuron model, and the electrical synapse model is used to build the synapses. The CORDIC algorithm is used to realize the functions of multiplication, division, and exponential operations. In terms of the multiplication operation using the CORDIC algorithm, the parameters are set as m = 0, X0 = A, Y0 = B, and Z0 = 0, and the function of Z = A × B can be calculated as:
{ X i + 1 = X i Y i + 1 = Y i ξ i X i 2 10 i Z i + 1 = Z i + ξ i θ i
where θi = 210−i × Xi, ξ = 1 if Yi > 0, otherwise ξi = 0. In terms of division operation, the parameters m = 0, X0 = B, Y0 = A, Z0 = 0 and the function Z = A/B can be computed as follows:
{ X i + 1 = X i Y i + 1 = Y i ξ i X i 2 10 i Z i + 1 = Z i + ξ i θ i
where θi = 210−i, and other parameters are the same as those in the case of multiplication. The eθ can be calculated by the following formula set:
{ X i + 1 = X i + ξ i Y i 2 i   Y i + 1 = Y i + ξ i X i 2 i Z i + 1 = Z i ξ i θ i
where θi = tanh−1(2i), and it can be calculated in the range from −1.11817 to 1.11817, which is smaller than the required range. Therefore, the data can be converted through the following equation:
θ = Q ln 2 + γ
where QZ, |γ| ≤ ln2 = 0.6931 and eθ = eQln2+γ = 2Qeγ. In this method, the range of eθ can be expanded as the iteration number increases to meet the required precision.

3.3. Neuron Implementation

In this study, the Izhikevich neuron model is used to realize the neural network. The digital implementation is shown in Figure 3. In Figure 3a, the blocks “Reg” and “ADD” represent the register and adder, and “SUB” stands for the subtractor. The block “MUL” represents the multiplier which is realized based on the CORDIC algorithm in this study. For the concrete implementation of an iteration of the CORDIC algorithm refer to Figure 3b, where “MUX” is the multiplexer. The input Xi, Yi, Zi are the input values of the iteration, and Xi+1, Yi+1, Zi+1 are the updated results of the i + 1 iteration. The variable is the iteration parameter of the iteration, and the specific values are based on the functions that the CORDIC algorithm implements. The CORDIC algorithms with different iterations can meet different precision requirements. The CORDIC algorithm can achieve more precision by increasing the iteration number. In this study, the CORDIC algorithm uses 21 steps of iteration.

4. Experimental Results

In order to simulate the function of a feedforward network on hardware FPGA, this paper implements an extensible neural network based on the improved butterfly fat tree (IBFT) architecture. Stratix III EP3SE260 FPGA can effectively implement the proposed neural network. In total, 64 NPs are implemented with the time division multiplexing technique for a single NP. Each NP is responsible for 1000 virtual neurons. The hardware resource cost is shown in Table 1. Traditionally, the use of lookup tables to compute multiplication solves this problem, but this method is expensive and backward in terms of memory resources and digital signal processing (DSP) resources. It can be clearly found from Table 2 that the hardware resource cost of only 18 bits of DSP block using the traditional method exceeds all available hardware resources. In contrast, the proposed implementation method can effectively reduce hardware resource costs in terms of the DSP and memory resources which are extravagant elements on FPGA hardware. The proposed digital neuromorphic system operates at a working frequency of 146.03 MHz.
The design and programming of the proposed digital neural morphological network is realized by using the VHDL modeling language. On the premise of using ALTERA Quartus II to synthesize HDL code, a Stratix III FPGA development board with 60 nm can be successfully developed. The oscilloscope photograph of the membrane potential of the neurons in the hippocampal neural network is shown in Figure 4. This indicates that the proposed neuromorphic hippocampal neural network can accurately track the biodynamics in real time.
The relationship of the computational precision, the iteration number and the CORDIC module number is shown in Figure 5a. The computational precision is defined by relative error between the CORDIC-based results and the desired results. It shows that the precision increases with increasing iteration number. The calculation accuracy will decrease as the number of CORDIC modules increases. Figure 5b reveals the relationship between the number of CORDIC modules and different kinds of the hardware resource cost. The CORDIC module can significantly reduce the hardware resource cost of the DSP block 18-bit elements and total storage bits. Considering both Figure 5a,b, it means that the CORDIC module in the proposed hardware implementation can save hardware resources while maintaining high level of computational precision.
In our study, we use frequency domain analysis to reveal how many signals are in a given frequency band within a given frequency range. According to the previous research results, the wavelet transform can effectively complete digital image processing and signal compression. As shown in Figure 6a,b, modulation of extracellular oscillations with negative and positive DC stimuli resulted in suppression and promotion of mean emissivity, respectively. During symmetrical modulation of low-frequency AC stimulation, the average activation rate of the hippocampal neural network increased, and the frequency of low-frequency AC stimulation was less than half (about 12 Hz) of the endogenous magnetic field frequency. In practice, symmetric modulation means that the effect of suppression is roughly the same as the effect of enhancement under stimulation of a low-frequency ac field. The suppressive neurons due to the effect of the negative cycles can fire during the positive cycles of the AC stimulation, which induces a stable network firing rate. Synchrony occurs during the emergency of subharmonic as shown in Figure 6c,d.
Previously we have presented several works for brain-inspired neuromorphic computing. In order to clarify the unique contribution in this study, a comparison with analysis is presented as illustrated in Table 3. These works have been inspired by different areas of the brain, which includes visual pathway, hippocampus, CPG (central pattern generator), Purkinje, and retina. Although [33] shares the same brain area with the presented study, different network structures with different aims are pursued. Study [33] uses the feedforward network structure for the implementation of the memory-related behaviors. This study presents a unique contribution for the realization of the neuromodulatory dynamics. In addition, this study presents a torus-based NoC design for hardware architecture, which presents another significant contribution in comparison with the previous studies. A previous study has revealed the advantage and necessity of the NoC design [28]. The presented study is the improved version in comparison with our previous studies [34,35]. The previous studies [34,35] use a bus-based NoC architecture. This work proposes a torus-based solution, which can further improve the NoC performance of the hippocampal network on FPGA, which is the major innovation and difference in comparison with our previous work. The weakness of the presented digital neuromorphic model is that it can only reproduce parts of the critical dynamics of the biological hippocampus. It cannot reproduce all the dynamics of the hippocampus region such as navigation, which should be further explored in a future study.

5. Conclusions

In this paper, a FPGA-based scalable hardware design for hippocampal neural network is presented, and the simulation is carried out under the constraints of biological characteristics. In addition, it is of great significance to study neural information processing and neurological diseases, since the biological dynamics of reappearing neuroregulatory oscillations are realized based on the proposed digital neuromorphic system. Since the NoC techniques are used in the proposed system, which makes the proposed system more scalable than the previous studies of the digital realization of the hippocampal network. In addition, it has been proved that the application of the CORDIC algorithm in large-scale neural networks can improve the system performance. The proposed work could be used in a variety of applications, such as simulation platforms for neural network dynamics, brain-inspired intelligence, neural prostheses in brain-machine interfaces, and control circuits for neural robots. Due to the neuromodulatory dynamics of the presented neuromorphic hippocampus, the most two significant future directions are the studies for the realization of the adaptive bi-directional brain-machine interface and the compact neuroprosthetics to replace the impaired counterpart in human brain. The brain-machine interface can induce a novel kind of mixed intelligence, which is an enhanced version of artificial intelligence. In addition, due to rhythmic dynamics of the proposed digital neuromorphic hippocampus, it can be applied in the interaction between human brain and the external environment to form a human-machine-environment integration system [39]. Besides, it can also be used as a neuro-controller for intelligent robots because of the rhythmic outputs of the neuromorphic network induced by the period input signals.

Author Contributions

Conceptualization, W.S.; Methodology, N.Z.; Validation, J.W.; Writing-Review & Editing, S.Y. All authors have read and agree to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (Grant Nos. 61671320, 61771330, 61871287), Science and technology project of universities in Shandong Province (J18KA353) and Natural Science Foundation of Tianjin (Grant No. 18JCZDJC32000).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, Y.; Huang, L.; Zhang, W.; Zhang, Z.; Cacioppo, S. Spatio-temporal dynamics of kind versus hostile intentions in the human brain: An electrical neuroimaging study. Soc. Neurosci. 2015, 10, 253–267. [Google Scholar] [CrossRef]
  2. Tajima, S.; Yanagawa, T.; Fujii, N.; Taro, T. Untangling Brain-Wide Dynamics in Consciousness by Cross-Embedding. PLoS Comput. Biol. 2015, 11, e1004537. [Google Scholar] [CrossRef] [Green Version]
  3. Sengupta, A.; Ye, Y.; Wang, R.; Liu, C.; Roy, K. Going Deeper in Spiking Neural Networks: VGG and Residual Architectures. Front. Neurosci. 2018, 13, 95. [Google Scholar] [CrossRef]
  4. Li, W.; Papilloud, A.; Laura, L.M.; Zhao, N.; Ye, X.; Zhang, X.; Carmen, S.; Gregor, R. Stress Impacts the Regulation Neuropeptides in the Rat Hippocampus and Prefrontal Cortex. Proteomics 2018, 18, 700408. [Google Scholar] [CrossRef]
  5. Guo, L.; Zhang, W.; Zhang, J. Neural information coding on small-world spiking neuronal networks modulated by spike-timing-dependent plasticity under external noise stimulation. Clust. Comput. 2019, 22, 5217–5231. [Google Scholar] [CrossRef]
  6. Caitlin, R.B.; Dagmar, Z. Abstract Memory Representations in the Ventromedial Prefrontal Cortex and Hippocampus Support Concept Generalization. J. Neurosci. Off. J. Soc. Neurosci. 2018, 38, 2605–2614. [Google Scholar]
  7. Pandey, A.; Sikdar, S.K. Depression biased non-Hebbian spike-timing-dependent synaptic plasticity in the rat subiculum. J. Physiol. 2014, 592, 3537–3557. [Google Scholar] [CrossRef] [Green Version]
  8. Lu, C.W.; Huang, S.K.; Lin, T.Y.; Wang, S.J. Echinacoside, an active constituent of Herba Cistanche, suppresses epileptiform activity in hippocampal CA3 pyramidal neurons. Korean J. Physiol. Pharm. 2018, 22, 49. [Google Scholar] [CrossRef]
  9. Klyne, D.M.; Moseley, G.L.; Sterling, M.; Barbe, M.F.; Hodges, P.W. Individual Variation in Pain Sensitivity and Conditioned Pain Modulation in Acute Low Back Pain: Impact of Stimulus Type, Sleep, Psychological and Lifestyle Factors. J. Pain Off. J. Am. Pain Soc. 2018, 19, 942.e1–942.e18. [Google Scholar] [CrossRef] [Green Version]
  10. Venkatesh, G.K.; Nadarajan, R.A. HTTP botnet detection using adaptive learning rate multilayer feed-forward neural network. In Proceedings of the Ifip Wg 112 International Conference on Information Security Theory & Practice: Security; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  11. George, P.; Simon, H.; Howard, B. The Sync/deSync Model: How a Synchronized Hippocampus and a Desynchronized Neocortex Code Memories. J. Neurosci. Off. J. Soc. Neurosci. 2018, 38, 3428–3440. [Google Scholar]
  12. Adler, D.H.; Wisse, L.E.M.; Ittyerah, R.; Pluta, J.B.; Ding, S.L.; Xie, L.; Wang, J.; Kadivar, S.; Robinson, J.L.; Schuck, T.; et al. Characterizing the human hippocampus in aging and Alzheimer’s disease using a computational atlas derived from ex vivo MRI and histology. Proc. Natl. Acad. Sci. USA 2018, 115, 201801093. [Google Scholar] [CrossRef] [Green Version]
  13. Karita, S.T.; Tuomas, P.M.; Selja, M.I.V.; Risto, J.I. EEG Artifact Removal in TMS Studies of Cortical Speech Areas. Brain Topogr. 2020, 33, 1–9. [Google Scholar]
  14. Nickl, R.C.; Reich, M.M.; Pozzi, N.G.; Patrick, F.; Florian, L.; Jonas, R.; Volkmann, J.; Matthies, C. Rescuing Suboptimal Outcomes of Subthalamic Deep Brain Stimulation in Parkinson Disease by Surgical Lead Revision. Neurosurgery 2019, 85, 2. [Google Scholar] [CrossRef]
  15. Bauer, P.R.; Kalitzin, S.; Zijlmans, M.; Sander, J.W.; Visser, G. Cortical Excitability as a Clinical Marker in Epilepsy: A review of the clinical application of Transcranial Magnetic Stimulation. Int. J. Neural Syst. 2014, 24, 1430001. [Google Scholar] [CrossRef]
  16. Roberto, F.L.; Patrice, A.; Herwig, W.; Stéphane, J.; Hugo, T. A Generalized Multifractal Formalism for the Estimation of Nonconcave Multifractal Spectra. IEEE Trans. Signal Process. 2019, 67, 110–119. [Google Scholar]
  17. Mccafferty, C.; François, D.; Venzi, M.; Lorincz, M.; Francis, D.; Zoe, A.; Recchia, G.; Orban, G.; Lambert, R.C.; di Giovanni, G. Cortical drive and thalamic feed-forward inhibition control thalamic output synchrony during absence seizures. Nat. Neurosci. 2018, 21 (Suppl. 9), 6. [Google Scholar] [CrossRef]
  18. Satarupa, D.; Sumalee, S.; Stella, M.; Surjith, K.; Carlo, M.; Sinoj, A. Light-induced ATP driven self-assembly of actin heavy-meromyosin in a proteo-tubularsomes as a step toward an artificial cell. Chem. Commun. 2018, 54. [Google Scholar]
  19. Mccormick, C.; Ciaramelli, E.; Luca, F.D.; Maguire, E.A. Comparing and Contrasting the Cognitive Effects of Hippocampal and Ventromedial Prefrontal Cortex Damage: A Review of Human Lesion Studies. Neuroscience 2018, 374, 295–318. [Google Scholar] [CrossRef]
  20. Shashikant, K.; Timothy, K.H. Waypoint Path Planning with Synaptic-Dependent Spike Latency. IEEE Trans. Circuits Syst. I: Regul. Pap. 2019, 6, 1544–1557. [Google Scholar]
  21. Caleb, J.; Kannan, M. VLSI Implementation of Constructive Neural Network for Skin Cancer Detection. J. Comput. Nanosci. 2018, 15, 485–492. [Google Scholar] [CrossRef]
  22. Hargreaves, S.; Bath, P.; Duffin, S.; Ellis, J. Sharing and empathy in digital spaces: Qualitative study of online health forums for breast cancer and motor neuron disease. (Amyotrophic Lateral Sclerosis). J. Med. Internet Res. 2018, 20, 222. [Google Scholar] [CrossRef] [PubMed]
  23. Yang, S.; Wang, J.; Li, S.; Li, H.; Wei, X.; Yu, H.; Deng, B. Digital implementations of thalamocortical neuron models and its application in thalamocortical control using FPGA for Parkinson’s disease. Neurocomputing 2016, 177, 274–289. [Google Scholar] [CrossRef]
  24. Luo, Y.; Wan, L.; Liu, J.; Harkin, J.; Cao, Y. An efficient, low-cost routing architecture for spiking neural network hardware implementations. Neural Process. Lett. 2018, 48, 1–12. [Google Scholar] [CrossRef]
  25. Yang, S.; Wang, J.; Li, S.; Deng, B.; Wei, X.; Yu, H.; Li, H. Cost-efficient FPGA implementation of basal ganglia and their Parkinsonian analysis. Neural Netw. 2015, 71, 62–75. [Google Scholar] [CrossRef]
  26. Yang, S.; Wei, X.; Wang, J.; Deng, B.; Liu, C.; Yu, H.; Li, H. Efficient hardware implementation of the subthalamic nucleus–external globus pallidus oscillation system and its dynamics investigation. Neural Netw. 2017, 94, 220–238. [Google Scholar] [CrossRef]
  27. Kuang, Z.; Wang, J.; Yang, S.; Yi, G.; Deng, B.; Wei, X. Digital Implementation of the Spiking Neural Network and Its Digit Recognition. In Proceedings of the 2019 Chinese Control and Decision Conference (CCDC), Nanchang, China, 3–5 June 2019. [Google Scholar]
  28. Sebastian, W.; Navaridas, J.; Mikel, L. A survey on optical network-on-chip architectures. ACM Comput. Surv. 2017, 50, 1–37. [Google Scholar]
  29. Yang, S.; Wang, J.; Deng, B.; Liu, C.; Li, H.; Fietkiewicz, C.; Loparo, K.A. Real-time neuromorphic system for large-scale conductance-based spiking neural networks. IEEE Trans Cybern. 2018, 49, 2490–2503. [Google Scholar] [CrossRef]
  30. Ayyldz, N.; Schmidt, E.G.; Güran, H.C. S-DIRECT: Scalable and Dynamically Reconfigurable TCAM Architecture for High-Speed IP Lookup. Comput. J. 2018, 58, 1443–1455. [Google Scholar]
  31. Jand, A.; Taheri-Nejad, M.R.; Mosleh, M.; Palizvan, M.R. Low, but Not High, Doses of Copper Sulfate Impair Synaptic Plasticity in the Hippocampal CA1 Region In Vivo. Biol. Trace Elem. Res. 2018, 185, 1–5. [Google Scholar] [CrossRef]
  32. Maícas, R.J.; Leng, G.; MacGregor, D.J. A Predictive, Quantitative Model of Spiking Activity and Stimulus-Secretion Coupling in Oxytocin Neurons. Endocrinology 2018, 159, 3. [Google Scholar] [CrossRef]
  33. Hao, J.; Hao, X.; Wang, J.; Yang, S.; Deng, B.; Yu, H. Behavior of a Hippocampal Spiking Network and FPGA Implementation. In Proceedings of the 2019 Chinese Control Conference (CCC), Guangzhou, China, 27–30 July 2019; pp. 8433–8438. [Google Scholar]
  34. Yang, S.; Deng, B.; Li, H.; Liu, C.; Wang, J.; Yu, H.; Qin, Y. FPGA implementation of hippocampal spiking network and its real-time simulation on dynamical neuromodulation of oscillations. Neurocomputing 2018, 282, 262–276. [Google Scholar] [CrossRef]
  35. Yang, S.; Wang, J.; Deng, B.; Wei, X.; Li, H.; Wang, T. FPGA-based spiking neural network with hippocampal oscillation dynamics towards biologically meaningful prostheses. In Proceedings of the 2018 13th World Congress on Intelligent Control and Automation (WCICA), Changsha, China, 4–8 July 2018. [Google Scholar]
  36. Hu, J.; Deng, B.; Yang, S.; Wei, X.; Wang, J. A real-time virtual manipulator simulation platform based on FPGA. In Proceedings of the 2019 Chinese Control Conference (CCC), Guangzhou, China, 27–30 July 2019; pp. 3114–3119. [Google Scholar]
  37. Li, H.; Yang, S.; Hao, X.; Wang, J.; Deng, B.; Yi, G. Real-time implementation of the Purkinje network on digital neuromorphic system. In Proceedings of the 2019 Chinese Control and Decision Conference (CCDC), Nanchang, China, 3–5 June 2019; pp. 3632–3637. [Google Scholar]
  38. Yang, S.; Wang, J.; Deng, B.; Li, H.; Che, Y. Digital Implementation of the Retinal Spiking Neural Network under Light Stimulation. In Proceedings of the 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), San Francisco, CA, USA, 20–23 March 2019; pp. 542–545. [Google Scholar]
  39. Dang, L.M. A survey on internet of things and cloud computing for healthcare. Electronics 2019, 8, 768. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The overall structure of hippocampus and the hippocampal network in the CA3 region. The hippocampal network consists of pyramidal excitatory neurons and inhibitory neurons. These two types of neurons are coupled to each other through excitatory and inhibitory synapses with a variety of synaptic strengths.
Figure 1. The overall structure of hippocampus and the hippocampal network in the CA3 region. The hippocampal network consists of pyramidal excitatory neurons and inhibitory neurons. These two types of neurons are coupled to each other through excitatory and inhibitory synapses with a variety of synaptic strengths.
Applsci 10 02857 g001
Figure 2. Digital implementation of the proposed hippocampal neural network. (a) Digital implementation of the network-on-chip (NoC) architecture; (b) Digital implementation of the nucleus processor (NP); (c) Digital implementation of the router.
Figure 2. Digital implementation of the proposed hippocampal neural network. (a) Digital implementation of the network-on-chip (NoC) architecture; (b) Digital implementation of the nucleus processor (NP); (c) Digital implementation of the router.
Applsci 10 02857 g002
Figure 3. Digital implementation of the single neuron. (a) Digital implementation of the Izhikevich model; (b) Digital implementation of the coordinate rotation digital computer (CORDIC) algorithm.
Figure 3. Digital implementation of the single neuron. (a) Digital implementation of the Izhikevich model; (b) Digital implementation of the coordinate rotation digital computer (CORDIC) algorithm.
Applsci 10 02857 g003
Figure 4. Biological behaviors of the hippocampal neural network implemented on Stratix III field programmable gate array (FPGA), which are observed on the oscilloscope device. The voltage scale is 50 mV. (a) The output of the 1st excitatory neuron in the digital neuromorphic network (time scale is 20 μs); (b) The output of the 48,000th excitatory neuron in the digital neuromorphic network (time scale is 20 μs); (c) The output of the 1st inhibitory neuron in the digital neuromorphic network (time scale is 20 μs); (d) The output of the 16,000th inhibitory neuron in the digital neuromorphic network (time scale is 20 μs).
Figure 4. Biological behaviors of the hippocampal neural network implemented on Stratix III field programmable gate array (FPGA), which are observed on the oscilloscope device. The voltage scale is 50 mV. (a) The output of the 1st excitatory neuron in the digital neuromorphic network (time scale is 20 μs); (b) The output of the 48,000th excitatory neuron in the digital neuromorphic network (time scale is 20 μs); (c) The output of the 1st inhibitory neuron in the digital neuromorphic network (time scale is 20 μs); (d) The output of the 16,000th inhibitory neuron in the digital neuromorphic network (time scale is 20 μs).
Applsci 10 02857 g004
Figure 5. Performance analysis of the proposed CORDIC module. (a) The relationship of the computational precision, the iteration number, and the number of the CORDIC modules; (b) The relationship between the number of CORDIC modules and the hardware resource cost.
Figure 5. Performance analysis of the proposed CORDIC module. (a) The relationship of the computational precision, the iteration number, and the number of the CORDIC modules; (b) The relationship between the number of CORDIC modules and the hardware resource cost.
Applsci 10 02857 g005
Figure 6. The wavelet transform of the frequency-domain analysis. (a) The oscillation dynamics under the modulation of negative direct current; (b) The oscillation dynamics under the modulation of positive direct current; (c) The oscillation dynamics under the modulation of low-frequency AC stimulation; (d) The oscillation dynamics under the modulation of high-frequency AC stimulation.
Figure 6. The wavelet transform of the frequency-domain analysis. (a) The oscillation dynamics under the modulation of negative direct current; (b) The oscillation dynamics under the modulation of positive direct current; (c) The oscillation dynamics under the modulation of low-frequency AC stimulation; (d) The oscillation dynamics under the modulation of high-frequency AC stimulation.
Applsci 10 02857 g006
Table 1. Parameter values of the neuron models.
Table 1. Parameter values of the neuron models.
Excitatory NeuronInhibitory Neuron
a0.040.04
b55
c140140
τu43 ± 4.3100 ± 0
ku0.24 ± 0.020.25 ± 0
V0−65 ± 6.5−65 ± 0
ΔU10 ± 11 ± 0
Table 2. The hardware resource consumption when conventional and CORDIC-based implementation methods are used respectively to implement the hippocampal neural network.
Table 2. The hardware resource consumption when conventional and CORDIC-based implementation methods are used respectively to implement the hippocampal neural network.
ResourcesConventionalCORDIC-Based
Combinational ALUTs19,652/203,520 (10%)10,642/203,520 (5%)
Memory ALUTs490/101,760 (<1%)712/101,760 (1%)
Dedicated logic registers8924/203,520 (4%)11,546/203,520 (6%)
Total block memory bits7,771,246/15,040,512 (52%)2106/15,040,512 (<1%)
DSP block 18-bit elements2384/768 (310%)0/768 (0%)
Total PLLs1/8 (13%)1/8 (13%)
Table 3. Comparison of the presented study with previous works.
Table 3. Comparison of the presented study with previous works.
WorkMotivationNetwork StructureHardware Architecture
[27]Visual pathway-inspired FeedforwardNo NoC design
[34]Hippocampus-inspiredFeedforwardNo NoC design
[36]CPG-inspiredCPGNo NoC design
[37]Purkinje-inspiredRecurrentNo NoC design
[38]Retina-inspiredFeedforwardNo NoC design
This studyHippocampus-inspiredRecurrentTorus-based NoC design

Share and Cite

MDPI and ACS Style

Sun, W.; Wang, J.; Zhang, N.; Yang, S. Scalable Implementation of Hippocampal Network on Digital Neuromorphic System towards Brain-Inspired Intelligence. Appl. Sci. 2020, 10, 2857. https://doi.org/10.3390/app10082857

AMA Style

Sun W, Wang J, Zhang N, Yang S. Scalable Implementation of Hippocampal Network on Digital Neuromorphic System towards Brain-Inspired Intelligence. Applied Sciences. 2020; 10(8):2857. https://doi.org/10.3390/app10082857

Chicago/Turabian Style

Sun, Wei, Jiang Wang, Nan Zhang, and Shuangming Yang. 2020. "Scalable Implementation of Hippocampal Network on Digital Neuromorphic System towards Brain-Inspired Intelligence" Applied Sciences 10, no. 8: 2857. https://doi.org/10.3390/app10082857

APA Style

Sun, W., Wang, J., Zhang, N., & Yang, S. (2020). Scalable Implementation of Hippocampal Network on Digital Neuromorphic System towards Brain-Inspired Intelligence. Applied Sciences, 10(8), 2857. https://doi.org/10.3390/app10082857

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop