Neural Network Optimization Based on Complex Network Theory: A Survey
Abstract
:1. Introduction
2. Basic Concepts
2.1. Complex Network Model
2.1.1. Graph Representations
2.1.2. Random Network
2.1.3. Small-World Network
2.1.4. Scale-Free Network
2.2. Neural Network Model
2.2.1. Multilayer Perceptron
2.2.2. Restricted Boltzmann Machine
2.2.3. Convolutional Neural Network
2.2.4. Graph Neural Network
2.2.5. Spiking Neural Network
3. Studies of Improved Performance Achieved by Complex Network-Based ANNs
4. Discussion
4.1. Input Data Optimization
4.2. Complex Network Topology
4.3. Complex Network Neural Network Applications
5. Conclusions
- Complex network optimization tools can be applied to big data to be used for neural network training and directly to neural network structure with improved performance in accuracy and robustness.
- However, simultaneous joint optimization approach targeting both the data and the neural network using complex network theory was nonexistent. It is of our opinion that further improvement in performance, compared to the conventional methods, is possible by considering both the input/output data structure and the target neural network structure.
- To stimulate future interest in the field of complex network theory-based neural networks research, more work should be carried out in other application areas such as AI-based self-driving.
Author Contributions
Funding
Conflicts of Interest
References
- Rosenblatt, F. Principles of Neurodynamics. Perceptrons and the Theory of Brain Mechanisms; Cornell Aeronautical Lab Inc.: Buffalo, NY, USA, 1961. [Google Scholar]
- Montavon, G.; Samek, W.; Müller, K.R. Methods for interpreting and understanding deep neural networks. Signal Process. 2018, 73, 1–15. [Google Scholar]
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar]
- Available online: https://learnopencv.com/number-of-parameters-and-tensor-sizes-in-convolutional-neural-network/. (accessed on 1 November 2022).
- Stauffer, D.; Aharony, A.; da Fontoura Costa, L.; Adler, J. Efficient Hopfield pattern recognition on a scale-free neural network. Eur. Phys. J. B-Condens. Matter Complex Syst. 2003, 32, 395–399. [Google Scholar] [CrossRef] [Green Version]
- Simard, D.; Nadeau, L.; Kröger, H. Fastest learning in small-world neural networks. Phys. Lett. A 2005, 336, 8–15. [Google Scholar] [CrossRef] [Green Version]
- Bohland, J.W.; Minai, A.A. Efficient associative memory using small-world architecture. Neurocomputing 2001, 38, 489–496. [Google Scholar] [CrossRef]
- Perotti, J.I.; Tamarit, F.A.; Cannas, S.A. A scale-free neural network for modelling neurogenesis. Phys. A Stat. Mech. Its Appl. 2006, 371, 71–75. [Google Scholar] [CrossRef] [Green Version]
- Kaviani, S.; Sohn, I. Influence of random topology in artificial neural networks: A survey. ICT Express 2020, 6, 145–150. [Google Scholar] [CrossRef]
- Kaviani, S.; Sohn, I. Application of complex systems topologies in artificial neural networks optimization: An overview. Expert Syst. Appl. 2021, 180, 115073. [Google Scholar] [CrossRef]
- Sohn, I. Small-world and scale-free network models for IoT systems. Mob. Inf. Syst. 2017. [Google Scholar] [CrossRef] [Green Version]
- Newman, M. Networks: An Introduction; Oxford University Press: Oxford, UK, 2010. [Google Scholar]
- Watts, D.; Strogatz, S. Collective dynamics of ‘small-world’ network. Nature 1998, 393, 440–442. [Google Scholar] [CrossRef]
- Kleinberg, J.M. Navigation in a small world. Nature 2000, 406, 845. [Google Scholar] [CrossRef] [PubMed]
- Newman, M. Models of the small world. J. Stat. Phys. 2000, 101, 819–841. [Google Scholar] [CrossRef]
- Barabási, A.-L.; Albert, R. Emergence of scaling in random networks. Science 1999, 286, 509–512. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Albert, R.; Barabási, A.-L. Statistical mechanics of complex networks. Rev. Mod. Phys. 2002, 74, 47–97. [Google Scholar]
- Barabási, A.-L. Scale-free networks: A decade and beyond. Science 2009, 325, 412–413. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hinton, G.E. Training products of experts by minimizing contrastive divergence. Neural Comput. 2002, 14, 1771–1800. [Google Scholar] [CrossRef]
- Sohn, I.I. Deep belief network based intrusion detection techniques: A survey. Expert Syst. Appl. 2021, 167, 114170. [Google Scholar]
- Zhu, J.; Goyal, S.B.; Verma, C.; Raboaca, M.S.; Mihaltan, T.C. Machine learning human behavior detection mechanism based on python architecture. Mathematics 2022, 10, 3159. [Google Scholar] [CrossRef]
- Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Wang, L.; Li, C.; Sun, M. Graph neural networks: A review of methods and applications. AI Open 2020, 1, 57–81. [Google Scholar]
- Wu, J.; Chua, Y.; Zhang, M.; Yang, Q.; Li, G.; Li, H. Deep spiking neural network with spike count based learning rule. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14 July 2019; pp. 1–6. [Google Scholar]
- Zheng, P.; Tang, W.; Zhang, J. A simple method for designing efficient small-world neural networks. Neural Netw. 2010, 23, 155–159. [Google Scholar] [CrossRef]
- Li, X.; Xu, F.; Zhang, J.; Wang, S. A multilayer feed forward small-world neural network controller and its application on electrohydraulic actuation system. J. Appl. Math. 2013, 2013, 872790. [Google Scholar] [CrossRef] [Green Version]
- Mocanu, D.C.; Mocanu, E.; Stone, P.; Nguyen, P.H.; Gibescu, M.; Liotta, A. Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science. Nat. Commun. 2018, 9, 1–12. [Google Scholar]
- Ribas, L.C.; Jarbas, J.D.M.S., Jr.; Scabini, L.F.; Bruno, O.M. Fusion of complex networks and randomized neural networks for texture analysis. Pattern Recognit. 2020, 103, 107189. [Google Scholar] [CrossRef] [Green Version]
- Wang, S.; Zhao, X.; Wang, H.; Li, M. Small-world neural network and its performance for wind power forecasting. CSEE J. Power Energy Syst. 2019, 6, 362–373. [Google Scholar]
- Huang, Z.; Du, X.; Chen, L.; Li, Y.; Liu, M.; Chou, Y.; Jin, L. Convolutional neural network based on complex networks for brain tumor image classification with a modified activation function. IEEE Access 2020, 8, 89281–89290. [Google Scholar]
- Gao, H.; Yu, X.; Sui, Y.; Shao, F.; Sun, R. Topological Graph Convolutional Network Based on Complex Network Characteristics. IEEE Access 2022, 10, 64465–64472. [Google Scholar] [CrossRef]
- Lee, Y.H.; Sohn, I. Reconstructing damaged complex networks based on neural networks. Symmetry 2017, 9, 310. [Google Scholar] [CrossRef] [Green Version]
- Wei, X.; Li, Y.; Qin, X.; Xu, X.; Li, X.; Liu, M. From decoupling to reconstruction: A robust graph neural network against topology attacks. In Proceedings of the 2020 International Conference on Wireless Communications and Signal Processing (WCSP), Nanjing, China, 21 October 2020; pp. 263–268. [Google Scholar]
- Guo, L.; Man, R.; Wu, Y.; Lu, H.; Yu, H. Anti-injury function of complex spiking neural networks under random attack and its mechanism analysis. IEEE Access 2020, 8, 153057–153066. [Google Scholar] [CrossRef]
- Guo, L.; Man, R.; Wu, Y.; Lu, H.; Xu, G. Anti-injury function of complex spiking neural networks under targeted attack. Neurocomputing 2021, 462, 260–271. [Google Scholar] [CrossRef]
- Kaviani, S.; Shamshiri, S.; Sohn, I. A defense method against backdoor attacks on neural networks. Expert Syst. Appl. 2023, 213, 118990. [Google Scholar] [CrossRef]
- Nawi, N.M.; Atomi, W.H.; Rehman, M.Z. The effect of data pre-processing on optimized training of artificial neural networks. Procedia Technol. 2013, 11, 32–39. [Google Scholar] [CrossRef] [Green Version]
- Schliebs, S.; Defoin-Platel, M.; Worner, S.; Kasabov, N. Integrated feature and parameter optimization for an evolving spiking neural network: Exploring heterogeneous probabilistic models. Neural Netw. 2009, 22, 623–632. [Google Scholar] [CrossRef] [PubMed]
- Lee, J.; Shin, S.; Yoon, S.; Kim, T. Survey on artificial intelligence & machine learning models and datasets for network intelligence. J. Korean Inst. Commun. Inf. Sci. 2022, 47, 625–643. [Google Scholar]
- Zhang, J.; Small, M. Complex network from pseudoperiodic time series: Topology versus dynamics. Phys. Rev. Lett. 2006, 96, 238701. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shandilya, S.G.; Timme, M. Inferring network topology from complex dynamics. New J. Phys. 2011, 13, 013004. [Google Scholar] [CrossRef]
- Anghinoni, L.; Zhao, L.; Ji, D.; Pan, H. Time series trend detection and forecasting using complex network topology analysis. Neural Netw. 2019, 117, 295–306. [Google Scholar] [CrossRef] [PubMed]
Reference | Complex Network | Construction Method | Degree Distribution | Robustness Against Random Attack |
---|---|---|---|---|
[11,12] | Random network | Erdös and Rényi | Poisson | Strong |
[13,14,15] | Small-world network | Watts and Strogatz | Poisson | Weak |
[16,17,18] | Scale-free network | Barabási and Albert | Power law | Strong |
Ref. | Complex Network | Neural Network | Input Data | Key Technology | Applications |
---|---|---|---|---|---|
[24] | SWN | MLP | Chinese character | Synaptic connection elimination | Chinese character recognition |
[25] | SWN | MLP | Position control signal | ANN link rewiring | Electrohydraulic control system |
[26] | SFN | MLP, RBM, CNN | MNIST, FMNIST, CIFAR10, HIGGS | Sparse evolutionary training | Image recognition |
[27] | RN | RNN | Brodatz, Outex, USPTex, Vistex | Image to CN topology mapping | Texture image recognition |
[28] | SWN | MLP | Wind-related data: speed, direction, power | WS- and NS-based CNNN construction | Wind power forecasting |
[29] | SWN, SFN | CNN | Medical dataset: meningioma, glioma, pituitary tumor | ER-, WS-, BA-based CNNN construction | Brain tumor image classification |
[30] | RN | GCN | Citeseer, UAI2010, ACM, BlogCatalog, Flickr | Graph feature-based CNN | Topological features extraction |
[31] | SWN, SFN | MLP | Complex network topological features | Network topology extraction | Damaged network reconstruction |
[32] | RN | GCN | Cora-ml, Citeseer, PubMed | Attribute subgraph generation | Damaged network reconstruction |
[33] | SWN, SFN | SNN | Neural network attack | Graph feature-based SNN | Anti-injury performance |
[34] | SWN, SFN | SNN | Neural network attack | Graph feature-based SNN | Anti-injury performance |
[35] | SFN | MLP | MNIST, FMNIST, HODA, Oracle-MNIST | Link pruning-based SFN construction | ANN defense against attack |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chung, D.; Sohn, I. Neural Network Optimization Based on Complex Network Theory: A Survey. Mathematics 2023, 11, 321. https://doi.org/10.3390/math11020321
Chung D, Sohn I. Neural Network Optimization Based on Complex Network Theory: A Survey. Mathematics. 2023; 11(2):321. https://doi.org/10.3390/math11020321
Chicago/Turabian StyleChung, Daewon, and Insoo Sohn. 2023. "Neural Network Optimization Based on Complex Network Theory: A Survey" Mathematics 11, no. 2: 321. https://doi.org/10.3390/math11020321
APA StyleChung, D., & Sohn, I. (2023). Neural Network Optimization Based on Complex Network Theory: A Survey. Mathematics, 11(2), 321. https://doi.org/10.3390/math11020321