2. Literature Review
2.1. Applications for Neural Networks
Neural networks are used for a wide range of applications, including applications in computing, science, engineering, technology, environmental, agriculture, mining, climate, business, arts, and more. Some modern applications for neural networks have also been developed, including image recognition, speech recognition, character recognition, machine translation, stock market predictions, medical diagnosis [
1], counterfeit products classification [
13], sound-absorbing board optimization [
14], the optimization of channel equalization in wireless communication [
15], the optimization of IoT intrusion detection system performance [
16], diabetic retinopathy classification [
17], the mitigation of cybersecurity alert fatigue [
18], the optimization of hand gesture recognition [
19], optimization of indoor human activity recognition [
20], the optimization of multi-type object tracking [
21], the mitigation of global warming impacts on marine ecosystems [
22], the optimization of oil layer prediction [
23], the identification of the optimal parameters for recurrent neural network training [
24] and the optimal classification of noises in QR codes [
25]. Among all these NN applications, the most successful and famous application for neural networks is image recognition, which is based on the convolutional neural network (CNN) architecture. We discuss this in more detail later in the paper.
There are diverse applications for neural networks, but some of them have become more popular in every everyday life, a few examples of which are listed below.
Facial recognition systems are becoming more popular as intelligent surveillance systems. Facial recognition solutions map out human faces and compare them to previously stored images. These systems are often used for specific entrances in buildings to authenticate human faces and match them up with the data that are stored in their facial image databases.
As stated above, CNNs are often used for image recognition processing. For this, a large number of images of human faces are imported into a database to train the CNNs, although the stored human face images need further processing before the CNN training. CNN models are based on linear algebra principles and matrix multiplication is key for the representation of data and weights. The basic structure of CNNs comprise three layers: a convolutional layer, a pooling layer, and a fully connected layer. The convolutional layer consists of many filters and features and applies those filters to the input to generate feature maps using an activation function. The second layer is the pooling layer, the function of which is to downsample the feature maps. Finally, the fully connected layer, which is also known as the classifier, consists of an activation function to classify the images [
26].
Below are some examples of advanced applications for neural networks within facial recognition:
The aggregation of the spatial and temporal convolutional features of CNN for video-based facial expression recognition [
27].
The use of hybrid data augmentation and lightweight convolutional neural network solutions to estimate the age of faces in images [
28].
Improved accuracy in facial emotion recognition using new hybrid HOG-ESRs (histogram of oriented gradient-ensembles with shared representations) algorithm [
29].
- 2.
Speech recognition
Speech recognition is another example of technology that has now become so common in our daily lives that is used without a second thought. Through popular voice-controlled systems, such as Amazon’s Alexa and Apple’s Siri, automatic speech recognition (ASR) is scattered throughout our computers, tablet devices, home speakers, and cellar phones. ASR technology is now also being used in businesses, for instance, in customer service and many other areas. Enterprises have immediately embraced ASR technology as a means to develop operational efficiencies and simplify processes.
Neural networks have been applied in speech recognition technology for some time. Recurrent neural network (RNNs) are a unique type of NN that have the concept of ”memory” which were originally designed for sequential data processing. RNNs have been employed to develop speech recognition systems with dramatic success. Moreover, a special and powerful RNN architecture with improved memory has been developed which is known as long short-term memory (LSTM). LSTM has proven to be especially effective in speech recognition [
30,
31].
Below are some examples of advanced applications for neural networks within speech recognition:
A convolutional neural network (CNN) structure that can divide continuous speech into syllables by converting sound patterns into the form of RGB (red, green, blue) images [
32];
A hybrid CNN and bi-directional long short-term memory (BiLSTM) model for a cross-language end-to-end speech recognition system based on transfer learning [
33];
A hybrid residual time delay neural network (ResTDNN) and BiLSTM model for slot filling in speech recognition [
34].
2.2. Applications for Genetic Algorithms
Numerous researchers and scientists have applied genetic algorithms in mathematics, computer science, engineering, business, finance, economics, and social sciences. Many scientists have applied GA concepts to computer science problems, such as the optimization of computer architectures [
35,
36]. Another important topic of study is the optimization of distributed computer networks. The main objective of this research is to minimize the costs of designing distributed computer network topologies [
37]. Other applications have also been developed, such as file allocation methods for distributed systems, signal processing and filtering, hardware debugging, and more. In the finance sector, GAs are used for the automation of sophisticated trading systems, the valuation of real options, and portfolio optimization [
38,
39]. GAs are also used to solve problems in industry, management, and engineering, for instance, airline revenue management, container loading optimization, automated planning of structural inspection, control engineering, mechanical engineering, marketing mix analysis, optimization of mobile communications infrastructure, plant floor layout design, quality control management, bearing placement optimization, and the identification of optimal routes for multiple vehicles [
40,
41,
42,
43,
44]. Clearly, GAs can be applied to solve a broad range of general solutions [
45,
46,
47].
Below are some examples of advanced applications for genetic algorithms:
A novel aggregated multi-objective genetic algorithm (MOGA) and a variant particle swarm optimization (PSO) solution for 5G software-defined networking (SDN) architecture [
48];
Combined support vector machine (SVM) and GA solutions for the detection of cyber-attacks and malicious intrusions [
49];
An integrated auto-generated multi-attribute design structure matrix (DSM) and GA solution for modular design [
50];
A GA mechanism that can optimize proof-of-work (PoW) blockchains parameters [
51].
2.3. Applications for Combination of Genetic Algorithms and Neural Networks
The first attempt at combining genetic algorithms and neural networks was in the late 1980s. Since then, many researchers and scholars have published articles about applications for genetic algorithms and neural networks (GANNs). Various combined genetic algorithm and neural network approaches have been explored, including solar system performance prediction [
11], face recognition [
52], animation [
53], color recipe prediction [
54], crude fractional distillation processing [
55], and more. In fact, several combinations of genetic algorithms and neural networks have been developed by researchers. Schaffer et al. found that these GANN combinations can be categorized into two different groups: supportive combinations and collaborative combinations. NNs and GAs are applied sequentially in supportive combinations and simultaneously in collaborative combinations [
12,
56,
57]. GAs can also be used in machine learning for feature selection in recurrent neural networks and for training artificial neural networks when pre-classified examples are not available [
58].
Below are some examples of advanced applications for NN and GA combinations:
A hybrid cascade neural network and metaheuristic optimization genetic algorithm for space–time forecasting [
59];
A combined GA and back-propagation (BP) neural network for early warning systems in coal mines. [
60];
An integrated NN and GA framework for sequential semi-supervised classification [
61].
2.4. Laws of Bibliometrics
Statistical and mathematical methods are usually applied to analyze information about publications which involve three important laws of bibliometrics: Lotka’s law, Bradford’s law, and Zipf’s law. These three classical laws are the pillars of bibliometrics and are often applied in bibliometric research to test the applicability of publications [
62,
63,
64].
2.4.1. Lotka’s Law
Lotka’s law determines the productivity of authors within specified research areas. The law states that 60% of authors in a specified research area only publish one article, 15% publish two articles (1/2² × 0.60), 7% publish three articles (1/3² × 0.60), etc. When Lotka’s law is applied to several articles over a specified time period, the results are similar but not very accurate. Lotka’s law is used to assess the frequency with which authors publish articles online [
65,
66,
67]. Lotka’s law is also deemed to be quite useful for studying the productivity patterns of authors in bibliographies. In this research, we selected articles published between 2002 and 2021 and performed an author productivity check to observe the NN and GA research trends and forecasts. For verification purposes, we also applied the K–S test to check whether our results complied with Lotka’s inverse square law [
68,
69,
70].
2.4.2. Zipf’s Law
Zipf’s law is regularly described as the word frequency law and is used to identify the frequency distribution of words within articles. It states that the most frequent word in a long article is used twice as often as the second most frequent word and three times as often as the third most frequent word, etc. Zipf’s law is not statistically accurate but it is still quite valuable for bibliometric researchers [
71,
72].
2.4.3. Bradford’s Law
Bradford’s law is a universal bibliometric principle that determines the number of core journals within specified research area. Bradford’s law separates the core journals within a research area into three categories with an equal number of articles. The first category comprises the core journals on the subject, which tend to be fewer in number but produce approximately one-third of the articles. The second category comprises the same number of articles, but a larger number of journals. The third category comprises the same number of articles but a much larger number of journals. Bradford found that the relationships between the journals in the first category and those in the second and third category were proportional (1: n: n²). Bradford studied 326 geophysics journals and their bibliographies and found that 429 articles were included in 9 journals, 499 articles were included in 59 journals, and 404 articles were included in 258 journals; the proportion was 9: 59: 258, which was similar to 9: 45: 225. Based on this, it can be seen that Bradford’s law is not very precise but it is still deemed to be a general bibliometric principle by many researchers [
71,
73].
4. Conclusions
We used a bibliometric analytical technique to examine NN and GA publications in SSCI journals from 2002 to 2021. We found 951 NN publications and 878 GA publications in total. We then analyzed these publications and obtained our final research results by evaluating the following eight criteria: (1) publication year; (2) citation count; (3) country/territory; (4) institution name; (5) document type; (6) language; (7) subject area; and (8) source title. Furthermore, we also applied the Kolmogorov–Smirnov test to verify whether the distributions of author productivity in NN and GA research complied with Lotka’s law and advanced h-index analysis. In sum, this paper provides the following findings and implications:
First, according to our analysis by publication year, we found that the number of NN publications has grown faster than the number of GA publications and NN research has become much more popular. Based on the ascending trends of NN and GA publications, we predict that research in both areas will keep growing continuously. We also predict that NNs and GAs will continue to be important in artificial intelligence research for quite some time due to the third boom in AI development. Second, according to our analysis of the trends of NN and GA citations, we observed upward trends for NN and GA publications and we predict that these trends will continue in the future, as long as the AI industry is still booming.
Third, on the basis of our countries/territories analysis of NN and GA publications, we found that China and the USA contribute the most to NN and GA research and observed the significance of AI within the context of the USA–China confrontation. Iran was in third place for NN publications and Taiwan was in third place for GA publications. We also conclude that contributions from Turkey, South Korea, and Spain are growing fast and they are becoming potential competitors in NN and GA research. Fourth, according to our research institution analysis of NN and GA publications, the top NN research institution was the University of Tehran and the top GA research institution was the Islamic Azad University. Both are institutions in Iran and both published over 20 articles from 2002 to 2021. The second-place institution in NN research was the Universiti Malaya and Hong Kong Polytech University. The third-place institutions in NN and GA research were the Islamic Azad University and the Iran University of Science and Technology, respectively. Furthermore, we also observed that Iran was the most productive country for NN and GA research.
Fifth, from our analysis of document types in NN and GA research, we found that articles are still the major and most popular document type for NN and GA research publications. Sixth, from our analysis of the language used in NN and GA research and as English is the predominant scientific language that is most widely used among scientists worldwide, we concluded that English was still the number one language used in NN and GA research publications. Seventh, from our analysis, we found that computer science and engineering were the major domains of NN and GA research and that environmental sciences ecology research is catching up due to the rapid development of ESG. Furthermore, we also identified many other important and potential research domains for NN and GA publications, including business economies, mathematics, psychology, transportation, information science, library science, energy, etc. We predict that these research areas will become more popular in the future.
Eighth, we discovered that the top three NN research source titles were Sustainability, Expert Systems with Applications, and the International Journal of Environmental Research and Public Health, while the top three GA research source titles were Expert Systems with Applications, Sustainability, and the Journal of Operation Research Society. Additionally, there were many other popular research sources titles for NN and GA publications, including Neural Computing Applications, Energy Policy, PLoS ONE, Applied Soft Computing, Computational Economics, Computers Industrial Engineering, etc. We predict that these source titles will become more attractive in the near future.
Finally, we performed an advanced h-index analyses of NN and GA publications using different criteria, such as author, institution, and country to measure researcher contributions. We obtained the following results:
The average h-index value of GA publications was higher than the average h-index value of NN publications, as shown in
Table 13.
Iranian institutions contributed the most NN and GA publications, as shown in
Table 14.
The USA contributed the most to NN and GA research, with highest h-index value and the second highest quantity of NN and GA publications. In contrast, China had the second highest h-index values in NN and GA research but produced the second highest quantity of NN and GA publications.
We hope that our bibliometric analysis results will serve as a roadmap for other NN and GA researchers and act as guidelines for further studies. In the future, it is suggested to conduct similar research based on more innovative hybrid classical algorithm models and new bibliometrics methods, such as y-index analysis, frequent keywords analysis, the “life” of references cited in journals, and other advanced data-analysis techniques [
75,
76].