A Synergistic MOEA Algorithm with GANs for Complex Data Analysis
Abstract
:1. Introduction
- This paper introduces the GANs into the MOEA/D algorithm. This new algorithm combines the advantages of both MOEAs and GANs. It can utilize the simple framework of multi-objective evolution algorithms and the advantages of GANs in generating samples and high-dimensional data analysis.
- This paper introduces an adaptive entropy control strategy that includes a population entropy-based control parameter. This strategy allows us to balance the exploration and exploitation trade-off by controlling the diversity of the GAN-generated population. Effectively avoid premature convergence and population diversity issues in APG-SMOEA algorithm.
- This paper introduces a quality hybrid memory pool to store and retrieve optimal solutions. This strategy not only improves the algorithm’s ability to maintain diversity and avoid premature convergence, but also alleviates the problem of high quality data desire in GAN training. By storing the best solutions from previous generations, we can prevent the algorithm from becoming stuck in a local optima, while also allowing for the exploration of new regions of the search space.
- Our algorithm effectively mitigates the common problems of training complexity and model collapse in GANs. Experimental results on benchmarks demonstrate that our proposed algorithm outperforms existing methods in terms of solution quality and diversity on complex data with low or high dimensions.
2. Related Work
2.1. MOEAs
- Domination-based MOEAs [26], such as NSGA-II [6,27] and NSGA-III [7], usually achieve the convergence by means of Pareto-dominance principle, while maintaining the diversity of solutions via explicit diversity preservation. However, such MOEAs suffer from the slow convergence since Pareto-dominance principle has the low selection mechanism.Domination-based MOEAs can be described in following equation.The population size is N, the objective function value of the i-th individual in the population is , then the domination set of this individual is defined by Equation (1).
- Indicator-based MOEAs [28,29,30,31,32] adopt quality indicators, which define the selection mechanism as a function to evaluate approximation sets. For example, in the S-Metric Selection Evolutionary Multi-objective Algorithm (SMS-EMOA) [8], the dominated volume of an approximation set is calculated by the hypervolume indicator. Based on this, a total order among related solutions is imposed. An example of a weighted sum index calculation formula is shown in Equation (3).
- Decomposition-based MOEAs, also called MOEA/D [33,34,35], decompose a MOP into several scalar sub-problems by employing the decomposition strategy. Thus, such sub-problems are solved simultaneously by optimizing the solution of each sub-problem. To cope with the limitations of MOEA/D, some research works [14] have proposed new methods for weight vector generation. For example, the MOEA/D-AWA [9] employs an adaptive weight vector adjustment strategy to achieve better uniformity of solutions to the target MOPs. The MOEA/D’s general equation is as follows in Equation (4).In the MOEA/D algorithm, the multi-objective optimization problem needs to be decomposed into a series of single-objective optimization sub-problems, each corresponding to a reference point. For each reference point , a weight vector needs to be defined, where , and the weighted function value needs to be calculated. Then, define a sub-problem : minimize subject to . Use an evolutionary algorithm or other optimization method to solve each sub-problem, and obtain a set of non-dominated solutions . Merge all non-dominated solutions into a solution set S, and select a set of non-dominated solutions from S as the final result. Repeat the above steps until the termination condition is met.
- Hybrid MOEAs [36,37,38] often combine different algorithms in a unified framework to solve complex MOPs. For example, the Hybrid-MOEA/D-I [11] combines a differential evolutionary algorithm with a genetic algorithm to improve the performance, which effectively optimizes MOPs in wireless sensor networks. Moreover, in the multi-objective particle swarm optimization (MOPSO) [39], a novel particle swarm optimization approach can solve a class of mean-variance in portfolio selection problems, and an adaptive ranking procedure further improves the performance of the proposed algorithm.
2.2. GAN
3. The Proposed Algorithm
3.1. Problem Definition
- Challenge 1: How to synergize GAN networks?
- Challenge 2: How to solve the problem of model collapse arising from the introduction of GAN?
- Challenge 3: How to introduce an adaptive control strategy?
- Fusion of traditional arithmetic with GANs.
- Increase the generation probability of GAN networks when the population diversity decreases.
- Population entropy determines the proportion of mixed operators.
3.2. APG-SMOEA Framework
- Read the information of the dataset and set the parameters of the problem. This includes statistical information on the number of assets, returns and risks, and correlation information (covariance matrix). It is also necessary to set the initial parameters of the algorithm, such as population size, number of neighbors, etc. (Algorithm A1 line 1)
- Define the neighbors, generate the initial population, and calculate the individual and Pareto front extreme value points. (Algorithm A1 line 2–3)
- Construct the GAN network. (Algorithm A1 line 4 and Algorithm A2)
- Loop iterations; each iteration is a complete search process, including selection, variation, crossover, replacement, and other operations. (Algorithm A1 line 5–27)
- Each iteration traverses each individual in the population. This step generates new children by performing mutation and crossover operations on each individual. (Algorithm A1 line 6–26)
- Determine the mating pool of the offspring generated by the variation operator. The generated random number is compared with . If the random number is greater than , the whole population is used as the mating pool. Otherwise, the neighbors of individuals are used as the mating pool. (Algorithm A1 line 7)
- Determine the way of generation of offspring. Compare the generated random number with . If the random number is greater than , choose the polynomial variation generated offspring determined by Equation (2); otherwise, choose the way to generate offspring by GAN network training (see Algorithm A2 for related pseudo-code, in Appendix A.2). Compute the of the offspring. (Algorithm A1 line 8–12)
- Update the extreme value points of the Pareto front, compute the reference points and weight vectors to decide whether to replace the parent, and then update the population. (Algorithm A1 line 13–15)
- Set the counter, traverse the neighbors, update the population if the children are better than the parents, and exit the loop when the traversal ends or the counter reaches the upper limit. (Algorithm A1 line 16–24)
- Calculate the population entropy according to Equation (19), update the alpha, and record the new population entropy and alpha (see Algorithm A3 for the relevant pseudo-code, in Appendix A.3). (Algorithm A1 line 25 and Algorithm A3)
- Iterate through the weights of all assets in the child generation and set the negative numbers to 0.
- Sum the weights of all children s assets.
- If , all the weight vectors are scaled so that their sum is 1. Otherwise, the children satisfying the constraint are generated randomly.
3.3. Introduction of GANs
- Calculate the covariance matrix according to the mean vector . Obtain the sum of the real samples according to Equation (17),Here, N is the population size, and is the i-th member of the positive sample.
- Sample the D-dimensional vector x. Generate a D-dimensional vector , which satisfies the multivariate normal distribution according to Equation (18).
- The generator generates children based on the upper boundary l and the lower boundary u of the decision space, as shown in Equation (19).
3.4. Solving the Model Collapse Problem
3.5. Adaptive Entropy Control Strategy
4. Experiments and Analysis
4.1. Performance Metric
4.1.1. Performance Indicators
- GD: This metric measures the distance between the solution set P found by the algorithm and the reference optimal solution set of the problem set. It reflects how much the solution set deviates from the true optimal bound. The larger the value, the further the deviation from the true optimal bound and the worse the convergence. The mathematical formula is as follows.
- IGD: The average of the distance from each reference point to the nearest solution. The solution set with smaller IGD values is better. It can respond well to the uniformity and extensiveness of the distribution in addition to the convergence of the solution set. The smaller the IGD value, the better the diversity and convergence. The mathematical formula is as follows.
- Spacing: This measures the standard deviation of the minimum distance of each solution to the other solutions. The smaller the Spacing value is, the more uniform the solution set is. The mathematical formula is as follows.
- Delta: A measure of the breadth of the solution set obtained, i.e., the diversity of the solution set. The smaller the Delta value, the more diverse the solutions in the Pareto frontier set. Its mathematical formula is as follows.The parameters and are the Euclidean distances between the extreme solutions and the boundary solutions of the obtained non-dominated set.
- HV: The volume of the region in the target space enclosed by the set of non-dominated solutions and reference points obtained by the algorithm. HV is a comprehensive metric to assess the convergence and diversity of the approximate solution set. The larger the HV value, the better the comprehensive performance of the algorithm. The mathematical formula is as follows.
- MS: This measures the maximum distance between the solutions in all solution sets. Specifically, it calculates the distance between the solutions in all solution sets and returns the maximum distance as the index value. This metric can help assess the search capability and diversity of the algorithm, as the algorithm is able to search for more solutions, and the distance between these solutions should be greater. Its mathematical formula is as follows.
4.1.2. Dataset
4.1.3. Comparison Algorithm
4.2. Parameter Configurations
4.3. Experimental Results and Analysis
4.3.1. Experiment: Comparison Analysis of the APG-SMOEA Algorithm and Other Algorithms
4.3.2. Experiment II: Ablation Experiments
4.3.3. Experiment III: Comparative Analysis of Different Parameters of APG-SMOEA Algorithms
4.3.4. Experiment IV: APG-SMOEA Algorithm Improvement Exploration
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. The Algorithms
Appendix A.1. APG-SMOEA Algorithm Framework
Algorithm A1 APG-SMOEA algorithm framework |
Input: T is the maximum number of iterations, is the maximum number of updates of the offspring; Initialize: T = 1500, = 2;
|
Appendix A.2. GAN Networks Generate Offspring
Algorithm A2 GAN networks generate offspring |
Input: is the upper limit of the child selection, period is the training period of GAN network, t is the population iteration times. Initialize: = 20, = 100.
|
Appendix A.3. Update Alpha
Algorithm A3 Update alpha |
Input: E is the entropy of the population, is the ratio of mixed operators, is the scaling factor of . Initialize: = 0.15, = 1.1.
|
Appendix B. The Tables
Appendix B.1. Tables in Experiment 1
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | APG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 5.27 × 10 | 6.32 × 10 | 7.01 × 10 | 2.84 × 10 | 7.38 × 10 | 4.58 × 10 | |
GD | Median | 7.05 × 10 | 9.58 × 10 | 1.83 × 10 | 5.05 × 10 | 9.25 × 10 | 7.98 × 10 |
Std. | 7.28 × 10 | 2.41 × 10 | 1.55 × 10 | 1.60 × 10 | 9.50 × 10 | 2.61 × 10 | |
Best | 1.71 × 10 | 1.38 × 10 | 9.87 × 10 | 1.14 × 10 | 2.39 × 10 | 1.53 × 10 | |
Spacing | Median | 2.07 × 10 | 2.01 × 10 | 1.72 × 10 | 1.97 × 10 | 2.99 × 10 | 1.12 × 10 |
Std. | 3.76 × 10 | 4.54 × 10 | 3.34 × 10 | 5.10 × 10 | 2.24 × 10 | 1.20 × 10 | |
Best | 5.96 × 10 | 5.79 × 10 | 5.74 × 10 | 5.47 × 10 | 5.67 × 10 | 2.27 × 10 | |
Max Spread | Median | 5.56 × 10 | 5.40 × 10 | 5.15 × 10 | 4.91 × 10 | 5.45 × 10 | 1.96 × 10 |
Std. | 1.59 × 10 | 1.93 × 10 | 5.08 × 10 | 4.19 × 10 | 1.69 × 10 | 6.28 × 10 | |
Best | 4.01 × 10 | 4.25 × 10 | 4.21 × 10 | 4.27 × 10 | 5.47 × 10 | 6.60 × 10 | |
Delta | Median | 4.33 × 10 | 4.72 × 10 | 4.51 × 10 | 5.05 × 10 | 6.06 × 10 | 8.78 × 10 |
Std. | 2.02 × 10 | 3.06 × 10 | 6.30 × 10 | 6.88 × 10 | 3.34 × 10 | 1.16 × 10 | |
Best | 2.30 × 10 | 2.90 × 10 | 4.26 × 10 | 4.07 × 10 | 3.22 × 10 | 7.59 × 10 | |
IGD | Median | 3.71 × 10 | 5.31 × 10 | 9.09 × 10 | 8.76 × 10 | 4.74 × 10 | 1.08 × 10 |
Std. | 1.14 × 10 | 2.16 × 10 | 1.47 × 10 | 4.33 × 10 | 1.38 × 10 | 3.65 × 10 | |
Best | 1.96 × 10 | 1.96 × 10 | 1.96 × 10 | 1.96 × 10 | 1.96 × 10 | 3.93 × 10 | |
HV | Median | 1.96 × 10 | 1.96 × 10 | 1.96 × 10 | 1.92 × 10 | 1.96 × 10 | 3.67 × 10 |
Std. | 1.08 × 10 | 2.50 × 10 | 1.43 × 10 | 8.49 × 10 | 1.45 × 10 | 8.88 × 10 |
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | APG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 9.36 × 10 | 1.13 × 10 | 1.27 × 10 | 3.20 × 10 | 1.01 × 10 | 7.92 × 10 | |
GD | Median | 1.21 × 10 | 1.75 × 10 | 3.92 × 10 | 5.02 × 10 | 1.29 × 10 | 6.76 × 10 |
Std. | 1.55 × 10 | 3.45 × 10 | 7.64 × 10 | 1.86 × 10 | 1.42 × 10 | 1.98 × 10 | |
Best | 2.25 × 10 | 2.01 × 10 | 1.72 × 10 | 1.77 × 10 | 2.51 × 10 | 2.75 × 10 | |
Spacing | Median | 2.66 × 10 | 2.76 × 10 | 2.31 × 10 | 2.35 × 10 | 3.56 × 10 | 9.55 × 10 |
Std. | 5.45 × 10 | 5.26 × 10 | 4.78 × 10 | 5.06 × 10 | 3.71 × 10 | 8.21 × 10 | |
Best | 7.70 × 10 | 7.86 × 10 | 7.60 × 10 | 6.87 × 10 | 7.29 × 10 | 1.99 × 10 | |
Max Spread | Median | 7.41 × 10 | 7.55 × 10 | 7.23 × 10 | 5.82 × 10 | 6.57 × 10 | 1.62 × 10 |
Std. | 1.30 × 10 | 1.61 × 10 | 2.53 × 10 | 4.35 × 10 | 3.52 × 10 | 4.82 × 10 | |
Best | 3.29 × 10 | 3.30 × 10 | 3.29 × 10 | 4.22 × 10 | 5.37 × 10 | 6.61 × 10 | |
Delta | Median | 3.53 × 10 | 3.74 × 10 | 3.66 × 10 | 5.57 × 10 | 6.48 × 10 | 8.59 × 10 |
Std. | 1.77 × 10 | 3.37 × 10 | 2.41 × 10 | 3.88 × 10 | 3.69 × 10 | 1.34 × 10 | |
Best | 3.21 × 10 | 3.30 × 10 | 3.64 × 10 | 4.88 × 10 | 4.08 × 10 | 1.73 × 10 | |
IGD | Median | 4.15 × 10 | 4.35 × 10 | 7.12 × 10 | 1.36 × 10 | 7.08 × 10 | 1.28 × 10 |
Std. | 5.77 × 10 | 7.65 × 10 | 4.16 × 10 | 4.13 × 10 | 2.21 × 10 | 3.59 × 10 | |
Best | 1.83 × 10 | 1.83 × 10 | 1.83 × 10 | 1.82 × 10 | 1.82 × 10 | 3.90 × 10 | |
HV | Median | 1.83 × 10 | 1.83 × 10 | 1.83 × 10 | 1.69 × 10 | 1.74 × 10 | 3.65 × 10 |
Std. | 1.75 × 10 | 3.47 × 10 | 4.10 × 10 | 6.01 × 10 | 4.32 × 10 | 7.31 × 10 |
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | APG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 5.32 × 10 | 5.10 × 10 | 5.93 × 10 | 9.95 × 10 | 2.54 × 10 | 6.13 × 10 | |
GD | Median | 7.36 × 10 | 8.06 × 10 | 1.45 × 10 | 3.32 × 10 | 4.24 × 10 | 8.54 × 10 |
Std. | 9.19 × 10 | 1.66 × 10 | 7.09 × 10 | 2.08 × 10 | 2.01 × 10 | 3.76 × 10 | |
Best | 1.42 × 10 | 1.28 × 10 | 5.99 × 10 | 0.00 × 10 | 1.05 × 10 | 1.38 × 10 | |
Spacing | Median | 1.79 × 10 | 2.08 × 10 | 1.15 × 10 | 1.92 × 10 | 1.45 × 10 | 2.03 × 10 |
Std. | 4.87 × 10 | 5.77 × 10 | 4.53 × 10 | 9.72 × 10 | 1.87 × 10 | 3.84 × 10 | |
Best | 4.17 × 10 | 4.23 × 10 | 4.29 × 10 | 2.63 × 10 | 3.36 × 10 | 1.16 × 10 | |
Max Spread | Median | 3.93 × 10 | 3.94 × 10 | 2.96 × 10 | 2.20 × 10 | 2.88 × 10 | 4.01 × 10 |
Std. | 1.21 × 10 | 5.39 × 10 | 5.48 × 10 | 4.65 × 10 | 2.54 × 10 | 1.15 × 10 | |
Best | 3.87 × 10 | 3.99 × 10 | 3.17 × 10 | 8.40 × 10 | 6.09 × 10 | 3.99 × 10 | |
Delta | Median | 4.42 × 10 | 4.81 × 10 | 5.58 × 10 | 9.34 × 10 | 6.81 × 10 | 4.55 × 10 |
Std. | 6.09 × 10 | 9.05 × 10 | 1.00 × 10 | 3.55 × 10 | 2.94 × 10 | 1.42 × 10 | |
Best | 1.72 × 10 | 1.90 × 10 | 7.90 × 10 | 1.77 × 10 | 4.64 × 10 | 1.84 × 10 | |
IGD | Median | 2.47 × 10 | 2.73 × 10 | 2.23 × 10 | 2.41 × 10 | 9.69 × 10 | 2.75 × 10 |
Std. | 7.05 × 10 | 4.09 × 10 | 6.95 × 10 | 5.29 × 10 | 3.67 × 10 | 2.84 × 10 | |
Best | 2.06 × 10 | 2.06 × 10 | 2.05 × 10 | 1.98 × 10 | 1.99 × 10 | 3.85 × 10 | |
HV | Median | 2.06 × 10 | 2.06 × 10 | 1.98 × 10 | 1.86 × 10 | 1.91 × 10 | 2.06 × 10 |
Std. | 1.06 × 10 | 2.12 × 10 | 7.70 × 10 | 2.64 × 10 | 3.08 × 10 | 2.69 × 10 |
Appendix B.2. Tables in Experiment 3
Metric | APG-SMOEA-1 | APG-SMOEA-2 | APG-SMOEA-3 | APG-SMOEA-4 | |
---|---|---|---|---|---|
Best | 1.36 × 10 | 7.05 × 10 | 2.49 × 10 | 1.53 × 10 | |
GD | Median | 4.11 × 10 | 4.20 × 10 | 3.96 × 10 | 4.11 × 10 |
Std. | 8.97 × 10 | 1.06 × 10 | 1.70 × 10 | 1.18 × 10 | |
Best | 1.12 × 10 | 5.63 × 10 | 4.45 × 10 | 5.17 × 10 | |
Spacing | Median | 6.81 × 10 | 6.86 × 10 | 5.95 × 10 | 5.66 × 10 |
Std. | 5.86 × 10 | 4.78 × 10 | 5.70 × 10 | 6.72 × 10 | |
Best | 1.09 × 10 | 1.09 × 10 | 1.09 × 10 | 1.09 × 10 | |
Max Spread | Median | 1.04 × 10 | 1.05 × 10 | 1.06 × 10 | 1.05 × 10 |
Std. | 1.68 × 10 | 2.39 × 10 | 3.91 × 10 | 2.85 × 10 | |
Best | 7.12 × 10 | 6.11 × 10 | 4.48 × 10 | 6.10 × 10 | |
Delta | Median | 9.48 × 10 | 9.48 × 10 | 9.49 × 10 | 9.12 × 10 |
Std. | 1.51 × 10 | 1.67 × 10 | 1.98 × 10 | 2.05 × 10 | |
Best | 1.72 × 10 | 3.94 × 10 | 3.81 × 10 | 3.82 × 10 | |
IGD | Median | 6.38 × 10 | 6.21 × 10 | 5.97 × 10 | 6.43 × 10 |
Std. | 7.90 × 10 | 1.31 × 10 | 1.80 × 10 | 1.40 × 10 | |
Best | 8.35 × 10 | 8.43 × 10 | 8.91 × 10 | 8.32 × 10 | |
HV | Median | 7.70 × 10 | 7.66 × 10 | 8.24 × 10 | 7.65 × 10 |
Std. | 7.76 × 10 | 1.36 × 10 | 2.38 × 10 | 1.66 × 10 |
Metric | APG-SMOEA-1 | APG-SMOEA-2 | APG-SMOEA-3 | APG-SMOEA-4 | |
---|---|---|---|---|---|
Best | 9.40 × 10 | 7.15 × 10 | 2.31 × 10 | 8.79 × 10 | |
GD | Median | 3.23 × 10 | 3.14 × 10 | 7.81 × 10 | 2.92 × 10 |
Std. | 1.37 × 10 | 2.34 × 10 | 1.90 × 10 | 1.02 × 10 | |
Best | 1.33 × 10 | 1.41 × 10 | 1.90 × 10 | 1.49 × 10 | |
Spacing | Median | 2.31 × 10 | 2.30 × 10 | 2.70 × 10 | 2.33 × 10 |
Std. | 6.86 × 10 | 6.64 × 10 | 5.25 × 10 | 5.28 × 10 | |
Best | 3.37 × 10 | 9.25 × 10 | 2.43 × 10 | 1.88 × 10 | |
Max Spread | Median | 4.24 × 10 | 4.27 × 10 | 4.27 × 10 | 4.26 × 10 |
Std. | 4.21 × 10 | 7.25 × 10 | 4.44 × 10 | 2.58 × 10 | |
Best | 3.87 × 10 | 3.93 × 10 | 4.35 × 10 | 4.13 × 10 | |
Delta | Median | 5.11 × 10 | 5.07 × 10 | 6.10 × 10 | 5.26 × 10 |
Std. | 1.35 × 10 | 1.15 × 10 | 2.11 × 10 | 1.50 × 10 | |
Best | 2.08 × 10 | 2.02 × 10 | 4.69 × 10 | 2.11 × 10 | |
IGD | Median | 3.18 × 10 | 2.58 × 10 | 1.18 × 10 | 3.00 × 10 |
Std. | 4.79 × 10 | 2.80 × 10 | 3.70 × 10 | 2.90 × 10 | |
Best | 4.22 × 10 | 2.44 × 10 | 4.25 × 10 | 4.04 × 10 | |
HV | Median | 2.07 × 10 | 2.07 × 10 | 2.07 × 10 | 2.06 × 10 |
Std. | 3.51 × 10 | 5.27 × 10 | 7.05 × 10 | 4.58 × 10 |
Metric | APG-SMOEA-2 | APG-SMOEA-5 | APG-SMOEA-6 | APG-SMOEA-7 | |
---|---|---|---|---|---|
Best | 7.05 × 10 | 1.59 × 10 | 1.95 × 10 | 1.18 × 10 | |
GD | Median | 4.20 × 10 | 4.22 × 10 | 3.87 × 10 | 3.91 × 10 |
Std. | 1.06 × 10 | 1.24 × 10 | 1.35 × 10 | 1.39 × 10 | |
Best | 5.63 × 10 | 5.81 × 10 | 1.21 × 10 | 6.66 × 10 | |
Spacing | Median | 6.86 × 10 | 6.26 × 10 | 5.87 × 10 | 7.12 × 10 |
Std. | 4.78 × 10 | 5.52 × 10 | 5.09 × 10 | 6.87 × 10 | |
Best | 1.09 × 10 | 1.09 × 10 | 1.09 × 10 | 1.09 × 10 | |
Max Spread | Median | 1.05 × 10 | 1.05 × 10 | 1.02 × 10 | 9.93 × 10 |
Std. | 2.39 × 10 | 2.76 × 10 | 2.98 × 10 | 3.09 × 10 | |
Best | 6.11 × 10 | 6.41 × 10 | 7.58 × 10 | 5.78 × 10 | |
Delta | Median | 9.48 × 10 | 9.06 × 10 | 9.24 × 10 | 9.88 × 10 |
Std. | 1.67 × 10 | 1.72 × 10 | 1.40 × 10 | 2.02 × 10 | |
Best | 3.94 × 10 | 4.11 × 10 | 9.90 × 10 | 3.58 × 10 | |
IGD | Median | 6.21 × 10 | 6.43 × 10 | 6.01 × 10 | 5.98 × 10 |
Std. | 1.31 × 10 | 1.36 × 10 | 1.32 × 10 | 1.63 × 10 | |
Best | 8.43 × 10 | 8.26 × 10 | 8.61 × 10 | 8.26 × 10 | |
HV | Median | 7.66 × 10 | 7.65 × 10 | 7.83 × 10 | 7.61 × 10 |
Std. | 1.36 × 10 | 1.47 × 10 | 1.59 × 10 | 1.82 × 10 |
Metric | APG-SMOEA-2 | APG-SMOEA-5 | APG-SMOEA-6 | APG-SMOEA-7 | |
---|---|---|---|---|---|
Best | 7.15 × 10 | 7.15 × 10 | 1.80 × 10 | 9.92 × 10 | |
GD | Median | 3.14 × 10 | 2.58 × 10 | 4.04 × 10 | 3.30 × 10 |
Std. | 2.34 × 10 | 8.61 × 10 | 5.25 × 10 | 1.98 × 10 | |
Best | 1.41 × 10 | 1.57 × 10 | 1.73 × 10 | 1.63 × 10 | |
Spacing | Median | 2.30 × 10 | 2.22 × 10 | 2.34 × 10 | 2.27 × 10 |
Std. | 6.64 × 10 | 2.87 × 10 | 2.33 × 10 | 3.43 × 10 | |
Best | 9.25 × 10 | 2.23 × 10 | 1.55 × 10 | 3.89 × 10 | |
Max Spread | Median | 4.27 × 10 | 4.28 × 10 | 4.22 × 10 | 4.26 × 10 |
Std. | 7.25 × 10 | 2.56 × 10 | 2.11 × 10 | 6.21 × 10 | |
Best | 3.93 × 10 | 3.87 × 10 | 4.24 × 10 | 4.02 × 10 | |
Delta | Median | 5.07 × 10 | 4.99 × 10 | 5.14 × 10 | 5.05 × 10 |
Std. | 1.15 × 10 | 9.68 × 10 | 1.17 × 10 | 1.40 × 10 | |
Best | 2.02 × 10 | 1.96 × 10 | 2.24 × 10 | 1.81 × 10 | |
IGD | Median | 2.58 × 10 | 2.44 × 10 | 3.69 × 10 | 2.61 × 10 |
Std. | 2.80 × 10 | 1.62 × 10 | 3.99 × 10 | 2.57 × 10 | |
Best | 1.24 × 10 | 2.05 × 10 | 3.85 × 10 | 2.96 × 10 | |
HV | Median | 1.00 × 10 | 1.00 × 10 | 1.00 × 10 | 1.00 × 10 |
Std. | 3.29 × 10 | 1.56 × 10 | 3.98 × 10 | 3.52 × 10 |
Appendix B.3. Tables in Experiment 4
Metric | APG-SMOEA-4 | APG-SMOEA-8 | APG-SMOEA-9 | APG-SMOEA-10 | APG-SMOEA-11 | |
---|---|---|---|---|---|---|
Best | 8.79 × 10 | 1.30 × 10 | 1.67 × 10 | 1.07 × 10 | 1.01 × 10 | |
GD | Median | 2.92 × 10 | 3.97 × 10 | 3.96 × 10 | 2.36 × 10 | 2.86 × 10 |
Std. | 1.02 × 10 | 1.98 × 10 | 1.12 × 10 | 1.28 × 10 | 2.55 × 10 | |
Best | 1.49 × 10 | 1.99 × 10 | 1.82 × 10 | 1.25 × 10 | 7.15 × 10 | |
Spacing | Median | 2.33 × 10 | 2.28 × 10 | 2.30 × 10 | 2.31 × 10 | 2.27 × 10 |
Std. | 5.28 × 10 | 1.41 × 10 | 9.78 × 10 | 3.50 × 10 | 9.03 × 10 | |
Best | 1.88 × 10 | 4.71 × 10 | 3.01 × 10 | 2.30 × 10 | 4.93 × 10 | |
MS | Median | 4.26 × 10 | 4.23 × 10 | 4.24 × 10 | 4.27 × 10 | 4.27 × 10 |
Std. | 2.58 × 10 | 6.23 × 10 | 3.80 × 10 | 3.06 × 10 | 2.99 × 10 | |
Best | 4.13 × 10 | 4.19 × 10 | 4.11 × 10 | 4.24 × 10 | 3.93 × 10 | |
Delta | Median | 5.26 × 10 | 5.16 × 10 | 5.35 × 10 | 5.20 × 10 | 5.04 × 10 |
Std. | 1.50 × 10 | 1.31 × 10 | 1.21 × 10 | 9.23 × 10 | 1.09 × 10 | |
Best | 2.11 × 10 | 1.90 × 10 | 2.21 × 10 | 2.16 × 10 | 1.90 × 10 | |
IGD | Median | 3.00 × 10 | 3.89 × 10 | 3.79 × 10 | 2.72 × 10 | 2.62 × 10 |
Std. | 2.90 × 10 | 3.74 × 10 | 6.33 × 10 | 2.67 × 10 | 5.84 × 10 | |
Best | 5.82 × 10 | 7.28 × 10 | 6.30 × 10 | 7.48 × 10 | 3.19 × 10 | |
HV | Median | 2.88 × 10 | 2.88 × 10 | 2.88 × 10 | 2.88 × 10 | 2.88 × 10 |
Std. | 6.68 × 10 | 7.25 × 10 | 6.27 × 10 | 7.46 × 10 | 6.00 × 10 |
Metric | APG-SMOEA-5 | APG-SMOEA-12 | APG-SMOEA-13 | APG-SMOEA-14 | |
---|---|---|---|---|---|
Best | 5.17 × 10 | 8.76 × 10 | 2.95 × 10 | 1.13 × 10 | |
GD | Median | 8.12 × 10 | 9.48 × 10 | 9.55 × 10 | 8.26 × 10 |
Std. | 1.72 × 10 | 1.52 × 10 | 1.78 × 10 | 2.43 × 10 | |
Best | 9.19 × 10 | 8.27 × 10 | 2.41 × 10 | 2.73 × 10 | |
Spacing | Median | 1.06 × 10 | 1.22 × 10 | 1.47 × 10 | 1.14 × 10 |
Std. | 8.64 × 10 | 8.80 × 10 | 2.55 × 10 | 9.52 × 10 | |
Best | 2.28 × 10 | 2.41 × 10 | 2.41 × 10 | 2.40 × 10 | |
MS | Median | 2.02 × 10 | 2.39 × 10 | 2.40 × 10 | 2.07 × 10 |
Std. | 3.96 × 10 | 3.48 × 10 | 2.97 × 10 | 5.94 × 10 | |
Best | 6.67 × 10 | 6.98 × 10 | 7.03 × 10 | 6.63 × 10 | |
Delta | Median | 8.31 × 10 | 8.35 × 10 | 9.13 × 10 | 8.51 × 10 |
Std. | 1.00 × 10 | 1.45 × 10 | 1.94 × 10 | 1.29 × 10 | |
Best | 3.11 × 10 | 5.53 × 10 | 7.23 × 10 | 1.23 × 10 | |
IGD | Median | 1.15 × 10 | 1.36 × 10 | 1.30 × 10 | 1.09 × 10 |
Std. | 2.65 × 10 | 2.08 × 10 | 2.13 × 10 | 3.07 × 10 | |
Best | 3.84 × 10 | 3.95 × 10 | 3.97 × 10 | 3.99 × 10 | |
HV | Median | 3.67 × 10 | 3.74 × 10 | 3.67 × 10 | 3.69 × 10 |
Std. | 3.94 × 10 | 2.61 × 10 | 3.30 × 10 | 6.40 × 10 |
Metric | APG-SMOEA-5 | APG-SMOEA-12 | APG-SMOEA-13 | APG-SMOEA-14 | |
---|---|---|---|---|---|
Best | 7.32 × 10 | 2.11 × 10 | 2.51 × 10 | 8.33 × 10 | |
GD | Median | 6.11 × 10 | 6.83 × 10 | 7.12 × 10 | 5.06 × 10 |
Std. | 1.63 × 10 | 1.43 × 10 | 2.16 × 10 | 2.25 × 10 | |
Best | 1.64 × 10 | 1.15 × 10 | 4.84 × 10 | 2.81 × 10 | |
Spacing | Median | 1.03 × 10 | 1.11 × 10 | 1.01 × 10 | 9.66 × 10 |
Std. | 6.55 × 10 | 9.92 × 10 | 8.99 × 10 | 1.26 × 10 | |
Best | 1.92 × 10 | 1.92 × 10 | 1.92 × 10 | 1.91 × 10 | |
MS | Median | 1.61 × 10 | 1.91 × 10 | 1.90 × 10 | 1.50 × 10 |
Std. | 3.71 × 10 | 3.05 × 10 | 2.71 × 10 | 5.57 × 10 | |
Best | 8.16 × 10 | 7.76 × 10 | 7.43 × 10 | 8.18 × 10 | |
Delta | Median | 9.61 × 10 | 9.25 × 10 | 9.78 × 10 | 9.56 × 10 |
Std. | 9.06 × 10 | 1.60 × 10 | 2.01 × 10 | 1.18 × 10 | |
Best | 2.71 × 10 | 5.05 × 10 | 4.80 × 10 | 2.24 × 10 | |
IGD | Median | 7.08 × 10 | 7.18 × 10 | 7.30 × 10 | 7.14 × 10 |
Std. | 1.09 × 10 | 1.02 × 10 | 6.81 × 10 | 1.55 × 10 | |
Best | 1.81 × 10 | 2.03 × 10 | 1.86 × 10 | 1.97 × 10 | |
HV | Median | 1.59 × 10 | 1.70 × 10 | 1.66 × 10 | 1.67 × 10 |
Std. | 1.72 × 10 | 2.64 × 10 | 3.04 × 10 | 3.62 × 10 |
References
- Liu, C.; Wu, S.; Li, R.; Jiang, D.; Wong, H.S. Self-supervised graph completion for incomplete multi-view clustering. IEEE Trans. Knowl. Data Eng. 2023, 35, 9394–9406. [Google Scholar] [CrossRef]
- Ivanytska, A.; Zubyk, L.; Ivanov, D.; Domracheva, K. Study of Methods of Complex Data Analysis that Based on Machine Learning Technologies. In Proceedings of the 2019 IEEE International Conference on Advanced Trends in Information Theory (ATIT), Kyiv, Ukraine, 18–20 December 2019; pp. 332–335. [Google Scholar]
- Li, S. A supply chain finance game model with order-to-factoring under blockchain. Syst. Eng. Theory Pract. 2023, 43, 3570–3586. [Google Scholar]
- Lakhina, U.; Badruddin, N.; Elamvazuthi, I.; Jangra, A.; Huy, T.H.B.; Guerrero, J.M. An Enhanced Multi-Objective Optimizer for Stochastic Generation Optimization in Islanded Renewable Energy Microgrids. Mathematics 2023, 11, 2079. [Google Scholar] [CrossRef]
- Guerrero, M.; Gil, C.; Montoya, F.G.; Alcayde, A.; Banos, R. Multi-objective evolutionary algorithms to find community structures in large networks. Mathematics 2020, 8, 2048. [Google Scholar] [CrossRef]
- Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef]
- Deb, K.; Jain, H. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: Solving problems with box constraints. IEEE Trans. Evol. Comput. 2013, 18, 577–601. [Google Scholar] [CrossRef]
- Beume, N.; Naujoks, B.; Emmerich, M. SMS-EMOA: Multiobjective selection based on dominated hypervolume. Eur. J. Oper. Res. 2007, 181, 1653–1669. [Google Scholar] [CrossRef]
- Qi, Y.; Ma, X.; Liu, F.; Jiao, L.; Sun, J.; Wu, J. MOEA/D with adaptive weight adjustment. Evol. Comput. 2014, 22, 231–264. [Google Scholar] [CrossRef]
- Xu, H.; Zeng, W.; Zhang, D.; Zeng, X. MOEA/HD: A multiobjective evolutionary algorithm based on hierarchical decomposition. IEEE Trans. Cybern. 2017, 49, 517–526. [Google Scholar] [CrossRef]
- Xu, Y.; Ding, O.; Qu, R.; Li, K. Hybrid multi-objective evolutionary algorithms based on decomposition for wireless sensor network coverage optimization. Appl. Soft Comput. 2018, 68, 268–282. [Google Scholar] [CrossRef]
- He, C.; Huang, S.; Cheng, R.; Tan, K.C.; Jin, Y. Evolutionary multiobjective optimization driven by generative adversarial networks (GANs). IEEE Trans. Cybern. 2020, 51, 3129–3142. [Google Scholar] [CrossRef] [PubMed]
- Qian, W.; Liu, J.; Lin, Y.; Yang, L.; Zhang, J.; Xu, H.; Liao, M.; Chen, Y.; Chen, Y.; Liu, B. An improved MOEA/D algorithm for complex data analysis. Wirel. Commun. Mob. Comput. 2021, 2021, 6393638. [Google Scholar] [CrossRef]
- Xu, M.; Zhang, M.; Cai, X.; Zhang, G. Adaptive neighbourhood size adjustment in MOEA/D-DRA. Int. J. Bio-Inspired Comput. 2021, 17, 14–23. [Google Scholar] [CrossRef]
- Xu, H.; Xue, B.; Zhang, M. A duplication analysis-based evolutionary algorithm for biobjective feature selection. IEEE Trans. Evol. Comput. 2020, 25, 205–218. [Google Scholar] [CrossRef]
- Xu, H.; Zeng, W.; Zeng, X.; Yen, G.G. An evolutionary algorithm based on Minkowski distance for many-objective optimization. IEEE Trans. Cybern. 2018, 49, 3968–3979. [Google Scholar] [CrossRef]
- Xu, H.; Xue, B.; Zhang, M. Segmented initialization and offspring modification in evolutionary algorithms for bi-objective feature selection. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference, Cancún, Mexico, 8–12 July 2020; pp. 444–452. [Google Scholar]
- Xue, Y.; Cai, X.; Neri, F. A multi-objective evolutionary algorithm with interval based initialization and self-adaptive crossover operator for large-scale feature selection in classification. Appl. Soft Comput. 2022, 127, 109420. [Google Scholar] [CrossRef]
- He, Y.; Aranha, C. Solving portfolio optimization problems using MOEA/D and levy flight. Adv. Data Sci. Adapt. Anal. 2020, 12, 2050005. [Google Scholar] [CrossRef]
- Zhang, C.; Peng, Y. Stacking VAE and GAN for context-aware text-to-image generation. In Proceedings of the 2018 IEEE Fourth International Conference on Multimedia Big Data (BigMM), Xi’an, China, 13–16 September 2018; pp. 1–5. [Google Scholar]
- Shiotani, M.; Iguchi, S.; Yamaguchi, K. Research on data augmentation for vital data using conditional GAN. In Proceedings of the 2022 IEEE 11th Global Conference on Consumer Electronics (GCCE), Osaka, Japan, 18–21 October 2022; pp. 344–345. [Google Scholar]
- Yang, Y.; Wang, C.; Lin, L. Regional Style Transfer Based on Partial Convolution Generative Adversarial Network. In Proceedings of the 2020 Chinese Automation Congress (CAC), Shanghai, China, 6–8 November 2020; pp. 5234–5239. [Google Scholar]
- Leung, M.F.; Wang, J. A collaborative neurodynamic approach to multiobjective optimization. IEEE Trans. Neural Netw. Learn. Syst. 2018, 29, 5738–5748. [Google Scholar] [CrossRef]
- Wang, Z.; Yao, S.; Li, G.; Zhang, Q. Multiobjective Combinatorial Optimization Using a Single Deep Reinforcement Learning Model. IEEE Trans. Cybern. 2023; in press. [Google Scholar]
- Huang, W.; Zhang, Y.; Li, L. Survey on multi-objective evolutionary algorithms. J. Phys. Conf. Ser. 2019, 1288, 012057. [Google Scholar] [CrossRef]
- Farina, M.; Amato, P. On the optimal solution definition for many-criteria optimization problems. In Proceedings of the 2002 Annual Meeting of the North American Fuzzy Information Processing Society Proceedings, NAFIPS-FLINT 2002 (Cat. No. 02TH8622), New Orleans, LA, USA, 27–29 June 2002; pp. 233–238. [Google Scholar]
- El-Nemr, M.; Afifi, M.; Rezk, H.; Ibrahim, M. Finite element based overall optimization of switched reluctance motor using multi-objective genetic algorithm (NSGA-II). Mathematics 2021, 9, 576. [Google Scholar] [CrossRef]
- Yu, G.; Chai, T.; Luo, X. Two-level production plan decomposition based on a hybrid MOEA for mineral processing. IEEE Trans. Autom. Sci. Eng. 2012, 10, 1050–1071. [Google Scholar] [CrossRef]
- Tian, Y.; Cheng, R.; Zhang, X.; Cheng, F.; Jin, Y. An indicator-based multiobjective evolutionary algorithm with reference point adaptation for better versatility. IEEE Trans. Evol. Comput. 2017, 22, 609–622. [Google Scholar] [CrossRef]
- Sun, Y.; Yen, G.G.; Yi, Z. IGD indicator-based evolutionary algorithm for many-objective optimization problems. IEEE Trans. Evol. Comput. 2018, 23, 173–187. [Google Scholar] [CrossRef]
- Xu, H.; Zeng, W.; Zeng, X.; Yen, G.G. A polar-metric-based evolutionary algorithm. IEEE Trans. Cybern. 2020, 51, 3429–3440. [Google Scholar] [CrossRef] [PubMed]
- Ravber, M.; Mernik, M.; Črepinšek, M. The impact of quality indicators on the rating of multi-objective evolutionary algorithms. Appl. Soft Comput. 2017, 55, 265–275. [Google Scholar] [CrossRef]
- Zhang, Q.; Li, H. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 2007, 11, 712–731. [Google Scholar] [CrossRef]
- Wang, Z.; Zhang, Q.; Zhou, A.; Gong, M.; Jiao, L. Adaptive replacement strategies for MOEA/D. IEEE Trans. Cybern. 2015, 46, 474–486. [Google Scholar] [CrossRef]
- Wang, W.X.; Li, K.S.; Tao, X.Z.; Gu, F.H. An improved MOEA/D algorithm with an adaptive evolutionary strategy. Inf. Sci. 2020, 539, 1–15. [Google Scholar] [CrossRef]
- Falcón-Cardona, J.G.; Coello, C.A.C. Indicator-based multi-objective evolutionary algorithms: A comprehensive survey. ACM Comput. Surv. (CSUR) 2020, 53, 1–35. [Google Scholar] [CrossRef]
- Lotfi, S.; Karimi, F. A Hybrid MOEA/D-TS for solving multi-objective problems. J. AI Data Min. 2017, 5, 183–195. [Google Scholar]
- Abdi, Y.; Feizi-Derakhshi, M.R. Hybrid multi-objective evolutionary algorithm based on search manager framework for big data optimization problems. Appl. Soft Comput. 2020, 87, 105991. [Google Scholar] [CrossRef]
- Silva, Y.L.T.; Herthel, A.B.; Subramanian, A. A multi-objective evolutionary algorithm for a class of mean-variance portfolio selection problems. Expert Syst. Appl. 2019, 133, 225–241. [Google Scholar] [CrossRef]
- Jabbar, A.; Li, X.; Omar, B. A survey on generative adversarial networks: Variants, applications, and training. ACM Comput. Surv. (CSUR) 2021, 54, 1–49. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
- Gui, J.; Sun, Z.; Wen, Y.; Tao, D.; Ye, J. A review on generative adversarial networks: Algorithms, theory, and applications. IEEE Trans. Knowl. Data Eng. 2021, 35, 3313–3332. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. Adv. Neural Inf. Process. Syst. 2014, 27. [Google Scholar]
- Goodfellow, I.J. On distinguishability criteria for estimating generative models. arXiv 2014, arXiv:1412.6515. [Google Scholar]
- Brophy, E.; Wang, Z.; She, Q.; Ward, T. Generative adversarial networks in time series: A systematic literature review. ACM Comput. Surv. 2023, 55, 1–31. [Google Scholar] [CrossRef]
- Aggarwal, A.; Mittal, M.; Battineni, G. Generative adversarial network: An overview of theory and applications. Int. J. Inf. Manag. Data Insights 2021, 1, 100004. [Google Scholar] [CrossRef]
- Dong, H.; Yu, S.; Wu, C.; Guo, Y. Semantic image synthesis via adversarial learning. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 5706–5714. [Google Scholar]
- Ma, L.; Jia, X.; Sun, Q.; Schiele, B.; Tuytelaars, T.; Van Gool, L. Pose guided person image generation. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar]
- Fahim Sikder, M. Bangla handwritten digit recognition and generation. In Proceedings of the International Joint Conference on Computational Intelligence: IJCCI 2018, Birulia, Bangladesh, 14–15 December 2018; Springer: Berlin/Heidelberg, Germany, 2020; pp. 547–556. [Google Scholar]
- Kelkar, V.A.; Gotsis, D.S.; Brooks, F.J.; Prabhat, K.; Myers, K.J.; Zeng, R.; Anastasio, M.A. Assessing the ability of generative adversarial networks to learn canonical medical image statistics. IEEE Trans. Med Imaging 2023, 42, 1799–1808. [Google Scholar] [CrossRef]
- Yin, X.; Yu, X.; Sohn, K.; Liu, X.; Chandraker, M. Towards large-pose face frontalization in the wild. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 3990–3999. [Google Scholar]
- Odena, A.; Olah, C.; Shlens, J. Conditional image synthesis with auxiliary classifier gans. In Proceedings of the International Conference on Machine Learning, PMLR, Sydney, Australia, 6–11 August 2017; pp. 2642–2651. [Google Scholar]
- Hamada, K.; Tachibana, K.; Li, T.; Honda, H.; Uchida, Y. Full-body high-resolution anime generation with progressive structure-conditional generative adversarial networks. In Proceedings of the European Conference on Computer Vision (ECCV) Workshops, Munich, Germany, 8–14 September 2018. [Google Scholar]
- Wang, Z.; She, Q.; Ward, T.E. Generative adversarial networks in computer vision: A survey and taxonomy. ACM Comput. Surv. (CSUR) 2021, 54, 1–38. [Google Scholar] [CrossRef]
- He, Z.; Zuo, W.; Kan, M.; Shan, S.; Chen, X. Attgan: Facial attribute editing by only changing what you want. IEEE Trans. Image Process. 2019, 28, 5464–5478. [Google Scholar] [CrossRef] [PubMed]
- Ehsani, K.; Mottaghi, R.; Farhadi, A. Segan: Segmenting and generating the invisible. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 6144–6153. [Google Scholar]
- Denton, E.L.; Birodkar, V. Unsupervised learning of disentangled representations from video. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar]
- Li, C.; Wand, M. Precomputed real-time texture synthesis with markovian generative adversarial networks. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Proceedings, Part III 14. Springer: Berlin/Heidelberg, Germany, 2016; pp. 702–716. [Google Scholar]
- Gao, N.; Xue, H.; Shao, W.; Zhao, S.; Qin, K.K.; Prabowo, A.; Rahaman, M.S.; Salim, F.D. Generative adversarial networks for spatio-temporal data: A survey. ACM Trans. Intell. Syst. Technol. (TIST) 2022, 13, 1–25. [Google Scholar] [CrossRef]
- Yu, L.; Zhang, W.; Wang, J.; Yu, Y. Seqgan: Sequence generative adversarial nets with policy gradient. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31. [Google Scholar]
- Liu, S.; Jiang, H.; Wu, Z.; Li, X. Data synthesis using deep feature enhanced generative adversarial networks for rolling bearing imbalanced fault diagnosis. Mech. Syst. Signal Process. 2022, 163, 108139. [Google Scholar] [CrossRef]
- Lu, S.; Dou, Z.; Jun, X.; Nie, J.Y.; Wen, J.R. Psgan: A minimax game for personalized search with limited and noisy click data. In Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, Paris, France, 21–25 July 2019; pp. 555–564. [Google Scholar]
- Siddique, N.; Adeli, H. Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
- Zhang, J.; Liang, C.; Lu, Q. A novel small-population genetic algorithm based on adaptive mutation and population entropy sampling. In Proceedings of the 2008 7th World Congress on Intelligent Control and Automation, Changsha, China, 4–8 July 2008; pp. 8738–8742. [Google Scholar]
- Li, H.; Zhang, Q. Multiobjective optimization problems with complicated Pareto sets, MOEA/D and NSGA-II. IEEE Trans. Evol. Comput. 2008, 13, 284–302. [Google Scholar] [CrossRef]
- Zhang, Q.; Liu, W.; Tsang, E.; Virginas, B. Expensive multiobjective optimization by MOEA/D with Gaussian process model. IEEE Trans. Evol. Comput. 2009, 14, 456–474. [Google Scholar] [CrossRef]
- Yang, H.; Li, Y.; Yang, L.; Wu, Q. An improved particle swarm optimization algorithm based on entropy and fitness of particles. In Proceedings of the 2020 12th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA), Phuket, Thailand, 28–29 February 2020; pp. 492–496. [Google Scholar]
- Lin, C.; Xu, C.; Luo, D.; Wang, Y.; Tai, Y.; Wang, C.; Li, J.; Huang, F.; Fu, Y. Learning salient boundary feature for anchor-free temporal action localization. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 3320–3329. [Google Scholar]
- Ladosz, P.; Weng, L.; Kim, M.; Oh, H. Exploration in deep reinforcement learning: A survey. Inf. Fusion 2022, 85, 1–22. [Google Scholar] [CrossRef]
Dataset | Region | Dimensions |
---|---|---|
Hangsheng | Hong Kong | 31 |
DAX100 | Germany | 85 |
FTSE100 | U.K. | 89 |
S&P100 | U.S. | 98 |
Nikkei | Japan | 225 |
Algorithm | GAN Generation Probability | GAN Network Training Period/Algebra | The Number of Individuals Used to Replace the Parent |
---|---|---|---|
APG-SMOEA-1 | 0.3 | 100 | First 50 individuals |
APG-SMOEA-2 | 0.1 | 100 | First 50 individuals |
APG-SMOEA-3 | 0.1 | 20 | First 10 individuals |
APG-SMOEA-4 | 0.1–0.9 uniform change | 100 | First 50 individuals |
APG-SMOEA-5 | 0.1 | 100 | First 20 individuals |
APG-SMOEA-6 | 0.1 | 50 | First 50 individuals |
APG-SMOEA-7 | 0.05 | 100 | First 50 individuals |
Algorithm | GAN Generation Probability | GAN Network Training Period/Algebra | The Number of Individuals Used to Replace the Parent |
---|---|---|---|
APG-SMOEA-8 | 0.1–0.9 uniform change | 100 | First 20 individuals |
APG-SMOEA-9 | 0.1–0.9 uniform change | 50 | First 50 individuals |
APG-SMOEA-10 | 0.1–0.9 interval change | 100 | First 50 individuals |
APG-SMOEA-11 | 0.1–0.3 interval change | 100 | First 50 individuals |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qian, W.; Xu, H.; Chen, H.; Yang, L.; Lin, Y.; Xu, R.; Yang, M.; Liao, M. A Synergistic MOEA Algorithm with GANs for Complex Data Analysis. Mathematics 2024, 12, 175. https://doi.org/10.3390/math12020175
Qian W, Xu H, Chen H, Yang L, Lin Y, Xu R, Yang M, Liao M. A Synergistic MOEA Algorithm with GANs for Complex Data Analysis. Mathematics. 2024; 12(2):175. https://doi.org/10.3390/math12020175
Chicago/Turabian StyleQian, Weihua, Hang Xu, Houjin Chen, Lvqing Yang, Yuanguo Lin, Rui Xu, Mulan Yang, and Minghong Liao. 2024. "A Synergistic MOEA Algorithm with GANs for Complex Data Analysis" Mathematics 12, no. 2: 175. https://doi.org/10.3390/math12020175
APA StyleQian, W., Xu, H., Chen, H., Yang, L., Lin, Y., Xu, R., Yang, M., & Liao, M. (2024). A Synergistic MOEA Algorithm with GANs for Complex Data Analysis. Mathematics, 12(2), 175. https://doi.org/10.3390/math12020175