Robust Support Vector Data Description with Truncated Loss Function for Outliers Depression
Abstract
:1. Introduction
- Normal data: Normal data refers to data points that conform to the characteristics and behavior patterns of the majority of data points within a dataset. They represent the normal operating state of a system or process.
- Anomalies: Anomalies are data points that significantly deviate from normal patterns and usually reflect actual problems or critical events in the system.
- Outliers: Outliers are data points that are significantly different from other data points in the dataset, which may be due to natural fluctuations, special circumstances, or noise.
- Noise: Noise refers to irregular, random errors or fluctuations, usually caused by measurement errors or data entry mistakes, and does not reflect the actual state of the system.
- Robustness to noise: Truncated loss functions can enhance the model’s robustness and stability by limiting the impact of outliers without removing data points.
- Reduction of error propagation: In anomaly detection tasks, outliers may significantly contribute to the loss function, leading to error propagation and model instability. Truncated loss functions can effectively reduce error propagation caused by outliers, thereby improving overall model performance.
- Generalization ability: Using truncated loss functions can prevent the model from overfitting to outliers, enhancing the model’s generalization ability. Truncated loss functions are well-suited for various types of datasets and noise conditions, particularly when noise is not obvious or easily detectable.
- We define a universal truncated loss framework that smoothly and adaptively binds loss functions, while preserving their symmetry and sparsity.
- To solve different truncated loss functions, we propose the use of a unified proximal operator algorithm.
- We introduce a fast ADMM algorithm to handle any truncated loss function within a unified scheme.
- We implement the proposed robust SVDD model for various datasets with different noise intensities. The experimental results for real datasets show that the proposed model exhibits superior resistance to outliers and noise compared to more traditional methods.
2. Related Works
2.1. SVDD
2.2. Robust SVDD Variants
2.2.1. Weighted SVDDs
2.2.2. Pinball Loss SVDD
2.2.3. SVDD with Mixed Exponential Loss Function
3. Truncated Loss Function
- Truncated generalized ramp loss function: , where , .
- Truncated binary cross entropy loss function: , where , .
- Truncated linear exponential loss function: , where , , and .
- For samples with , the loss value is 0; for samples with , the loss value is . Thus, the general truncated loss function exhibits sparsity and robustness to outliers.
- and are truncated concave loss functions, which are non-differentiable at . is a truncated convex loss function, which is non-differentiable at and differentiable at .
- and exhibit explicit expressions for the proximal operators, while does not.
3.1. Proximal Operators of Truncated Loss Functions
- 1.
- When , the explicit expression of the proximal operator is presented as follows:
- 2.
- When , the explicit expression of the proximal operator is as follows:
- (1.1)
- Since , we achieve , which means .
- (1.2)
- Since , we obtain , which means or .
- (1.3)
- Since , we achieve , which means .
- (1.4)
- Since , we obtain , which means .
- (1.5)
- Since , we achieve , which means .
- (2.1)
- As , we obtain , which means .
- (2.2)
- As , we obtain , which either means or .
- (2.3)
- As , we obtain , which means .
- (2.4)
- As , we obtain , which means .
- (1.1)
- When the conditions of , , and are met, and it follows that ;
- (1.2)
- When the condition of is met, if is true, then or can be derived;
- (1.3)
- When the condition of is met, if is true, then or can be derived;
- (1.4)
- When the conditions of , , and are met, it follows that ;
- (1.5)
- When the conditions of , , and are met, it follows that ;
- (1.6)
- When the condition of is met, it follows that .
3.2. The Use of the Proximal Operator Algorithm to Solve Truncated Loss Functions
Algorithm 1: Algorithm for solving the proximal operator of the truncated loss function |
. |
do |
according to Formula (21). |
is obtained. |
according to Formula (22). |
. |
. |
. |
9: End |
. |
according to Formula (23). |
4. Robust SVDD Model
5. Fast ADMM Algorithm
5.1. Fast ADMM Algorithm Framework
- 1.
- Computing
- 2.
- Computing
- 3.
- Computing
- 4.
- Computing
Algorithm 2: Fast ADMM Algorithm |
. |
perform the following |
according to Formula (43). |
according to Formula (47). |
has an explicit expression, use Formula (20) for the |
does not have an explicit expression, use |
according to Formula (38). |
. |
9: End |
10: . |
5.2. Global Convergence Analysis of the Fast ADMM Algorithm
5.3. Fast ADMM Algorithm Termination Conditions
6. Experiment
- -SVDD: SVDD algorithm with a truncated generalized ramp loss function;
- -SVDD: SVDD algorithm with a truncated binary cross entropy loss function;
- -SVDD: SVDD algorithm with a truncated linear exponential loss function.
6.1. Experimental Setup
6.1.1. Evaluation Metrics
6.1.2. Kernels
6.1.3. Parameter Configuration
6.2. Synthetic Datasets with Noise
- Neighboring noise: Noise points are randomly distributed near the normal samples but do not completely overlap with the normal samples, forming a more discrete distribution characteristic. This noise simulates the common boundary ambiguity in practical applications.
- Regional noise: Noise points are randomly distributed within a specified area, forming sparse clusters. This noise simulates possible local anomalies in practical applications.
- Circular Dataset
- Normal samples: This consists of 300 two-dimensional sample points distributed within a concentric circle, forming a normal distribution pattern.
- Noise samples: This consists of 40 noise points, divided into two types: 20 noise points randomly distributed near the normal sample points, showing a discrete distribution characteristic, and another 20 noise points randomly distributed within a specified area, forming sparse clusters.
- Illustration: Figure 1a shows the circular dataset, where blue points represent normal samples, and red points represent noise.
- Banana-shaped Dataset
- Normal samples: This consists of 300 two-dimensional sample points distributed along curved lines, resembling a banana shape.
- Noise samples: This consists of 40 noise points, divided into two types: 10 noise points randomly distributed near the banana-shaped normal sample points, and another 30 noise points randomly distributed within a specified area, forming sparse clusters.
- Illustration: Figure 1b shows the banana-shaped dataset, where blue points represent normal samples, and red points represent noise samples.
6.3. UCI Datasets with the Presence of Noise
6.3.1. Training Dataset without Non-Target Data
6.3.2. Training Datasets with Non-Target Data
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Chandola, V.; Banerjee, A.; Kumar, V. Anomaly detection: A survey. ACM Comput. Surv. 2009, 41, 15. [Google Scholar] [CrossRef]
- Pimentel, M.A.; Clifton, D.A.; Clifton, L.; Tarassenko, L. A review of novelty detection. Signal Process. 2014, 99, 215–249. [Google Scholar] [CrossRef]
- Lei, Y.; Yang, B.; Jiang, X.; Jia, F.; Li, N.; Nandi, A.K. Applications of machine learning to machine fault diagnosis: A review and roadmap. Mech. Syst. Signal Process. 2020, 138, 106587. [Google Scholar] [CrossRef]
- Hasani, R.; Wang, G.; Grosu, R. A machine learning suite for machine components’ health-monitoring. In Proceedings of the 33rd AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 9472–9477. [Google Scholar]
- Khan, S.S.; Madden, M.G. One-class classification: Taxonomy of study and review of techniques. Knowl. Eng. Rev. 2014, 29, 345–374. [Google Scholar] [CrossRef]
- Khan, S.S.; Madden, M.G. A survey of recent trends in one class classification. In Proceedings of the 20th Annual Irish Conference on Artificial Intelligence and Cognitive Science, Dublin, Ireland, 19–21 August 2009; pp. 188–197. [Google Scholar] [CrossRef]
- Alam, S.; Sonbhadra, S.K.; Agarwal, S.; Nagabhushan, P. One-class support vector classifiers: A survey. Knowl.-Based Syst. 2020, 196, 105754. [Google Scholar] [CrossRef]
- Tax, D.M.J.; Duin, R.P.W. Support vector data description. Mach. Learn. 2004, 54, 45–66. [Google Scholar] [CrossRef]
- Zheng, S. A fast iterative algorithm for support vector data description. Int. J. Mach. Learn. Cybern. 2019, 10, 1173–1187. [Google Scholar] [CrossRef]
- Turkoz, M.; Kim, S.; Son, Y.; Jeong, M.K.; Elsayed, E.A. Generalized support vector data description for anomaly detection. Pattern Recognit. 2020, 100, 107119. [Google Scholar] [CrossRef]
- Fong, S.; Narasimhan, S. An Unsupervised Bayesian OC-SVM Approach for Early Degradation Detection, Thresholding, and Fault Prediction in Machinery Monitoring. IEEE Trans. Instrum. Meas. 2022, 71, 3500811. [Google Scholar] [CrossRef]
- Breunig, M.M.; Kriegel, H.P.; Ng, R.T.; Sander, J. Lof: Identifying density-based local outliers. In Proceedings of the ACM SIGMOD International Conference on Management of Data, Dallas, TX, USA, 15–18 May 2000; pp. 93–104. [Google Scholar]
- Zheng, L.; Hu, W.; Min, Y. Raw Wind Data Preprocessing: A Data-Mining Approach. IEEE Trans. Sustain. Energy 2015, 6, 11–19. [Google Scholar] [CrossRef]
- Khan, S.S.; Karg, M.E.; Kulic, D.; Hoey, J. X-factor HMMs for detecting falls in the absence of fall-specific training data. In Proceedings of the Ambient Assisted Living and Daily Activities: 6th International Work-Conference, IWAAL 2014, Belfast, UK, 2–5 December 2014; pp. 1–9. [Google Scholar]
- Andreou, C.; Karathanassi, V. Estimation of the Number of Endmembers Using Robust Outlier Detection Method. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 247–256. [Google Scholar] [CrossRef]
- Lu, J.; Gao, Y.; Zhang, L. A novel dynamic radius support vector data description based fault diagnosis method for proton exchange membrane fuel cell systems. Int. J. Hydrogen Energy 2022, 47, 35825–35837. [Google Scholar] [CrossRef]
- Zhao, Y.-P.; Xie, Y.-L.; Ye, Z.-F. A new dynamic radius SVDD for fault detection of aircraft engine. Eng. Appl. Artif. Intell. 2021, 100, 104177. [Google Scholar] [CrossRef]
- Zhu, F.; Yang, J.; Gao, C.; Xu, S.; Ye, N.; Yin, T. A weighted one-class support vector machine. Neurocomputing 2016, 189, 1–10. [Google Scholar] [CrossRef]
- Chen, G.; Zhang, X.; Wang, Z.J. Robust support vector data description for outlier detection with noise or uncertain data. Knowl.-Based Syst. 2015, 90, 129–137. [Google Scholar] [CrossRef]
- Cha, M.; Kim, J.S.; Baek, J.G. Density weighted support vector data description. Expert Syst. Appl. 2014, 41, 3343–3350. [Google Scholar] [CrossRef]
- Sadeghi, R.; Hamidzadeh, J. Automatic support vector data description. Soft Comput. 2018, 22, 147–158. [Google Scholar] [CrossRef]
- Hu, W.; Hu, T.; Wei, Y.; Lou, J.; Wang, S. Global Plus Local Jointly Regularized Support Vector Data Description for Novelty Detection. IEEE Trans. Neural Netw. Learn. Syst. 2023, 34, 6602–6614. [Google Scholar] [CrossRef] [PubMed]
- Zhao, Y.-P.; Huang, G.; Hu, Q.-K.; Li, B. An improved weighted one class support vector machine for turboshaft engine fault detection. Eng. Appl. Artif. Intell. 2020, 94, 103796. [Google Scholar] [CrossRef]
- Wang, K.; Lan, H. Robust support vector data description for novelty detection with contaminated data. Eng. Appl. Artif. Intell. 2020, 91, 103554. [Google Scholar] [CrossRef]
- Xing, H.-J.; Li, L.-F. Robust least squares one-class support vector machine. Pattern Recognit. Lett. 2020, 138, 571–578. [Google Scholar] [CrossRef]
- Xiao, Y.; Wang, H.; Xu, W. Ramp Loss based robust one-class SVM. Pattern Recognit. Lett. 2017, 85, 15–20. [Google Scholar] [CrossRef]
- Tian, Y.; Mirzabagheri, M.; Bamakan, S.M.H. Ramp loss one-class support vector machine; A robust and effective approach to anomaly detection problems. Neurocomputing 2018, 310, 223–235. [Google Scholar] [CrossRef]
- Xing, H.; Ji, M. Robust one-class support vector machine with rescaled hinge loss function. Pattern Recognit. 2018, 84, 152–164. [Google Scholar] [CrossRef]
- Zhong, G.; Xiao, Y.; Liu, B.; Zhao, L.; Kong, X. Pinball loss support vector data description for outlier detection. Appl. Intell. 2022, 52, 16940–16961. [Google Scholar] [CrossRef]
- Zheng, Y.; Wang, S.; Chen, B. Robust one-class classification with support vector data description and mixed exponential loss function. Eng. Appl. Artif. Intell. 2023, 122, 106153. [Google Scholar] [CrossRef]
- Le Thi, H.A.; Pham Dinh, T. DC programming and DCA: Thirty years of developments. Math. Program. 2018, 169, 5–68. [Google Scholar] [CrossRef]
- Liu, J.; Pang, J.S. Risk-based robust statistical learning by stochastic difference-of convex value-function optimization. Oper. Res. 2023, 71, 397–414. [Google Scholar] [CrossRef]
- Yuille, A.L.; Rangarajan, A. The concave-convex procedure. Neural Comput. 2003, 15, 915–936. [Google Scholar] [CrossRef]
- Tao, Q.; Wu, G.; Chu, D. Improving sparsity and scalability in regularized nonconvex truncated-loss learning problems. IEEE Trans. Neural Netw. Learn. Syst. 2017, 29, 2782–2793. [Google Scholar] [CrossRef]
- Wang, H.J.; Shao, Y.H.; Xiu, N.H. Proximal operator and optimality conditions for ramp loss SVM. Optim. Lett. 2022, 16, 999–1014. [Google Scholar] [CrossRef]
- Gong, P.; Zhang, C.; Lu, Z. A general iterative shrinkage and thresholding algorithm for non-convex regularized optimization problems. In Proceedings of the 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, USA, 16–21 June 2013; pp. 696–704. [Google Scholar]
- Scholkopf, B.; Herbrich, R.; Smola, A.J. A generalized representer theorem. In Proceedings of the 14th Annual Conference on Computational Learning Theory, Amsterdam, The Netherlands, 16–19 July 2001; pp. 416–426. [Google Scholar] [CrossRef]
- Guan, L.; Qiao, L.; Li, D.; Sun, T.; Ge, K.; Lu, X. An efficient ADMM-based algorithm to nonconvex penalized support vector machines. In Proceedings of the IEEE International Conference on Data Mining Workshops (ICDMW), Singapore, 17–20 November 2018; pp. 1209–1216. [Google Scholar]
- Wu, M.; Ye, J. A small sphere and large margin approach for novelty detection using training data with outliers. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 31, 2088–2092. [Google Scholar] [CrossRef] [PubMed]
- Xing, H.; Liu, Y.; He, Z. Robust sparse coding for one-class classification based on correntropy and logarithmic penalty function. Pattern Recognit. 2021, 111, 107685. [Google Scholar] [CrossRef]
- Zheng, Y.; Wang, S.; Chen, B. Multikernel correntropy based robust least squares one-class support vector machine. Neurocomputing 2023, 545, 126324. [Google Scholar] [CrossRef]
- Chaudhuri, A.; Sadek, C.; Kakde, D.; Wang, H.; Hu, W.; Jiang, H.; Kong, S.; Liao, Y.; Peredriy, S. The trace kernel bandwidth criterion for support vector data description. Pattern Recognit. 2021, 111, 107662. [Google Scholar] [CrossRef]
- Dua, D.; Graff, C. UCI Machine Learning Repository. 2008. University of California, Irvine, School of Information and Computer Sciences. Available online: https://archive.ics.uci.edu/ml (accessed on 20 July 2024).
Dataset | SVDD | DW-SVDD | R-SVDD | GL-SVDD | -SVDD | -SVDD | -SVDD | |
---|---|---|---|---|---|---|---|---|
Blood | 10% | 49.95 | 49.28 | 51.25 | 49.19 | 52.79 | 52.79 | 54.82 |
20% | 46.87 | 46.43 | 48.65 | 46.84 | 50.21 | 50.21 | 51.5 | |
30% | 42.34 | 41.87 | 42.34 | 42.34 | 45.96 | 45.96 | 52.41 | |
40% | 41.39 | 41.52 | 41.39 | 41.58 | 45.06 | 45.06 | 51.50 | |
50% | 40.53 | 40.56 | 40.69 | 40.9 | 42.6 | 42.6 | 50.72 | |
Balance scale | 10% | 73.70 | 73.8 | 73.91 | 73.64 | 74.38 | 74.38 | 74.32 |
20% | 68.58 | 68.67 | 68.60 | 68.60 | 66.62 | 66.62 | 67.19 | |
30% | 62.66 | 64.26 | 62.66 | 62.71 | 62.70 | 62.70 | 63.75 | |
40% | 57.38 | 58.44 | 57.38 | 57.44 | 58.60 | 58.60 | 64.32 | |
50% | 53.47 | 54.72 | 53.43 | 53.47 | 56.51 | 56.51 | 61.13 | |
Ecoli | 10% | 77.5 | 77.48 | 80.63 | 76.28 | 86.08 | 86.08 | 81.95 |
20% | 70.85 | 67.02 | 70.26 | 70.85 | 73.08 | 73.08 | 73.53 | |
30% | 67 | 62.77 | 67.22 | 67 | 70.61 | 70.61 | 70.29 | |
40% | 64.75 | 60.37 | 64.75 | 64.75 | 67.57 | 67.57 | 68.33 | |
50% | 59.94 | 55.18 | 59.87 | 59.94 | 64.26 | 64.26 | 64.42 | |
Haberman | 10% | 54.37 | 54.57 | 53.36 | 55.49 | 54.06 | 54.06 | 52.09 |
20% | 54.85 | 55.01 | 54.87 | 54.93 | 55.34 | 55.34 | 54.30 | |
30% | 53.38 | 53.58 | 54.86 | 53.54 | 57.13 | 57.13 | 56.02 | |
40% | 52.54 | 52.31 | 52.73 | 52.45 | 53.89 | 53.89 | 53.93 | |
50% | 50.12 | 49.74 | 49.95 | 50.09 | 51.92 | 51.92 | 52.03 | |
Iris | 10% | 76.8 | 79.5 | 76.92 | 78.18 | 80.74 | 80.74 | 79.52 |
20% | 69.29 | 68.94 | 69.39 | 69.52 | 71.13 | 71.13 | 70.97 | |
30% | 65.77 | 65.52 | 65.67 | 65.85 | 70.46 | 70.46 | 69.46 | |
40% | 61.67 | 61.69 | 61.41 | 61.96 | 65.46 | 65.46 | 65.84 | |
50% | 60.21 | 59.95 | 60.32 | 60.48 | 67.72 | 67.72 | 68.02 | |
Wine | 10% | 79.76 | 79.76 | 79.02 | 79.76 | 80.18 | 80.18 | 80.74 |
20% | 71.66 | 73.59 | 73.02 | 72.32 | 75.03 | 75.03 | 76.23 | |
30% | 68.73 | 70.11 | 71.49 | 69.04 | 73.12 | 73.12 | 72.44 | |
40% | 64.3 | 64.25 | 64.21 | 64.01 | 68.35 | 68.35 | 68.96 | |
50% | 64.03 | 63.93 | 64.06 | 63.09 | 68.00 | 68.00 | 68.59 | |
Ionosphere | 10% | 84.22 | 84.43 | 85.08 | 84.22 | 86.89 | 86.89 | 86.81 |
20% | 82.98 | 83.3 | 82.89 | 83.22 | 84.12 | 84.12 | 83.38 | |
30% | 81.94 | 82.2 | 81.35 | 81.89 | 82.42 | 82.42 | 82.18 | |
40% | 83.55 | 83.37 | 80.58 | 83.33 | 83.14 | 83.14 | 82.86 | |
50% | 82.28 | 82.44 | 77.87 | 82.28 | 81.96 | 81.96 | 82.06 | |
Sonar | 10% | 56.46 | 56.89 | 56.16 | 57.22 | 58.62 | 58.62 | 59.58 |
20% | 53.46 | 51.48 | 53.46 | 53.46 | 53.50 | 53.50 | 53.73 | |
30% | 49.15 | 47.17 | 49.15 | 49.15 | 55.29 | 55.29 | 55.28 | |
40% | 52.48 | 50.04 | 52.48 | 52.48 | 57.57 | 57.57 | 57.68 | |
50% | 45.22 | 43.31 | 45.22 | 45.22 | 50.10 | 50.10 | 50.38 |
Dataset | SVDD | DW-SVDD | R-SVDD | GL-SVDD | -SVDD | -SVDD | -SVDD | |
---|---|---|---|---|---|---|---|---|
Blood | 10% | 72.17 | 73.13 | 73.18 | 73.21 | 73.05 | 73.05 | 72.93 |
20% | 68.98 | 68.09 | 69.12 | 68.85 | 68.72 | 68.72 | 68.19 | |
30% | 65.93 | 66.54 | 67.33 | 65.93 | 68.42 | 68.42 | 68.44 | |
40% | 64.42 | 64.95 | 64.26 | 64.28 | 66.25 | 66.25 | 65.95 | |
50% | 63.38 | 62.37 | 63.27 | 62.64 | 65.25 | 65.25 | 64.06 | |
Balance scale | 10% | 58.14 | 58.28 | 58.13 | 58.07 | 56.67 | 56.48 | 57.01 |
20% | 50.57 | 51.47 | 50.19 | 51.49 | 50.74 | 50.74 | 51.14 | |
30% | 48.61 | 48.92 | 48.63 | 48.65 | 49.75 | 49.75 | 49.58 | |
40% | 48.18 | 48.82 | 48.18 | 48.21 | 53.02 | 53.02 | 53.65 | |
50% | 49.07 | 49.21 | 49.16 | 49.21 | 49.81 | 49.81 | 53.48 | |
Ecoli | 10% | 60.55 | 60.48 | 60.51 | 55.27 | 72.52 | 63.25 | 64.12 |
20% | 47.43 | 50.81 | 46.49 | 50.81 | 54.71 | 54.60 | 56.57 | |
30% | 49.30 | 47.04 | 43.51 | 49.30 | 53.46 | 53.26 | 53.27 | |
40% | 50.95 | 49.43 | 46.08 | 50.95 | 53.02 | 53.02 | 53.65 | |
50% | 50.49 | 50.58 | 49.20 | 50.58 | 51.93 | 51.93 | 52.11 | |
Haberman | 10% | 57.98 | 58.28 | 56.01 | 62.42 | 57.63 | 58.15 | 57.79 |
20% | 60.97 | 60.17 | 59.67 | 60.29 | 60.65 | 60.27 | 59.76 | |
30% | 62.18 | 61.95 | 62.36 | 62.23 | 64.10 | 63.72 | 64.22 | |
40% | 63.87 | 64.41 | 62.04 | 64.02 | 64.62 | 64.45 | 63.87 | |
50% | 56.66 | 58.02 | 57.51 | 58.25 | 60.52 | 60.61 | 59.11 | |
Iris | 10% | 66.07 | 57.62 | 68.75 | 56.52 | 60.02 | 60.41 | 62.75 |
20% | 51.17 | 51.71 | 53.99 | 51.79 | 55.66 | 55.66 | 55.79 | |
30% | 41.10 | 41.02 | 42.15 | 41.17 | 49.74 | 49.74 | 51.22 | |
40% | 40.04 | 40.13 | 41.97 | 40.41 | 44.94 | 44.94 | 45.79 | |
50% | 43.39 | 43.70 | 44.28 | 43.99 | 49.97 | 49.97 | 50.34 | |
Wine | 10% | 60.87 | 63.87 | 62.77 | 53.66 | 66.26 | 66.76 | 67.28 |
20% | 44.97 | 43.48 | 41.97 | 43.94 | 55.32 | 50.34 | 51.50 | |
30% | 42.46 | 43.91 | 43.33 | 42.81 | 48.67 | 47.52 | 48.27 | |
40% | 42.06 | 42.45 | 42.56 | 41.94 | 45.60 | 45.29 | 49.26 | |
50% | 46.64 | 46.61 | 46.24 | 45.98 | 48.96 | 48.94 | 49.59 | |
Ionosphere | 10% | 80.87 | 81.09 | 81.29 | 80.87 | 83.08 | 82.89 | 83.08 |
20% | 80.40 | 80.03 | 80.74 | 80.31 | 81.00 | 80.36 | 80.28 | |
30% | 79.46 | 79.76 | 79.98 | 79.40 | 81.21 | 80.60 | 80.52 | |
40% | 81.52 | 81.93 | 81.26 | 81.88 | 82.00 | 82.12 | 82.02 | |
50% | 81.85 | 81.67 | 81.48 | 81.67 | 82.08 | 81.99 | 81.97 | |
Sonar | 10% | 55.23 | 54.3 | 55.2 | 54.6 | 54.17 | 54.17 | 55.14 |
20% | 51.43 | 52.25 | 52.25 | 52.35 | 53.03 | 52.73 | 52.13 | |
30% | 48.73 | 47.78 | 48.73 | 48.73 | 51.52 | 51.52 | 51.47 | |
40% | 51.16 | 50.34 | 51.16 | 51.16 | 52.29 | 52.29 | 52.30 | |
50% | 48.26 | 49.08 | 49.08 | 49.08 | 51.78 | 51.78 | 51.67 |
Dataset | SVDD | DW-SVDD | R-SVDD | GL-SVDD | -SVDD | -SVDD | -SVDD | |
---|---|---|---|---|---|---|---|---|
Blood | 10% | 53.18 | 51.66 | 52.05 | 53.66 | 52.29 | 52.29 | 54.45 |
20% | 50.82 | 50.91 | 51.65 | 51.57 | 52.02 | 52.02 | 53.39 | |
30% | 49.72 | 49.19 | 51.13 | 48.78 | 53.43 | 53.43 | 52.13 | |
40% | 51.00 | 49.23 | 50.16 | 48.43 | 51.21 | 51.21 | 53.87 | |
50% | 48.04 | 48.72 | 47.89 | 46.88 | 52.87 | 52.87 | 52.42 | |
Balance scale | 10% | 78.86 | 80.91 | 79.7 | 80.24 | 81.25 | 81.25 | 77.64 |
20% | 73.53 | 75.25 | 74.18 | 73.41 | 76.07 | 76.07 | 73.65 | |
30% | 65.78 | 67.38 | 66.2 | 65.51 | 68.09 | 68.09 | 66.48 | |
40% | 60.74 | 62.30 | 62.10 | 60.56 | 67.11 | 67.11 | 65.78 | |
50% | 56.79 | 61.48 | 58.46 | 60.08 | 68.24 | 68.24 | 64.86 | |
Ecoli | 10% | 82.1 | 82.36 | 83.95 | 83.08 | 85.34 | 85.34 | 82.1 |
20% | 77.46 | 77.48 | 80.02 | 77.08 | 82.53 | 82.53 | 77.82 | |
30% | 69.55 | 67.74 | 68.52 | 70.88 | 71.96 | 71.96 | 71.32 | |
40% | 74.09 | 70.07 | 71.19 | 72.08 | 74.45 | 74.45 | 74.35 | |
50% | 66.93 | 63.38 | 64.87 | 63.58 | 68.18 | 68.18 | 67.36 | |
Haberman | 10% | 55.59 | 55.41 | 54.81 | 55.51 | 56.72 | 56.72 | 57.09 |
20% | 58.66 | 58.88 | 54.12 | 58.80 | 61.99 | 61.99 | 59.75 | |
30% | 55.13 | 55.22 | 54.2 | 55.47 | 56.14 | 56.14 | 54.65 | |
40% | 51.83 | 52.00 | 57.63 | 50.38 | 56.01 | 56.01 | 54.81 | |
50% | 55.64 | 55.70 | 55.26 | 55.52 | 58.27 | 58.27 | 57.60 | |
Iris | 10% | 80.7 | 81.45 | 81.22 | 80.38 | 79.13 | 79.13 | 80.18 |
20% | 79.01 | 78.31 | 78.75 | 77.94 | 77.37 | 77.37 | 78.63 | |
30% | 66.77 | 64.48 | 63.58 | 66.63 | 71.12 | 71.12 | 70.78 | |
40% | 67.08 | 60.61 | 63.16 | 64.29 | 70.91 | 70.91 | 70.47 | |
50% | 62.19 | 53.74 | 61.09 | 59.39 | 67.59 | 67.59 | 67.63 | |
Wine | 10% | 81.54 | 84.11 | 83.77 | 81.54 | 84.35 | 84.35 | 82.45 |
20% | 79.12 | 81.44 | 80.24 | 79.12 | 82.87 | 82.87 | 80.80 | |
30% | 71.03 | 69.15 | 70.02 | 71.03 | 75.48 | 75.48 | 71.95 | |
40% | 69.28 | 67.97 | 67.75 | 69.28 | 72.29 | 72.29 | 70.47 | |
50% | 64.77 | 66.80 | 66.46 | 64.77 | 69.04 | 69.04 | 66.50 | |
Ionosphere | 10% | 86.45 | 86.79 | 88.66 | 87.26 | 89.28 | 89.28 | 89.02 |
20% | 84.84 | 86.01 | 86.38 | 86.38 | 88.49 | 88.49 | 88.28 | |
30% | 84.25 | 85.0 | 85.09 | 85.18 | 87.93 | 87.93 | 87.87 | |
40% | 82.58 | 83.13 | 83.37 | 83.37 | 86.60 | 86.60 | 86.20 | |
50% | 77.26 | 78.56 | 78.54 | 78.62 | 84.29 | 84.29 | 83.95 | |
Sonar | 10% | 57.24 | 58.34 | 58.97 | 58.97 | 58.66 | 58.66 | 58.70 |
20% | 59.23 | 59.25 | 60.35 | 60.35 | 60.21 | 60.21 | 59.79 | |
30% | 54.51 | 54.76 | 56.03 | 56.03 | 56.50 | 56.50 | 56.33 | |
40% | 55.86 | 55.86 | 58.93 | 58.93 | 58.56 | 58.56 | 60.09 | |
50% | 52.21 | 53.14 | 53.92 | 53.92 | 55.45 | 55.45 | 54.71 |
Dataset | SVDD | DW-SVDD | R-SVDD | GL-SVDD | -SVDD | -SVDD | ||
---|---|---|---|---|---|---|---|---|
Blood | 10% | 76.30 | 76.05 | 76.38 | 77.20 | 77.14 | 75.81 | 69.71 |
20% | 65.90 | 65.46 | 66.12 | 66.22 | 73.71 | 70.79 | 65.62 | |
30% | 65.56 | 67.43 | 65.40 | 69.15 | 69.32 | 66.81 | 68.44 | |
40% | 63.94 | 65.87 | 60.48 | 65.82 | 70.09 | 66.07 | 66.46 | |
50% | 65.98 | 65.10 | 63.16 | 68.44 | 80.57 | 68.88 | 69.03 | |
Balance scale | 10% | 65.77 | 68.66 | 66.89 | 67.52 | 69.18 | 66.75 | 66.74 |
20% | 63.48 | 62.23 | 62.34 | 61.64 | 64.09 | 61.90 | 63.58 | |
30% | 58.62 | 58.54 | 56.99 | 61.23 | 61.62 | 61.62 | 59.06 | |
40% | 64.74 | 63.67 | 63.67 | 63.16 | 66.57 | 66.14 | 66.05 | |
50% | 68.25 | 68.78 | 66.92 | 68.02 | 71.23 | 71.23 | 69.84 | |
Ecoli | 10% | 67.58 | 68.46 | 68.29 | 68.51 | 76.37 | 68.84 | 68.58 |
20% | 62.76 | 63.36 | 63.03 | 62.17 | 70.01 | 69.01 | 68.16 | |
30% | 58.62 | 58.54 | 56.99 | 60.25 | 61.39 | 61.75 | 61.11 | |
40% | 65.01 | 65.49 | 63.81 | 66.17 | 67.89 | 67.86 | 67.67 | |
50% | 65.11 | 65.83 | 65.92 | 66.02 | 66.86 | 66.17 | 67.92 | |
Haberman | 10% | 61.22 | 61.07 | 59.24 | 61.12 | 58.04 | 60.43 | 58.08 |
20% | 66.36 | 66.28 | 58.27 | 66.19 | 63.82 | 63.82 | 59.85 | |
30% | 61.84 | 59.14 | 59.53 | 61.53 | 64.97 | 65.48 | 64.48 | |
40% | 57.60 | 58.29 | 57.46 | 57.45 | 63.08 | 64.17 | 62.35 | |
50% | 62.90 | 62.45 | 55.28 | 63.08 | 66.34 | 70.63 | 63.96 | |
Iris | 10% | 61.66 | 62.84 | 63.26 | 61.13 | 61.91 | 63.78 | 60.98 |
20% | 55.14 | 55.39 | 56.23 | 53.85 | 58.24 | 58.24 | 59.73 | |
30% | 52.68 | 50.74 | 53.21 | 52.68 | 58.91 | 53.91 | 53.75 | |
40% | 54.74 | 51.56 | 53.88 | 55.98 | 57.64 | 57.64 | 57.42 | |
50% | 54.19 | 57.43 | 58.42 | 58.47 | 60.52 | 58.79 | 58.64 | |
Wine | 10% | 65.03 | 70.62 | 70.33 | 65.03 | 70.90 | 68.82 | 67.48 |
20% | 64.02 | 59.12 | 64.69 | 59.12 | 62.37 | 59.29 | 59.35 | |
30% | 52.68 | 50.74 | 52.21 | 52.68 | 58.91 | 53.91 | 53.75 | |
40% | 55.56 | 53.86 | 53.82 | 55.56 | 58.78 | 56.90 | 56.17 | |
50% | 58.67 | 57.47 | 57.24 | 57.47 | 60.52 | 59.79 | 58.94 | |
Ionosphere | 10% | 85.21 | 85.55 | 87.22 | 86.02 | 87.95 | 87.95 | 87.64 |
20% | 85.36 | 84.20 | 85.40 | 85.58 | 87.48 | 87.48 | 87.22 | |
30% | 86.43 | 86.80 | 86.79 | 86.91 | 88.45 | 88.45 | 88.38 | |
40% | 87.39 | 87.46 | 87.19 | 87.58 | 88.95 | 88.8 | 88.67 | |
50% | 87.94 | 87.63 | 87.90 | 87.93 | 89.24 | 89.07 | 88.92 | |
Sonar | 10% | 58.54 | 58.75 | 58.71 | 58.71 | 59.05 | 58.88 | 59.12 |
20% | 57.46 | 58.21 | 58.05 | 58.05 | 58.74 | 58.74 | 58.18 | |
30% | 55.12 | 52.25 | 52.66 | 52.66 | 60.26 | 59.52 | 56.25 | |
40% | 62.15 | 58.30 | 60.23 | 60.23 | 63.52 | 63.52 | 62.38 | |
50% | 61.35 | 64.91 | 60.10 | 60.10 | 62.80 | 62.80 | 61.94 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, H.; Lyu, Y.; Shi, J.; Zhang, W. Robust Support Vector Data Description with Truncated Loss Function for Outliers Depression. Entropy 2024, 26, 628. https://doi.org/10.3390/e26080628
Chen H, Lyu Y, Shi J, Zhang W. Robust Support Vector Data Description with Truncated Loss Function for Outliers Depression. Entropy. 2024; 26(8):628. https://doi.org/10.3390/e26080628
Chicago/Turabian StyleChen, Huakun, Yongxi Lyu, Jingping Shi, and Weiguo Zhang. 2024. "Robust Support Vector Data Description with Truncated Loss Function for Outliers Depression" Entropy 26, no. 8: 628. https://doi.org/10.3390/e26080628
APA StyleChen, H., Lyu, Y., Shi, J., & Zhang, W. (2024). Robust Support Vector Data Description with Truncated Loss Function for Outliers Depression. Entropy, 26(8), 628. https://doi.org/10.3390/e26080628