Millimeter-Wave Radar Clutter Suppression Based on Cycle-Consistency Generative Adversarial Network
Abstract
:1. Introduction
1.1. Traditional Clutter Suppression Methods
- The spatial-based domain methods represent the early radar clutter suppression technology. It mostly uses the clutter statistical model to describe the clutter component to eliminate the clutter [16]. Such methods are characterized by their relative simplicity and direct data processing capabilities, but they exhibit weak interference resistance. The introduction of Fourier transform has significantly advanced the development of frequency-based domain methods in the field of clutter suppression.
- The frequency-based domain methods extract the Doppler information of the target and clutter by Fourier transform. The representative methods include Moving Target Indicator (MTI) [5], which uses the difference between the moving target echo and the strong ground clutter in the Doppler domain to construct a zero-frequency notch filter to suppress the clutter. Moving Target Detection (MTD) [6] improves the improvement factor by introducing a group of filters in the frequency domain. However, the suppression effect is not ideal for clutter with a large spectrum expansion.
- The transform domain methods utilize various mathematical transformations to decompose the signal. The representative methods include:
- (1)
- Singular value decomposition (SVD) [7] decomposes the input signal into the product of three specific matrices: two orthogonal matrices and a diagonal matrix. The elements situated along the diagonal of the diagonal matrix are referred to as singular values, which are typically organized in descending order. Larger singular values correspond to more significant signal features. However, during the process of singular value selection, there is a risk of inadvertently suppressing important target signals by misclassifying them as clutter.
- (2)
- Principal component analysis (PCA) [8] reflects the linear correlations among various features of the signal by computing the covariance matrix, which performs eigenvalue decomposition on this matrix, retaining only the principal components associated with the top k eigenvalues. It may also result in the loss of critical target information, thereby diminishing the effectiveness of the clutter suppression.
- (3)
- Robust principal component analysis (RPCA) [9] decomposes the input data into a low-rank matrix and a sparse matrix, transforming them into an optimization problem. However, when handling high-dimensional data, the computational complexity is high, resulting in slower processing speeds. Although the above methods generally have a good effect on clutter suppression, they all require prior modeling and the estimation of clutter parameters, making it difficult to adapt to real-time and variable clutter.
1.2. Deep Learning Clutter Suppression Methods
- Regarding convolutional neural networks, Li et al. [10] introduced a clutter suppression algorithm utilizing an encoder–decoder to solve the problem of object detection at various signal-to-noise ratios in real sea conditions. Meanwhile, Guo et al. [11] and Zhang et al. [12] developed a sea clutter suppression network employing a deep convolutional autoencoder. Additionally, Geng et al. [13] proposed a ground-penetrating radar clutter suppression algorithm based on an LSTM network. Wen et al. [10] also presented a sea clutter suppression method based on deep convolutional neural networks. While these approaches have yielded promising results, they often require a substantial number of clutter and clutter-free sample pairs during the training phase. In practice, obtaining clutter-free labeled images is challenging, which limits the performance of the above supervised learning methods in clutter suppression tasks.
- In the domain of GANs [20], the capability of neural networks to learn features has significantly advanced. GANs also considerably lessen the reliance on labeled training datasets, facilitating the integration of deep learning methods into radar clutter suppression. Ni et al. [14] designed an unsupervised conditional generative adversarial network to map ground-penetrating radar data from clutter to clutter-free, and it has good applicability on real data. Furthermore, Mou et al. [15] introduced a novel sea clutter suppression method SCS-GAN, which further expands the number of radar training sets by GAN. This network incorporates an attention mechanism to further boost the network’s ability to extract clutter features. Although existing deep learning-based clutter suppression methods have shown a good performance, there is a scarcity of techniques specifically aimed at suppressing radar image clutter in traffic scenarios. Moreover, the coexistence of object signals and clutter within the echo signal poses a challenge, as the distribution of the object echoes is relatively sparse compared to that of the clutter. Effectively suppressing clutter while retaining the object information is crucial for enhancing the overall performance of object detection.
- Multi-scale feature fusion network: The generator incorporates a multi-scale feature fusion network enhanced with an attention mechanism. This design effectively captures contextual information across multiple scales, thereby emphasizing both the categorization and localization of targets. The attention mechanism ensures a focus on the region of interest throughout the clutter-suppressing process.
- Target consistency loss: An additional term, designed as the target consistency loss, is incorporated into the original loss function. This regularization term preserves the integrity of the target information by limiting the gap between input and output. Furthermore, it helps mitigate the risk of overfitting, which is particularly relevant when dealing with limited training data.
2. Operational Principles of CycleGAN
3. Proposed Method
3.1. Architecture of the Clutter Suppression Network
3.1.1. Architecture of Generator
3.1.2. Architecture of Discriminator
3.2. Loss Function
4. Experiments and Results
4.1. Experiment Implementation
4.1.1. Data Preparation
4.1.2. Network Training
4.1.3. Evaluation Index
4.2. Ablation Experiment
4.2.1. Model Choice
4.2.2. Ablation of Loss Function
4.2.3. Ablation of Hyperparameters
4.3. Suppression Performance Comparision
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wei, Z.; Zhang, F.; Chang, S.; Liu, Y.; Wu, H.; Feng, Z. MmWave Radar and Vision Fusion for Object Detection in Autonomous Driving: A Review. Sensors 2022, 22, 2542. [Google Scholar] [CrossRef] [PubMed]
- Zhou, T.; Yang, M.; Jiang, K.; Wong, H.; Yang, D. MMW Radar-Based Technologies in Autonomous Driving: A Review. Sensors 2020, 20, 7283. [Google Scholar] [CrossRef] [PubMed]
- Abdu, F.J.; Zhang, Y.; Fu, M.; Li, Y.; Deng, Z. Application of Deep Learning on Millimeter-Wave Radar Signals: A Review. Sensors 2021, 21, 1951. [Google Scholar] [CrossRef]
- Huang, P.; Yang, H.; Zou, Z.; Xia, X.-G.; Liao, G. Multichannel Clutter Modeling, Analysis, and Suppression for Missile-Borne Radar Systems. IEEE Trans. Aerosp. Electron. Syst. 2022, 58, 3236–3260. [Google Scholar] [CrossRef]
- Weng, Z.Y. Optimal design of clutter rejection filters for MTI system. In Proceedings of the 2001 CIE International Conference on Radar Proceedings (Cat No.01TH8559), Beijing, China, 15–18 October 2001; pp. 475–478. [Google Scholar] [CrossRef]
- Wang, H.; Cai, L. A localized adaptive MTD processor. IEEE Trans. Aerosp. Electron. Syst. 1991, 27, 532–539. [Google Scholar] [CrossRef]
- Yang, Y.; Xiao, S.-P.; Wang, X.-S. Radar Detection of Small Target in Sea Clutter Using Orthogonal Projection. IEEE Geosci. Remote Sens. Lett. 2019, 16, 382–386. [Google Scholar] [CrossRef]
- Karlsen, B.; Larsen, J.; Sorensen, H.; Jakobsen, K.B. Comparison of PCA and ICA based clutter reduction in GPR systems for anti-personal Iandmine detection. In Proceedings of the 11th IEEE Singal Process Workshop on Statistical Singal Processing (Cat. No.01TH8563), Singapore, 8 August 2001. [Google Scholar]
- Song, X.; Xiang, D.; Zhou, K.; Su, Y. Improving RPCA-Based Clutter Suppression in GPR Detection of Antipersonnel Mines. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1338–1342. [Google Scholar] [CrossRef]
- Wen, L.; Zhong, C.; Huang, X.; Ding, J. Sea Clutter Suppression Based on Selective Reconstruction of Features. In Proceedings of the 2019 6th Asia-Pacific Conference on Synthetic Aperture Radar (APSAR), Xiamen, China, 26–29 November 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Guo, S.; Zhang, Q.; Shao, Y.; Chen, W. Sea Clutter and Target Detection with Deep Neural Networks. In DEStech Transactions on Computer Science and Engineering; DEStech Publishing Inc.: Lancaster, PA, USA, 2017. [Google Scholar]
- Zhang, Q.; Shao, Y.; Guo, S.; Sun, L.; Chen, W. A Novel Method for Sea Clutter Suppression and Target Detection via Deep Convolutional Autoencoder. Int. J. Signal Process. 2017, 2, 35–40. [Google Scholar]
- Geng, J.; He, J.; Ye, H.; Zhan, B. A Clutter Suppression Method Based on LSTM Network for Ground Penetrating Radar. Appl. Sci. 2022, 12, 6457. [Google Scholar] [CrossRef]
- Ni, Z.-K.; Shi, C.; Pan, J.; Zheng, Z.; Ye, S.; Fang, G. Declutter-GAN: GPR B-Scan Data Clutter Removal Using Conditional Generative Adversarial Nets. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4023105. [Google Scholar] [CrossRef]
- Mou, X.; Chen, X.; Guan, J.; Dong, Y.; Liu, N. Sea Clutter Suppression for Radar PPI Images Based on SCS-GAN. IEEE Geosci. Remote Sens. Lett. 2021, 18, 1886–1890. [Google Scholar] [CrossRef]
- Weinberg, G.V. Constant false alarm rate detectors for Pareto clutter models. IET Radar Sonar Navig. 2013, 7, 153–163. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Krichen, M. Generative Adversarial Networks. In Proceedings of the 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), Delhi, India, 6–8 July 2023; pp. 1–7. [Google Scholar] [CrossRef]
- Zhu, J.-Y.; Park, T.; Isola, P.; Efros, A.A. Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2242–2251. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; Springer International Publishing: Cham, Switzerland, 2015. [Google Scholar] [CrossRef]
- Panda, S.L.; Sahoo, U.K.; Maiti, S.; Sasmal, P. An Attention U-Net-Based Improved Clutter Suppression in GPR Images. IEEE Trans. Instrum. Meas. 2024, 73, 8502511. [Google Scholar] [CrossRef]
- Pei, J.; Yang, Y.; Wu, Z.; Ma, Y.; Huo, W.; Zhang, Y.; Huang, Y.; Yang, J. A Sea Clutter Suppression Method Based on Machine Learning Approach for Marine Surveillance Radar. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 3120–3130. [Google Scholar] [CrossRef]
- Wang, Y.; Jiang, Z.; Li, Y.; Hwang, J.-N.; Xing, G.; Liu, H. RODNet: A Real-Time Radar Object Detection Network Cross-Supervised by Camera-Radar Fused Object 3D Localization. IEEE J. Sel. Top. Signal Process. 2021, 15, 954–967. [Google Scholar] [CrossRef]
- Flir Systems. Available online: http://www.flir.com (accessed on 8 August 2024).
- Texas Instruments. Available online: http://www.ti.com (accessed on 8 August 2024).
- Loshchilov, I.; Hutter, F. Decoupled weight decay regularization. In Proceedings of the International Conference on Learning Representations (ICLR), New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Ide, H.; Kurita, T. Improvement of learning for CNN with ReLU activation by sparse regularization. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017; pp. 2684–2691. [Google Scholar] [CrossRef]
- Xie, S.; Girshick, R.; Dollár, P.; Tu, Z.; He, K. Aggregated Residual Transformations for Deep Neural Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 5987–5995. [Google Scholar] [CrossRef]
- Gao, S.; Cheng, M.M.; Zhao, K.; Zhang, X.Y.; Yang, M.H.; Torr, P. Res2Net: A New Multi-cale Backbone Architecture. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 652–662. [Google Scholar] [CrossRef]
- Oktay, O.; Schlemper, J.; Folgoc, L.L.; Lee, M.; Heinrich, M.; Misawa, K.; Mori, K.; McDonagh, S.; Hammerla, N.Y.; Kainz, B.; et al. Attention U-Net: Learning Where to Look for the Pancreas. arXiv 2018, arXiv:1804.03999. [Google Scholar]
- Poon, M.W.; Khan, R.H.; Le-Ngoc, S. A singular value decomposition (SVD) based method for suppressing ocean clutter in high frequency radar. IEEE Trans. Signal Process. 1993, 41, 1421–1425. [Google Scholar] [CrossRef] [PubMed]
- Isola, P.; Zhu, J.-Y.; Zhou, T.; Efros, A.A. Image-to-Image Translation with Conditional Adversarial Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 5967–5976. [Google Scholar] [CrossRef]
Category | Method | Key Contributions | Limitations |
---|---|---|---|
Traditional methods | MTI [5], MTD [6] |
| Inability to adapt to real-time dynamic scenarios. |
SVD [7] PCA [8] RPCA [9] |
| ||
Deep learning methods | Li et al. [10] Guo et al. [11] Zhang et al. [12] Geng et al. [13] Wen et al. [10] |
| Requirement for corresponding labeled data. |
Declutter-GAN [14] SCS-GAN [15] |
| Dependence on additional specified conditional. |
Camera | Values | Radar | Values |
---|---|---|---|
Frame rate | 30 FPS | Frame rate | 30 FPS |
Pixels | 1440 × 1080 | Frequency | 77 GHz |
Resolution | 1.6 MegaPixels | # of transmitters | 2 |
Field of view | 93.6° | # of receivers | 4 |
Stereo baseline | 0.35 m | # of chirps per frame | 255 |
Range resolution | 0.23 m | ||
Azimuth resolution | 15° |
Data | Number |
---|---|
Training set A | 23,352 |
Training set B | 8400 |
Test | 4180 |
Evaluation Metrics | Resnet | U-Net | ||
---|---|---|---|---|
Resnet_6blocks | Resnet_9blocks | Unet_128 | Unet_256 | |
PSNR (↑) | 27.289 | 28.299 | 34.294 | 38.802 |
SSIM (↑) | 0.957 | 0.970 | 0.978 | 0.987 |
Unet_256 | |||
---|---|---|---|
1 | √ | × | 26.013 |
2 | √ | √ | 38.802 |
Unet_256 | |||
---|---|---|---|
1 | √ | × | 0.944 |
2 | √ | √ | 0.987 |
λ | 10 | 7 | 5 | 2 | |
---|---|---|---|---|---|
μ | |||||
0.5 | 38.802 | 27.344 | 25.9945 | 25.763 | |
1 | 39.846 | 38.742 | 38.220 | 34.085 | |
1.5 | 39.409 | 39.208 | 39.156 | 38.943 | |
2.0 | 39.227 | 39.026 | 38.951 | 38.734 |
λ | 10 | 7 | 5 | 2 | |
---|---|---|---|---|---|
μ | |||||
0.5 | 0.987 | 0.962 | 0.934 | 0.934 | |
1 | 0.990 | 0.989 | 0.980 | 0.977 | |
1.5 | 0.989 | 0.984 | 0.981 | 0.978 | |
2.0 | 0.988 | 0.986 | 0.978 | 0.972 |
Method | SVD | RPCA | Pix2Pix | CycleGAN | Ours |
---|---|---|---|---|---|
PSNR (↑) | 25.58 | 28.83 | 33.758 | 34.294 | 39.846 |
SSIM (↑) | 0.874 | 0.905 | 0.969 | 0.978 | 0.990 |
Time(s) | 0.0102 | 2.0839 | 0.493 | 0.008 | 0.017 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Z.; Li, Y.; Wang, Y.; Zheng, T.; Qu, H. Millimeter-Wave Radar Clutter Suppression Based on Cycle-Consistency Generative Adversarial Network. Electronics 2024, 13, 4166. https://doi.org/10.3390/electronics13214166
Li Z, Li Y, Wang Y, Zheng T, Qu H. Millimeter-Wave Radar Clutter Suppression Based on Cycle-Consistency Generative Adversarial Network. Electronics. 2024; 13(21):4166. https://doi.org/10.3390/electronics13214166
Chicago/Turabian StyleLi, Ziyi, Yang Li, Yanping Wang, Tong Zheng, and Hongquan Qu. 2024. "Millimeter-Wave Radar Clutter Suppression Based on Cycle-Consistency Generative Adversarial Network" Electronics 13, no. 21: 4166. https://doi.org/10.3390/electronics13214166
APA StyleLi, Z., Li, Y., Wang, Y., Zheng, T., & Qu, H. (2024). Millimeter-Wave Radar Clutter Suppression Based on Cycle-Consistency Generative Adversarial Network. Electronics, 13(21), 4166. https://doi.org/10.3390/electronics13214166