A Deep-Learning Method for Radar Micro-Doppler Spectrogram Restoration
Abstract
:1. Introduction
2. Radar Spectrogram Restoration Using Deep Learning
2.1. Algorithm Overview
2.2. Fully Convolutional Network for Interference Localization
2.3. Generative Adversarial Network for Spectrogram Restoration
3. Experiment Implementation
3.1. Simulated Radar Dataset
3.2. Measured Radar Dataset
3.3. Measure Metrics
3.4. Training Details
4. Experimental Results
- Zeroing [11], which is a simple and well-known approach, is treated as a baseline during the experiments. It performs interference mitigation by simply setting the time domain samples of interference to a value of zero. The prior information on the position of the interference in the time domain is known.
- FCNs [14] uses FCNs to remove the interference in the FMCW signals and outputs the corresponding clean range profiles. In this approach, the main goal is to mitigate the interference, and how to restore the interfered signals with good quality is not considered.
- ResNet [13] adopts the deep residual network (ResNet) for interference mitigation in synthetic aperture radar (SAR). In detail, an interference detection network and an interference mitigation network are proposed respectively to remove interference and restore clean SAR images.
4.1. Results of the Simulated Data
4.1.1. Qualitative Evaluation
4.1.2. Quantitative Evaluation
4.2. Results of the Measured Data
4.2.1. Qualitative Evaluation
4.2.2. Quantitative Evaluation
4.3. Performance Comparison on the Human Activity Recognition Task
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
HAR | Human Activity Recognition |
MOCAP | Motion Capture |
MD | Micro-Doppler |
IMAT | Iterative Method with Adaptive Thresholding |
RFmin | Ramp Filtering |
DL | Deep-Learning |
GRU | Gated Recurrent Unit |
CS | Chirp Sequence |
GAN | Generative Adversarial Network |
AWGN | Additive White Gaussian Noise |
PSNR | Peak Signal-to-Noise Ratio |
SSIM | Structural Similarity |
MSE | Mean Square Error |
STFT | Short-Time Fourier Transform |
MTI | Moving Target Indicator |
FCNs | Fully Convolutional Neural Networks |
ResNet | Residual Network |
SAR | Synthetic Aperture Radar |
References
- Amin, M.G.; Zhang, Y.D.; Ahmad, F.; Ho, K.C.D. Radar Signal Processing for Elderly Fall Detection: The future for in-home monitoring. IEEE Signal Process. Mag. 2016, 33, 71–80. [Google Scholar] [CrossRef]
- Fioranelli, F.; Kernec, J.L.; Shah, S.A. Radar for Health Care: Recognizing Human Activities and Monitoring Vital Signs. IEEE Potentials 2019, 38, 16–23. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; He, Y.; Jing, X. A Survey of Deep Learning-Based Human Activity Recognition in Radar. Remote Sens. 2019, 11, 1068. [Google Scholar] [CrossRef] [Green Version]
- Du, H.; Jin, T.; He, Y.; Song, Y.; Dai, Y. Segmented convolutional gated recurrent neural networks for human activity recognition in ultra-wideband radar. Neurocomputing 2019, 396, 451–464. [Google Scholar] [CrossRef]
- He, Y.; Li, X.; Jing, X. A Mutiscale Residual Attention Network for Multitask Learning of Human Activity Using Radar Micro-Doppler Signatures. Remote Sens. 2019, 11, 2584. [Google Scholar] [CrossRef] [Green Version]
- Erol, B.; Gurbuz, S.Z.; Amin, M.G. Motion Classification using Kinematically Sifted ACGAN-Synthesized Radar Micro-Doppler Signatures. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 3197–3213. [Google Scholar] [CrossRef] [Green Version]
- Yang, Y.; Hou, C.; Lang, Y.; Guan, D.; Huang, D.; Xu, J. Open-set human activity recognition based on micro-Doppler signatures. Pattern Recognit. 2019, 85, 60–69. [Google Scholar] [CrossRef]
- Yang, Y.; Hou, C.; Lang, Y.; Sakamoto, T.; He, Y.; Xiang, W. Omnidirectional Motion Classification with Monostatic Radar System Using Micro-Doppler Signatures. IEEE Trans. Geosci. Remote Sens. 2019, 58, 3574–3587. [Google Scholar] [CrossRef]
- Rock, J.; Toth, M.; Messner, E.; Meissner, P.; Pernkopf, F. Complex Signal Denoising and Interference Mitigation for Automotive Radar Using Convolutional Neural Networks. In Proceedings of the 2019 22th International Conference on Information Fusion (FUSION), Ottawa, ON, Canada, 2–5 July 2019; pp. 1–8. [Google Scholar]
- Toth, M.; Meissner, P.; Melzer, A.; Witrisal, K. Performance comparison of mutual automotive radar interference mitigation algorithms. In Proceedings of the 2019 IEEE Radar Conference (RadarConf), Boston, MA, USA, 22–26 April 2019; pp. 1–6. [Google Scholar]
- Tao, M.; Zhou, F.; Zhang, Z. Wideband interference mitigation in high-resolution airborne synthetic aperture radar data. IEEE Trans. Geosci. Remote Sens. 2015, 54, 74–87. [Google Scholar] [CrossRef]
- Huang, D.; Hou, C.; Yang, Y.; Lang, Y.; Wang, Q. Micro-Doppler spectrogram denoising based on generative adversarial network. In Proceedings of the 2018 48th European Microwave Conference (EuMC), Madrid, Spain, 23–27 September 2018; pp. 909–912. [Google Scholar]
- Fan, W.; Zhou, F.; Tao, M.; Bai, X.; Rong, P.; Yang, S.; Tian, T. Interference Mitigation for Synthetic Aperture Radar Based on Deep Residual Network. Remote Sens. 2019, 11, 1654. [Google Scholar] [CrossRef] [Green Version]
- Ristea, N.C.; Anghel, A.; Ionescu, R.T. Fully Convolutional Neural Networks for Automotive Radar Interference Mitigation. arXiv 2020, arXiv:2007.11102. [Google Scholar] [CrossRef] [Green Version]
- Dai, J.; Li, Y.; He, K.; Sun, J. R-FCN: Object Detection via Region-based Fully Convolutional Networks. In Proceedings of the Advances in Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; pp. 379–387. [Google Scholar]
- He, D.; Yang, X.; Liang, C.; Zhou, Z.; Ororbi, A.G.; Kifer, D.; Lee Giles, C. Multi-scale FCN with cascaded instance aware segmentation for arbitrary oriented word spotting in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 3519–3528. [Google Scholar]
- Zhou, X.; Takayama, R.; Wang, S.; Hara, T.; Fujita, H. Deep learning of the sectional appearances of 3D CT images for anatomical structure segmentation based on an FCN voting method. Med. Phys. 2017, 44, 5221–5233. [Google Scholar] [CrossRef] [PubMed]
- Roth, H.R.; Oda, H.; Zhou, X.; Shimizu, N.; Yang, Y.; Hayashi, Y.; Oda, M.; Fujiwara, M.; Misawa, K.; Mori, K. An application of cascaded 3D fully convolutional networks for medical image segmentation. Comput. Med. Imaging Graph. 2018, 66, 90–99. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Teimouri, N.; Dyrmann, M.; Jorgensen, R.N. A Novel Spatio-Temporal FCN-LSTM Network for Recognizing Various Crop Types Using Multi-Temporal Radar Images. Remote Sens. 2019, 11, 990. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Z.; Guo, W.; Yu, W.; Yu, W. Multi-task fully convolutional networks for building segmentation on SAR image. J. Eng. 2019, 2019, 7074–7077. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Yu, J.; Lin, Z.; Yang, J.; Shen, X.; Lu, X.; Huang, T.S. Generative image inpainting with contextual attention. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 5505–5514. [Google Scholar]
- Gulrajani, I.; Ahmed, F.; Arjovsky, M.; Dumoulin, V.; Courville, A.C. Improved training of wasserstein gans. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; pp. 5767–5777. [Google Scholar]
- Creswell, A.; White, T.; Dumoulin, V.; Arulkumaran, K.; Sengupta, B.; Bharath, A.A. Generative adversarial networks: An overview. IEEE Signal Process. Mag. 2018, 35, 53–65. [Google Scholar] [CrossRef] [Green Version]
- Andoni, A.; Indyk, P.; Krauthgamer, R. Earth mover distance over high-dimensional spaces. In Proceedings of the Symposium on Discrete Algorithms, San Fracisco, CA, USA, 20–22 January 2008; pp. 343–352. [Google Scholar]
- Ntouskos, V.; Papadakis, P.; Pirri, F. A Ccomprehensive Analysis of Human Motion Ccapture Data for Action Recognition. In Proceedings of the International Conference on Computer Vision Theory and Applications, Rome, Italy, 24–26 February 2012; pp. 647–652. [Google Scholar]
- Salimans, T.; Goodfellow, I.; Zaremba, W.; Cheung, V.; Radford, A.; Chen, X. Improved techniques for training gans. In Proceedings of the Advances in Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; pp. 2234–2242. [Google Scholar]
- Deng, J.; Dong, W.; Socher, R.; Li, L.; Li, K.; Feifei, L. ImageNet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 248–255. [Google Scholar]
- Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Davis, A.; Dean, J.; Devin, M.; Ghemawat, S.; Irving, G.; Isard, M.; et al. Tensorflow: A system for large-scale machine learning. In Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), Savannah, GA, USA, 2–4 November 2016; pp. 265–283. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; pp. 1097–1105. [Google Scholar]
Zeroing | FCNs | ResNet | Ours | |
---|---|---|---|---|
PSNR | 35.210 | 58.935 | 63.364 | 65.714 |
SSIM | 0.720 | 0.887 | 0.926 | 0.930 |
Zeroing | FCNs | ResNet | Ours | |
---|---|---|---|---|
PSNR | 39.053 | 50.273 | 51.249 | 51.714 |
SSIM | 0.767 | 0.812 | 0.822 | 0.864 |
Zeroing | FCNs | ResNet | Ours | ||
---|---|---|---|---|---|
Walking | PSNR | 48.557 | 57.397 | 56.465 | 56.805 |
SSIM | 0.883 | 0.879 | 0.850 | 0.906 | |
Running | PSNR | 33.087 | 49.025 | 48.676 | 47.279 |
SSIM | 0.687 | 0.769 | 0.758 | 0.808 | |
Jumping | PSNR | 34.904 | 47.947 | 46.741 | 49.039 |
SSIM | 0.695 | 0.772 | 0.757 | 0.836 | |
Boxing | PSNR | 45.541 | 55.961 | 54.483 | 56.179 |
SSIM | 0.886 | 0.908 | 0.903 | 0.912 | |
Circle Running | PSNR | 33.177 | 45.000 | 45.917 | 49.267 |
SSIM | 0.685 | 0.784 | 0.794 | 0.856 |
Zeroing | FCNs | ResNet | Ours | |
---|---|---|---|---|
Simulated Data | 0.855 | 0.864 | 0.866 | 0.947 |
Measured Data | 0.805 | 0.821 | 0.819 | 0.915 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
He, Y.; Li, X.; Li, R.; Wang, J.; Jing, X. A Deep-Learning Method for Radar Micro-Doppler Spectrogram Restoration. Sensors 2020, 20, 5007. https://doi.org/10.3390/s20175007
He Y, Li X, Li R, Wang J, Jing X. A Deep-Learning Method for Radar Micro-Doppler Spectrogram Restoration. Sensors. 2020; 20(17):5007. https://doi.org/10.3390/s20175007
Chicago/Turabian StyleHe, Yuan, Xinyu Li, Runlong Li, Jianping Wang, and Xiaojun Jing. 2020. "A Deep-Learning Method for Radar Micro-Doppler Spectrogram Restoration" Sensors 20, no. 17: 5007. https://doi.org/10.3390/s20175007
APA StyleHe, Y., Li, X., Li, R., Wang, J., & Jing, X. (2020). A Deep-Learning Method for Radar Micro-Doppler Spectrogram Restoration. Sensors, 20(17), 5007. https://doi.org/10.3390/s20175007