Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks
Abstract
:1. Introduction
- We propose the approach for feature-engineering based on the NIR channel generation via cGANs.
- We investigate the impact of artificially generated and real NIR data on the model performance in the satellite image segmentation task. We also examine the NIR channel contribution in reducing labeled dataset size with minimum quality loss. The NIR channel for satellite cross-domain stability is considered.
2. Materials and Methods
2.1. Dataset
2.2. Artificial NIR Channel Generation
2.3. Forest Segmentation Task
2.4. NIR Channel Usage
2.5. Training Setup
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Maxwell, A.E.; Warner, T.A.; Fang, F. Implementation of machine-learning classification in remote sensing: An applied review. Int. J. Remote Sens. 2018, 39, 2784–2817. [Google Scholar] [CrossRef] [Green Version]
- Huete, A.; Justice, C.; Van Leeuwen, W. MODIS vegetation index (MOD13). Algorithm Theor. Basis Doc. 1999, 3, 295–309. [Google Scholar]
- Li, W.; Dong, R.; Fu, H.; Yu, L. Large-scale oil palm tree detection from high-resolution satellite images using two-stage convolutional neural networks. Remote Sens. 2019, 11, 11. [Google Scholar] [CrossRef] [Green Version]
- Illarionova, S.; Trekin, A.; Ignatiev, V.; Oseledets, I. Neural-Based Hierarchical Approach for Detailed Dominant Forest Species Classification by Multispectral Satellite Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 14, 1810–1820. [Google Scholar] [CrossRef]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Navarro, P.J.; Pérez, F.; Weiss, J.; Egea-Cortines, M. Machine learning and computer vision system for phenotype data acquisition and analysis in plants. Sensors 2016, 16, 641. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Scott, G.J.; England, M.R.; Starms, W.A.; Marcum, R.A.; Davis, C.H. Training deep convolutional neural networks for land–cover classification of high-resolution imagery. IEEE Geosci. Remote Sens. Lett. 2017, 14, 549–553. [Google Scholar] [CrossRef]
- Fan, J.; Chen, T.; Lu, S. Unsupervised Feature Learning for Land-Use Scene Recognition. IEEE Trans. Geosci. Remote Sens. 2017, 54, 2250–2261. [Google Scholar] [CrossRef]
- Flood, N.; Watson, F.; Collett, L. Using a U-net convolutional neural network to map woody vegetation extent from high resolution satellite imagery across Queensland, Australia. Int. J. Appl. Earth Obs. Geoinf. 2019, 82, 101897. [Google Scholar] [CrossRef]
- Alias, B.; Karthika, R.; Parameswaran, L. Classification of High Resolution Remote Sensing Images using Deep Learning Techniques. In Proceedings of the 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Bangalore, India, 19–22 September 2018; pp. 1196–1202. [Google Scholar]
- Satellite Imagery for Natural Disasters|Digital Globe. Available online: https://www.digitalglobe.com/ecosystem/open-data (accessed on 6 February 2021).
- De Lima, D.C.; Saqui, D.; Ataky, S.; Jorge, L.A.d.C.; Ferreira, E.J.; Saito, J.H. Estimating Agriculture NIR Images from Aerial RGB Data. In International Conference on Computational Science; Springer: Cham, Switzerland, 2019; pp. 562–574. [Google Scholar]
- Gravey, M.; Rasera, L.G.; Mariethoz, G. Analogue-based colorization of remote sensing images using textural information. ISPRS J. Photogramm. Remote Sens. 2019, 147, 242–254. [Google Scholar] [CrossRef]
- Abady, L.; Barni, M.; Garzelli, A.; Tondi, B. GAN generation of synthetic multispectral satellite images. In Image and Signal Processing for Remote Sensing XXVI. International Society for Optics and Photonics; SPIE: Bellingham, WA, USA, 2020; Volume 11533, p. 115330L. [Google Scholar]
- Mohandoss, T.; Kulkarni, A.; Northrup, D.; Mwebaze, E.; Alemohammad, H. Generating synthetic multispectral satellite imagery from sentinel-2. arXiv 2020, arXiv:2012.03108. [Google Scholar]
- Alqahtani, H.; Kavakli-Thorne, M.; Kumar, G. Applications of generative adversarial networks (gans): An updated review. Arch. Comput. Methods Eng. 2019, 28, 525–552. [Google Scholar] [CrossRef]
- Nazeri, K.; Ng, E.; Ebrahimi, M. Image colorization using generative adversarial networks. In International Conference on Articulated Motion and Deformable Objects; Springer: Cham, Switzerland, 2018; pp. 85–94. [Google Scholar]
- Suárez, P.L.; Sappa, A.D.; Vintimilla, B.X. Infrared image colorization based on a triplet dcgan architecture. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, HI, USA, 21–26 July 2017; pp. 18–23. [Google Scholar]
- Wu, M.; Jin, X.; Jiang, Q.; Lee, S.J.; Guo, L.; Di, Y.; Huang, S.; Huang, J. Remote Sensing Image Colorization Based on Multiscale SEnet GAN. In Proceedings of the 2019 12th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), Huaqiao, China, 19–21 October 2019; pp. 1–6. [Google Scholar]
- Li, F.; Ma, L.; Cai, J. Multi-discriminator generative adversarial network for high resolution gray-scale satellite image colorization. In Proceedings of the IGARSS 2018–2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 3489–3492. [Google Scholar]
- Tang, R.; Liu, H.; Wei, J. Visualizing Near Infrared Hyperspectral Images with Generative Adversarial Networks. Remote Sens. 2020, 12, 3848. [Google Scholar] [CrossRef]
- Singh, P.; Komodakis, N. Cloud-gan: Cloud removal for sentinel-2 imagery using a cyclic consistent generative adversarial networks. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 1772–1775. [Google Scholar]
- Zhu, J.Y.; Park, T.; Isola, P.; Efros, A.A. Unpaired image-to-image translation using cycle-consistent adversarial networks. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2223–2232. [Google Scholar]
- Isola, P.; Zhu, J.Y.; Zhou, T.; Efros, A.A. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1125–1134. [Google Scholar]
- Qu, Y.; Chen, Y.; Huang, J.; Xie, Y. Enhanced pix2pix dehazing network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 8160–8168. [Google Scholar]
- Wang, H.; Liu, X. Overview of image colorization and its applications. In Proceedings of the 2021 IEEE 5th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 12–14 March 2021; Volume 5, pp. 1561–1565. [Google Scholar]
- Welsh, T.; Ashikhmin, M.; Mueller, K. Transferring Color to Greyscale Images. ACM Trans. Graph. 2002, 21, 277–280. [Google Scholar] [CrossRef]
- Wu, M.; Jin, X.; Jiang, Q.; Lee, S.j.; Liang, W.; Lin, G.; Yao, S. Remote sensing image colorization using symmetrical multi-scale DCGAN in YUV color space. Visual Comput. 2020, 1–23. [Google Scholar] [CrossRef]
- Yang, P.; van der Tol, C.; Campbell, P.K.; Middleton, E.M. Fluorescence Correction Vegetation Index (FCVI): A physically based reflectance index to separate physiological and non-physiological information in far-red sun-induced chlorophyll fluorescence. Remote Sens. Environ. 2020, 240, 111676. [Google Scholar] [CrossRef]
- Vandal, T.J.; McDuff, D.; Wang, W.; Duffy, K.; Michaelis, A.; Nemani, R.R. Spectral Synthesis for Geostationary Satellite-to-Satellite Translation. IEEE Trans. Geosci. Remote. Sens. 2021. [Google Scholar] [CrossRef]
- Kwan, C.; Zhu, X.; Gao, F.; Chou, B.; Perez, D.; Li, J.; Shen, Y.; Koperski, K.; Marchisio, G. Assessment of Spatiotemporal Fusion Algorithms for Planet and Worldview Images. Sensors 2018, 18, 1051. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, F.; Zhu, X.; Liu, D. Blending MODIS and Landsat images for urban flood mapping. Int. J. Remote Sens. 2014, 35, 3237–3253. [Google Scholar] [CrossRef]
- Sedano, F.; Lisboa, S.N.; Sahajpal, R.; Duncanson, L.; Ribeiro, N.; Sitoe, A.; Hurtt, G.; Tucker, C.J. The connection between forest degradation and urban energy demand in sub-Saharan Africa: A characterization based on high-resolution remote sensing data. Environ. Res. Lett. 2021, 16, 064020. [Google Scholar] [CrossRef]
- He, Q.; Sun, X.; Yan, Z.; Fu, K. DABNet: Deformable Contextual and Boundary-Weighted Network for Cloud Detection in Remote Sensing Images. IEEE Trans. Geosci. Remote. Sens. 2021, 1–16. [Google Scholar] [CrossRef]
- Illarionova, S.; Nesteruk, S.; Shadrin, D.; Ignatiev, V.; Pukalchik, M.; Oseledets, I. MixChannel: Advanced Augmentation for Multispectral Satellite Images. Remote Sens. 2021, 13, 2181. [Google Scholar] [CrossRef]
- Pal, M. Random forest classifier for remote sensing classification. Int. J. Remote Sens. 2005, 26, 217–222. [Google Scholar] [CrossRef]
- GBDX. Available online: https://gbdxdocs.digitalglobe.com/ (accessed on 17 August 2021).
- Optical and Radar Data|SPOT. Available online: https://www.intelligence-airbusds.com/optical-and-radar-data/#spot (accessed on 6 February 2021).
- Satellite Imagery and Archive|Planet. Available online: https://www.planet.com/products/planet-imagery/ (accessed on 6 February 2021).
- Salehi, P.; Chalechale, A. Pix2Pix-based Stain-to-Stain Translation: A Solution for Robust Stain Normalization in Histopathology Images Analysis. In Proceedings of the 2020 International Conference on Machine Vision and Image Processing (MVIP), Tehran, Iran, 18–20 February 2020; pp. 1–7. [Google Scholar]
- Ren, H.; Li, J.; Gao, N. Two-stage sketch colorization with color parsing. IEEE Access 2019, 8, 44599–44610. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar]
- Szegedy, C.; Ioffe, S.; Vanhoucke, V.; Alemi, A.A. Inception-v4, inception-resnet and the impact of residual connections on learning. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Keras. 2019–2020. Available online: https://keras.io/ (accessed on 20 August 2020).
- TensorFlow. 2019–2020. Available online: https://github.com/tensorflow/tensorflow (accessed on 20 August 2020).
- Securewatch. Available online: https://www.maxar.com/products/securewatch (accessed on 17 August 2021).
- OneAtlas. Available online: https://www.intelligence-airbusds.com/imagery/oneatlas/ (accessed on 17 August 2021).
- Yu, X.; Wu, X.; Luo, C.; Ren, P. Deep learning in remote sensing scene classification: A data augmentation enhanced convolutional neural network framework. GIScience Remote Sens. 2017, 54, 741–758. [Google Scholar] [CrossRef] [Green Version]
- Illarionova, S.; Nesteruk, S.; Shadrin, D.; Ignatiev, V.; Pukalchik, M.; Oseledets, I. Object-Based Augmentation Improves Quality of Remote SensingSemantic Segmentation. arXiv 2021, arXiv:2105.05516. [Google Scholar]
- Nesteruk, S.; Shadrin, D.; Pukalchik, M.; Somov, A.; Zeidler, C.; Zabel, P.; Schubert, D. Image Compression and Plants Classification Using Machine Learning in Controlled-Environment Agriculture: Antarctic Station Use Case. IEEE Sens. J. 2021. [Google Scholar] [CrossRef]
MAE | RMSE | Mean Bias | |
---|---|---|---|
WorldView | 0.09 | 0.31 | 0.058 |
SPOT | 0.037 | 0.194 | −0.0029 |
Planet | 0.16 | 0.41 | 0.088 |
U-Net | RF | |||||
---|---|---|---|---|---|---|
Test images | RGB | RGB | RGB and | RGB | RGB | RGB and |
and NIR | artificial NIR | and NIR | artificial NIR | |||
SPOT | 0.954 | 0.961 | 0.96 | 0.874 | 0.892 | 0.889 |
Planet | 0.857 | 0.939 | 0.936 | 0.815 | 0.863 | 0.861 |
SPOT + Planet | 0.932 | 0.96 | 0.945 | 0.836 | 0.876 | 0.872 |
Average | 0.914 | 0.953 | 0.947 | 0.841 | 0.877 | 0.874 |
(+0.039) | (+0.033) | (+0.036) | (+0.033) |
Bands | All Data | 1/2 | 1/3 | |
---|---|---|---|---|
SPOT | RGB | 0.97 | 0.956 | 0.942 |
RGB and NIR | 0.97 | 0.963 | 0.961 | |
Planet | RGB | 0.939 | 0.933 | 0.874 |
RGB and NIR | 0.95 | 0.942 | 0.927 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Illarionova, S.; Shadrin, D.; Trekin, A.; Ignatiev, V.; Oseledets, I. Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks. Sensors 2021, 21, 5646. https://doi.org/10.3390/s21165646
Illarionova S, Shadrin D, Trekin A, Ignatiev V, Oseledets I. Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks. Sensors. 2021; 21(16):5646. https://doi.org/10.3390/s21165646
Chicago/Turabian StyleIllarionova, Svetlana, Dmitrii Shadrin, Alexey Trekin, Vladimir Ignatiev, and Ivan Oseledets. 2021. "Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks" Sensors 21, no. 16: 5646. https://doi.org/10.3390/s21165646
APA StyleIllarionova, S., Shadrin, D., Trekin, A., Ignatiev, V., & Oseledets, I. (2021). Generation of the NIR Spectral Band for Satellite Images with Convolutional Neural Networks. Sensors, 21(16), 5646. https://doi.org/10.3390/s21165646