A Fast Registration Method for Optical and SAR Images Based on SRAWG Feature Description
Abstract
:1. Introduction
- A 3D dense feature description based on SRAWG is proposed, combined with non-maximum suppression template search, to further improve registration accuracy. The single-scale Sobel and ROEWA operators are used to calculate the consistent gradients of optical and SAR images, respectively, which have better speckle noise suppression ability than the differential gradient operator. The gradient of each pixel is projected to its adjacent quantized gradient direction by means of angle weighting, and the weighted gradient amplitudes in the 3 × 3 neighborhood of each pixel are fused to form the feature description of each pixel. SRAWG can more accurately describe the structural features of optical and SAR images. We introduce a non-maximum suppression method to address the multi-peak phenomenon of search surfaces in frequency-domain-based template search methods. This further improves registration accuracy.
- In order to maintain high registration efficiency, we modify the original multi-scale Sobel and ROEWA operators into the single-scale case for the computation of image gradients. For the repeated feature construction problem in template matching, we adopt the overlapping template merging method and analyze the efficiency improvement brought by this method.
2. Method
2.1. Proposed Fast Registration Method
2.2. Implementation Details of Each Step
2.2.1. Feature Point Extraction
- The detector relies on the differential gradient of the image; so when it is directly applied to the SAR image detection task, many false corner points are often obtained.
- Due to the lack of spatial location constraints, the detected corner points are unevenly distributed in the image space, which has an impact on the final image registration accuracy.
2.2.2. 3D Dense Feature Descriptions Based on SRAWG
2.2.3. Feature Matching
3. Results
3.1. Effectiveness of Consistent Gradient Computation for Optical and SAR Images
3.2. Comparison of SSD Search Surfaces with Different Feature Descriptions
3.3. Overlapping Template Merging Performance Analysis
3.4. Image Registration Experiments
3.4.1. Experimental Data and Parameter Settings
3.4.2. Evaluation Criteria
3.4.3. Results and Analysis
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Kulkarni, S.C.; Rege, P.P. Pixel level fusion techniques for SAR and optical images: A review—ScienceDirect. Inf. Fusion 2020, 59, 13–29. [Google Scholar] [CrossRef]
- Yu, Q.; Ni, D.; Jiang, Y.; Yan, Y.; An, J.; Sun, T. Universal SAR and optical image registration via a novel SIFT framework based on nonlinear diffusion and a polar spatial-frequency descriptor. ISPRS J. Photogramm. Remote Sens. 2021, 171, 1–17. [Google Scholar] [CrossRef]
- Michael, W.; Thomas, S.; Zhu, X.; Matthias, W.; Taubenbö; Hannes, C. Semantic Segmentation of slums in satellite images using transfer learning on fully convolutional neural networks. ISPRS J. Photogramm. Remote Sens. 2019, 150, 59–69. [Google Scholar] [CrossRef]
- Jia, L.; Li, M.; Wu, Y.; Zhang, P.; Liu, G.; Chen, H.; An, L. SAR Image Change Detection Based on Iterative Label-Information Composite Kernel Supervised by Anisotropic Texture. IEEE Trans. Geosci. Remote Sens. 2015, 53, 3960–3973. [Google Scholar] [CrossRef]
- Wan, L.; Zhang, T.; You, H. Multi-sensor remote sensing image change detection based on sorted histograms. Int. J. Remote Sens. 2018, 39, 3753–3775. [Google Scholar] [CrossRef]
- Wilfried, H.; Michal, H.; Konrad, S. Recent developments in large-scale tie-point matching. ISPRS J. Photogramm. Remote Sens. 2016, 115, 47–62. [Google Scholar] [CrossRef]
- Xiang, Y.; Wang, F.; You, H. OS-SIFT: A robust SIFT-like algorithm for high-resolution optical-to-SAR image registration in suburban areas. IEEE Trans. Geosci. Remote Sens. 2018, 56, 3078–3090. [Google Scholar] [CrossRef]
- Zhang, X.; Leng, C.; Hong, Y.; Pei, Z.; Cheng, I.; Anup, B. Multimodal Remote Sensing Image Registration Methods and Advancements: A Survey. Remote Sens. 2021, 13, 5128. [Google Scholar] [CrossRef]
- Zhao, F.; Huang, Q.; Gao, W. Image Matching by Normalized Cross-Correlation. In Proceedings of the 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, Toulouse, France, 14–19 May 2006. [Google Scholar] [CrossRef]
- Shi, W.; Su, F.; Wang, R.; Fan, J. A visual circle based image registration algorithm for optical and SAR imagery. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012. [Google Scholar] [CrossRef]
- Suri, S.; Reinartz, P. Mutual-Information-Based Registration of TerraSAR-X and Ikonos Imagery in Urban Areas. IEEE Trans. Geosci. Remote Sens. 2010, 48, 939–949. [Google Scholar] [CrossRef]
- Siddique, M.A.; Sarfraz, S.M.; Bornemann, D.; Hellwich, O. Automatic registration of SAR and optical images based on mutual information assisted Monte Carlo. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012. [Google Scholar] [CrossRef]
- Zhuang, Y.; Gao, K.; Miu, X.; Han, L.; Gong, X. Infrared and visual image registration based on mutual information with a combined particle swarm optimization—Powell search algorithm. Opt. -Int. J. Light Electron Opt. 2016, 127, 188–191. [Google Scholar] [CrossRef]
- Yu, L.; Zhang, D.; Holden, E.J. A fast and fully automatic registration approach based on point features for multi-source remote-sensing images. Comput. Geosci. 2008, 34, 838–848. [Google Scholar] [CrossRef]
- Shi, X.; Jiang, J. Automatic registration method for optical remote sensing images with large background variations using line segments. Remote Sens. 2016, 8, 426. [Google Scholar] [CrossRef] [Green Version]
- Li, H.; Manjunath, B.; Mitra, S.K. A contour-based approach to multisensor image registration. IEEE Trans. Image Process. 1995, 4, 320–334. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dai, X.; Khorram, S. A feature-based image registration algorithm using improved chain-code representation combined with invariant moments. IEEE Trans. Geosci. Remote Sens. 1999, 37, 2351–2362. [Google Scholar] [CrossRef] [Green Version]
- Wu, Y.; Ma, W.; Gong, M.; Su, L.; Jiao, L. A Novel Point-Matching Algorithm Based on Fast Sample Consensus for Image Registration. IEEE Geosci. Remote Sens. Lett. 2017, 12, 43–47. [Google Scholar] [CrossRef]
- Wu, B.; Zhou, S.; Ji, K. A novel method of corner detector for SAR images based on Bilateral Filter. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium, Beijing, China, 10–15 July 2016. [Google Scholar] [CrossRef]
- Sedaghat, A.; Ebadi, H. Remote Sensing Image Matching Based on Adaptive Binning SIFT Descriptor. IEEE Trans. Geosci. Remote Sens. 2015, 53, 5283–5293. [Google Scholar] [CrossRef]
- Liao, S.; Chung, A. Nonrigid Brain MR Image Registration Using Uniform Spherical Region Descriptor. IEEE Trans. Geosci. Remote Sens. 2012, 21, 157–169. [Google Scholar] [CrossRef] [Green Version]
- Lindeberg, T. Feature detection with automatic scale selection. Int. J. Comput. Vis. 1998, 30, 79–116. [Google Scholar] [CrossRef]
- Fan, B.; Huo, C.; Pan, C.; Kong, Q. Registration of Optical and SAR Satellite Images by Exploring the Spatial Relationship of the Improved SIFT. IEEE Geosci. Remote Sens. Lett. 2013, 10, 657–661. [Google Scholar] [CrossRef] [Green Version]
- Dellinger, F.; Delon, J.; Gousseau, Y.; Michel, J.; Tupin, F. SAR-SIFT: A SIFT-like algorithm for SAR images. IEEE Trans. Geosci. Remote Sens. 2014, 53, 453–466. [Google Scholar] [CrossRef] [Green Version]
- Gong, M.; Zhao, S.; Jiao, L.; Tian, D.; Wang, S. A Novel Coarse-to-Fine Scheme for Automatic Image Registration Based on SIFT and Mutual Information. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4328–4338. [Google Scholar] [CrossRef]
- Xu, C.; Sui, H.; Li, H.; Liu, J. An automatic optical and SAR image registration method with iterative level set segmentation and SIFT. Int. J. Remote Sens. 2015, 36, 3997–4017. [Google Scholar] [CrossRef]
- Ma, W.; Wen, Z.; Wu, Y.; Jiao, L.; Gong, M.; Zheng, Y.; Liu, L. Remote Sensing Image Registration With Modified SIFT and Enhanced Feature Matching. IEEE Geosci. Remote Sens. Lett. 2017, 14, 3–7. [Google Scholar] [CrossRef]
- Ma, W.; Wu, Y.; Liu, S.; Su, Q.; Zhong, Y. Remote Sensing Image Registration Based on Phase Congruency Feature Detection and Spatial Constraint Matching. IEEE J. Transl. Eng. Health Med. 2018, 6, 77554–77567. [Google Scholar] [CrossRef]
- Fan, J.; Wu, Y.; Li, M.; Liang, W.; Cao, Y. SAR and optical image registration using nonlinear diffusion and phase congruency structural descriptor. IEEE Trans. Geosci. Remote Sens. 2018, 56, 5368–5379. [Google Scholar] [CrossRef]
- Liu, X.; Ai, Y.; Zhang, J.; Wang, Z. A Novel Affine and Contrast Invariant Descriptor for Infrared and Visible Image Registration. Remote Sens. 2018, 10, 658. [Google Scholar] [CrossRef]
- Li, J.; Hu, Q.; Ai, M. RIFT: Multi-Modal Image Matching Based on Radiation-Variation Insensitive Feature Transform. IEEE Trans. Image Process. 2020, 29, 3296–3310. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J.; Bruzzone, L.; Shen, L. Robust Registration of Multimodal Remote Sensing Images Based on Structural Similarity. IEEE Trans. Geosci. Remote Sens. 2017, 55, 2941–2958. [Google Scholar] [CrossRef]
- Ye, Y.; Bruzzone, L.; Shan, J.; Bovolo, F.; Zhu, Q. Fast and robust matching for multimodal remote sensing image registration. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9059–9070. [Google Scholar] [CrossRef] [Green Version]
- Li, S.; Lv, X.; Ren, J.; Li, J. A Robust 3D Density Descriptor Based on Histogram of Oriented Primary Edge Structure for SAR and Optical Image Co-Registration. Remote Sens. 2022, 14, 630. [Google Scholar] [CrossRef]
- Ye, F.; Su, Y.; Xiao, H.; Zhao, X.; Min, W. Remote Sensing Image Registration Using Convolutional Neural Network Features. IEEE Geosci. Remote Sens. Lett. 2018, 15, 232–236. [Google Scholar] [CrossRef]
- Ma, W.; Zhang, J.; Wu, Y.; Jiao, L.; Zhu, H.; Zhao, W. A Novel Two-Step Registration Method for Remote Sensing Images Based on Deep and Local Features. IEEE Trans. Geosci. Remote Sens. 2019, 57, 4834–4843. [Google Scholar] [CrossRef]
- Li, Z.; Zhang, H.; Huang, Y. A Rotation-Invariant Optical and SAR Image Registration Algorithm Based on Deep and Gaussian Features. Remote Sens. 2021, 13, 2628. [Google Scholar] [CrossRef]
- Merkle, N.; Auer, S.; Muller, R.; Reinartz, P. Exploring the Potential of Conditional Adversarial Networks for Optical and SAR Image Matching. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2018, 11, 1811–1820. [Google Scholar] [CrossRef]
- Zhang, H.; Ni, W.; Yan, W.; Xiang, D.; Wu, J.; Yang, X.; Bian, H. Registration of multimodal remote sensing image based on deep fully convolutional neural network. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2019, 12, 3028–3042. [Google Scholar] [CrossRef]
- Zhang, H.; Lei, L.; Ni, W.; Tang, T.; Wu, J.; Xiang, D.; Kuang, G. Optical and SAR Image Matching Using Pixelwise Deep Dense Features. IEEE Geosci. Remote Sens. Lett. 2020, 19, 6000705. [Google Scholar] [CrossRef]
- Quan, D.; Wang, S.; Liang, X.; Wang, R.; Fang, S.; Hou, B.; Jiao, L. Deep generative matching network for optical and SAR image registration. In Proceedings of the 2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018. [Google Scholar] [CrossRef]
- Bürgmann, T.; Koppe, W.; Schmitt, M. Matching of TerraSAR-X derived ground control points to optical image patches using deep learning. ISPRS J. Photogramm. Remote Sens. 2019, 158, 241–248. [Google Scholar] [CrossRef]
- Cui, S.; Ma, A.; Zhang, L.; Xu, M.; Zhong, Y. MAP-Net: SAR and Optical Image Matching via Image-Based Convolutional Network with Attention Mechanism and Spatial Pyramid Aggregated Pooling. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–13. [Google Scholar] [CrossRef]
- Hughes, L.H.; Schmitt, M. A semi-supervised approach to sar-optical image matching. ISPRS Ann. Photogramm. Remote Sens. Spat.Inf. Sci. 2019, 4, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Jia, H. Research on Automatic Registration of Optical and SAR Images. Master’s Thesis, Chang’an University, Xi’an, China, 2020. [Google Scholar]
- Roger, F.; Armand, L.; Phillipe, M.; Eliane, C.C. An optimal multiedge detector for SAR image segmentation. IEEE Trans. Geosci. Remote Sens. 1998, 36, 793–802. [Google Scholar] [CrossRef] [Green Version]
- Wei, Q.; Feng, D. An efficient SAR edge detector with a lower false positive rate. Int. J. Remote Sens. 2015, 36, 3773–3797. [Google Scholar] [CrossRef]
- Fan, Z.; Zhang, L.; Wang, Q.; Liu, S.; Ye, Y. A fast matching method of SAR and optical images using angular weighted orientated gradients. Acta Geod. Cartogr. Sin. 2021, 50, 1390–1403. [Google Scholar] [CrossRef]
- Ye, Y.; Bruzzone, L.; Shan, J.; Shen, L. Fast and Robust Structure-based Multimodal Geospatial Image Matching. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium, Fort Worth, TX, USA, 23–28 July 2017. [Google Scholar] [CrossRef]
Case | Dataset Description | ||
---|---|---|---|
Reference Image | Sensed Image | Image Characteristic | |
1 | Sensor: Google Earth | Sensor: GF-3 | The image covers the river region on the outskirts, and the terrain is relatively flat. The edge of the river is clear on the SAR image, and the speckle noise is relatively weak. However, there are significant nonlinear intensity differences compared to optical image. |
Original Resolution: 5 m | Original Resolution: 10 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 24.71° | ||
Size: 1000 × 1000 | Size: 1000 × 1000 | ||
GSD: 5 m | GSD: 5 m | ||
2 | Sensor: Google Earth | Sensor: GF-3 | The image covers an urban region with tall buildings and has local geometric distortions. There is significant speckle noise on SAR images. |
Original Resolution: 1 m | Original Resolution: 3 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 43.28° | ||
Size: 1000 × 1000 | Size: 1000 × 1000 | ||
GSD: 3 m | GSD: 3 m | ||
3 | Sensor: Google Earth | Sensor: GF-3 | The image covers the airport region, which has clear geometry in both images. There is a clear intensity inversion on the SAR image and optical image, and significant speckle noise on the SAR image. |
Original Resolution: 5 m | Original Resolution: 10 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 20.23° | ||
Size: 1500 × 1500 | Size: 1500 × 1500 | ||
GSD: 10 m | GSD: 10 m | ||
4 | Sensor: Google Earth | Sensor: Airborne SAR | The image covers suburban regions with complex structures such as buildings, reservoirs, farmland, etc. On the SAR image, some shadow areas are generated due to the large incident angle, and the image has some defocusing phenomenon. |
Original Resolution: 1 m | Original Resolution: 0.5 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 81.24° | ||
Size: 1500 × 1500 | Size: 1500 × 1500 | ||
GSD: 1 m | GSD: 1 m | ||
5 | Sensor: Google Earth | Sensor: Airborne SAR | The image covers a large region of rural farmland with relatively flat terrain. Compared with the optical image, there is a significant intensity inversion phenomenon in the SAR image. |
Original Resolution: 1 m | Original Resolution: 1 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 80.89° | ||
Size: 1500 × 1500 | Size: 1500 × 1500 | ||
GSD: 1 m | GSD: 1 m | ||
6 | Sensor: Google Earth | Sensor: Sentinel-1 | The image covers the outskirts of a city, which is flat and has complex features such as buildings, farmland, lakes, and road networks. |
Original Resolution: 5 m | Original Resolution: 10 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 38.94° | ||
Size: 2570 × 1600 | Size: 2570 × 1600 | ||
GSD: 10 m | GSD: 10 m | ||
7 | Sensor: Google Earth | Sensor: Sentinel-1 | The image covers the hilly area around the river. The terrain in this area is greatly undulating, and the geometric difference between the optical image and the SAR image is large. |
Original Resolution: 5 m | Original Resolution: 10 m | ||
Central Incidence Angle: 0° | Central Incidence Angle: 39.13° | ||
Size: 1274 × 1073 | Size: 1274 × 1073 | ||
GSD: 10 m | GSD: 10 m |
Methods | Parameter Space |
---|---|
SRAWG | N = 5, k = 8, N = 9, σ = 0.8, αn = 2, βm = 2, Rt = 0.9, T = 1/0.9, Ns = 1% number of template pixels, Template size: 100 × 100, Radius of search region: 20 pixels |
CFOG | n = 5, k = 8, N = 9, σ = 0.8, Template size: 100 × 100, Radius of search region: 20 pixels |
HOPC | n = 5, k = 8, N = 9, Template size: 100 × 100, Radius of search region: 20 pixels |
HOPES | n = 5, k = 8, N = 9, σ = 0.8, Nscale = 3, σMSG = 1.4, kMSG = 1.4, γMSG = 6, Rt = 0.9, T = 1/0.9, Ns = 1% number of template pixels, Template size: 100 × 100, Radius of search region: 20 pixels |
Case | Criteria | SRAWG | HOPES | CFOG | HOPC | RIFT |
---|---|---|---|---|---|---|
1 | NCM | 177 | 178 | 156 | 167 | 147 |
CMR | 90.31 | 91.28 | 79.59 | 84.34 | 51.22 | |
RMSE | 0.9257 | 1.0170 | 1.2618 | 1.4919 | 1.7086 | |
Run Time | 5.5 | 16.4 | 6.1 | 43.68 | 12.99 | |
2 | NCM | 171 | 139 | 159 | 145 | 92 |
CMR | 86.36 | 71.65 | 80.3 | 73.23 | 49.73 | |
RMSE | 1.0413 | 1.4455 | 1.4143 | 1.4632 | 1.8324 | |
Run Time | 5.6 | 16.5 | 6.2 | 43.01 | 12.74 | |
3 | NCM | 166 | 152 | 118 | 106 | 14 |
CMR | 90.71 | 85.88 | 67.05 | 66.67 | 28 | |
RMSE | 0.9109 | 1.0575 | 1.5540 | 1.5148 | 2.5998 | |
Run Time | 8.3 | 16.8 | 6.2 | 48.59 | 16.39 | |
4 | NCM | 112 | 96 | 114 | 103 | 13 |
CMR | 68.71 | 68.57 | 65.9 | 60.95 | 15.85 | |
RMSE | 1.9126 | 1.7773 | 2.0346 | 2.1014 | 4.5432 | |
Run Time | 8.2 | 16.6 | 6.3 | 48.86 | 17.21 | |
5 | NCM | 92 | 89 | 62 | 99 | 39 |
CMR | 92.93 | 85.58 | 59.62 | 68.28 | 41.05 | |
RMSE | 0.9936 | 1.1819 | 1.6501 | 1.5363 | 1.8114 | |
Run Time | 7.8 | 16.6 | 6.2 | 48.64 | 16.93 | |
6 | NCM | 183 | 175 | 163 | 171 | 34 |
CMR | 95.31 | 91.15 | 89.07 | 88.14 | 38.2 | |
RMSE | 0.7376 | 0.9189 | 1.0536 | 1.3901 | 2.0999 | |
Run Time | 9.70 | 16.92 | 6.34 | 53.41 | 22.23 | |
7 | NCM | 72 | 66 | 51 | 42 | 15 |
CMR | 60 | 45.52 | 58.62 | 34.71 | 25 | |
RMSE | 1.8506 | 2.2699 | 2.0861 | 2.2457 | 2.5719 | |
Run Time | 6.38 | 15.48 | 5.66 | 48.09 | 14.75 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Z.; Yu, A.; Zhang, B.; Dong, Z.; Chen, X. A Fast Registration Method for Optical and SAR Images Based on SRAWG Feature Description. Remote Sens. 2022, 14, 5060. https://doi.org/10.3390/rs14195060
Wang Z, Yu A, Zhang B, Dong Z, Chen X. A Fast Registration Method for Optical and SAR Images Based on SRAWG Feature Description. Remote Sensing. 2022; 14(19):5060. https://doi.org/10.3390/rs14195060
Chicago/Turabian StyleWang, Zhengbin, Anxi Yu, Ben Zhang, Zhen Dong, and Xing Chen. 2022. "A Fast Registration Method for Optical and SAR Images Based on SRAWG Feature Description" Remote Sensing 14, no. 19: 5060. https://doi.org/10.3390/rs14195060
APA StyleWang, Z., Yu, A., Zhang, B., Dong, Z., & Chen, X. (2022). A Fast Registration Method for Optical and SAR Images Based on SRAWG Feature Description. Remote Sensing, 14(19), 5060. https://doi.org/10.3390/rs14195060