Edge Consistency Feature Extraction Method for Multi-Source Image Registration
Abstract
:1. Introduction
- To capture the similarity of geometric structures and morphological features, phase congruency is used to extract image edges, and the differences between multi-source remote sensing images are suppressed by eliminating noise and weakening texture.
- To reduce computing costs, phase congruency models are constructed using Log-Gabor filtering in orthogonal directions.
- To increase the dependability of descriptor features with richer description information, sector descriptors are built based on edge consistency features.
2. Materials and Methods
2.1. Multi-Source Image Preprocessing
2.1.1. Non-Local Mean Filtering
2.1.2. Co-Occurrence Filtering
2.2. Edge Feature Detection
2.3. Sector Descriptor Construction
2.4. Feature Matching and Outlier Removal
3. Experiments and Results
3.1. Datasets
3.2. Evaluation Criterion
3.3. Registration Results and Analysis
3.3.1. Comparison Experiment of Preprocessing Algorithms
3.3.2. Comparison Experiment of OLG and LG
3.3.3. Comparison Experiment of Square Descriptor and Sector Descriptor
3.3.4. Comparative Results on SAR-Optical Data
3.4. Comparative Results on Multi-Source Data
4. Discussion
4.1. The Effect of Noises on MIM
4.2. Fine-Registration and Considerable Difference
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
SAR | Synthetic Aperture Radar |
EC-RIFT | Edge Consistency Radiation-variation Insensitive Feature Transform |
NLM | Non-local Mean |
CoF | Co-occurrence Filter |
OLG | Orthogonal Log-Gabor |
RMSE | Root Mean Square Error |
References
- Chen, C.; Wang, C.; Liu, B.; He, C.; Cong, L.; Wan, S. Edge Intelligence Empowered Vehicle Detection and Image Segmentation for Autonomous Vehicles. IEEE Trans. Intell. Transp. Syst. 2023, 1–12. [Google Scholar] [CrossRef]
- Lv, N.; Zhang, Z.; Li, C. A hybrid-attention semantic segmentation network for remote sensing interpretation in land-use surveillance. Int. J. Mach. Learn. Cybern. 2023, 14, 395–406. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Chen, H.M.; Varshney, P.K.; Arora, M.K. Performance of Mutual Information Similarity Measure for Registration of Multitemporal Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2003, 41, 2445–2454. [Google Scholar] [CrossRef]
- Ye, Y.; Bruzzone, L.; Shan, J.; Bovolo, F.; Zhu, Q. Fast and Robust Matching for Multimodal Remote Sensing Image Registration. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9059–9070. [Google Scholar] [CrossRef]
- Ma, W.; Wen, Z.; Wu, Y.; Jiao, L.; Gong, M.; Zheng, Y.; Liu, L. Remote Sensing Image Registration with Modified SIFT and Enhanced Feature Matching. IEEE Geosci. Remote Sens. Lett. 2016, 14, 3–7. [Google Scholar] [CrossRef]
- Xiang, Y.; Wang, F.; You, H. OS-SIFT: A Robust SIFT-Like Algorithm for High-Resolution Optical-to-SAR Image Registration in Suburban Areas. IEEE Trans. Geosci. Remote Sens. 2018, 56, 3078–3090. [Google Scholar] [CrossRef]
- Gao, C.; Li, W.; Tao, R.; Du, Q. MS-HLMO: Multiscale Histogram of Local Main Orientation for Remote Sensing Image Registration. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–14. [Google Scholar] [CrossRef]
- Ma, W.; Zhang, J.; Wu, Y.; Jiao, L.; Zhu, H.; Zhao, W. A Novel Two-Step Registration Method for Remote Sensing Images Based on Deep and Local Features. IEEE Trans. Geosci. Remote Sens. 2019, 57, 4834–4843. [Google Scholar] [CrossRef]
- Zhou, L.; Ye, Y.; Tang, T.; Nan, K.; Qin, Y. Robust Matching for SAR and Optical Images Using Multiscale Convolutional Gradient Features. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Zhuoqian, Y.; Tingting, D.; Yang, Y. Multi-temporal Remote Sensing Image Registration Using Deep Convolutional Features. IEEE Access 2018, 6, 38544–38555. [Google Scholar]
- Zhang, H.; Ni, W.; Yan, W.; Xiang, D.; Bian, H. Registration of Multimodal Remote Sensing Image Based on Deep Fully Convolutional Neural Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3028–3042. [Google Scholar] [CrossRef]
- Hughes, L.; Marcos, D.; Lobry, S.; Tuia, D.; Schmitt, M. A deep learning framework for matching of SAR and optical imagery. ISPRS J. Photogramm. Remote Sens. 2020, 169, 166–179. [Google Scholar] [CrossRef]
- Kovesi, P. Phase congruency: A low-level image invariant. Psychol. Res. 2000, 64, 136–148. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J.; Bruzzone, L.; Shen, L. Robust Registration of Multimodal Remote Sensing Images Based on Structural Similarity. IEEE Trans. Geosci. Remote Sens. 2017, 55, 2941–2958. [Google Scholar] [CrossRef]
- Li, J.; Hu, Q.; Ai, M. RIFT: Multi-Modal Image Matching Based on Radiation-Variation Insensitive Feature Transform. IEEE Trans. Image Process. 2020, 29, 3296–3310. [Google Scholar] [CrossRef]
- Yao, Y.; Zhang, Y.; Wan, Y.; Liu, X.; Guo, H. Heterologous Images Matching Considering Anisotropic Weighted Moment and Absolute Phase Orientation. Geomat. Inf. Sci. Wuhan Univ. 2021, 46, 1727. [Google Scholar]
- Fan, Z.; Liu, Y.; Liu, Y.; Zhang, L.; Zhang, J.; Sun, Y.; Ai, H. 3MRS: An Effective Coarse-to-Fine Matching Method for Multimodal Remote Sensing Imagery. Remote Sens. 2022, 14, 478. [Google Scholar] [CrossRef]
- Sui, H.; Liu, C.; Gan, Z. Overview of multi-modal remote sensing image matching methods. J. Geod. Geoinf. Sci. 2022, 51, 1848. [Google Scholar]
- Rosten, E.; Drummond, T. Machine learning for high-speed corner detection. In Proceedings of the ECCV 2006: 9th European Conference on Computer Vision, Graz, Austria, 7–13 May 2006; pp. 430–443. [Google Scholar]
- Wu, Y.; Ma, W.; Gong, M.; Su, L.; Jiao, L. A Novel Point-Matching Algorithm Based on Fast Sample Consensus for Image Registration. IEEE Geosci. Remote Sens. Lett. 2015, 12, 43–47. [Google Scholar] [CrossRef]
- Buades, A.; Coll, B.; Morel, J. A non-local algorithm for image denoising. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; Volume 2, pp. 60–65. [Google Scholar]
- Jevnisek, R.J.; Avidan, S. Co-occurrence Filter. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 3816–3824. [Google Scholar]
- Yang, H.; Fu, Y.; Zeng, J. Face recognition algorithm based on orthogonal Log-Gabor filter binary mode. J. Intell. Syst. 2019, 14, 330–337. [Google Scholar]
- Oppenheim, A.; Lim, J. The importance of phases in signal. IEEE Trans. Comput. Sci. 1981, 69, 333–382. [Google Scholar] [CrossRef]
- Morrone, M.C.; Owens, R.A. Feature detection from local energy. Pattern Recognit. Lett. 1987, 6, 303–313. [Google Scholar] [CrossRef]
- Mikolajczyk, K.; Schmid, C. A performance evaluation of local descriptors. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1615–1630. [Google Scholar] [CrossRef] [PubMed]
- Mikolajczyk, K.; Schmid, C. An Affine Invariant Interest Point Detector. In Proceedings of the ECCV 2002, 7th European Conference on Computer Vision, Copenhagen, Denmark, 28–31 May 2002; Volume LNCS 2350. [Google Scholar]
- Li, J.; Xu, W.; Shi, P.; Zhang, Y.; Hu, Q. LNIFT: Locally Normalized Image for Rotation Invariant Multimodal Feature Matching. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–14. [Google Scholar] [CrossRef]
- Zhang, Y.; Chen, C.; Liu, L.; Lan, D.; Jiang, H.; Wan, S. Aerial Edge Computing on Orbit: A Task Offloading and Allocation Scheme. IEEE Trans. Netw. Sci. Eng. 2023, 10, 275–285. [Google Scholar] [CrossRef]
- Han, C. RI-LPOH: Rotation-Invariant Local Phase Orientation Histogram for Multi-Modal Image Matching. Remote Sens. 2022, 14, 4228. [Google Scholar]
- Gao, C.; Li, W. Multi-scale PIIFD for Registration of Multi-source Remote Sensing Images. J. Beijing Inst. Technol. 2021, 30, 12. [Google Scholar]
- Chen, C.; Yao, G.; Liu, L.; Pei, Q.; Song, H.; Dustdar, S. A Cooperative Vehicle-Infrastructure System for Road Hazards Detection with Edge Intelligence. IEEE Trans. Intell. Transp. Syst. 2023, 24, 5186–5198. [Google Scholar] [CrossRef]
- Hnatushenko, V.; Kogut, P.; Uvarov, M. Variational approach for rigid co-registration of optical/SAR satellite images in agricultural areas. J. Comput. Appl. Math. 2022, 400, 113742. [Google Scholar] [CrossRef]
Symbol | Explanation | Symbol | Explanation |
---|---|---|---|
FI | input of filter | A | amplitude |
FO | output of filter | phase | |
w | weight | s | scale |
N | number | o | orientation |
polar radius | bandwidth | ||
polar angle |
Data | Method | RMSE/px | Running Time/s |
---|---|---|---|
Pair 6 | HAPCG | 0.88 | 2.8 |
OS-SIFT | × 1 | × | |
LNIFT | 1.39 | 15.45 | |
RIFT | 0.99 | 2.26 | |
EC-RIFT | 0.95 | 1.99 | |
Pair 7 | HAPCG | 1.59 | 2.56 |
OS-SIFT | × | × | |
LNIFT | 1.22 | 12.61 | |
RIFT | 1.11 | 1.79 | |
EC-RIFT | 0.92 | 1.76 |
Data | Filter | Detection Time/s | RMSE/px |
---|---|---|---|
Pair 1 | LG | 0.99 | 1.41 |
OLG | 0.58 | 1.35 | |
Pair 2 | LG | 3.20 | 1.36 |
OLG | 1.81 | 1.32 | |
Pair 3 | LG | 1.50 | 1.32 |
OLG | 1.04 | 1.27 | |
Pair 4 | LG | 0.99 | 1.34 |
OLG | 0.65 | 1.33 | |
Pair 5 | LG | 0.23 | 1.40 |
OLG | 0.15 | 1.29 |
Data | Method | RMSE/px | Running Time/s |
---|---|---|---|
Pair 1 | HAPCG | 1.95 | 9.98 |
OS-SIFT | 1.38 | 9.15 | |
LNIFT | 1.35 | 54.64 | |
RIFT | 1.37 | 10.45 | |
EC-RIFT | 1.35 | 8.81 | |
Pair 2 | HAPCG | 1.95 | 26.13 |
OS-SIFT | × 1 | × | |
LNIFT | × | × | |
RIFT | 1.34 | 14.26 | |
EC-RIFT | 1.19 | 11.01 | |
Pair 3 | HAPCG | 1.93 | 13.64 |
OS-SIFT | NAN 2 | NAN | |
LNIFT | 1.29 | 62.64 | |
RIFT | 1.32 | 6.41 | |
EC-RIFT | 1.28 | 5.06 | |
Pair 4 | HAPCG | 1.97 | 7.98 |
OS-SIFT | × | × | |
LNIFT | × | × | |
RIFT | 1.35 | 9.36 | |
EC-RIFT | 1.24 | 9.27 | |
Pair 5 | HAPCG | 1.76 | 2.12 |
OS-SIFT | NAN | NAN | |
LNIFT | 1.33 | 10.01 | |
RIFT | 1.36 | 3.95 | |
EC-RIFT | 1.31 | 2.39 |
Data | Method | RMSE/px | Running Time/s |
---|---|---|---|
Infrared–optical | HAPCG | 1.65 | 18.66 |
OS-SIFT | 1.16 | 16.71 | |
LNIFT | 1.29 | 64.73 | |
RIFT | 1.16 | 14.70 | |
EC-RIFT | 1.16 | 13.13 | |
Day–night | HAPCG | 1.83 | 9.79 |
OS-SIFT | × 1 | × | |
LNIFT | × | × | |
RIFT | 1.27 | 12.37 | |
EC-RIFT | 1.26 | 10.68 | |
Depth–optical | HAPCG | 1.95 | 8.90 |
OS-SIFT | 1.41 | 8.80 | |
LNIFT | 1.36 | 55.06 | |
RIFT | 1.36 | 11.98 | |
EC-RIFT | 1.33 | 11.50 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, Y.; Han, Z.; Dou, Z.; Huang, C.; Cong, L.; Lv, N.; Chen, C. Edge Consistency Feature Extraction Method for Multi-Source Image Registration. Remote Sens. 2023, 15, 5051. https://doi.org/10.3390/rs15205051
Zhou Y, Han Z, Dou Z, Huang C, Cong L, Lv N, Chen C. Edge Consistency Feature Extraction Method for Multi-Source Image Registration. Remote Sensing. 2023; 15(20):5051. https://doi.org/10.3390/rs15205051
Chicago/Turabian StyleZhou, Yang, Zhen Han, Zeng Dou, Chengbin Huang, Li Cong, Ning Lv, and Chen Chen. 2023. "Edge Consistency Feature Extraction Method for Multi-Source Image Registration" Remote Sensing 15, no. 20: 5051. https://doi.org/10.3390/rs15205051
APA StyleZhou, Y., Han, Z., Dou, Z., Huang, C., Cong, L., Lv, N., & Chen, C. (2023). Edge Consistency Feature Extraction Method for Multi-Source Image Registration. Remote Sensing, 15(20), 5051. https://doi.org/10.3390/rs15205051