Next Article in Journal
Estimating Olive Tree Density in Delimited Areas Using Sentinel-2 Images
Previous Article in Journal
Approximate Near-Real-Time Assessment of Some Characteristic Parameters of the Spring Ozone Depletion over Antarctica Using Ground-Based Measurements
Previous Article in Special Issue
Multi-Class Guided GAN for Remote-Sensing Image Synthesis Based on Semantic Labels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

OSNet: An Edge Enhancement Network for a Joint Application of SAR and Optical Images

1
School of Automation, Nanjing University of Information Science and Technology, Nanjing 210044, China
2
Jiangsu Collaborative Innovation Center of Atmospheric Environment and Equipment Technology (CICAEET), Nanjing University of Information Science and Technology, Nanjing 210044, China
3
Faculty of Geosciences, Utrecht University, 3584 CS Utrecht, The Netherlands
4
Department of Computer Science, University of Reading, Whiteknights, Reading RG6 6DH, UK
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(3), 505; https://doi.org/10.3390/rs17030505
Submission received: 31 October 2024 / Revised: 29 January 2025 / Accepted: 30 January 2025 / Published: 31 January 2025

Abstract

The combined use of synthetic aperture radar (SAR) and optical images for surface observation is gaining increasing attention. Optical images, with their distinct edge features, can accurately classify different objects, while SAR images reveal deeper internal variations. To address the challenge of differing feature distributions in multi-source images, we propose an edge enhancement network, OSNet (network for optical and SAR images), designed to jointly extract features from optical and SAR images and enhance edge feature representation. OSNet consists of three core modules: a dual-branch backbone, a synergistic attention integration module, and a global-guided local fusion module. These modules, respectively, handle modality-independent feature extraction, feature sharing, and global-local feature fusion. In the backbone module, we introduce a differentiable Lee filter and a Laplacian edge detection operator in the SAR branch to suppress noise and enhance edge features. Additionally, we designed a multi-source attention fusion module to facilitate cross-modal information exchange between the two branches. We validated OSNet’s performance on segmentation tasks (WHU-OPT-SAR) and regression tasks (SNOW-OPT-SAR). The results show that OSNet improved PA and MIoU by 2.31% and 2.58%, respectively, in the segmentation task, and reduced MAE and RMSE by 3.14% and 4.22%, respectively, in the regression task.
Keywords: multimodal neural networks; multi-source fusion; attention mechanism multimodal neural networks; multi-source fusion; attention mechanism

Share and Cite

MDPI and ACS Style

Ma, K.; Hu, K.; Chen, J.; Jiang, M.; Xu, Y.; Xia, M.; Weng, L. OSNet: An Edge Enhancement Network for a Joint Application of SAR and Optical Images. Remote Sens. 2025, 17, 505. https://doi.org/10.3390/rs17030505

AMA Style

Ma K, Hu K, Chen J, Jiang M, Xu Y, Xia M, Weng L. OSNet: An Edge Enhancement Network for a Joint Application of SAR and Optical Images. Remote Sensing. 2025; 17(3):505. https://doi.org/10.3390/rs17030505

Chicago/Turabian Style

Ma, Keyu, Kai Hu, Junyu Chen, Ming Jiang, Yao Xu, Min Xia, and Liguo Weng. 2025. "OSNet: An Edge Enhancement Network for a Joint Application of SAR and Optical Images" Remote Sensing 17, no. 3: 505. https://doi.org/10.3390/rs17030505

APA Style

Ma, K., Hu, K., Chen, J., Jiang, M., Xu, Y., Xia, M., & Weng, L. (2025). OSNet: An Edge Enhancement Network for a Joint Application of SAR and Optical Images. Remote Sensing, 17(3), 505. https://doi.org/10.3390/rs17030505

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop