Refraction-Aware Structure from Motion for Airborne Bathymetry
Abstract
:1. Introduction
1.1. State of the Art Review
1.1.1. Structure from Motion
1.1.2. Radiometry
1.1.3. Deep Learning
1.2. Contributions
- We implement a refraction-aware SfM (R-SfM) pipeline within the OpenSfM framework. Refraction is taken into account in the bundle adjustment problem, leading to very accurate solutions. We experimentally validate the pipeline using real and simulated data.
- We demonstrate that a CNN pipeline that combines the R-SfM-provided geometric information with radiometric information achieves accurate depth estimations. We experimentally validate it on real-world data. Our pipeline is especially designed to balance between R-SfM and radiometric-based estimations.
- We make the R-SfM and CNN source code open source (https://github.com/amakris/R-SfM (accessed on 5 June 2024)). In addition, we make the data available upon request for the benefit of the research community.
2. Materials and Methods
2.1. Refraction-Aware Structure from Motion
2.2. Radiometric Data Processing
2.3. Deep Learning Model
3. Results
3.1. Data Collection Platforms
3.1.1. Onshore Survey
3.1.2. Drone Platform
3.1.3. USV Platform
3.2. Dataset
3.2.1. Simulated DataSet
3.2.2. Field Dataset
3.2.3. Error Metrics
3.3. Deep Learning Model Training
3.4. Experimental Results
3.4.1. Simulated Data
3.4.2. Field Data
4. Discussion
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Misra, A.; Vojinovic, Z.; Ramakrishnan, B.; Luijendijk, A.; Ranasinghe, R. Shallow Water Bathymetry Mapping Using Support Vector Machine (SVM) Technique and Multispectral Imagery. Int. J. Remote Sens. 2018, 39, 4431–4450. [Google Scholar] [CrossRef]
- Sagawa, T.; Yamashita, Y.; Okumura, T.; Yamanokuchi, T. Satellite Derived Bathymetry Using Machine Learning and Multi-Temporal Satellite Images. Remote Sens. 2019, 11, 1155. [Google Scholar] [CrossRef]
- Alevizos, E.; Alexakis, D.D. Evaluation of Radiometric Calibration of Drone-Based Imagery for Improving Shallow Bathymetry Retrieval. Remote Sens. Lett. 2022, 13, 311–321. [Google Scholar] [CrossRef]
- Zhou, W.; Tang, Y.; Jing, W.; Li, Y.; Yang, J.; Deng, Y.; Zhang, Y. A Comparison of Machine Learning and Empirical Approaches for Deriving Bathymetry from Multispectral Imagery. Remote Sens. 2023, 15, 393. [Google Scholar] [CrossRef]
- Agrafiotis, P.; Skarlatos, D.; Georgopoulos, A.; Karantzalos, K. Shallow water bathymetry mapping from uav imagery based on machine learning. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 9–16. [Google Scholar] [CrossRef]
- Dietrich, J.T. Bathymetric Structure-from-Motion: Extracting Shallow Stream Bathymetry from Multi-View Stereo Photogrammetry. Earth Surf. Process. Landf. 2017, 42, 355–364. [Google Scholar] [CrossRef]
- Alevizos, E.; Nicodemou, V.C.; Makris, A.; Oikonomidis, I.; Roussos, A.; Alexakis, D.D. Integration of Photogrammetric and Spectral Techniques for Advanced Drone-Based Bathymetry Retrieval Using a Deep Learning Approach. Remote Sens. 2022, 14, 4160. [Google Scholar] [CrossRef]
- Slocum, R.K.; Parrish, C.E.; Simpson, C.H. Combined Geometric-Radiometric and Neural Network Approach to Shallow Bathymetric Mapping with UAS Imagery. ISPRS J. Photogramm. Remote Sens. 2020, 169, 351–363. [Google Scholar] [CrossRef]
- Wang, E.; Li, D.; Wang, Z.; Cao, W.; Zhang, J.; Wang, J.; Zhang, H. Pixel-Level Bathymetry Mapping of Optically Shallow Water Areas by Combining Aerial RGB Video and Photogrammetry. Geomorphology 2024, 449, 109049. [Google Scholar] [CrossRef]
- Mandlburger, G.; Kölle, M.; Nübel, H.; Soergel, U. BathyNet: A Deep Neural Network for Water Depth Mapping from Multispectral Aerial Images. PFG—J. Photogramm. Remote Sens. Geoinf. Sci. 2021, 1, 71–89. [Google Scholar] [CrossRef]
- Triggs, B.; McLauchlan, P.F.; Hartley, R.I.; Fitzgibbon, A.W. Bundle Adjustment—A Modern Synthesis. In Vision Algorithms: Theory and Practice; Triggs, B., Zisserman, A., Szeliski, R., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2000; Volume 1883, pp. 298–372. ISBN 978-3-540-67973-8. [Google Scholar]
- Lourakis, M.I.A.; Argyros, A.A. SBA: A Software Package for Generic Sparse Bundle Adjustment. ACM Trans. Math. Softw. 2009, 36, 1–30. [Google Scholar] [CrossRef]
- Agrafiotis, P.; Skarlatos, D.; Georgopoulos, A.; Karantzalos, K. DepthLearn: Learning to Correct the Refraction on Point Clouds Derived from Aerial Imagery for Accurate Dense Shallow Water Bathymetry Based on SVMs-Fusion with LiDAR Point Clouds. Remote Sens. 2019, 11, 2225. [Google Scholar] [CrossRef]
- Agrafiotis, P.; Karantzalos, K.; Georgopoulos, A.; Skarlatos, D. Correcting Image Refraction: Towards Accurate Aerial Image-Based Bathymetry Mapping in Shallow Waters. Remote Sens. 2020, 12, 322. [Google Scholar] [CrossRef]
- Lambert, S.E.; Parrish, C.E. Refraction Correction for Spectrally Derived Bathymetry Using UAS Imagery. Remote Sens. 2023, 15, 3635. [Google Scholar] [CrossRef]
- Cao, B.; Fang, Y.; Jiang, Z.; Gao, L.; Hu, H. Shallow Water Bathymetry from WorldView-2 Stereo Imagery Using Two-Media Photogrammetry. Eur. J. Remote Sens. 2019, 52, 506–521. [Google Scholar] [CrossRef]
- Agrafiotis, P.; Karantzalos, K.; Georgopoulos, A.; Skarlatos, D. Learning from Synthetic Data: Enhancing Refraction Correction Accuracy for Airborne Image-Based Bathymetric Mapping of Shallow Coastal Waters. PFG—J. Photogramm. Remote Sens. Geoinf. Sci. 2021, 89, 91–109. [Google Scholar] [CrossRef]
- Murase, T.; Tanaka, M.; Tani, T.; Miyashita, Y.; Ohkawa, N.; Ishiguro, S.; Suzuki, Y.; Kayanne, H.; Yamano, H. A Photogrammetric Correction Procedure for Light Refraction Effects at a Two-Medium Boundary. Photogramm. Eng. Remote Sens. 2007, 73, 1129–1136. [Google Scholar] [CrossRef]
- Wimmer, M. Comparison of Active and Passive Optical Methods for Mapping River Bathymetry. Master’s Thesis, Technische Universität Wien, Vienna, Austria, 2016. [Google Scholar]
- Mandlburger, G. A case study on through-water dense image matching. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 659–666. [Google Scholar] [CrossRef]
- Cao, B.; Deng, R.; Zhu, S. Universal Algorithm for Water Depth Refraction Correction in Through-Water Stereo Remote Sensing. Int. J. Appl. Earth Obs. Geoinf. 2020, 91, 102108. [Google Scholar] [CrossRef]
- David, C.G.; Kohl, N.; Casella, E.; Rovere, A.; Ballesteros, P.; Schlurmann, T. Structure-from-Motion on Shallow Reefs and Beaches: Potential and Limitations of Consumer-Grade Drones to Reconstruct Topography and Bathymetry. Coral Reefs 2021, 40, 835–851. [Google Scholar] [CrossRef]
- Lingua, A.M.; Maschio, P.; Spadaro, A.; Vezza, P.; Negro, G. Iterative refraction-correction method on mvs-sfm for shallow stream bathymetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, 48, 249–255. [Google Scholar] [CrossRef]
- Maas, H.-G. On the Accuracy Potential in Underwater/Multimedia Photogrammetry. Sensors 2015, 15, 18140–18152. [Google Scholar] [CrossRef] [PubMed]
- Mulsow, C. A Flexible Multi-Media Bundle Approach. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2010, 38, 472–477. [Google Scholar]
- Mulsow, C.; Kenner, R.; Bühler, Y.; Stoffel, A.; Maas, H.-G. Subaquatic digital elevation models from uav-imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 739–744. [Google Scholar] [CrossRef]
- Stumpf, R.P.; Holderied, K.; Sinclair, M. Determination of Water Depth with High-Resolution Satellite Imagery over Variable Bottom Types. Limnol. Oceanogr. 2003, 48, 547–556. [Google Scholar] [CrossRef]
- Gholamalifard, M.; Kutser, T.; Esmaili-Sari, A.; Abkar, A.; Naimi, B. Remotely Sensed Empirical Modeling of Bathymetry in the Southeastern Caspian Sea. Remote Sens. 2013, 5, 2746–2762. [Google Scholar] [CrossRef]
- Liu, S.; Gao, Y.; Zheng, W.; Li, X. Performance of Two Neural Network Models in Bathymetry. Remote Sens. Lett. 2015, 6, 321–330. [Google Scholar] [CrossRef]
- Wang, L.; Liu, H.; Su, H.; Wang, J. Bathymetry Retrieval from Optical Images with Spatially Distributed Support Vector Machines. GISci. Remote Sens. 2019, 56, 323–337. [Google Scholar] [CrossRef]
- Lumban-Gaol, Y.A.; Ohori, K.A.; Peters, R.Y. Satellite-derived bathymetry using convolutional neural networks and multispectral sentinel-2 images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, 43, 201–207. [Google Scholar] [CrossRef]
- Rossi, L.; Mammi, I.; Pelliccia, F. UAV-Derived Multispectral Bathymetry. Remote Sens. 2020, 12, 3897. [Google Scholar] [CrossRef]
- Al Najar, M.; Thoumyre, G.; Bergsma, E.W.J.; Almar, R.; Benshila, R.; Wilson, D.G. Satellite Derived Bathymetry Using Deep Learning. Mach. Learn. 2023, 112, 1107–1130. [Google Scholar] [CrossRef]
- Benshila, R.; Thoumyre, G.; Najar, M.A.; Abessolo, G.; Almar, R.; Bergsma, E.; Hugonnard, G.; Labracherie, L.; Lavie, B.; Ragonneau, T.; et al. A Deep Learning Approach for Estimation of the Nearshore Bathymetry. J. Coast. Res. 2020, 95, 1011. [Google Scholar] [CrossRef]
- Chen, Q.; Wang, N.; Chen, Z. Simultaneous Mapping of Nearshore Bathymetry and Waves Based on Physics-Informed Deep Learning. Coast. Eng. 2023, 183, 104337. [Google Scholar] [CrossRef]
- Forghani, M.; Qian, Y.; Lee, J.; Farthing, M.W.; Hesser, T.; Kitanidis, P.K.; Darve, E. Deep Learning-Based Estimation of Riverine Bathymetry Using Variational Encoder Geostatistical Approaches (VEGAs); AGU Fall Meeting Abstracts. 2021. Available online: https://ui.adsabs.harvard.edu/abs/2021AGUFM.H35S1254F/abstract (accessed on 5 June 2024).
- Ghorbanidehno, H.; Lee, J.; Farthing, M.; Hesser, T.; Darve, E.F.; Kitanidis, P.K. Deep Learning Technique for Fast Inference of Large-Scale Riverine Bathymetry. Adv. Water Resour. 2021, 147, 103715. [Google Scholar] [CrossRef]
- Jordt, A.; Köser, K.; Koch, R. Refractive 3D Reconstruction on Underwater Images. Methods Oceanogr. 2016, 15–16, 90–113. [Google Scholar] [CrossRef]
- Sonogashira, M.; Shonai, M.; Iiyama, M. High-Resolution Bathymetry by Deep-Learning-Based Image Superresolution. PLoS ONE 2020, 15, e0235487. [Google Scholar] [CrossRef] [PubMed]
- Jiang, S.; Jiang, C.; Jiang, W. Efficient Structure from Motion for Large-Scale UAV Images: A Review and a Comparison of SfM Tools. ISPRS J. Photogramm. Remote Sens. 2020, 167, 230–251. [Google Scholar] [CrossRef]
- Glaeser, G.; Schröcker, H.-P. Reflections on Refractions. J. Geom. Graph. 2000, 4, 1–18. [Google Scholar]
- Mapillary OpenSfM. 2022. Available online: https://github.com/mapillary/OpenSfM (accessed on 5 June 2024).
- Adorjan, M. OpenSfM: A Collaborative Structure-from-Motion System. Diploma Thesis, Technische Universität Wien, Vienna, Austria, 2016; 105p. [Google Scholar] [CrossRef]
- Mikolajczyk, K.; Tuytelaars, T.; Schmid, C.; Zisserman, A.; Matas, J.; Schaffalitzky, F.; Kadir, T.; Gool, L.V. A Comparison of Affine Region Detectors. Int. J. Comput. Vis. 2005, 65, 43–72. [Google Scholar] [CrossRef]
- Tareen, S.A.K.; Saleem, Z. A Comparative Analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK. In Proceedings of the 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, Pakistan, 3–4 March 2018; pp. 1–10. [Google Scholar]
- Newell, A.; Yang, K.; Deng, J. Stacked Hourglass Networks for Human Pose Estimation. In Computer Vision—ECCV 2016; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2016; Volume 9912, pp. 483–499. ISBN 978-3-319-46483-1. [Google Scholar]
- Nicodemou, V.C.; Oikonomidis, I.; Tzimiropoulos, G.; Argyros, A. Learning to Infer the Depth Map of a Hand from Its Color Image. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
- Jackson, A.S.; Bulat, A.; Argyriou, V.; Tzimiropoulos, G. Large Pose 3D Face Reconstruction from a Single Image via Direct Volumetric CNN Regression. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 1031–1039. [Google Scholar]
- Cho, H.-M.; Park, J.-W.; Lee, J.-S.; Han, S.-K. Assessment of the GNSS-RTK for Application in Precision Forest Operations. Remote Sens. 2023, 16, 148. [Google Scholar] [CrossRef]
- Mourtzas, N.; Kolaiti, E.; Anzidei, M. Vertical Land Movements and Sea Level Changes along the Coast of Crete (Greece) since Late Holocene. Quat. Int. 2016, 401, 43–70. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2017, arXiv:1412.6980. [Google Scholar]
- Recht, B.; Roelofs, R.; Schmidt, L.; Shankar, V. Do Imagenet Classifiers Generalize to Imagenet? In Proceedings of the International Conference on Machine Learning, PMLR, Long Beach, CA, USA, 9–15 June 2019; pp. 5389–5400. [Google Scholar]
- O’Mahony, N.; Campbell, S.; Carvalho, A.; Harapanahalli, S.; Hernandez, G.V.; Krpalkova, L.; Riordan, D.; Walsh, J. Deep Learning vs. Traditional Computer Vision. In Advances in Computer Vision; Arai, K., Kapoor, S., Eds.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2020; Volume 943, pp. 128–144. ISBN 978-3-030-17794-2. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image Is Worth 16×16 Words: Transformers for Image Recognition at Scale. arXiv 2021, arXiv:2010.11929. [Google Scholar]
GS | RSA | RSB | |
---|---|---|---|
Flight Altitude | 50 m | 50 m | 120 m |
Area | 100 m × 100 m | 200 m × 150 m | 400 m × 200 m |
Max Depth | 0–20 m | 3.5 m | 5 m |
GS05 | RSA | RSB | |
---|---|---|---|
SfM | 0.22 m | 0.35 m | 0.18 m |
R-SfM | 7 × 10−9 m | 5 × 10−5 m | 7 × 10−5 m |
GS00 | GS05 | GS10 | GS15 | GS20 | |
---|---|---|---|---|---|
Min/Max Depth | −5 m/0 m | 0 m/5 m | 5 m/10 m | 10 m/15 m | 15 m/20 m |
SfM | 7 × 10−9 m | 0.22 m | 0.36 m | 1.37 m | 1.54 m |
R-SfM | 7 × 10−9 m | 5 × 10−5 m | 1 × 10−5 m | 3 × 10−5 m | 5 × 10−5 m |
Method | RMSE | R2 |
---|---|---|
CNN R-SfM and RGB | 0.36 m | 0.84 |
CNN SfM and RGB | 0.70 m | 0.23 |
CNN RGB only | 0.59 m | 0.56 |
SfM (intermediate sparse reconstruction) | 2.71 m | −6.44 |
R-SfM (intermediate sparse reconstruction) | 0.75 m | 0.48 |
Method | RMSE | R2 |
---|---|---|
CNN RGB + R-SfM | 0.36 m | 0.84 |
SVM RGB + R-SfM | 0.65 m | 0.47 |
RF RGB + R-SfM | 1.69 m | −6.18 |
CNN RGB + SfM | 0.70 m | 0.23 |
SVM RGB + SfM | 2.60 m | −5.51 |
RF RGB + SfM | 2.86 m | −6.80 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Makris, A.; Nicodemou, V.C.; Alevizos, E.; Oikonomidis, I.; Alexakis, D.D.; Roussos, A. Refraction-Aware Structure from Motion for Airborne Bathymetry. Remote Sens. 2024, 16, 4253. https://doi.org/10.3390/rs16224253
Makris A, Nicodemou VC, Alevizos E, Oikonomidis I, Alexakis DD, Roussos A. Refraction-Aware Structure from Motion for Airborne Bathymetry. Remote Sensing. 2024; 16(22):4253. https://doi.org/10.3390/rs16224253
Chicago/Turabian StyleMakris, Alexandros, Vassilis C. Nicodemou, Evangelos Alevizos, Iason Oikonomidis, Dimitrios D. Alexakis, and Anastasios Roussos. 2024. "Refraction-Aware Structure from Motion for Airborne Bathymetry" Remote Sensing 16, no. 22: 4253. https://doi.org/10.3390/rs16224253
APA StyleMakris, A., Nicodemou, V. C., Alevizos, E., Oikonomidis, I., Alexakis, D. D., & Roussos, A. (2024). Refraction-Aware Structure from Motion for Airborne Bathymetry. Remote Sensing, 16(22), 4253. https://doi.org/10.3390/rs16224253