Improved Piecewise Linear Transformation for Precise Warping of Very-High-Resolution Remote Sensing Images
Abstract
:1. Introduction
2. Methodology
2.1. Piecewise Linear (PL) Transformation
- Determine the triangulation of CPs in one image by using the Delaunay triangulation method. The triangulation of the CPs in the other image is accordingly obtained from the corresponding CPs in the images.
- Determine the affine transformation T that registers the two triangles for each pair of corresponding triangles in the images. Thus, the triangular regions inside the convex hulls of the CPs in the two images are registered.
- Determine the transformation for mapping the points outside the convex hulls by extending the boundary triangle planes and then use this plane to extrapolate the points between the convex hull and the image border. Two planes belonging to two neighboring boundary triangles intersect at a line, the projection of which in the image plane separates the points outside the convex hull and determines the triangle plane to which a point outside the convex hull should belong in the extrapolation process.
2.2. Improved Piecewise Linear (IPL) Transformation
Algorithm 1 Improved Piecewise Linear Transformation |
Input: CPs = {}, Set N, K (K 3) for each n do Define pseudo-CPs located along the boundary of the sensed image with the same interval Find K CPs closest to the position of Estimate affine transformation with K CPs by the least-squares method Define end for Output: CPs = {}, Construct a piecewise linear transformation through CPs = {}, |
3. Experimental Results
3.1. Dataset Construction
3.2. Results with Simulated Dataset
3.3. Results with Real Dataset
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Hagenlocher, M.; Lang, S.; Tiede, D. Integrated assessment of the environmental impact of an IDP camp in Sudan based on very high resolution multi-temporal satellite imagery. Remote Sens. Environ. 2012, 126, 27–38. [Google Scholar] [CrossRef]
- Solano-Correa, Y.; Bovolo, F.; Bruzzone, L. An approach for unsupervised change detection in multitemporal VHR images acquired by different multispectral sensors. Remote Sens. 2018, 10, 533. [Google Scholar] [CrossRef]
- Blaes, X.; Chomé, G.; Lambert, M.J.; Traoré, P.; Schut, A.; Defourny, P. Quantifying fertilizer application response variability with VHR satellite NDVI time series in a rainfed smallholder cropping system of Mali. Remote Sens. 2016, 8, 531. [Google Scholar] [CrossRef]
- Michez, A.; Piégay, H.; Lisein, J.; Claessens, H.; Lejeune, P. Classification of riparian forest species and health condition using multi-temporal and hyperspatial imagery from unmanned aerial system. Environ. Monit. Assess. 2016, 188, 146. [Google Scholar] [CrossRef] [PubMed]
- Turner, D.; Lucieer, A.; de Jong, S. Time series analysis of landslide dynamics using an unmanned aerial vehicle (UAV). Remote Sens. 2015, 7, 1736–1757. [Google Scholar] [CrossRef]
- Wei, Z.; Han, Y.; Li, M.; Yang, K.; Yang, Y.; Luo, Y.; Ong, S.H. A small UAV based multi-temporal image registration for dynamic agricultural terrace monitoring. Remote Sens. 2017, 9, 904. [Google Scholar] [CrossRef]
- Aicardi, I.; Nex, F.; Gerke, M.; Lingua, A.M. An image-based approach for the co-registration of multi-temporal UAV image datasets. Remote Sens. 2016, 8, 779. [Google Scholar] [CrossRef]
- Li, W.; Sun, K.; Li, D.; Bai, T.; Sui, H. A new approach to performing bundle adjustment for time series UAV images 3D building change detection. Remote Sens. 2017, 9, 625. [Google Scholar] [CrossRef]
- Zhao, J.; Zhang, X.; Gao, C.; Qiu, X.; Tian, Y.; Zhu, Y.; Cao, W. Rapid mosaicking of unmanned aerial vehicle (UAV) images for crop growth monitoring using the SIFT algorithm. Remote Sens. 2019, 11, 1226. [Google Scholar] [CrossRef]
- Zitová, B.; Flusser, J. Image registration methods: A survey. Image Vis. Comput. 2003, 21, 977–1000. [Google Scholar] [CrossRef]
- Han, Y.; Choi, J.; Byun, Y.; Kim, Y. Parameter optimization for the extraction of matching points between high-resolution multisensory images in urban areas. IEEE Trans. Geosci. Remote Sens. 2014, 52, 5612–5621. [Google Scholar]
- Goshtasby, A. Piecewise linear mapping functions for image registration. Pattern Recognit. 1986, 19, 459–466. [Google Scholar] [CrossRef]
- Arévalo, V.; González, J. An experimental evaluation of non-rigid registration techniques on QuickBird satellite imagery. Int. J. Remote Sens. 2008, 29, 513–527. [Google Scholar] [CrossRef]
- Arévalo, V.; González, J. Improving piecewise linear registration of high-resolution satellite images through mesh optimization. IEEE Trans. Geosci. Remote Sens. 2008, 46, 3792–3803. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J.; Bruzzone, L.; Shen, L. Robust registration of multimodal remote sensing images based on structural similarity. IEEE Trans. Geosci. Remote Sens. 2017, 55, 2941–2958. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Bay, H.; Ess, A.; Tuytelaars, T.; Van Gool, L. Speeded-up robust features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J. A local descriptor based registration method for multispectral remote sensing images with non-linear intensity differences. ISPRS J. Photogramm. Remote Sens. 2014, 90, 83–95. [Google Scholar] [CrossRef]
- Zhu, Q.; Wu, B.; Xu, Z.X. Seed point selection method for triangle constrained image matching propagation. IEEE Geosci. Remote Sens. Lett. 2006, 3, 207–211. [Google Scholar] [CrossRef]
- Li, H.; Man, Y. Robust multi-source image registration for optical satellite based on phase information. Photogramm. Eng. Remote Sens. 2016, 82, 865–878. [Google Scholar] [CrossRef]
- Sedaghat, A.; Ebadi, H. Very high resolution image matching based on local features and K-means clustering. Photogramm. Rec. 2015, 30, 166–186. [Google Scholar] [CrossRef]
- Han, Y.; Oh, J. Automated geo/co-registration of multi-temporal very-high-resolution imagery. Sensors 2018, 18, 1599. [Google Scholar] [CrossRef] [PubMed]
- Han, Y.; Byun, Y.; Choi, J.; Han, D.; Kim, Y. Automatic registration of high-resolution images using local properties of features. Photogramm. Eng. Remote Sens. 2012, 78, 211–221. [Google Scholar] [CrossRef]
- Han, Y.; Choi, J.; Jung, J.; Chang, A.; Oh, S.; Yeom, J. Automated co-registration of multi-sensor orthophotos generated from unmanned aerial vehicle platforms. J. Sens. 2019, 2019. [Google Scholar] [CrossRef]
Dataset | Site 1 (Seoul) | Site 2 (Gwangju) | Site 3 (Daejeon) | |||
---|---|---|---|---|---|---|
Reference Image | Sensed Image | Reference Image | Sensed Image | Reference Image | Sensed Image | |
Sensor | WorldView-3 Multi. | WorldView-3 Multi. | Kompsat-3A Pan. | |||
Acquisition date | 2015. 02. 12. | 2018. 02. 05. | 2017. 05. 26. | 2018. 05. 04. | 2015. 10. 28. | 2019. 01. 02. |
Off-nadir angle | 14.9° | 37.9° | 21.6° | 28.0° | 24.1° | 20.9° |
Azimuth angle | 234.3° | 140.9° | 180.4° | 133.3° | 166.1° | 187.8° |
Spatial resolution | 1.2 m | 1.6 m | 1.2 m | 1.2 m | 0.55 m | 0.55 m |
Radiometric resolution | 11 bit | 11 bit | 14 bit | |||
Processing level | Level 2A | Level 2A | Level 1G |
Transformation Method | Number of CPs | Correlation Coefficient |
---|---|---|
Without co-registration | – | 0.319 |
Affine | 1654 | 0.957 |
Third polynomial | 1654 | 0.962 |
Fourth polynomial | 1654 | 0.961 |
Local weighted mean | 1654 | 0.632 |
Piecewise linear | 1654 | 0.953 |
Improved piecewise linear | 1670 | 0.975 |
Transformation Model | Site 1 (Seoul) | Site 2 (Gwangju) | Site 3 (Daejeon) | |||
---|---|---|---|---|---|---|
Number of CPs | Correlation Coefficient | Number of CPs | Correlation Coefficient | Number of CPs | Correlation Coefficient | |
Affine | 50 | 0.266 | 1102 | 0.632 | 84 | 0.502 |
Third polynomial | 50 | 0.272 | 1102 | 0.641 | 84 | 0.495 |
Fourth polynomial | 50 | 0.251 | 1102 | 0.641 | 84 | 0.474 |
Local weighted mean | 50 | 0.258 | 1102 | 0.659 | 84 | 0.483 |
Piecewise linear | 50 | 0.254 | 1102 | 0.671 | 84 | 0.371 |
Improved piecewise linear | 66 | 0.270 | 1118 | 0.675 | 100 | 0.523 |
Transformation Model | Correlation Coefficient | ||
---|---|---|---|
Site 1 (Seoul) | Site 2 (Gwangju) | Site 3 (Daejeon) | |
Piecewise linear | 0.259 | 0.603 | 0.180 |
Improved piecewise linear | 0.304 | 0.657 | 0.338 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Han, Y.; Kim, T.; Yeom, J. Improved Piecewise Linear Transformation for Precise Warping of Very-High-Resolution Remote Sensing Images. Remote Sens. 2019, 11, 2235. https://doi.org/10.3390/rs11192235
Han Y, Kim T, Yeom J. Improved Piecewise Linear Transformation for Precise Warping of Very-High-Resolution Remote Sensing Images. Remote Sensing. 2019; 11(19):2235. https://doi.org/10.3390/rs11192235
Chicago/Turabian StyleHan, Youkyung, Taeheon Kim, and Junho Yeom. 2019. "Improved Piecewise Linear Transformation for Precise Warping of Very-High-Resolution Remote Sensing Images" Remote Sensing 11, no. 19: 2235. https://doi.org/10.3390/rs11192235
APA StyleHan, Y., Kim, T., & Yeom, J. (2019). Improved Piecewise Linear Transformation for Precise Warping of Very-High-Resolution Remote Sensing Images. Remote Sensing, 11(19), 2235. https://doi.org/10.3390/rs11192235