A Building Point Cloud Extraction Algorithm in Complex Scenes
Abstract
:1. Introduction
- This paper combines the Alpha Shape algorithm with the neighborhood expansion method to compensate for the shortcomings of the region growing algorithm in the coarse extraction stage, thereby obtaining more complete building points.
- To address the issue of misidentifying facade points near the ground, we perform mask extraction on the original points instead of non-ground points. This approach allows us to obtain more comprehensive facade points within the mask polygons compared to the ones obtained using the cloth simulation filtering algorithm.
- Even in cases where buildings are closely adjacent to trees, the proposed method can successfully separate and extract building points from tree points, thereby improving accuracy and reliability.
2. Methods
2.1. Coarse Extraction of the Building Point Cloud
2.2. Fine Extraction of the Building Point Cloud
- (1)
- All possible pairs of projected points are processed in the same way. For any pair of points and from projected point cloud on the XOY plane of point cloud S, the center point of the circle whose distance from is calculated and is equal to based on the distance intersection method (Figure 7) [21]:
- (2)
- The distance between each point in S and is calculated. If is less than , the point is considered to be inside the circle; otherwise, it is deemed to be outside the circle. If there are and such that there are no other points inside the circle, then and are defined as edge points, and is defined as a boundary line. The edge points are obtained until all point pairs in S have been processed.
- (3)
- The centroid coordinates of all edge points and the distance from each edge point to the , as well as the direction vector from the to each edge point, are calculated. refers to the corresponding multipliers. The expanded corresponding edge point is as follows:
- (4)
- Edge points are sorted based on the polar angles between adjacent points and connect them to form a closed polygon for extracting points within the polygon.
- (1)
- The K nearest neighbor points for any point P in space are found using the KD-Tree nearest neighbor search algorithm.
- (2)
- For the K nearest neighbor points, the Euclidean distance between each point and P is calculated.
- (3)
- If there are points within the K nearest neighbors that have a distance smaller than the set threshold, these points are clustered into a set Q.
- (4)
- The above process is repeated until the number of elements in set Q no longer increases.
3. Experiment Settings
3.1. Study Areas
3.2. Parameter Settings
3.3. Evaluation Indicators
3.4. Benchmark Algorithm
4. Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wang, X.; Li, P. Extraction of urban building damage using spectral, height and corner information from VHR satellite images and airborne LiDAR data. ISPRS J. Photogramm. Remote Sens. 2020, 159, 322–336. [Google Scholar] [CrossRef]
- Adamopoulos, E.; Rinaudo, F.; Ardissono, L. A critical comparison of 3D digitization techniques for heritage objects. ISPRS Int. J. Geo-Inf. 2020, 10, 10. [Google Scholar] [CrossRef]
- Xu, Y.; Stilla, U. Toward building and civil infrastructure reconstruction from point clouds: A review on data and key techniques. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 2857–2885. [Google Scholar] [CrossRef]
- Schrotter, G.; Hürzeler, C. The digital twin of the city of Zurich for urban planning. PFG–J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 99–112. [Google Scholar] [CrossRef]
- Tarsha Kurdi, F.; Gharineiat, Z.; Campbell, G.; Awrangjeb, M.; Dey, E.K. Automatic filtering of lidar building point cloud in case of trees associated to building roof. Remote Sens. 2022, 14, 430. [Google Scholar] [CrossRef]
- Martín-Jiménez, J.; Del Pozo, S.; Sánchez-Aparicio, M.; Lagüela, S. Multi-scale roof characterization from LiDAR data and aerial orthoimagery: Automatic computation of building photovoltaic capacity. Autom. Constr. 2020, 109, 102965. [Google Scholar] [CrossRef]
- Zou, X.; Feng, Y.; Li, H.; Zhu, J. An Adaptive Strips Method for Extraction Buildings From Light Detection and Ranging Data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1651–1655. [Google Scholar] [CrossRef]
- Huang, R.; Yang, B.; Liang, F.; Dai, W. A top-down strategy for buildings extraction from complex urban scenes using airborne LiDAR point clouds. Infrared Phys. Technol. 2018, 92, 203–218. [Google Scholar] [CrossRef]
- Hui, Z.; Li, Z.; Cheng, P.; Ziggah, Y.Y.; Fan, J.L. Building extraction from airborne lidar data based on multi-constraints graph segmentation. Remote Sens. 2021, 13, 3766. [Google Scholar] [CrossRef]
- Qin, R.; Fang, W. A hierarchical building detection method for very high resolution remotely sensed images combined with DSM using graph cut optimization. Photogramm. Eng. Remote Sens. 2014, 80, 37–47. [Google Scholar] [CrossRef]
- Acar, H.; Karsli, F.; Ozturk, M.; Dihkan, M. Automatic detection of building roofs from point clouds produced by the dense image matching technique. Int. J. Remote Sens. 2018, 40, 138–155. [Google Scholar] [CrossRef]
- Hron, V.; Halounová, L. Automatic reconstruction of roof models from building outlines and aerial image data. Acta Polytech. 2019, 59, 448–457. [Google Scholar] [CrossRef]
- Ghamisi, P.; Höfle, B.; Zhu, X.X. Hyperspectral and lidar data fusion using extinction profiles and deep convolutional neural network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 3011–3024. [Google Scholar] [CrossRef]
- Nguyen, T.H.; Daniel, S.; Gueriot, D.; Sintes, C.; Caillec, J.M.L. Unsupervised Automatic Building Extraction Using Active Contour Model on Unregistered Optical Imagery and Airborne LiDAR Data. In Proceedings of the PIA19+MRSS19—Photogrammetric Image Analysis & Munich Remote Sensing Symposium, Munich, Germany, 18–20 September 2019; Volume XLII-2/W16. pp. 181–188. [Google Scholar] [CrossRef]
- Yuan, Q.; Shafri, H.Z.H.; Alias, A.H.; Hashim, S.J. Multiscale semantic feature optimization and fusion network for building extraction using high-resolution aerial images and LiDAR data. Remote Sens. 2021, 13, 2473. [Google Scholar] [CrossRef]
- Li, F.; Zhu, H.; Luo, Z.; Shen, H.; Li, L. An adaptive surface interpolation filter using cloth simulation and relief amplitude for airborne laser scanning data. Remote Sens. 2021, 13, 2938. [Google Scholar] [CrossRef]
- Provot, X. Deformation constraints in a mass-spring model to describe rigid cloth behaviour. In Graphics Interface; Canadian Information Processing Society: Mississauga, ON, Canada, 1995; p. 147. Available online: http://www-rocq.inria.fr/syntim/research/provot/ (accessed on 3 May 2022).
- Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Xie, D.; Wang, X.; Yan, G. An easy-to-use airborne LiDAR data filtering method based on cloth simulation. Remote Sens. 2016, 8, 501. [Google Scholar] [CrossRef]
- Su, Z.; Gao, Z.; Zhou, G.; Li, S.; Song, L.; Lu, X.; Kang, N. Building Plane Segmentation Based on Point Clouds. Remote Sens. 2022, 12, 95. [Google Scholar] [CrossRef]
- Dos Santos, R.C.; Galo, M.; Carrilho, A.C. Building boundary extraction from LiDAR data using a local estimated parameter for alpha shape algorithm. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 127–132. [Google Scholar] [CrossRef]
- Shen, W.; Li, J.; Chen, Y.; Deng, L.; Peng, G. Algorithms study of building boundary extraction and normalization based on LiDAR data. J. Remote Sens. 2008, 05, 692–698. [Google Scholar] [CrossRef]
- Sun, Z.; Li, Z.; Liu, Y. An improved lidar data segmentation algorithm based on euclidean clustering. In Proceedings of the 11th International Conference on Modelling, Identification and Control, Tianjin, China, 13–15 July 2019; Springer: Singapore, 2020; pp. 1119–1130. [Google Scholar] [CrossRef]
- Xu, Z.; Yan, W. The Filter Algorithm Based on Lidar Point Cloud. Inf. Commun. 2018, 3, 80–82. [Google Scholar] [CrossRef]
- Li, W.; Wang, F.; Xia, G. A geometry-attentional network for ALS point cloud classification. ISPRS J. Photogramm. Remote Sens. 2020, 164, 26–40. [Google Scholar] [CrossRef]
- Charles, R.Q.; Su, H.; Kaichun, M.; Guibas, L.J. PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 77–85. [Google Scholar] [CrossRef]
- Charles, R.Q.; Yi, L.; Su, H.; Guibas, L.J. PointNet++: Deep hierarchical feature learning on point sets in a metric space. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar] [CrossRef]
- Huang, R.; Xu, Y.; Hong, D.; Yao, W.; Ghamisi, P.; Stilla, U. Deep point embedding for urban classification using ALS point clouds: A new perspective from local to global. ISPRS J. Photogramm. Remote Sens. 2020, 163, 62–81. [Google Scholar] [CrossRef]
Algorithm | Parameter | Urban-LiDAR | Vaihi-1 | Vaihi-2 |
---|---|---|---|---|
CSF algorithm | cloth_resolution | 1.0 | 0.3 | 1.0 |
max_iterations | 500 | 500 | 500 | |
classification_thresold | 2.0 | 1.5 | 2.2 | |
Region growing algorithm | theta_threshold | 5 | 30 | 10 |
curvature_threshold | 0.05 | 0.05 | 0.03 | |
neighbor_number | 20 | 15 | 30 | |
min_pts_per_cluster | 100 | 40 | 50 | |
max_pts_per_cluster | 10,000 | 10,000 | 10,000 | |
European clustering algorithm | tolerance | 0.58 | 1.5 | 1.25 |
min_cluster_size | 80 | 180 | 180 | |
max_cluster_size | 100,000 | 10,000 | 15,000 |
Precision | Recall | F1 Score | |
---|---|---|---|
Roof | 98.74 | 98.47 | 98.60 |
Façade | 97.98 | 70.94 | 82.30 |
ID | Precision | Recall | F1 Score |
---|---|---|---|
0 | 99.54 | 99.77 | 99.66 |
1 | 98.25 | 98.92 | 98.58 |
2 | 99.80 | 98.42 | 99.11 |
3 | 96.05 | 98.00 | 97.02 |
4 | 97.19 | 98.56 | 97.87 |
5 | 95.22 | 95.62 | 95.42 |
6 | 99.85 | 99.80 | 99.82 |
7 | 100 | 98.14 | 99.06 |
8 | 84.08 | 91.31 | 87.55 |
9 | 98.72 | 98.88 | 98.80 |
10 | 98.68 | 97.35 | 98.01 |
11 | 98.82 | 98.38 | 98.60 |
12 | 98.00 | 98.52 | 98.26 |
13 | 99.50 | 97.70 | 98.59 |
14 | 100 | 100 | 100 |
15 | 99.12 | 96.64 | 97.86 |
16 | 98.79 | 97.65 | 98.22 |
17 | 99.94 | 99.32 | 99.63 |
18 | 96.67 | 98.28 | 97.47 |
19 | 88.47 | 93.46 | 90.90 |
20 | 93.29 | 96.37 | 94.80 |
21 | 99.87 | 97.78 | 98.81 |
22 | 99.62 | 99.17 | 99.39 |
23 | 97.69 | 97.96 | 97.82 |
24 | 99.41 | 92.02 | 95.57 |
25 | 97.46 | 92.42 | 94.87 |
26 | 96.18 | 98.06 | 97.11 |
27 | 98.91 | 98.68 | 98.79 |
28 | 79.57 | 89.13 | 84.08 |
29 | 100 | 100 | 100 |
30 | 92.76 | 95.66 | 94.19 |
Algorithm | Indicator | Roof | Facade |
---|---|---|---|
PointNet | Precision | 73.0 (↑20.73) | 10.7 (↑49.63) |
Recall | 82.2 | 0.1 | |
F1 score | 77.6 | 5.4 | |
PointNet++ | Precision | 92.8 | 43.8 (↑16.53) |
Recall | 81.0 | 38.3 | |
F1 score | 86.9 | 41.0 | |
HDL-JME-GGO | Precision | 92.8 | 64.2 (↓3.87) |
Recall | 89.3 | 24.2 | |
F1 score | 91.1 (↓0.28) | 44.2 | |
The Proposed Method | Precision | 93.73 | 60.33 |
Recall | 88.08 | 27.33 | |
F1 score | 90.82 | 37.62 |
Precision | Recall | F1 Score | ||||
---|---|---|---|---|---|---|
Vaih-1 | Vaih-2 | Vaih-1 | Vaih-2 | Vaih-1 | Vaih-2 | |
Roof | 91.49 | 96.27 | 92.32 | 83.93 | 91.90 | 89.68 |
Facade | 58.33 | 61.45 | 17.77 | 38.36 | 27.24 | 47.23 |
ID | Precision | Recall | F1 Score | |||
---|---|---|---|---|---|---|
Roof | Vaih-1 | Vaih-2 | Vaih-1 | Vaih-2 | Vaih-1 | Vaih-2 |
0 | 100 | 86.80 | 100 | 91.83 | 100 | 89.24 |
1 | 88.94 | 98.04 | 93.87 | 90.77 | 91.34 | 94.27 |
2 | 100 | 92.65 | 99.80 | 92.45 | 99.90 | 92.55 |
3 | 97.83 | 97.91 | 99.77 | 95.02 | 98.79 | 96.44 |
4 | 99.45 | 99.90 | 99.73 | 94.43 | 99.59 | 97.09 |
5 | 97.75 | 99.88 | 97.61 | 78.13 | 97.68 | 87.68 |
6 | 99.39 | 94.78 | 95.46 | 84.80 | 97.39 | 89.51 |
7 | 100 | 99.88 | 95.58 | 55.14 | 97.74 | 71.05 |
8 | 99.02 | 99.41 | 99.18 | 68.67 | 99.10 | 81.23 |
9 | 71.91 | 99.72 | 81.51 | 94.67 | 76.41 | 97.13 |
10 | 98.52 | 99.29 | 98.89 | 94.40 | 98.70 | 96.78 |
11 | 100 | 97.21 | 100 | 93.43 | 100 | 95.28 |
12 | 98.14 | 99.70 | 86.17 | 98.38 | 91.77 | 99.04 |
13 | 98.18 | 96.57 | 96.83 | 97.70 | 97.50 | 97.13 |
14 | 98.96 | 99.55 | 95.65 | 98.31 | 97.28 | 98.93 |
15 | 99.43 | 99.84 | 99.15 | 87.41 | 99.29 | 93.21 |
16 | 100 | 99.07 | 99.76 | 98.05 | 99.88 | 98.56 |
17 | 99.29 | 99.23 | 99.29 | 90.44 | 99.29 | 94.63 |
18 | 100 | 99.16 | 89.25 | 98.39 | 94.32 | 98.77 |
19 | 96.92 | 97.68 | 98.43 | 91.75 | 97.67 | 94.62 |
20 | 97.35 | 96.92 | 96.89 | 97.55 | 97.12 | 97.23 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Su, Z.; Peng, J.; Feng, D.; Li, S.; Yuan, Y.; Zhou, G. A Building Point Cloud Extraction Algorithm in Complex Scenes. Remote Sens. 2024, 16, 1934. https://doi.org/10.3390/rs16111934
Su Z, Peng J, Feng D, Li S, Yuan Y, Zhou G. A Building Point Cloud Extraction Algorithm in Complex Scenes. Remote Sensing. 2024; 16(11):1934. https://doi.org/10.3390/rs16111934
Chicago/Turabian StyleSu, Zhonghua, Jing Peng, Dajian Feng, Shihua Li, Yi Yuan, and Guiyun Zhou. 2024. "A Building Point Cloud Extraction Algorithm in Complex Scenes" Remote Sensing 16, no. 11: 1934. https://doi.org/10.3390/rs16111934
APA StyleSu, Z., Peng, J., Feng, D., Li, S., Yuan, Y., & Zhou, G. (2024). A Building Point Cloud Extraction Algorithm in Complex Scenes. Remote Sensing, 16(11), 1934. https://doi.org/10.3390/rs16111934