Optimizing Moving Object Trajectories from Roadside Lidar Data by Joint Detection and Tracking
Abstract
:1. Introduction
- (1)
- Occlusions and data sparsity are the main challenges of roadside lidar data, causing interruptions and shortened range of object trajectories. Thus, the first objective of this paper is to increase object tracking ranges and improve trajectory continuity for enhanced information extraction.
- (2)
- As there are different kinds of on-road objects, it is useful to learn how object category affects the maximum tracking range, which can also be influenced by the number of laser beams on the lidar sensor; therefore, the second objective of this paper is to investigate how the object types/sizes and number of lidar beams practically affect the trackable ranges in roadside lidar systems.
2. Related Work
2.1. Tracking-by-Detection
2.2. Tracking-before-Detection
3. Methodology
3.1. Segmentation
- Create a Kd-tree representation of the point cloud dataset, P.
- Set up an empty list of clusters, C, and a queue of points requiring processing, Q.
- For every point pi in P, the following operations will be undertaken:
- (1)
- Add pi to Q.
- (2)
- For every point pk in Q, search the neighboring points in a sphere with radius r < d. Then check each neighboring point to see if it has already been processed, if not, add it to Q.
- (3)
- If all points in Q have been processed, add Q to C and reset Q to empty.
- Terminate when all the points in P have been processed and included in C.
3.2. Object Tracking
3.3. Object Detection Based on PV-RCNN
3.4. Trajectory Classification
4. Experimental Results and Analysis
4.1. Datasets
4.2. Experiment Settings
4.2.1. Segmentation
4.2.2. Object Detection and Tracking
4.2.3. Trajectory Classification
4.3. Results and Analysis
4.3.1. Detection Results
4.3.2. Trajectory Classification Results
4.3.3. Comparison with Tracking-by-Detection Method
- Ranges of the trajectories
- Continuity of trajectories
4.3.4. Maximum Tracking Range
- From a general perspective, the maximum tracking range of pedestrian is shorter than
- that of vehicles including bus, car, and van. At Study Site 1, the maximum tracking range of pedestrian is the shortest among all the categories because it has the smallest object size. Although, at Study Site 2, the maximum tracking range of pedestrian is longer than that of car because two pedestrians were walking together and were tracked as one object.
- In terms of vehicles, the maximum tracking range of car is shorter than that of van and bus due to smaller object size.
- For car, van, and pedestrian, the maximum tracking range at Site 1 is longer than that at Site 2 because a sensor with more laser beams is adopted at Site 1. For buses, the sensor can ‘see’ through a straight open road branch at Site 2, whereas at Site 1, bushes and trees occlude some of the beams when they attempt to spread further (seen as Figure 2). Therefore, buses can be tracked for longer at Site 2.
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Fay, D.; Thakur, G.S.; Hui, P.; Helmy, A. Knowledge discovery and causality in urban city traffic: A study using planet scale vehicular imagery data. In Proceedings of the 6th ACM SIGSPATIAL International Workshop on Computational Transportation Science, Orlando, FL, USA, 5–8 November 2013. [Google Scholar]
- Zhao, J.; Xu, H.; Liu, H.; Wu, J.; Zheng, Y.; Wu, D. Detection and tracking of pedestrians and vehicles using roadside LiDAR sensors. Transp. Res. Part C Emerg. Technol. 2019, 100, 68–87. [Google Scholar] [CrossRef]
- Wu, J. Data processing algorithms and applications of LiDAR-enhanced connected infrastructure sensing. Ph.D. Thesis, University of Nevada, Reno, NV, USA, 2018. [Google Scholar]
- Nagpure, A.K.; Gurjar, B.R.; Sahni, N.; Kumar, P. Pollutant emissions from road vehicle in mega city Kolkata, India: Past and present trends. Indian J. Air Pollut. Control 2010, 10, 18–30. [Google Scholar]
- Xu, H.; Tian, Z.; Wu, J.; Liu, H.; Zhao, J. High-Resolution Micro Traffic Data from Roadside LiDAR Sensors for Connected-Vehicles and New Traffic Applications; University of Nevada, Solaris University Transportation Center: Reno, NV, USA, 2018. [Google Scholar]
- Zhang, Z.; Zheng, J.; Xu, H.; Wang, X. Vehicle detection and tracking in complex traffic circumstances with roadside LiDAR. Transp. Res. Rec. 2019, 2673, 62–71. [Google Scholar] [CrossRef]
- Zhao, J. Exploring the fundamentals of using infrastructure-based LiDAR sensors to develop connected intersections. Ph.D. Thesis, Texas Tech University, Lubbock, TX, USA, 2019. [Google Scholar]
- Zhang, J.; Xiao, W.; Coifman, B.; Mills, J.P. Vehicle Tracking and Speed Estimation from Roadside Lidar. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 5597–5608. [Google Scholar] [CrossRef]
- Xiao, W.; Vallet, B.; Schindler, K.; Paparoditis, N. Simultaneous Detection and Tracking of Pedestrian from Velodyne Laser Scanning Data. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 295–302. [Google Scholar] [CrossRef] [Green Version]
- Wu, J. An automatic procedure for vehicle tracking with a roadside LiDAR sensor. ITE J. Inst. Transp. Eng. 2018, 88, 32–37. [Google Scholar]
- Chen, J.; Tian, S.; Xu, H.; Yue, R.; Sun, Y.; Cui, Y. Architecture of vehicle trajectories extraction with roadside LiDAR serving connected vehicles. IEEE Access 2019, 7, 100406–100415. [Google Scholar] [CrossRef]
- Yan, Z.; Duckett, T.; Bellotto, N. Online learning for human classification in 3d lidar-based tracking. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Wu, J.; Xu, H.; Zheng, Y.; Zhang, Y.; Lv, B.; Tian, Z. Automatic vehicle classification using roadside LiDAR data. Transp. Res. Rec. 2019, 2673, 153–164. [Google Scholar] [CrossRef]
- Zhang, J.; Pi, R.; Ma, X.; Wu, J.; Li, H.; Yang, Z. Object Classification with Roadside LiDAR Data Using a Probabilistic Neural Network. Electronics 2021, 10, 803. [Google Scholar] [CrossRef]
- Wan, E.A.; Van Der Merwe, R. The Unscented Kalman Filter for Nonlinear Estimation. In Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium, Lake Louise, AB, Canada, 4 October 2000. [Google Scholar]
- Bar-Shalom, Y.; Daum, F.; Huang, J. The probabilistic data association filter. IEEE Control Syst. 2009, 29, 82–100. [Google Scholar]
- Wu, J.; Zhang, Y.; Tian, Y.; Yue, R.; Zhang, H. Automatic Vehicle Tracking with LiDAR-Enhanced Roadside Infrastructure. J. Test Eval. 2020, 49, 121–133. [Google Scholar] [CrossRef]
- Cui, Y.; Xu, H.; Wu, J.; Sun, Y.; Zhao, J. Automatic vehicle tracking with roadside LiDAR data for the connected-vehicles system. IEEE Intell. Syst. 2019, 34, 44–51. [Google Scholar] [CrossRef]
- Weng, X.; Kitani, K. A baseline for 3d multi-object tracking. arXiv 2019, arXiv:1907.03961. [Google Scholar]
- Weng, X.; Wang, J.; Held, D.; Kitani, K. 3d multi-object tracking: A baseline and new evaluation metrics. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October–24 January 2020. [Google Scholar]
- Shi, S.; Wang, X.; Li, H. Pointrcnn: 3d object proposal generation and detection from point cloud. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Weng, X.; Kitani, K. Monocular 3d object detection with pseudo-lidar point cloud. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, Seoul, Korea, 27–28 October 2019. [Google Scholar]
- Shi, S.; Guo, C.; Yang, J.; Li, H. PV-RCNN: The Top-Performing LiDAR-only Solutions for 3D Detection/3D Tracking/Domain Adaptation of Waymo Open Dataset Challenges. arXiv 2020, arXiv:2008.12599. [Google Scholar]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The kitti dataset. Int. J. Rob. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef] [Green Version]
- Kuhn, H.W. The Hungarian method for the assignment problem. Nav. Res. Logist. Q. 1955, 2, 83–97. [Google Scholar] [CrossRef] [Green Version]
- Wang, D.; Huang, C.; Wang, Y.; Deng, Y.; Li, H. A 3D Multiobject Tracking Algorithm of Point Cloud Based on Deep Learning. Math. Probl. Eng. 2020, 2020, 8895696. [Google Scholar] [CrossRef]
- Weng, X.; Wang, Y.; Man, Y.; Kitani, K.M. Gnn3dmot: Graph neural network for 3d multi-object tracking with 2d-3d multi-feature learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020. [Google Scholar]
- Tong, H.; Zhang, H.; Meng, H.; Wang, X. Multitarget Tracking Before Detection via Probability Hypothesis Density Filter. In Proceedings of the International Conference on Electrical and Control Engineering, Wuhan, China, 25–27 June 2010. [Google Scholar]
- Ošep, A.; Mehner, W.; Voigtlaender, P.; Leibe, B. Track, then decide: Category-agnostic vision-based multi-object tracking. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018. [Google Scholar]
- Mitzel, D.; Leibe, B. Taking Mobile Multi-Object Tracking to the Next Level: People, Unknown Objects, and Carried Items. In Proceedings of the European Conference on Computer Vision, Florence, Italy, 7–13 October 2012. [Google Scholar]
- Gonzalez, H.; Rodriguez, S.; Elouardi. Track-Before-Detect Framework-Based Vehicle Monocular Vision Sensors. Sensors 2019, 19, 560. [Google Scholar] [CrossRef] [Green Version]
- Chen, Q.A.; Tsukada, A. Detection-by-Tracking Boosted Online 3D Multi-Object Tracking. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019. [Google Scholar]
- Zhang, Z.; Zheng, J.; Xu, H.; Wang, X.; Fan, X.; Chen, R. Automatic background construction and object detection based on roadside LiDAR. IEEE Trans. Intell. Transp. Syst. 2019, 21, 4086–4097. [Google Scholar] [CrossRef]
- Shi, S.; Guo, C.; Jiang, L.; Wang, Z.; Shi, J.; Wang, X.; Li, H. Pv-rcnn: Point-voxel feature set abstraction for 3d object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020. [Google Scholar]
- Syarif, I.; Prugel-Bennett, A.; Wills, G. SVM parameter optimization using grid search and genetic algorithm to improve classification performance. Telkomnika 2016, 14, 1502. [Google Scholar] [CrossRef]
- SUPERVISELY. Available online: https://supervise.ly/ (accessed on 20 December 2021).
- Wu, J.; Xu, H.; Sun, Y.; Zheng, J.; Yue, R. Automatic Background Filtering Method for Roadside LiDAR Data. Transp. Res. Rec. 2018, 2672, 106–114. [Google Scholar] [CrossRef]
- Huang, K.; Hao, Q. Joint Multi-Object Detection and Tracking with Camera-LiDAR Fusion for Autonomous Driving. In Proceedings of the EEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021. [Google Scholar]
Procedure | Parameter | Description | Setting | Basis of Setting |
---|---|---|---|---|
Initialization | Initialization threshold | Threshold to initialize a track | 0.1 | Default |
confirmation threshold | Threshold for track confirmation | [1, 3] | Experiment | |
Data association | Assignment threshold | Detection assignment threshold | [4 m, Inf] | Practice and empirical knowledge |
Track management | Deletion threshold | Threshold for track deletion | [5, 5] | Default |
Length threshold | Threshold to delete a non-vehicle trajectory | 3 m | Experiment and practice |
Class | Total Number | Recall (%) | Precision (%) | F1 (%) |
---|---|---|---|---|
Vehicle | 182 | 74.7 | 83.4 | 78.8 |
Pedestrian | 146 | 72.6 | 65.4 | 68.8 |
Class | Ground Truth | Recall (%) | Precision (%) | F1 (%) |
---|---|---|---|---|
Vehicle | 20 | 70.0 | 82.4 | 75.7 |
Pedestrian | 45 | 93.3 | 87.5 | 90.3 |
Study Sites | Vehicle Examples | Tracking-by-Detection | JDAT | Ground Truth | Comparison | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Start Frame | End Frame | N1 | Start Frame | End Frame | N2 | Start Frame | End Frame | N | N1/N (%) | N2/N (%) | N1/N2 (%) | ||
1 | 1 | 777 | 849 | 73 | 743 | 849 | 107 | 741 | 849 | 109 | 67.0 | 98.2 | 68.2 |
2 | 878 | 967 | 90 | 875 | 976 | 102 | 865 | 991 | 127 | 70.9 | 80.3 | 88.2 | |
3 | 4880 | 4971 | 92 | 4871 | 4979 | 109 | 4863 | 4996 | 134 | 68.7 | 81.3 | 84.4 | |
2 | 4 | 4221 | 4274 | 54 | 4219 | 4284 | 66 | 4219 | 4302 | 84 | 64.3 | 78.6 | 81.8 |
5 | 9110 | 9163 | 54 | 9100 | 9173 | 74 | 9100 | 9181 | 82 | 65.9 | 90.2 | 73.0 | |
6 | 9105 | 9162 | 58 | 9104 | 9174 | 71 | 9104 | 9185 | 82 | 70.7 | 86.6 | 81.7 | |
3 | 7 | 0 | 20 | 21 | 0 | 34 | 35 | 0 | 34 | 35 | 60.0 | 100 | 60.0 |
8 | 766 | 836 | 71 | 766 | 858 | 93 | 766 | 858 | 93 | 76.3 | 100 | 76.3 | |
9 | 926 | 977 | 52 | 926 | 987 | 62 | 926 | 987 | 62 | 83.9 | 100 | 83.9 | |
Mean | 69.7 | 90.6 | 77.5 |
Study Sites | Road Condition | Sensor Type | Maximum Tracking Range(m) | |||
---|---|---|---|---|---|---|
Bus | Van | Car | Pedestrian | |||
1 | Straight section | RS-LiDAR-32 | 109.5 | 111.3 | 98.2 | 91.8 |
2 | Intersection | VLP-16 | 112.39 | 49.26 | 38.16 | 48.50 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, J.; Xiao, W.; Mills, J.P. Optimizing Moving Object Trajectories from Roadside Lidar Data by Joint Detection and Tracking. Remote Sens. 2022, 14, 2124. https://doi.org/10.3390/rs14092124
Zhang J, Xiao W, Mills JP. Optimizing Moving Object Trajectories from Roadside Lidar Data by Joint Detection and Tracking. Remote Sensing. 2022; 14(9):2124. https://doi.org/10.3390/rs14092124
Chicago/Turabian StyleZhang, Jiaxing, Wen Xiao, and Jon P. Mills. 2022. "Optimizing Moving Object Trajectories from Roadside Lidar Data by Joint Detection and Tracking" Remote Sensing 14, no. 9: 2124. https://doi.org/10.3390/rs14092124
APA StyleZhang, J., Xiao, W., & Mills, J. P. (2022). Optimizing Moving Object Trajectories from Roadside Lidar Data by Joint Detection and Tracking. Remote Sensing, 14(9), 2124. https://doi.org/10.3390/rs14092124