Non-Contact Body Measurement for Qinchuan Cattle with LiDAR Sensor
Abstract
:1. Introduction
1.1. Agricultural Applications of LiDAR
1.2. Imaging Systems for Body Measuring
1.3. Main Purposes
- New filter fusion and clustering segmentation methods are presented, where the filter fusion can effectively remove the uneven distribution of PCD, multiple noises and many outliers, and the clustering segmentation can accurately extract the spatial position, geometry shape and proximity distance for the cattle.
- The feature extraction, matching, reconstruction and validation are presented, where the global and local feature descriptors can be employed to effectively detect the features of point data of cattle, and the partitioned feature data can be iteratively matched and reconstructed into whole cattle. In-field experimentation results are presented to validate the measurement calibration.
2. Materials and Methods
2.1. Data Acquisition and Preprocessing
2.1.1. Data Acquisition and Body Dimensions
2.1.2. Preprocessing with Filters Fusion
Algorithm 1. Filtering with three filters fusion |
Input: ocloud % Original point cloud input data Output: fcloud % Filtered point cloud output data 1. InputCloud ← ocloud % Putting the original data into the filters container 2. Condition ← −1.25 < x < 1.0 && 1.5 < z < 3.0 % Setting the CRF filtering condition 3. KeepOrganized ← true % Keeping the point cloud structure 4. ccloud← CrFilter(ocloud) % Filtering with CRF 5. MeanK ← 60 % Setting the mean distances threshold of SORF as 60 6. StddevMulThresh ← 1 % Setting the outlier deviation threshold of SORF as 1 7. scloud←SorFilter(ccloud) % Filtering with SORF 8. LeafSize ← (0.03 f, 0.03 f, 0.03 f) % Setting the grid of VGF as 9. fcloud←VgFilter(scloud) % Filtering with VGF |
2.2. Clustering Segmentation
2.2.1. K-Means Clustering with KD-Trees Searching
2.2.2. Plane Segmentation with RANSAC
Algorithm 2. Segmentation processing with K-means clustering and RANSAC |
Input: fcloud % Input preprocessed point cloud data Output: segcloud % Segmented point cloud data 1. InputCloud ← fcloud % Put the input data into the segmentation container 2. ClusterTolerance ← 0.05 % Set the cluster searching radius as 0.05 m 3. MinClusterSize ← 50 % Set the minimal clusters quantity as 50 4. ecloud ← EuExtract(fcloud) % Segment input data with K-means clustering 5. ModelType ← SACMODEL_PLANE % Set the segmentation model type as planar model 6. MethodType ← SAC_RANSAC % Get parameter estimation with RANSAC 7. DistanceThreshold ← 0.02 % Set the distances threshold in the model as 0.02 m 8. segcloud ← RANExtract(ecloud) % Segment the point cloud data with RANSAC |
2.3. Feature Detection of FPFH
2.3.1. FPFH Descriptor
2.3.2. Feature Models Library and Feature Matching
- (1)
- Construct the feature model library. With 3D PCD collection of some live cattle on the spot, the specific features of cattle are selected. Several typical groups of point clouds are filtered, clustered and segmented, and then, several groups of cattle point cloud manually are decided as known target feature cluster model. Finally, the FPFH feature descriptors of each cluster are computed to construct the training library of the feature model.
- (2)
- Feature matching. The FPFH of all the point cloud files are extracted with a clustering classifier, and the input clustering files to be detected are compared with the feature model library one by one.
- (3)
- Select point clouds. The Euclidean distance is calculated as a similarity index to match whether the FPFH of the point cloud is similar to the feature model library. If the distance is beyond the given threshold, which is called a mismatch, the feature cluster is removed by the classifier. The corresponding pseudo is shown in Algorithm 3. The matching result is shown in Figure 11 where the red portion indicates the whole contour of cattle.
Algorithm 3. Feature Detection with FPFH descriptor |
Input: segcloud % Segmented point cloud data Output: tcloud % Output target point cloud data 1. initialize n, VFH % n representing the number of point cloud files after segmentation % VFH representing the VFH of the Model Feature Library 2. for i := 1, …, n do 3. NormalEstimation() % Make a normal estimate 4. VFHi ← calcVFH() % Calculate the VFH of the point cloud 5. if (VFHi – VFH) > thresh then % Point cloud matching 6. delete (pld) % Delete the unmatched point cloud 7. end if 8. end for |
2.4. 3D Surface Reconstruction
2.4.1. ICP Registration with BRKD-Trees Searching
2.4.2. Reconstruction with GPT
2.5. Fitting Function
3. Experiments and Discussion
3.1. Experiments
3.2. Discussion
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
LiDAR | Light Detection And Ranging |
RANSAC | RANdom SAmple Consensus algorithm |
VFH | Viewpoint Feature Histogram |
FPFH | Fast Point Feature Histogram |
ICP | Iterative Closest Point matching algorithm |
PCD | Point Cloud Data |
3D | Three Dimensional |
MLS | Moving Least Squares resampling algorithm |
ToF | Time of Flight |
PC | Personal Computer |
BDRKD-trees | Bi-direction Random K-D Trees searching method |
MPFH | Model Point Feature Histogram |
FPFH | Fast Point Feature Histograms |
SAC-IA | SAmple Consensus based Initial Alignment algorithm |
PCL | Point Cloud Library |
CRF | Conditional Removal Filter |
FLANN | Fast Library for Approximate Nearest Neighbors data format |
SORF | Statistical Outlier Removal Filter |
VGF | Voxel Grid Filter |
BRKD-trees | Bi-direction Random kd-trees searching method |
GPT | Greedy Projection Triangulation algorithm |
References
- Wilson, L.L.; Egan, C.L.; Terosky, T.L. Body measurements and body weights of special-fed Holstein veal calves. J. Dairy Sci. 1997, 80, 3077–3082. [Google Scholar] [CrossRef]
- Enevoldsen, C.; Kristensen, T. Estimation of body weight from body size measurements and body condition scores in dairy cows. J. Dairy Sci. 1997, 80, 1988–1995. [Google Scholar] [CrossRef]
- Brandl, N.; Jorgensen, E. Determination of live weight of pigs from dimensions measured using image analysis. Comput. Electron. Agric. 1996, 15, 57–72. [Google Scholar] [CrossRef]
- Kawasue, K.; Ikeda, T.; Tokunaga, T.; Harada, H. Three-dimensional shape measurement system for black cattle using KINECT sensor. Int. J. Circ. Syst. Signal. Process 2013, 7, 222–230. [Google Scholar]
- Communod, R.; Guida, S.; Vigo, D.; Beretti, V.; Munari, E.; Colombani, C.; Superchi, P.; Sabbioni, A. Body measures and milk production, milk fat globules granulometry and milk fatty acid content in Cabannina cattle breed. Ital. J. Anim. Sci. 2013, 12, e181. [Google Scholar] [CrossRef]
- Huang, L.; Chen, S.; Zhang, J.; Cheng, B.; Liu, M. Real-Time Motion Tracking for Indoor Moving Sphere Objects with a LiDAR Sensor. Sensors 2017, 17, 1932. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Walsh, K.B.; Verma, B. On-tree mango fruit size estimation using RGB-D images. Sensors 2017, 17, 2738. [Google Scholar] [CrossRef] [PubMed]
- Haemmerle, M.; Hoefle, B. Mobile low-cost 3D camera maize crop height measurements under field conditions. Precis. Agric. 2018, 19, 630–647. [Google Scholar] [CrossRef]
- Lee, J.; Jin, L.; Park, D.; Chung, Y. Automatic Recognition of Aggressive Behavior in Pigs Using a Kinect Depth Sensor. Sensors 2016, 16, 631. [Google Scholar] [CrossRef] [PubMed]
- Garrido, M.; Paraforos, D.S.; Reiser, D.; Arellano, M.V.; Griepentrog, H.W.; Valero, C. 3D maize plant reconstruction based on georeferenced overlapping LiDAR point clouds. Remote Sens. 2015, 7, 17077–17096. [Google Scholar] [CrossRef]
- Underwood, J.P.; Hung, C.; Whelan, B.; Sukkarieh, S. Mapping almond orchard canopy volume, flowers, fruit and yield using lidar and vision sensors. Comput. Electron. Agric. 2016, 130, 83–96. [Google Scholar] [CrossRef]
- Arno, J.; Escola, A.; Valles, J.M.; Llorens, J.; Sanz, R.; Masip, J.; Palacin, J.; Rosell-Polo, J.R. Leaf area index estimation in vineyards using a ground-based LiDAR scanner. Precis. Agric. 2013, 14, 290–306. [Google Scholar] [CrossRef] [Green Version]
- Werbrouck, I.; Antrop, M.; Van Eetvelde, V.; Stal, C.; De Maeyer, P.; Bats, M.; Bourgeois, J.; Court-Picon, M.; Crombe, P.; De Reu, J.; et al. Digital Elevation Model generation for historical landscape analysis based on LiDAR data, a case study in Flanders (Belgium). Expert Syst. Appl. 2011, 38, 8178–8185. [Google Scholar] [CrossRef] [Green Version]
- Koenig, K.; Hoefle, B.; Haemmerle, M.; Jarmer, T.; Siegmann, B.; Lilienthal, H. Comparative classification analysis of post-harvest growth detection from terrestrial LiDAR point clouds in precision agriculture. ISPRS J. Photogramm. Sens. 2015, 104, 112–125. [Google Scholar] [CrossRef]
- Teobaldelli, M.; Cona, F.; Saulino, L.; Migliozzi, A.; D’Urso, G.; Langella, G.; Manna, P.; Saracino, A. Detection of diversity and stand parameters in Mediterranean forests using leaf-off discrete return LiDAR data. Remote Sens. Environ. 2017, 192, 126–138. [Google Scholar] [CrossRef]
- Nie, S.; Wang, C.; Dong, P.; Xi, X. Estimating leaf area index of maize using airborne full-waveform lidar data. Remote Sens. Lett. 2016, 7, 111–120. [Google Scholar] [CrossRef]
- Schoeps, T.; Sattler, T.; Hane, C.; Pollefeys, M. Large-scale outdoor 3D reconstruction on a mobile device. Comput. Vis. Image Underst. 2017, 157, 151–166. [Google Scholar] [CrossRef]
- Balsi, M.; Esposito, S.; Fallavollita, P.; Nardinocchi, C. Single-tree detection in high-density LiDAR data from UAV-based survey. Eur. J. Remote Sens. 2018, 51, 679–692. [Google Scholar] [CrossRef]
- Qin, X.; Wu, G.; Lei, J.; Fan, F.; Ye, X.; Mei, Q. A novel method of autonomous inspection for transmission line based on cable inspection robot lidar data. Sensors 2018, 18, 596. [Google Scholar] [CrossRef] [PubMed]
- Brede, B.; Lau, A.; Bartholomeus, H.M.; Kooistra, L. Comparing RIEGL RiCOPTER UAV LiDAR derived canopy height and DBH with terrestrial LiDAR. Sensors 2017, 17, 2371. [Google Scholar] [CrossRef] [PubMed]
- Madrigal, C.A.; Branch, J.W.; Restrepo, A.; Mery, D. A Method for Automatic Surface Inspection Using a Model-Based 3D Descriptor. Sensors 2017, 17, 2262. [Google Scholar] [CrossRef] [PubMed]
- Arias-Castro, E. Clustering Based on Pairwise Distances When the Data is of Mixed Dimensions. IEEE Trans. Inf. Theory 2011, 57, 1692–1706. [Google Scholar] [CrossRef] [Green Version]
- Shaikh, S.A.; Kitagawa, H. Efficient distance-based outlier detection on uncertain datasets of Gaussian distribution. World Wide Web-Internet Web Inf. Syst. 2014, 17, 511–538. [Google Scholar] [CrossRef]
- Rusu, R.B.; Blodow, N.; Beetz, M. Fast Point Feature Histograms (FPFH) for 3D Registration. In Proceedings of the IEEE International Conference on Robotics and Automation-ICRA, Kobe, Japan, 12–17 May 2009; pp. 1848–1853. [Google Scholar]
- Frank, T.; Tertois, A.; Mallet, J. 3D-reconstruction of complex geological interfaces from irregularly distributed and noisy point data. Comput. Geosci. 2007, 33, 932–943. [Google Scholar] [CrossRef]
- Galvez, A.; Iglesias, A. Particle swarm optimization for non-uniform rational B-spline surface reconstruction from clouds of 3D data points. Inf. Sci. 2012, 192, 174–192. [Google Scholar] [CrossRef]
- Cazals, F.; Dreyfus, T.; Sachdeva, S.; Shah, N. Greedy geometric algorithms for collection of balls, with applications to geometric approximation and molecular coarse-graining. Comput. Graph. Forum 2014, 33, 1–17. [Google Scholar] [CrossRef]
- Stavrakakis, S.; Li, W.; Guy, J.H.; Morgan, G.; Ushaw, G.; Johnson, G.R.; Edwards, S.A. Validity of the Microsoft Kinect sensor for assessment of normal walking patterns in pigs. Comput. Electron. Agric. 2015, 117, 1–7. [Google Scholar] [CrossRef]
- Pezzuolo, A.; Guarino, M.; Sartori, L.; Marinello, F. A Feasibility study on the use of a structured light depth-camera for three-dimensional body measurements of dairy cows in free-stall barns. Sensors 2018, 18, 673. [Google Scholar] [CrossRef] [PubMed]
- Viazzi, S.; Bahr, C.; Van Hertem, T.; Schlageter-Tello, A.; Romanini, C.E.B.; Halachmi, I.; Lokhorst, C.; Berckmans, D. Comparison of a three-dimensional and two-dimensional camera system for automated measurement of back posture in dairy cows. Comput. Electron. Agric. 2014, 100, 139–147. [Google Scholar] [CrossRef]
- Xiang, Y.; Nakamura, S.; Tamari, H.; Takano, S.; Okada, Y. 3D Model Generation of Cattle by Shape-from-Silhouette Method for ICT Agriculture. In Proceedings of the International Conference on Complex, Intelligent, and Software Intensive Systems (CISIS 2016), Fukuoka, Japan, 6–8 July 2016; pp. 611–616. [Google Scholar]
- Foix, S.; Alenya, G.; Torras, C. Lock-in Time-of-Flight (ToF) Cameras: A Survey. IEEE Sens. J. 2011, 11, 1917–1926. [Google Scholar] [CrossRef] [Green Version]
- Maki, N.; Nakamura, S.; Takano, S.; Okada, Y. 3D Model Generation of Cattle Using Multiple Depth-Maps for ICT Agriculture. In Proceedings of the 11th International Conference on Complex, Intelligent, and Software Intensive Systems (CISIS 2017), Torino, Italy, 10–12 July 2017; pp. 768–777. [Google Scholar]
- Salau, J.; Haas, J.H.; Junge, W.; Thaller, G. Automated calculation of udder depth and rear leg angle in Holstein-Friesian cows using a multi-Kinect cow scanning system. Biosyst. Eng. 2017, 160, 154–169. [Google Scholar] [CrossRef]
- Viazzi, S.; Van Hertem, T.; Schlageter-Tello, A.; Bahr, C.; Romanini, C.E.B.; Halachmi, I.; Lokhorst, C.; Berckmans, D. Using a 3D camera to evaluate the back posture of dairy cows. In Proceedings of the American Society of Agricultural and Biological Engineers Annual International Meeting (ASABE 2013), Kansas City, MO, USA, 21–24 July 2013; pp. 4222–4227. [Google Scholar]
- Weber, A.; Salau, J.; Haas, J.H.; Junge, W.; Bauer, U.; Harms, J.; Suhr, O.; Schonrock, K.; Rothfuss, H.; Bieletzki, S.; Thaller, G. Estimation of backfat thickness using extracted traits from an automatic 3D optical system in lactating Holstein-Friesian cows. Livest. Sci. 2014, 165, 129–137. [Google Scholar] [CrossRef]
- Salau, J.; Haas, J.H.; Junge, W.; Bauer, U.; Harms, J.; Bieletzki, S. Feasibility of automated body trait determination using the SR4K time-of-flight camera in cow barns. Springerplus 2014, 3, 225. [Google Scholar] [CrossRef] [PubMed]
- McPhee, M.J.; Walmsley, B.J.; Skinner, B.; Littler, B.; Siddell, J.P.; Café, L.M.; Wilkins, J.F.; Oddy, V.H.; Alempijevic, A. Live animal assessments of rump fat and muscle score in Angus cows and steers using 3-dimensional imaging. J. Anim. Sci. 2017, 95, 1847–1857. [Google Scholar] [CrossRef] [PubMed]
- Tasdemir, S.; Urkmez, A.; Inal, S. A fuzzy rule-based system for predicting the live weight of holstein cows whose body dimensions were determined by image analysis. Turk. J. Eng. Comp. Sci. 2011, 19, 689–703. [Google Scholar] [CrossRef]
- Tasdemir, S.; Urkmez, A.; Inal, S. Determination of body measurements on the Holstein cows using digital image analysis and estimation of live weight with regression analysis. Comput. Electron. Agric. 2011, 76, 189–197. [Google Scholar] [CrossRef]
- Marinello, F.; Pezzuolo, A.; Cillis, D.; Gasparini, F.; Sartori, L. Application of Kinect-Sensor for three-dimensional body measurements of cows. In Proceedings of the 7th European Conference on Precision Livestock Farming (ECPLF 2015), Milan, Italy, 15–18 September 2015; pp. 661–669. [Google Scholar]
- Wang, K.; Guo, H.; Ma, Q.; Su, W.; Chen, L.; Zhu, D. A portable and automatic Xtion-based measurement system for pig body size. Comput. Electron. Agric. 2018, 148, 291–298. [Google Scholar] [CrossRef]
- Ju, M.; Choi, Y.; Seo, J.; Sa, J.; Lee, S.; Chung, Y.; Park, D. A kinect-based segmentation of touching-pigs for real-time monitoring. Sensors 2018, 18, 1746. [Google Scholar] [CrossRef] [PubMed]
- Pezzuolo, A.; Guarino, M.; Sartori, L.; González, L.A.; Marinello, F. On-barn pig weight estimation based on body measurements by a Kinect v1 depth camera. Comput. Electron. Agric 2018, 148, 29–36. [Google Scholar] [CrossRef]
- Menesatti, P.; Costa, C.; Antonucci, F.; Steri, R.; Pallottino, F.; Catillo, G. A low-cost stereovision system to estimate size and weight of live sheep. Comput. Electron. Agric. 2014, 103, 33–38. [Google Scholar] [CrossRef]
- Zhang, A.L.N.; Wu, B.P.; Jiang, C.X.H.; Xuan, D.C.Z.; Ma, E.Y.H.; Zhang, F.Y.A. Development and validation of a visual image analysis for monitoring the body size of sheep. J. Appl. Anim. Res. 2018, 46, 1004–1015. [Google Scholar] [CrossRef] [Green Version]
- Wu, J.; Tillett, R.; McFarlane, N.; Ju, X.; Siebert, J.P.; Schofield, P. Extracting the three-dimensional shape of live pigs using stereo photogrammetry. Comput. Electron. Agric. 2004, 44, 203–222. [Google Scholar] [CrossRef] [Green Version]
- White, R.P.; Schofield, C.P.; Green, D.M.; Parsons, D.J.; Whittemore, C.T. The effectiveness of a visual image analysis (VIA) system for monitoring the performance of growing/finishing pigs. Anim. Sci. 2004, 78, 409–418. [Google Scholar] [CrossRef]
- Doeschl-Wilson, A.B.; Whittemore, C.T.; Knap, P.W.; Schofield, C.P. Using visual image analysis to describe pig growth in terms of size and shape. Anim. Sci. 2004, 79, 415–427. [Google Scholar] [CrossRef]
- Chen, N.; Huang, J.; Zulfiqar, A.; Li, R.; Xi, Y.; Zhang, M.; Dang, R.; Lan, X.; Chen, H.; Ma, Y.; Lei, C. Population structure and ancestry of Qinchuan cattle. Anim. Genet. 2018, 49, 246–248. [Google Scholar] [CrossRef] [PubMed]
- Kapuscinski, T.; Oszust, M.; Wysocki, M.; Warchol, D. Recognition of Hand Gestures Observed by Depth Cameras. Int. J. Adv. Robot. Syst. 2015, 12, 36. [Google Scholar] [CrossRef] [Green Version]
- Fan, X.; Zhu, A.; Huang, L. Noncontact measurement of indoor objects with 3D laser camera-based. In Proceedings of the 2017 IEEE International Conference on Information and Automation (ICIA), Macau, China, 18–20 July 2017; pp. 386–391. [Google Scholar]
- Dziubich, T.; Szymanski, J.; Brzeski, A.; Cychnerski, J.; Korlub, W. Depth Images Filtering in Distributed Streaming. Pol. Marit. Res. 2016, 23, 91–98. [Google Scholar] [CrossRef]
- Redmond, S.J.; Heneghan, C. A method for initialising the K-means clustering algorithm using kd-trees. Pattern Recognit. Lett. 2007, 28, 965–973. [Google Scholar] [CrossRef] [Green Version]
- Schnabel, R.; Wahl, R.; Klein, R. Efficient RANSAC for point-cloud shape detection. Comput. Graph. Forum 2007, 26, 214–226. [Google Scholar] [CrossRef]
- Zhang, L.; Shen, P.; Zhu, G.; Wei, W.; Song, H. A Fast Robot Identification and Mapping Algorithm Based on Kinect Sensor. Sensors 2015, 15, 19937–19967. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- He, Y.; Liang, B.; Yang, J.; Li, S.; He, J. An iterative closest points algorithm for registration of 3D laser scanner point clouds with geometric features. Sensors 2017, 17, 1862. [Google Scholar] [CrossRef] [PubMed]
- Kawasue, K.; Win, K.D.; Yoshida, K.; Tokunaga, T. Black cattle body shape and temperature measurement using thermography and KINECT sensor. Artif. Life Robot. 2017, 22, 464–470. [Google Scholar] [CrossRef]
- Silpa-Anan, C.; Hartley, R. Optimised KD-trees for fast image descriptor matching. In Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
- Yu, J.; You, Z.; An, P.; Xia, J. An efficient 3-D mapping algorithm for RGB-D SLAM. In Proceedings of the 14th International Forum on Digital TV and Wireless Multimedia Communication (IFTC 2017), Shanghai, China, 8–9 November 2018; pp. 466–477. [Google Scholar]
- Jovancevic, I.; Pham, H.; Orteu, J.; Gilblas, R.; Harvent, J.; Maurice, X.; Brethes, L. 3D Point Cloud Analysis for Detection and Characterization of Defects on Airplane Exterior Surface. J. Nondestruct. Eval. 2017, 36, 74. [Google Scholar] [CrossRef]
- Marton, Z.C.; Rusu, R.B.; Beetz, M. On Fast Surface Reconstruction Methods for Large and Noisy Point Clouds. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2009), Kobe, Japan, 12–17 May 2009; pp. 2829–2834. [Google Scholar]
Ear Mask Q0521 Cattle | Withers Height | Chest Depth | Back Height | Body Length | Waist Height |
---|---|---|---|---|---|
Manual Measuring Value (m) | 1.56900 | 0.65300 | 1.55600 | 1.59100 | 1.59800 |
Initial Measuring Value (m) | 1.41508 | 0.57622 | 1.38628 | 1.39373 | 1.43797 |
Corrected/Final Measuring Value (m) | 1.58461 | 0.64525 | 1.55236 | 1.56071 | 1.61025 |
Initial Deviation | 9.81% | 11.76% | 10.91% | 12.40% | 10.01% |
Correction/Final Deviation | 1.00% | 1.19% | 0.23% | 1.90% | 0.77% |
Ear Mask Q0145 Cattle | Withers Height | Chest Depth | Back Height | Body Length | Waist Height |
---|---|---|---|---|---|
Manual Measuring Value (m) | 1.53400 | 0.75800 | 1.51600 | 1.58400 | 1.55800 |
Initial Measuring Value (m) | 1.27672 | 0.64814 | 1.26818 | 1.35469 | 1.31889 |
Corrected/Final Measuring Value (m) | 1.52120 | 0.77225 | 1.51103 | 1.61410 | 1.57145 |
Initial Deviation | 16.77% | 14.49% | 16.35% | 14.48% | 15.35% |
Correction/Final Deviation | 0.83% | 1.88% | 0.33% | 1.90% | 0.86% |
Ear Mask Q0159 Cattle | Withers Height | Chest Depth | Back Height | Body Length | Waist Height |
---|---|---|---|---|---|
Manual Measuring Value (m) | 1.12200 | 0.54100 | 1.10100 | 1.19600 | 1.14300 |
Initial Measuring Value (m) | 0.98878 | 0.48211 | 0.97186 | 1.07749 | 1.00453 |
Corrected/Final Measuring Value (m) | 1.10256 | 0.53759 | 1.08370 | 1.20149 | 1.12013 |
Initial Deviation | 11.87% | 10.89% | 11.73% | 9.91% | 12.11% |
Correction/Final Deviation | 1.73% | 0.63% | 1.57% | 0.46% | 2.00% |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, L.; Li, S.; Zhu, A.; Fan, X.; Zhang, C.; Wang, H. Non-Contact Body Measurement for Qinchuan Cattle with LiDAR Sensor. Sensors 2018, 18, 3014. https://doi.org/10.3390/s18093014
Huang L, Li S, Zhu A, Fan X, Zhang C, Wang H. Non-Contact Body Measurement for Qinchuan Cattle with LiDAR Sensor. Sensors. 2018; 18(9):3014. https://doi.org/10.3390/s18093014
Chicago/Turabian StyleHuang, Lvwen, Shuqin Li, Anqi Zhu, Xinyun Fan, Chenyang Zhang, and Hongyan Wang. 2018. "Non-Contact Body Measurement for Qinchuan Cattle with LiDAR Sensor" Sensors 18, no. 9: 3014. https://doi.org/10.3390/s18093014
APA StyleHuang, L., Li, S., Zhu, A., Fan, X., Zhang, C., & Wang, H. (2018). Non-Contact Body Measurement for Qinchuan Cattle with LiDAR Sensor. Sensors, 18(9), 3014. https://doi.org/10.3390/s18093014