Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning—A Comparison between High-End and Low-Cost Multispectral Sensors
Abstract
:1. Introduction
1.1. Motivation
1.2. Related Research
1.3. Goals and Structure
- Present the hardware and software development of a multispectral LCS for weed detection in agriculture.
- Compare the performance of the proposed multispectral LCS against the trusted MicaSense Altum in a real-world scenario. The classification accuracy of both sensors will be evaluated in the context of a practical application using different evaluation metrics.
- Identify and discuss possible weaknesses of the LCS to determine suitable use cases and outline a path for further development of the system.
2. Materials and Methods
2.1. Data Acquisition
2.1.1. Sensor Description
2.1.2. System Hardware LCS
2.1.3. System Software LCS
2.1.4. UAV Platform and Study Site
2.2. Data Analysis
2.2.1. Preprocessing
2.2.2. Evaluation of Training and Testing Data
2.2.3. Semantic Segmentation
2.2.4. Evaluation Metrics
- The Jaccard index (also known as the Jaccard coefficient or InterSection over Union) is an established metric used to evaluate the performance of a segmentation algorithm [58]. It is defined as the size of the interSection of the predicted and ground truth segmentations divided by the size of the union of the predicted and ground truth segmentations. The utilized Jaccard score computes the average of the Jaccard index, between pairs of label sets. The average is weighted, considering the proportion for each label in the dataset, to account for imbalances.
- The calculations of recall, precision, and F1-score were performed for each class separately. Recall is a measure of the ability of the model to correctly identify positive instances in a dataset, whereas precision measures the proportion of instances identified as positive by the model which are indeed positive. The F1-score combines these two metrics as a harmonic mean in a single score [70].
- To better comprehend the misclassifications, a confusion matrix (CM) was computed. The matrix contains the normalized number of true positive, true negative, false positive, and false negative predictions made by the U-Net for each class.
3. Results
3.1. Qualitative Results of the Training and Testing Data
3.2. Quantitative Evaluation of the U-Net Predictions
3.3. Qualitative Evaluation of the U-Net Predictions
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
BBCH | Biologische Bundesanstalt für Land und Forstwirtschaft, Bundessortenamt und CHemische Industrie |
CIR | Color–Infrared |
CM | Confusion Matrix |
CMOS | Complementary Metal Oxide Semiconductor |
CNAT | CNAT Image Uses NAT (Network Address Translation) |
CNN | Convolutional Neural Network |
DL | Deep Learning |
FCN | Fully Convolutional Neural Network |
GSD | Ground Sample Distance |
IR | Infrared |
LCS | Low-Cost Camera System |
ML | Machine Learning |
NDVI | Normalized Difference Vegetation Index |
NIR | Near Infrared |
NN | Neural Network |
NoIR | No Infrared Filter |
RGB | Red–Green–Blue |
SSH | Secure Shell |
UAV | Unmanned Aerial Vehicle |
Appendix A
References
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Oerke, E.C. Crop losses to pests. J. Agric. Sci. 2006, 144, 31–43. [Google Scholar] [CrossRef]
- Hashemi-Beni, L.; Gebrehiwot, A.; Karimoddini, A.; Shahbazi, A.; Dorbu, F. Deep Convolutional Neural Networks for Weeds and Crops Discrimination from UAS Imagery. Front. Remote Sens. 2022, 3, 755939. [Google Scholar] [CrossRef]
- Chechliński, Ł.; Siemiątkowska, B.; Majewski, M. A System for Weeds and Crops Identification-Reaching over 10 FPS on Raspberry Pi with the Usage of MobileNets, DenseNet and Custom Modifications. Sensors 2019, 19, 3787. [Google Scholar] [CrossRef]
- Castaldi, F.; Pelosi, F.; Pascucci, S.; Casa, R. Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precis. Agric. 2017, 18, 76–94. [Google Scholar] [CrossRef]
- Peña, J.M.; Torres-Sánchez, J.; de Castro, A.I.; Kelly, M.; López-Granados, F. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef]
- Pelosi, F.; Castaldi, F.; Casa, R. Operational Unmanned Aerial Vehicle Assisted Post-Emergence Herbicide Patch Spraying in Maize: A Field Study. In Proceedings of the European Conference of Precision Agriculture, Tel-Aviv, Israel, 12–16 July 2015; pp. 159–166. [Google Scholar] [CrossRef]
- Liu, B.; Bruch, R. Weed Detection for Selective Spraying: A Review. Curr. Robot. Rep. 2020, 1, 19–26. [Google Scholar] [CrossRef]
- Haug, S.; Michaels, A.; Biber, P.; Ostermann, J. Plant classification system for crop /weed discrimination without segmentation. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Steamboat Springs, CO, USA, 24–26 March 2014; pp. 1142–1149. [Google Scholar] [CrossRef]
- Hasan, A.S.M.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
- Coleman, G.R.; Bender, A.; Hu, K.; Sharpe, S.M.; Schumann, A.W.; Wang, Z.; Bagavathiannan, M.V.; Boyd, N.S.; Walsh, M.J. Weed detection to weed recognition: Reviewing 50 years of research to identify constraints and opportunities for large-scale cropping systems. Weed Technol. 2022, 36, 741–757. [Google Scholar] [CrossRef]
- Farooq, A.; Hu, J.; Jia, X. Analysis of Spectral Bands and Spatial Resolutions for Weed Classification via Deep Convolutional Neural Network. IEEE Geosci. Remote Sens. Lett. 2019, 16, 183–187. [Google Scholar] [CrossRef]
- Zou, K.; Ge, L.; Zhang, C.; Yuan, T.; Li, W. Broccoli Seedling Segmentation Based on Support Vector Machine Combined with Color Texture Features. IEEE Access 2019, 7, 168565–168574. [Google Scholar] [CrossRef]
- Shahi, T.B.; Xu, C.Y.; Neupane, A.; Guo, W. Machine learning methods for precision agriculture with UAV imagery: A review. Electron. Res. Arch. 2022, 30, 4277–4317. [Google Scholar] [CrossRef]
- Le, V.N.T.; Apopei, B.; Alameh, K. Effective plant discrimination based on the combination of local binary pattern operators and multiclass support vector machine methods. Inf. Process. Agric. 2019, 6, 116–131. [Google Scholar]
- Wang, C.; Li, Z. Weed recognition using SVM model with fusion height and monocular image features. Trans. Chin. Soc. Agric. Eng. 2016, 32, 165–174. [Google Scholar]
- Xu, B.; Meng, R.; Chen, G.; Liang, L.; Lv, Z.; Zhou, L.; Sun, R.; Zhao, F.; Yang, W. Improved weed mapping in corn fields by combining UAV-based spectral, textural, structural, and thermal measurements. Pest Manag. Sci. 2023, 79, 2591–2602. [Google Scholar] [CrossRef]
- Brilhador, A.; Gutoski, M.; Hattori, L.T.; de Souza Inácio, A.; Lazzaretti, A.E.; Lopes, H.S. Classification of Weeds and Crops at the Pixel-Level Using Convolutional Neural Networks and Data Augmentation. In Proceedings of the 2019 IEEE Latin American Conference on Computational Intelligence (LA-CCI), Guayaquil, Ecuador, 11–15 November 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Wang, A.; Xu, Y.; Wei, X.; Cui, B. Semantic segmentation of crop and weed using an encoder-decoder network and image enhancement method under uncontrolled outdoor illumination. IEEE Access 2020, 8, 81724–81734. [Google Scholar] [CrossRef]
- Potena, C.; Nardi, D.; Pretto, A. Fast and accurate crop and weed identification with summarized train sets for precision agriculture. In Proceedings of the Intelligent Autonomous Systems 14: Proceedings of the 14th International Conference IAS-14 14, Shanghai, China, 3–7 July 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 105–121. [Google Scholar]
- Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of Weed Detection Methods Based on Computer Vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef] [PubMed]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Ong, P.; Teo, K.S.; Sia, C.K. UAV-based weed detection in Chinese cabbage using deep learning. Smart Agric. Technol. 2023, 4, 100181. [Google Scholar] [CrossRef]
- Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
- Khan, A.; Ilyas, T.; Umraiz, M.; Mannan, Z.I.; Kim, H. CED-Net: Crops and Weeds Segmentation for Smart Farming Using a Small Cascaded Encoder-Decoder Architecture. Electronics 2020, 9, 1602. [Google Scholar] [CrossRef]
- You, J.; Liu, W.; Lee, J. A DNN-based semantic segmentation for detecting weed and crop. Comput. Electron. Agric. 2020, 178, 105750. [Google Scholar] [CrossRef]
- Ramirez, W.; Achanccaray, P.; Mendoza, L.F.; Pacheco, M.A.C. Deep Convolutional Neural Networks for Weed Detection in Agricultural Crops Using Optical Aerial Images. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; pp. 133–137. [Google Scholar] [CrossRef]
- Genze, N.; Ajekwe, R.; Güreli, Z.; Haselbeck, F.; Grieb, M.; Grimm, D.G. Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Comput. Electron. Agric. 2022, 202, 107388. [Google Scholar] [CrossRef]
- Fawakherji, M.; Potena, C.; Bloisi, D.D.; Imperoli, M.; Pretto, A.; Nardi, D. UAV Image Based Crop and Weed Distribution Estimation on Embedded GPU Boards. In Proceedings of the CAIP 2019 International Workshops, ViMaBi and DL-UAV, Salerno, Italy, 6 September 2019; pp. 100–108. [Google Scholar]
- Andrea, C.C.; Daniel, B.B.M.; Misael, J.B.J. Precise weed and maize classification through convolutional neuronal networks. In Proceedings of the 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), Salinas, Ecuador, 16–20 October 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Sa, I.; Chen, Z.; Popovic, M.; Khanna, R.; Liebisch, F.; Nieto, J.; Siegwart, R. weedNet: Dense Semantic Weed Classification Using Multispectral Images and MAV for Smart Farming. IEEE Robot. Autom. Lett. 2018, 3, 588–595. [Google Scholar] [CrossRef]
- Lopez-Ruiz, N.; Granados-Ortega, F.; Carvajal, M.A.; Martinez-Olmos, A. Portable multispectral imaging system based on Raspberry Pi. Sens. Rev. 2017, 37, 322–329. [Google Scholar] [CrossRef]
- Belcore, E.; Piras, M.; Pezzoli, A.; Massazza, G.; Rosso, M. Raspberry pi 3 Multispectral Low-Cost Sensor for Uav Based Remote Sensing. Case Study In South-West Niger. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W13, 207–214. [Google Scholar] [CrossRef]
- Arduino vs. Raspberry Pi: Key Differences Comparison Table. Available online: https://webbylab.com/blog/arduino-vs-raspberry-pi-key-differences-comparison-table/#:~:text=Raspberry (accessed on 6 December 2023).
- Pagnutti, M.; Ryan, R.E.; Cazenavette, G.; Gold, M.; Harlan, R.; Leggett, E.; Pagnutti, J. Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes. J. Electron. Imaging 2017, 26, 013014. [Google Scholar] [CrossRef]
- Dworak, V.; Selbeck, J.; Dammer, K.H.; Hoffmann, M.; Zarezadeh, A.A.; Bobda, C. Strategy for the development of a smart NDVI camera system for outdoor plant detection and agricultural embedded systems. Sensors 2013, 13, 1523–1538. [Google Scholar] [CrossRef]
- Doering, D.; Vizzotto, M.R.; Bredemeier, C.; Da Costa, C.M.; Henriques, R.; Pignaton, E.; Pereira, C.E. MDE-Based Development of a Multispectral Camera for Precision Agriculture. IFAC-PapersOnLine 2016, 49, 24–29. [Google Scholar] [CrossRef]
- Sangjan, W.; Carter, A.H.; Pumphrey, M.O.; Jitkov, V.; Sankaran, S. Development of a Raspberry Pi-Based Sensor System for Automated In-Field Monitoring to Support Crop Breeding Programs. Inventions 2021, 6, 42. [Google Scholar] [CrossRef]
- Kamath, R.; Balachandra, M.; Prabhu, S. Raspberry Pi as Visual Sensor Nodes in Precision Agriculture: A Study. IEEE Access 2019, 7, 45110–45122. [Google Scholar] [CrossRef]
- Lottes, P.; Behley, J.; Milioto, A.; Stachniss, C. Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming. IEEE Robot. Autom. Lett. 2018, 3, 2870–2877. [Google Scholar] [CrossRef]
- Venkataraju, A.; Arumugam, D.; Stepan, C.; Kiran, R.; Peters, T. A review of machine learning techniques for identifying weeds in corn. Smart Agric. Technol. 2023, 3, 100102. [Google Scholar] [CrossRef]
- Miller, I.J.; Schieber, B.; de Bey, Z.; Benner, E.; Ortiz, J.D.; Girdner, J.; Patel, P.; Coradazzi, D.G.; Henriques, J.; Forsyth, J. Analyzing crop health in vineyards through a multispectral imaging and drone system. In Proceedings of the 2020 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 24 April 2020; pp. 1–5. [Google Scholar] [CrossRef]
- Hutton, J.J.; Lipa, G.; Baustian, D.; Sulik, J.; Bruce, R.W. High Accuracy Direct Georeferencing of the Altum Multi-Spectral Uav Camera and Its Application to High Throughput Plant Phenotyping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, XLIII-B1-2020, 451–456. [Google Scholar] [CrossRef]
- Zarzar, C.M.; Dash, P.; Dyer, J.L.; Moorhead, R.; Hathcock, L. Development of a Simplified Radiometric Calibration Framework for Water-Based and Rapid Deployment Unmanned Aerial System (UAS) Operations. Drones 2020, 4, 17. [Google Scholar] [CrossRef]
- Altum-PT Camera. Available online: https://ageagle.com/drone-sensors/altum-pt-camera/ (accessed on 6 December 2023).
- Behling, H. Raspberry 4B+. Make Mag. 2019, 4, 30. [Google Scholar]
- Cluster HAT. Available online: https://clusterhat.com/ (accessed on 12 January 2024).
- Dyrmann, M.; Karstoft, H.; Midtiby, H.S. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
- Evangelidis, G.D.; Psarakis, E.Z. Parametric Image Alignment Using Enhanced Correlation Coefficient Maximization. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 1858–1865. [Google Scholar] [CrossRef]
- Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef]
- Training Module on Monitoring Vegetation from Space: Chapter III: Satellite Sensors and Vegetation. 2010. Available online: https://resources.eumetrain.org/data/3/36/print_3.htm (accessed on 18 February 2024).
- GIS Geography. Spectral Signature Cheatsheet in Remote Sensing. 2023. Available online: https://gisgeography.com/spectral-signature/ (accessed on 30 November 2023).
- Elstone, L.; How, K.Y.; Brodie, S.; Ghazali, M.Z.; Heath, W.P.; Grieve, B. High Speed Crop and Weed Identification in Lettuce Fields for Precision Weeding. Sensors 2020, 20, 455. [Google Scholar] [CrossRef]
- Louargant, M.; Jones, G.; Faroux, R.; Paoli, J.N.; Maillot, T.; Gée, C.; Villette, S. Unsupervised Classification Algorithm for Early Weed Detection in Row-Crops by Combining Spatial and Spectral Information. Remote Sens. 2018, 10, 761. [Google Scholar] [CrossRef]
- Alexandridis, T.K.; Tamouridou, A.A.; Pantazi, X.E.; Lagopodi, A.L.; Kashefi, J.; Ovakoglou, G.; Polychronos, V.; Moshou, D. Novelty Detection Classifiers in Weed Mapping: Silybum marianum Detection on UAV Multispectral Images. Sensors 2017, 17, 2007. [Google Scholar] [CrossRef]
- Zeng, Y.; Hao, D.; Huete, A.; Dechant, B.; Berry, J.; Chen, J.M.; Joiner, J.; Frankenberg, C.; Bond-Lamberty, B.; Ryu, Y.; et al. Optical vegetation indices for monitoring terrestrial ecosystems globally. Nat. Rev. Earth Environ. 2022, 3, 477–493. [Google Scholar] [CrossRef]
- Milics, G. Application of UAVs in Precision Agriculture; Springer: Berlin/Heidelberg, Germany, 2019; pp. 93–97. [Google Scholar] [CrossRef]
- Haug, S.; Ostermann, J.M. A Crop/Weed Field Image Dataset for the Evaluation of Computer Vision Based Precision Agriculture Tasks; Springer International Publishing: Cham, Switzerland, 2015; Volume 8928. [Google Scholar] [CrossRef]
- Bosilj, P.; Aptoula, E.; Duckett, T.; Cielniak, G. Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture. J. Field Robot. 2020, 37, 7–19. [Google Scholar] [CrossRef]
- Kriegler, F.J.; Malila, W.A.; Nalepka, R.F.; Richardson, W. Preprocessing Transformations and Their Effects on Multispectral Recognition. In Proceedings of the Remote Sensing of Environment, VI, Ann Arbor, MI, USA, 13–16 October 1969; p. 97. [Google Scholar]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
- Jiang, Z.; Chen, Y.; Li, J.; Dou, W. The impact of spatial resolution on NDVI over heterogeneous surface. In Proceedings of the Impact of Spatial Resolution on NDVI over Heterogeneous Surface, Seoul, Republic of Korea, 29–29 July 2005; Volume 2, pp. 1310–1313. [Google Scholar] [CrossRef]
- Teillet, P.; Staenz, K.; William, D. Effects of spectral, spatial, and radiometric characteristics on remote sensing vegetation indices of forested regions. Remote Sens. Environ. 1997, 61, 139–149. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Proceedings of the 18th International Conference, Munich, Germany, 5–9 October 2015. [Google Scholar]
- Asad, M.H.; Bais, A. Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Inf. Process. Agric. 2020, 7, 535–545. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 26 June–1 July 2016. [Google Scholar]
- Deng, J.; Dong, W.; Socher, R.; Li, L.J.; Li, K.; Li, F.-F. ImageNet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 248–255. [Google Scholar] [CrossRef]
- Bhimar. SegSalad: A Hands On Review of Semantic Segmentation Techniques for Weed/Crop Datasets. 2021. Available online: https://github.com/bhimar/SegSalad (accessed on 18 February 2024).
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Sokolova, M.; Lapalme, G. A systematic analysis of performance measures for classification tasks. Inf. Process. Manag. 2009, 45, 427–437. [Google Scholar] [CrossRef]
- Zou, K.; Chen, X.; Wang, Y.; Zhang, C.; Zhang, F. A modified U-Net with a specific data argumentation method for semantic segmentation of weed images in the field. Comput. Electron. Agric. 2021, 187, 106242. [Google Scholar] [CrossRef]
MicaSense Altum | Raspberry Pi 4 V2 Sony IMX219 RGB + NoIR | |
---|---|---|
Resolution | 2064 × 1544 px | 2592 × 1944 px |
Field of view | 48° × 36.8° | 53.50° × 62.2° |
GSD (at 12 m altitude) | ∼0.53 px/cm | ∼0.55 px/cm |
Center Wavelength | Blue: 465 nm, Green: 560 nm, Red: 668 nm, NIR: 842 nm | Blue: 450 nm, Green: 520 nm, Red: 600 nm, NIR: 750 nm |
Price | ∼15,995.00 € | ∼500.00 € |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Seiche, A.T.; Wittstruck, L.; Jarmer, T. Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning—A Comparison between High-End and Low-Cost Multispectral Sensors. Sensors 2024, 24, 1544. https://doi.org/10.3390/s24051544
Seiche AT, Wittstruck L, Jarmer T. Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning—A Comparison between High-End and Low-Cost Multispectral Sensors. Sensors. 2024; 24(5):1544. https://doi.org/10.3390/s24051544
Chicago/Turabian StyleSeiche, Anna Teresa, Lucas Wittstruck, and Thomas Jarmer. 2024. "Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning—A Comparison between High-End and Low-Cost Multispectral Sensors" Sensors 24, no. 5: 1544. https://doi.org/10.3390/s24051544
APA StyleSeiche, A. T., Wittstruck, L., & Jarmer, T. (2024). Weed Detection from Unmanned Aerial Vehicle Imagery Using Deep Learning—A Comparison between High-End and Low-Cost Multispectral Sensors. Sensors, 24(5), 1544. https://doi.org/10.3390/s24051544