Measuring Comfort Behaviours in Laying Hens Using Deep-Learning Tools
Abstract
:Simple Summary
Abstract
1. Introduction
2. Materials and Methods
2.1. Animals and Housing
2.2. Video Recordings and Test Sets
2.3. Set up of the Object-Detection Algorithm
2.4. Training and Validation
3. Results
Validation of the External Dataset
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Halachmi, I.; Guarino, M.; Bewley, J.; Pastell, M. Smart Animal Agriculture: Application of Real-Time Sensors to Improve Animal Well-Being and Production. Annu. Rev. Anim. Biosci. 2019, 7, 403–425. [Google Scholar] [CrossRef] [PubMed]
- Li, N.; Ren, Z.; Li, D.; Zeng, L. Review: Automated techniques for monitoring the behaviour and welfare of broilers and laying hens: Towards the goal of precision livestock farming. Animal 2020, 14, 617–625. [Google Scholar] [CrossRef] [Green Version]
- Berckmans, D. General introduction to precision livestock farming. Anim. Front. 2017, 7, 6–11. [Google Scholar] [CrossRef] [Green Version]
- Neethirajan, S. The role of sensors, big data and machine learning in modern animal farming. Sens. Bio-Sens. Res. 2020, 29, 100367. [Google Scholar] [CrossRef]
- Neethirajan, S. Transforming the adaptation physiology of farm animals through sensors. Animals 2020, 10, 1512. [Google Scholar] [CrossRef]
- Zhuang, X.; Bi, M.; Guo, J.; Wu, S.; Zhang, T. Development of an early warning algorithm to detect sick broilers. Comput. Electron. Agric. 2018, 144, 102–113. [Google Scholar] [CrossRef]
- Wang, J.; Shen, M.; Liu, L.; Xu, Y.; Okinda, C. Recognition and Classification of Broiler Droppings Based on Deep Convolutional Neural Network. J. Sens. 2019, 2019, 823515. [Google Scholar] [CrossRef] [Green Version]
- Fernández, A.P.; Norton, T.; Tullo, E.; van Hertem, T.; Youssef, A.; Exadaktylos, V.; Vranken, E.; Guarino, M.; Berckmans, D. Real-time monitoring of broiler flock’s welfare status using camera-based technology. Biosyst. Eng. 2018, 173, 103–114. [Google Scholar] [CrossRef]
- Kashiha, M.; Pluk, A.; Bahr, C.; Vranken, E.; Berckmans, D. Development of an early warning system for a broiler house using computer vision. Biosyst. Eng. 2013, 116, 36–45. [Google Scholar] [CrossRef]
- Rowe, E.; Dawkins, M.S.; Gebhardt-Henrich, S.G. A Systematic Review of Precision Livestock Farming in the Poultry Sector: Is Technology Focussed on Improving Bird Welfare? Animals 2019, 9, 614. [Google Scholar] [CrossRef]
- Wurtz, K.; Camerlink, I.; D’Eath, R.B.; Fernández, A.P.; Norton, T.; Steibel, J.; Siegford, J. Recording behaviour of in-door-housed farm animals automatically using machine vision technology: A systematic review. PLoS ONE 2019, 14, 12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zaninelli, M.; Redaelli, V.; Luzi, F.; Mitchell, M.; Bontempo, V.; Cattaneo, D.; Dell’Orto, V.; Savoini, G. Development of a machine vision method for the monitoring of laying hens and detection of multiple nest occupations. Sensors 2018, 18, 132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- European Commission 2022. Eggs Market Situation Dashboard. Available online: https://ec.europa.eu/info/sites/default/files/food-farming-fisheries/farming/documents/eggs-dashboard_en.pdf (accessed on 1 August 2022).
- Hartcher, K.M.; Jones, B. The welfare of layer hens in cage and cage-free housing systems. Worlds Poult. Sci. J. 2017, 73, 767–782. [Google Scholar] [CrossRef] [Green Version]
- Gautron, J.; Dombre, C.; Nau, F.; Feidt, C.; Guillier, L. Review: Production factors affecting the quality of chicken table eggs and egg products in Europe. Animal 2022, 16, 100425. [Google Scholar] [CrossRef] [PubMed]
- Campbell, D.L.M.; Makagon, M.M.; Swanson, J.C.; Siegford, J.M. Litter use by laying hens in a commercial aviary: Dust bathing and piling. Poult. Sci. 2016, 95, 164–175. [Google Scholar] [CrossRef] [PubMed]
- Winter, J.; Toscano, M.J.; Stratmann, A. Piling behaviour in Swiss layer flocks: Description and related factors. Appl. Anim. Behav. Sci. 2021, 236, 105272. [Google Scholar] [CrossRef]
- Ferrante, V.; Lolli, S. Specie avicole. In Etologia Applicata e Benessere Animale; Vol. 2—Parte Speciale; Carenzi, C., Panzera, M., Eds.; Point Veterinaire Italie Srl: Milan, Italy, 2009; pp. 89–106. [Google Scholar]
- Grebey, T.G.; Ali, A.B.A.; Swanson, J.C.; Widowski, T.M.; Siegford, J.M. Dust bathing in laying hens: Strain, proximity to, and number of conspecific matter. Poult. Sci. 2020, 99, 4103–4112. [Google Scholar] [CrossRef]
- Riddle, E.R.; Ali, A.B.A.; Campbell, D.L.M.; Siegford, J.M. Space use by 4 strains of laying hens to perch, wing flap, dust bathe, stand and lie down. PLoS ONE 2018, 13, e0190532. [Google Scholar] [CrossRef] [Green Version]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep Learning in Agriculture: A Survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef]
- Zhao, Z.Q.; Zheng, P.; Xu, S.T.; Wu, X. Object Detection with Deep Learning: A Review. IEEE Trans. Neural. Netw. Learn. Syst. 2019, 30, 3212–3232. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef] [Green Version]
- Redmon, j.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
- Jiang, P.; Ergu, D.; Liu, F.; Cai, Y.; Ma, B. A Review of Yolo algorithm developments. Procedia Comput. Sci. 2022, 199, 1066–1073. [Google Scholar] [CrossRef]
- Sozzi, M.; Cantalamessa, S.; Cogato, A.; Kayad, A.; Marinello, F. Automatic bunch detection in white grape varieties using YOLOv3, YOLOv4, and YOLOv5 deep learning algorithms. Agronomy 2022, 12, 319. [Google Scholar] [CrossRef]
- Pillan, G.; Trocino, A.; Bordignon, F.; Pascual, A.; Birolo, M.; Concollato, A.; Pinedo Gil, J.; Xiccato, G. Early training of hens: Effects on the animal distribution in an aviary system. Acta fytotech. Zootech. 2020, 23, 269–275. [Google Scholar] [CrossRef]
- Bresilla, K.; Perulli, G.D.; Boini, A.; Morandi, B.; Corelli Grappadelli, L.; Manfrini, L. Single-Shot Convolution Neural Net-works for Real-Time Fruit Detection Within the Tree. Front. Plant Sci. 2019, 10, 611. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yi, Z.; Yongliang, S.; Jun, Z. An improved tiny-yolov3 pedestrian detection algorithm. Optik 2019, 183, 17–23. [Google Scholar] [CrossRef]
- Yang, G.; Feng, W.; Jin, J.; Lei, Q.; Li, X.; Gui, G.; Wang, W. Face mask recognition system with YOLOV5 based on image recognition. In Proceedings of the IEEE 6th International Conference on Computer and Communications (ICCC), Chengdu, China, 11–14 December 2020; pp. 1398–1404. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, 2004, 10934. [Google Scholar] [CrossRef]
- Redmon, J. Darknet: Open Source Neural Networks in C. 2013–2016. Available online: https://pjreddie.com/darknet (accessed on 1 September 2022).
- Wang, J.; Wang, N.; Li, L.; Ren, Z. Real-time behavior detection and judgment of egg breeders based on YOLO v3. Neural. Comput. Appl. 2020, 32, 5471–5481. [Google Scholar] [CrossRef]
- Kwon, Y.; Choi, W.; Marrable, D.; Abdulatipov, R.; Loïck, J. Yolo_label 2020. Available online: https://github.com/developer0hye/Yolo_Label (accessed on 1 September 2020).
- Bochkovskiy, A. YOLOv4. Available online: https://github.com/AlexeyAB/darknet/releases (accessed on 1 September 2022).
- Van Dyk, D.A.; Meng, X.L. The art of data augmentation. J. Comput. Graph. Stat. 2001, 10, 1–50. [Google Scholar] [CrossRef]
- Loshchilov, I.; Hutter, F. SGDR: Stochastic gradient descent with warm restarts. In Proceedings of the International Con-ference on Learning Representations, Toulon, France, 24–26 April 2017. arXiv 2017, 1608, 03983. [Google Scholar] [CrossRef]
- Dice, L.R. Measures of the Amount of Ecologic Association between Species. Ecology 1945, 26, 297–302. [Google Scholar] [CrossRef]
- Sørensen, T.J. A Method of Establishing Groups of Equal Amplitudes in Plant Sociology Based on Similarity of Species Content and Its Application to Analyses of the Vegetation on Danish Commons. K. Dan. Vidensk. Selsk. 1948, 5, 1–34. Available online: https://www.royalacademy.dk/Publications/High/295_S%C3%B8rensen,%20Thorvald.pdf (accessed on 1 September 2022).
- Jaccard, P. The distribution of the flora in the alpine zone. New Phytol. 1912, 11, 37–50. [Google Scholar] [CrossRef]
- Pereira, D.F.; Miyamoto, B.C.; Maia, G.D.; Sales, G.T.; Magalhães, M.M.; Gates, R.S. Machine vision to identify broiler breeder behavior. Comput. Electron. Agric. 2013, 99, 194–199. [Google Scholar] [CrossRef]
- Pu, H.; Lian, J.; Fan, M. Automatic Recognition of Flock Behavior of Chickens with Convolutional Neural Network and Kinect Sensor. Int. J. Pattern. Recognit. Artif. Intell. 2018, 32, 7. [Google Scholar] [CrossRef]
- Wang, C.Y.; Liao, H.Y.M.; Wu, Y.H.; Chen, P.Y.; Hsieh, J.W.; Yeh, I.H. CSPNet: A new backbone that can enhance learning capability of CNN. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 16–18 June 2020; IEEE Computer Society: Washington, DC, USA, 2020; Volume 2020, pp. 1571–1580. [Google Scholar]
- Siriani, A.L.R.; Kodaira, V.; Mehdizadeh, S.A.; de Alencar Nääs, I.; de Moura, D.J.; Pereira, D.F. Detection and tracking of chickens in low-light images using YOLO network and Kalman filter. Neural. Comput. Appl. 2022, 34, 21987–21997. [Google Scholar] [CrossRef]
- Mbelwa, H.; Machuve, D.; Mbelwa, J. Deep Convolutional Neural Network for Chicken Diseases Detection. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 759–765. [Google Scholar] [CrossRef]
- Chang, K.R.; Shih, F.P.; Hsieh, M.K.; Hsieh, K.W.; Kuo, Y.F. Analyzing chicken activity level under heat stress condition using deep convolutional neural networks. In Proceedings of the ASABE Annual International Meeting, Houston, TX, USA, 17–20 July 2022. [Google Scholar] [CrossRef]
- Kayad, A.; Sozzi, M.; Paraforos, D.S.; Rodrigues, F.A., Jr.; Cohen, Y.; Fountas, S.; Francisco, M.J.; Pezzuolo, A.; Grigolato, S.; Marinello, F. How many gi-gabytes per hectare are available in the digital agriculture era? A digitization footprint estimation. Comput. Electron. Agric. 2022, 198, 107080. [Google Scholar] [CrossRef]
- Yang, J.; Zhang, Z.; Li, Y. Agricultural Few-Shot Selection by Model Confidences for Multimedia Internet of Things Acquisition Dataset. In Proceedings of the 2022 IEEE International Conferences on Internet of Things (iThings) and IEEE Green Computing & Communications (GreenCom) and IEEE Cyber, Physical & Social Computing (CPSCom) and IEEE Smart Data (SmartData) and IEEE Congress on Cybermatics (Cybermatics), Espoo, Finland, 22–25 August 2022; IEEE Computer Society: Washington, DC, USA, 2022; pp. 488–494. [Google Scholar] [CrossRef]
Model | Training Time | Final mAP@50 | Best mAP@50 |
---|---|---|---|
YOLOv4-tiny | 4.26 h | 90.3% | 91.7% |
YOLOv4 | 23.2 h | 87.8% | 90.0% |
Model | IoU | Precision | Recall | F1-score | mAP | FPS |
---|---|---|---|---|---|---|
YOLOv4-tiny | 78.3% | 0.94 | 0.93 | 0.92 | 61.4% | 208.5 |
Hens on floor | 94.5% | |||||
Dust-bathing hens | 28.2% | |||||
YOLOv4 | 77.1% | 0.90 | 0.93 | 0.92 | 62.9% | 31.35 |
Hens on floor | 94.1% | |||||
Dust-bathing hens | 31.6% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sozzi, M.; Pillan, G.; Ciarelli, C.; Marinello, F.; Pirrone, F.; Bordignon, F.; Bordignon, A.; Xiccato, G.; Trocino, A. Measuring Comfort Behaviours in Laying Hens Using Deep-Learning Tools. Animals 2023, 13, 33. https://doi.org/10.3390/ani13010033
Sozzi M, Pillan G, Ciarelli C, Marinello F, Pirrone F, Bordignon F, Bordignon A, Xiccato G, Trocino A. Measuring Comfort Behaviours in Laying Hens Using Deep-Learning Tools. Animals. 2023; 13(1):33. https://doi.org/10.3390/ani13010033
Chicago/Turabian StyleSozzi, Marco, Giulio Pillan, Claudia Ciarelli, Francesco Marinello, Fabrizio Pirrone, Francesco Bordignon, Alessandro Bordignon, Gerolamo Xiccato, and Angela Trocino. 2023. "Measuring Comfort Behaviours in Laying Hens Using Deep-Learning Tools" Animals 13, no. 1: 33. https://doi.org/10.3390/ani13010033
APA StyleSozzi, M., Pillan, G., Ciarelli, C., Marinello, F., Pirrone, F., Bordignon, F., Bordignon, A., Xiccato, G., & Trocino, A. (2023). Measuring Comfort Behaviours in Laying Hens Using Deep-Learning Tools. Animals, 13(1), 33. https://doi.org/10.3390/ani13010033