Fish Detection and Classification for Automatic Sorting System with an Optimized YOLO Algorithm
Abstract
:Featured Application
Abstract
1. Introduction
- We compiled our own dataset of eight cultivated fish species. The dataset contains not only static images, but also videos of fish running randomly at two different speeds on a conveyor belt (low and high).
- This work employed YOLOv4, a very popular recognition algorithm, which was optimized with a unique labeling technique.
- A trial study with several schemes was also conducted to determine their effectiveness. Among them are schemes for using data for training, versions of YOLOv4, and comparisons of labeling techniques.
2. Materials and Methods
2.1. Images Dataset and the Experimental Set-Up
- Yeesok (Labeo rohita),
- Nuanchan (Cirrhinus microlepsis),
- Tapian (Barbonymus gonionotus),
- Nai (Cyprinus carpio),
- Jeen Ban (Hypophthalmichthys molitrix),
- Jeen To (Hypophthalmichthys nobilis),
- Nin (Oreochromis niloticus), and
- Sawai (Pangasianodon hypophthalmus).
2.2. Training Images and Augmentation
2.3. Labeling Techniques
2.4. YOLOv4, YOLOv4-Tiny, and the Training Process
2.5. Validation Matrix
3. Experimental Results and Discussion
3.1. Using Static Pictures for Training Data
3.2. With the Lite Version (YOLOv4-Tiny)
3.3. With YOLOv4 Using Conventional and Landmarking Labeling Techniques
3.4. With the Prophosed Approach
- Using extracted pictures as training data provided much more effective results than static pictures.
- YOLOv4 provided better accuracy results than its lite version (YOLOv4-Tiny).
- Using conventional labeling techniques on YOLOv4 gave fairly accurate detection results, even for fish classes that were similar, but many of them failed to detect the Sawai class.
- Combining YOLOv4 with the landmark labeling technique provided a fairly accurate detection result. Still, it generated many double detections for similar classes of fish, mostly for Nuanchan and Tapian, as well as Jeen Ban and Jeen To.
3.5. Comparison with Recent State of Art
3.6. Limitations and Future Developments
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Li, D.; Wang, Q.; Li, X.; Niu, M.; Wang, H.; Liu, C. Recent advances of machine vision technology in fish classification. ICES J. Mar. Sci. 2022, 79, 263–284. [Google Scholar] [CrossRef]
- Alsmadi, M.K.; Almarashdeh, I. A survey on fish classification techniques. J. King Saud Univ.-Comput. Inf. Sci. 2020, 34, 1625–1638. [Google Scholar] [CrossRef]
- Saleh, A.; Sheaves, M.; Azghadi, M.R. Computer vision and deep learning for fish classification in underwater habitats: A survey. Fish Fish. 2022, 23, 977–999. [Google Scholar] [CrossRef]
- Zhao, S.; Zhang, S.; Liu, J.; Wang, H.; Zhu, J.; Li, D.; Zhao, R. Application of machine learning in intelligent fish aquaculture: A review. Aquaculture 2021, 540, 736724. [Google Scholar] [CrossRef]
- Li, D.; Du, L. Recent advances of deep learning algorithms for aquacultural machine vision systems with emphasis on fish. Artif. Intell. Rev. 2022, 55, 4077–4116. [Google Scholar] [CrossRef]
- Yang, X.; Zhang, S.; Liu, J.; Gao, Q.; Dong, S.; Zhou, C. Deep learning for smart fish farming: Applications, opportunities and challenges. Rev. Aquac. 2021, 13, 66–90. [Google Scholar] [CrossRef]
- Villon, S.; Mouillot, D.; Chaumont, M.; Darling, E.S.; Subsol, G.; Claverie, T.; Villéger, S. A deep learning method for accurate and fast identification of coral reef fishes in underwater images. Ecol. Inform. 2018, 48, 238–244. [Google Scholar] [CrossRef] [Green Version]
- Labao, A.B.; Naval, P.C. Cascaded deep network systems with linked ensemble components for underwater fish detection in the wild. Ecol. Inform. 2019, 52, 103–121. [Google Scholar] [CrossRef]
- Jalal, A.; Salman, A.; Mian, A.; Shortis, M.; Shafait, F. Fish detection and species classification in underwater environments using deep learning with temporal information. Ecol. Inform. 2020, 57, 101088. [Google Scholar] [CrossRef]
- Villon, S.; Iovan, C.; Mangeas, M.; Claverie, T.; Mouillot, D.; Villéger, S.; Vigliola, L. Automatic underwater fish species classification with limited data using few-shot learning. Ecol. Inform. 2021, 63, 101320. [Google Scholar] [CrossRef]
- Salman, A.; Maqbool, S.; Khan, A.H.; Jalal, A.; Shafait, F. Real-time fish detection in complex backgrounds using probabilistic background modelling. Ecol. Inform. 2019, 51, 44–51. [Google Scholar] [CrossRef]
- Victor, N.; Alazab, M.; Bhattacharya, S.; Magnusson, S.; Maddikunta, P.K.R.; Ramana, K.; Gadekallu, T.R. Federated learning for IoUT: Concepts, applications, challenges and opportunities. arXiv 2022, arXiv:2207.13976. [Google Scholar] [CrossRef]
- Bhattacharya, S.; Victor, N.; Chengoden, R.; Ramalingam, M.; Selvi, G.C.; Maddikunta, P.K.R.; Donta, P.K.; Dustdar, S.; Jhaveri, R.H.; Gadekallu, T.R. Blockchain for internet of underwater things: State-of-the-art, applications, challenges, and future directions. Sustainability 2022, 14, 15659. [Google Scholar] [CrossRef]
- Abinaya, N.S.; Susan, D.; Kumar, R. Naive Bayesian fusion based deep learning networks for multisegmented classification of fishes in aquaculture industries. Ecol. Inform. 2021, 61, 101248. [Google Scholar] [CrossRef]
- Ahmed, M.S.; Aurpa, T.T.; Azad, M.A.K. Fish disease detection using image based machine learning technique in aquaculture. J. King Saud Univ.-Comput. Inf. Sci. 2021, 34, 5170–5182. [Google Scholar] [CrossRef]
- Mohamed, H.E.-D.; Fadl, A.; Anas, O.; Wageeh, Y.; ElMasry, N.; Nabil, A.; Atia, A. MSR-YOLO: Method to enhance fish detection and tracking in fish farms. Procedia Comput. Sci. 2020, 170, 539–546. [Google Scholar] [CrossRef]
- Xu, W.; Zhu, Z.; Ge, F.; Han, Z.; Li, J. Analysis of behavior trajectory based on deep learning in ammonia environment for fish. Sensors 2020, 20, 4425. [Google Scholar] [CrossRef]
- Waleed, A.; Medhat, H.; Esmail, M.; Osama, K.; Samy, R.; Ghanim, T.M. Automatic recognition of fish diseases in fish farms. In Proceedings of the 2019 14th International Conference on Computer Engineering and Systems (ICCES), Cairo, Egypt, 17 December 2019; IEEE: Manhattan, NY, USA, 2020. [Google Scholar] [CrossRef]
- Ubina, N.; Cheng, S.-C.; Chang, C.-C.; Chen, H.-Y. Evaluating fish feeding intensity in aquaculture with convolutional neural networks. Aquac. Eng. 2021, 94, 102178. [Google Scholar] [CrossRef]
- Bader, F.; Rahimifard, S. Challenges for industrial robot applications in food manufacturing. In Proceedings of the 2nd International Symposium on Computer Science and Intelligent Control, Stockholm, Sweden, 21–23 September 2018; Association for Computing Machinery (ACM): New York, NY, USA, 2018. [Google Scholar] [CrossRef] [Green Version]
- Goncharuk, A. Food business and food security challenges in research. J. Appl. Manag. Invest. 2015, 4, 223–230. [Google Scholar]
- Vo, T.T.E.; Ko, H.; Huh, J.-H.; Kim, Y. Overview of smart aquaculture system: Focusing on applications of machine learning and computer vision. Electronics 2021, 10, 2882. [Google Scholar] [CrossRef]
- Gladju, J.; Kamalam, B.S.; Kanagaraj, A. Applications of data mining and machine learning framework in aquaculture and fisheries: A review. Smart Agric. Technol. 2022, 2, 100061. [Google Scholar] [CrossRef]
- Wu, Y.; Zhuang, R.; Cui, Z. Automatic sorting system of large yellow croaker based on machine vision. In Proceedings of the 2019 International Conference on High Performance Big Data and Intelligent Systems (HPBD&IS), Shenzhen, China, 9–11 May 2019; IEEE: Manhattan, NY, USA, 2019. [Google Scholar] [CrossRef]
- Tappi, S.; Rocculi, P.; Ciampa, A.; Romani, S.; Balestra, F.; Capozzi, F.; Dalla Rosa, M. Computer vision system (CVS): A powerful non-destructive technique for the assessment of red mullet (Mullus barbatus) freshness. Eur. Food Res. Technol. 2017, 243, 2225–2233. [Google Scholar] [CrossRef]
- Li, C.; Zhen, T.; Li, Z. Image classification of pests with residual neural network based on transfer learning. Appl. Sci. 2022, 12, 4356. [Google Scholar] [CrossRef]
- Li, W.; Zhang, L.; Wu, C.; Cui, Z.; Niu, C. A new lightweight deep neural network for surface scratch detection. Int. J. Adv. Manuf. Technol. 2022, 123, 1999–2015. [Google Scholar] [CrossRef]
- Fisher, R.B.; Chen-Burger, Y.-H.; Giordano, D.; Hardman, L.; Lin, F.-P. Fish4Knowledge: Collecting and Analyzing Massive Coral Reef Fish Video Data; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar] [CrossRef] [Green Version]
- Siddiqui, S.A.; Salman, A.; Malik, M.I.; Shafait, F.; Mian, A.; Shortis, M.R.; Harvey, E.S. Automatic fish species classification in underwater videos: Exploiting pre-trained deep neural network models to compensate for limited labelled data. ICES J. Mar. Sci. 2018, 75, 374–389. [Google Scholar] [CrossRef]
- Salman, A.; Siddiqui, S.A.; Shafait, F.; Mian, A.; Shortis, M.R.; Khurshid, K.; Ulges, A.; Schwanecke, U. Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system. ICES J. Mar. Sci. 2020, 77, 1295–1307. [Google Scholar] [CrossRef]
- Shah, S.Z.H.; Rauf, H.T.; IkramUllah, M.; Khalid, M.S.; Farooq, M.; Fatima, M.; Bukhari, S.A.C. Fish-pak: Fish species dataset from Pakistan for visual features based classification. Data Brief 2019, 27, 104565. [Google Scholar] [CrossRef]
- Lillywhite, K.D.; Lee, D.J. Robotic Vision Lab, Brigham Young University, Fish Dataset. 2013. Available online: http://roboticvision.groups.et.byu.net/Machine_Vision/BYUFish/BYU_Fish.html (accessed on 19 September 2021).
- Shorten, C.; Khoshgoftaar, T.M. A survey on image data augmentation for deep learning. J. Big Data 2019, 6, 60. [Google Scholar] [CrossRef]
- Perez, L.; Wang, J. The effectiveness of data augmentation in image classification using deep learning. arXiv 2017, arXiv:1712.04621. [Google Scholar] [CrossRef]
- Liu, Z.; Jia, X.; Xu, X. Study of shrimp recognition methods using smart networks. Comput. Electron. Agric. 2019, 165, 104926. [Google Scholar] [CrossRef]
- Bouwmans, T.; Javed, S.; Sultana, M.; Jung, S.K. Deep neural network concepts for background subtraction: A systematic review and comparative evaluation. Neural Netw. 2019, 117, 8–66. [Google Scholar] [CrossRef] [Green Version]
- Kuswantori, A.; Suesut, T.; Tangsrirat, W.; Nunak, N. Development of object detection and classification with YOLOv4 for similar and structural deformed fish. EUREKA Phys. Eng. 2022, 2, 154–165. [Google Scholar] [CrossRef]
- Kuswantori, A.; Suesut, T.; Tangsrirat, W.; Satthamsakul, S. Fish recognition optimization in various backgrounds using landmarking technique and YOLOv4. In Proceedings of the the 37th International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC), Phuket, Thailand, 5–8 July 2022; IEEE: Manhattan, NY, USA, 2022. [Google Scholar] [CrossRef]
- Jiang, P.; Ergu, D.; Liu, F.; Cai, Y.; Ma, B. A review of yolo algorithm developments. Procedia Comput. Sci. 2022, 199, 1066–1073. [Google Scholar] [CrossRef]
- Diwan, T.; Anirudh, G.; Tembhurne, J.V. Object detection using YOLO: Challenges, architectural successors, datasets and applications. Multimed. Tools Appl. 2022, 82, 9243–9275. [Google Scholar] [CrossRef] [PubMed]
- Chandana, R.; Ramachandra, A. Real time object detection system with YOLO and CNN models: A review. arXiv 2022, arXiv:2208.00773. [Google Scholar] [CrossRef]
- Zhao, Z.-Q.; Zheng, P.; Xu, S.-T.; Wu, X. Object detection with deep learning: A review. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 3212–3232. [Google Scholar] [CrossRef] [Green Version]
- Shetty, A.K.; Saha, I.; Sanghvi, R.M.; Save, S.A.; Patel, Y.J. A review: Object detection models. In Proceedings of the 2021 6th International Conference for Convergence in Technology (I2CT), Pune, India, 2–4 April 2021; IEEE: Manhattan, NY, USA, 2021. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar] [CrossRef]
- Jiang, Z.; Zhao, L.; Li, S.; Jia, Y. Real-time object detection method based on improved YOLOv4-tiny. arXiv 2020, arXiv:2011.04244. [Google Scholar] [CrossRef]
- Kittichai, V.; Pengsakul, T.; Chumchuen, K.; Samung, Y.; Sriwichai, P.; Phatthamolrat, N.; Tongloy, T.; Jaksukam, K.; Chuwongin, S.; Boonsang, S. Deep learning approaches for challenging species and gender identification of mosquito vectors. Sci. Rep. 2021, 11, 4838. [Google Scholar] [CrossRef]
- Kittichai, V.; Kaewthamasorn, M.; Thanee, S.; Jomtarak, R.; Klanboot, K.; Naing, K.M.; Tongloy, T.; Chuwongin, S.; Boonsang, S. Classification for avian malaria parasite plasmodium gallinaceum blood stages by using deep convolutional neural networks. Sci. Rep. 2021, 11, 16919. [Google Scholar] [CrossRef]
Fish Class | Yeesok | Nuanchan | Tapian | Nai | Jeen Ban | Jeen To | Nin | Sawai |
---|---|---|---|---|---|---|---|---|
No. of images | 20 | 20 | 20 | 20 | 20 | 20 | 20 | 20 |
Total | 160 | |||||||
Average per class | 20 |
No. | Name | Conveyor Speed (m/h) | Duration | Note |
---|---|---|---|---|
1 | low-speed video | 116.65 | 17 min 13 s | later extracted for training data (scheme 2) |
2 | high-speed video 1 | 505.08 | 8 min 24 s | for testing data |
3 | high-speed video 2 | 505.08 | 17 min 13 s | for testing data |
Scheme | Original Images (8 Classes) | Averaged per Class | Augmented Images | Total Images for Training | Averaged per Class | ||
---|---|---|---|---|---|---|---|
Flip-Vertically | Flip-Horizontally | Total | |||||
Scheme-1 (from static pictures) | 160 | 20 | 160 | 160 | 320 | 480 | 60 |
Scheme-2 (from extracted pictures) | 188 | 24 | 188 | 188 | 376 | 564 | 71 |
References | Fish Object | Important Findings Related to This Work |
---|---|---|
[37] |
| Applying YOLOv4 with conventional labeling resulted in 14.29% higher accuracy than using landmarking for 6 classes. |
[38] |
| Combining YOLOv4 with landmarking labeling techniques resulted in 4.94% higher accuracy than using conventional. |
Approches | Video Test-1 | Video Test-2 | Average (Final Accuracy) (%) | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Correct Detection | False (Double/Wrong) Detection | Not Detect | Total Detection | Accuracy (%) | Correct Detection | False (Double/Wrong) Detection | Not Detect | Total Detection | Accuracy (%) | ||
YOLOv4 with Static Pics | 2 | 0 | 69 | 71 | 2.82 | 11 | 0 | 160 | 171 | 6.43 | 4.62 |
YOLOv4-Tiny | 67 | 4 | 0 | 71 | 94.37 | 154 | 17 | 0 | 171 | 90.06 | 92.21 |
YOLOv4 | 69 | 1 | 1 | 71 | 97.18 | 156 | 7 | 8 | 171 | 91.23 | 94.21 |
YOLOv4 + LM | 68 | 6 | 0 | 74 | 91.89 | 159 | 11 | 1 | 171 | 92.98 | 92.44 |
Proposed method | 72 | 0 | 1 | 73 | 98.63 | 167 | 4 | 0 | 171 | 97.66 | 98.15 |
Approches | Video Test 1 and 2 | ||||||
---|---|---|---|---|---|---|---|
Correct Detection (TP) | Wrong/Double Detection (FP) | Not Detect (FN) | Total Detection | Precision (%) | Sensitivity (%) | F-Score (%) | |
YOLOv4 With Static Pics | 13 | 0 | 229 | 242 | 100.00 | 5.37 | 10.20 |
YOLOv4-Tiny | 221 | 21 | 0 | 242 | 91.32 | 100.0 | 95.46 |
YOLOv4 | 225 | 8 | 9 | 242 | 96.57 | 96.15 | 96.36 |
YOLOv4 + LM | 227 | 17 | 1 | 245 | 93.03 | 99.56 | 96.19 |
Proposed Method | 239 | 4 | 1 | 244 | 98.35 | 99.58 | 98.96 |
Related Work | Fish Dataset | Fish Type | No. of Fish Classes | Method/ Algorithm | Accuracy (%) | Precision (%) | Sensitivity (%) | F Score (%) | Advantage | Disadvantage |
---|---|---|---|---|---|---|---|---|---|---|
[7] | own dataset | deep ocean fish | 20 | CNN | 94.90 | - | - | - | (1) The best accuracy is achieved by recognizing nine species of fish. (2) Recognition results are accurate even though the backgrounds are various or the fish only partially appear. | (1) The results of accuracy can still be increased. (2) High accuracy is expected to apply to all classes. |
[8] | own dataset | deep ocean fish | 1 | Multi-cascade object detection Network, 7 CNNs 2 RPNs, trained LSTMs | - | 67.28 | 68.25 | 67.76 | Promising to detect and count fish under various benthic backgrounds and illumination conditions. | Only detect fish, not classify them. |
[9] | Fish4-Knowledge & UWA | deep ocean fish | 17 | Optical flow, GMM, ResNet-50, YOLOv3 | 91.64 | - | - | 95.47 | Quite effective, even applied to many classes with diverse backgrounds and illumination challenges. | The results of accuracy can still be improved |
[16] | own dataset | Aquacultured fish | 1 | Image enhancement, YOLOv3 | 100 | - | - | - | (1) Effectively detect all fish in the test images. (2) Image Enhancement can optimize the work of the algorithm significantly. | Only to detect fish and trajectory, not for classification. |
[17] | own dataset | Aquacultured fish | 1 | Faster R-CNN, YOLOv3 | 98.13 | - | - | - | (1) High accuracy is obtained from Faster R-CNN. (2) Simple with good results | Only to detect fish and trajectory, not for classification. |
[30] | Fish4-Knowledge | deep ocean fish | 1 | GMM, Pixel-wise posteriors, CNN | - | - | - | 87.44 | Increased the result fairly from the previous work. | (1) Only detect fish, not classify them. (2) The result can still be improved |
Proposed method | own dataset | Aquacultured fish | 8 | Optimized YOLOv4 | 98.15 | 98.35 | 99.58 | 98.96 | (1) Simple method but delivers high results. (2) Ready to implement for aquaculture fish sorting system. | (1) Using not open access deep learning software. (2) YOLOv4 can not be modified. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kuswantori, A.; Suesut, T.; Tangsrirat, W.; Schleining, G.; Nunak, N. Fish Detection and Classification for Automatic Sorting System with an Optimized YOLO Algorithm. Appl. Sci. 2023, 13, 3812. https://doi.org/10.3390/app13063812
Kuswantori A, Suesut T, Tangsrirat W, Schleining G, Nunak N. Fish Detection and Classification for Automatic Sorting System with an Optimized YOLO Algorithm. Applied Sciences. 2023; 13(6):3812. https://doi.org/10.3390/app13063812
Chicago/Turabian StyleKuswantori, Ari, Taweepol Suesut, Worapong Tangsrirat, Gerhard Schleining, and Navaphattra Nunak. 2023. "Fish Detection and Classification for Automatic Sorting System with an Optimized YOLO Algorithm" Applied Sciences 13, no. 6: 3812. https://doi.org/10.3390/app13063812
APA StyleKuswantori, A., Suesut, T., Tangsrirat, W., Schleining, G., & Nunak, N. (2023). Fish Detection and Classification for Automatic Sorting System with an Optimized YOLO Algorithm. Applied Sciences, 13(6), 3812. https://doi.org/10.3390/app13063812