Fertility Detection of Hatching Eggs Based on a Convolutional Neural Network
Abstract
:1. Introduction
- We propose a deep learning method for the fertility detection of hatching eggs. Our method achieves better performance on survival identification of the hatching eggs thanks to the combination of CNN and the heartbeat signal of hatching eggs.
- We consider the heartbeat signal of hatching eggs as an effective feature to distinguish between fertile eggs and dead eggs, and collect it by using the method of PhotoPlethysmoGraphy to avoid introducing a lot of noise.
- We design a sequence convolutional neural network E-CNN which is used to classify heartbeat sequence of hatching eggs.
- We design a 10-layer-deep convolutional neural network SR-CNN for the fertility detection of hatching eggs by recognizing the heartbeat signal waveform of the embryos, which combines the channel weighting unit “Squeeze-and-Excitation” (SE) and the residual structure to improve the performance of the network.
2. Methods
2.1. Data Acquisition and Pre-Processing
2.2. Dataset Construction
2.3. CNN Performance Improvement Strategy
2.4. Sequence CNN Design
2.5. Two-Dimensional CNN Design
3. Experiment and Results Analysis
3.1. E-CNN Experiment
3.2. SR-CNN Experiment
3.3. Results Analysis
3.4. Comparsion with Other Methods
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Xu, Q.L.; Cui, F.Y. Non-destructive detection on the fertility of injected SPF eggs in vaccine manufacture. In Proceedings of the 2014 IEEE Conference on Chinese Control and Decision Conference (CCDC), Changsha, China, 31 May–2 June 2014; pp. 1574–1579. [Google Scholar]
- Xu, Y.; Xu, A.; Xie, T. Automatic Sorting System of Egg Embryo in Biological Vaccines Production Based on Multi-information Fusion. Trans. Chin. Soc. Agric. 2015, 2, 20–26. [Google Scholar]
- Liu, L.; Ngadi, M.O. Detecting Fertility and Early Embryo Development of Chicken Eggs Using Near-Infrared Hyperspectral Imaging. Food Bioprocess Technol. 2013, 9, 2503–2513. [Google Scholar] [CrossRef]
- McQuinn, T.C.; Bratoeva, M.; DeAlmeida, A.; Remond, M.; Thompson, R.P.; Sedmera, D. High-frequency ultrasonographic imaging of avian cardiovascular development. Dev. Dyn. 2007, 12, 3503–3513. [Google Scholar] [CrossRef] [PubMed]
- Schellpfeffer, M.A.; Kolesari, G.L. Microbubble contrast imaging of the cardiovascular system of the chick embyro. Ultrasound Med. Biol. 2012, 3, 504–510. [Google Scholar] [CrossRef] [PubMed]
- Zhang, W.; Tu, K.; Liu, P.; Pan, L.; Zhan, G. Early Fertility Detection of Hatching Duck Egg Based on Fusion between Computer Vision and Impact Excitation. Trans. Chin. Soc. Agric. 2012, 2, 140–145. [Google Scholar]
- Shan, B. Fertility Detection of Middle-stage Hatching Egg in Vaccine Production Using Machine Vision. In Proceedings of the International Workshop on Education Technology and Computer Science (ETCS), Wuhan, China, 6–7 March 2010; pp. 95–98. [Google Scholar]
- Geng, L.; Yan, T.; Xiao, Z.; Xi, J.; Li, Y. Hatching eggs classification based on deep learning. Multimedia Tools Appl. 2017, 1, 1–12. [Google Scholar] [CrossRef]
- Islam, M.H.; Kondo, N.; Ogawa, Y.; Fujiura, T.; Suzuki, T.; Fujitani, S. Detection of infertile eggs using visible transmission spectroscopy combined with multivariate analysis. EAEF 2017, 10, 115–120. [Google Scholar] [CrossRef]
- Zhang, Z.; Pi, Z.; Liu, B. TROIKA: A general framework for heart rate monitoring using wrist-type photoplethysmographic signals during intensive physical exercise. IEEE Trans. Biomed. Eng. 2015, 2, 522–531. [Google Scholar] [CrossRef] [PubMed]
- Hochstadt, A.; Chorin, E.; Viskin, S.; Schwartz, A.L. Continuous heart rate monitoring for automatic detection of atrial fibrillation with novel bio-sensing technology. J. Electrocardiol. 2019, 52, 23–27. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z. Photoplethysmography-Based Heart Rate Monitoring in Physical Activities via Joint Sparse Spectrum Reconstruction. IEEE Trans. Biomed. Eng. 2015, 8, 1902–1910. [Google Scholar] [CrossRef] [PubMed]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv, 2014; arXiv:1409.1556. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 8–10 June 2015; pp. 1–9. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Delving deep into rectifiers: Surpassinghuman-level performance on ImageNet classification. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 13–16 December 2015; pp. 1026–1034. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv, 2015; arXiv:1502.03167. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the 2016 IEEE conference on computer vision and pattern recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-excitation networks. arXiv, 2017; arXiv:1709.01507. [Google Scholar]
- Henderson, T.A.; Morries, L.D. Near-infrared photonic energy penetration: Can infrared phototherapy effectively reach the human brain? Neuropsychiatr. Dis. Treat. 2015, 11, 2191. [Google Scholar] [CrossRef] [PubMed]
- Cain, J.R.; Abbott, U.K.; Rogallo, V.L. Heart rate of the developing chick embryo. Proc. Soc. Exp. Biol. Med. 1967, 2, 507–510. [Google Scholar] [CrossRef]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 1, 1929–1958. [Google Scholar]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS), Sardinia, Italy, 13–15 May 2010; pp. 249–256. [Google Scholar]
Hatching Eggs Activity Detection Period | Total | ||||
---|---|---|---|---|---|
24 h | 48 h | 64 h | 88 h | ||
Fertile eggs | 3750 | 3750 | 3750 | 3750 | 15,000 |
Dead eggs | 3750 | 3750 | 3750 | 3750 | 15,000 |
Total | 7500 | 7500 | 7500 | 7500 | 30,000 |
Dataset Type | Raw Data (N = 500) | Filtered Data (N = 350) | Total | Positive: Negative | Dataset Partition |
---|---|---|---|---|---|
Sequence | DSI | DSH | 30,000 | 1:1 | 8:1:1 |
Waveform | DWI | DWH | 30,000 | 1:1 | 8:1:1 |
Model | Related Parameters | ||||||||
---|---|---|---|---|---|---|---|---|---|
E-CNN | Conv1 | MaxPool1 | Conv2 | Conv3 | MaxPool2 | Conv4 | Conv5 | Dense | |
1 × 20, 64 | 1 × 5 | 1 × 20, 64 | 1 × 20, 128 | 1 × 5 | 1 × 20, 128 | 1 × 20, 256 | 2d |
Layer Name | Layer Type | Releated Parameters |
---|---|---|
Conv | Convolution | 7 × 7, 64, stride 2 |
Pool1 | Pooling | 3 × 3, max pool, stride 2 |
Conv1_1, Conv1_2 | Convolution | 3 × 3, 64, stride 1 |
Conv2_1, Conv2_2 | Convolution | 3 × 3, 128, stride 2, 1 |
Conv2_3 | Convolution | 1 × 1, 128, stride 2 |
Conv3_1, Conv3_2 | Convolution | 3 × 3, 256, stride 2, 1 |
Conv3_3 | Convolution | 1 × 1, 256, stride 2 |
Conv4_1, Conv4_2 | Convolution | 3 × 3, 512, stride 2, 1 |
Conv4_3 | Convolution | 1 × 1, 512, stride 2 |
Dropout | Dropout | dropout-ratio 0.25 |
Pool2 | Pooling | 8 × 8, average pool, stride 1 |
FC | Fully connected | 2-d |
Sample Type | Detection Period | Number of Sample | Total |
---|---|---|---|
Fertile eggs | 24 h | 395 | 1580 |
48 h | 410 | ||
64 h | 575 | ||
88 h | 200 | ||
Dead eggs | 24 h | 355 | 1420 |
48 h | 421 | ||
64 h | 288 | ||
88 h | 356 |
Model Name | Sample Distribution | Recognition Result | False Detection | Accuracy (%) | Precision (%) | Recall (%) | F1 Score (%) | |
---|---|---|---|---|---|---|---|---|
P | N | |||||||
Basic network | P (1580) | 1569 | 11 | 22 | 99.28 | 99.30 | 99.30 | 99.30 |
N (1420) | 11 | 1409 | ||||||
Basic network+ Residual unit | P (1580) | 1570 | 10 | 19 | 99.37 | 99.43 | 99.36 | 99.40 |
N (1420) | 9 | 1411 | ||||||
Basic network+ SE block | P (1580) | 1573 | 7 | 17 | 99.44 | 99.37 | 99.57 | 99.47 |
N (1420) | 10 | 1410 | ||||||
SR-CNN | P (1580) | 1575 | 5 | 12 | 99.62 | 99.56 | 99.68 | 99.62 |
N (1420) | 7 | 1413 |
R | Accuracy (%) |
---|---|
4 | 94.52% |
8 | 96.58% |
16 | 99.62% |
32 | 97.73% |
The Method | Feature Extraction Method | Classification Method | Date | Accuracy | Classification |
---|---|---|---|---|---|
Proposed method | SR-CNN, E-CNN | SR-CNN, E-CNN | 9-day-later | 99.62%, 99.5% | Fertile and Dead |
Geng, L. et al. (2017) [8] | TB-CNN | TB-CNN | 5-days | 99.5% | Fertile, Dead and Infertile |
Liu, L. et al. (2013) [3] | Gabor-filter | K-means clustering | 4-days | 84.1%. | Fertile and Non-fertile |
Shan, B. et al. (2010) [7] | Thresholding by Histogram-based WFCM | Criterions of blood vessels | Middle-stage | 99.33% | Fertile and Non-fertile |
Md. Hamidul Islam. et al | k-means, LDA, SVM | k-means clustering, LDA, SVM | 4-days | 96%, 100%, 100% | Fertile and Infertile |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Geng, L.; Hu, Y.; Xiao, Z.; Xi, J. Fertility Detection of Hatching Eggs Based on a Convolutional Neural Network. Appl. Sci. 2019, 9, 1408. https://doi.org/10.3390/app9071408
Geng L, Hu Y, Xiao Z, Xi J. Fertility Detection of Hatching Eggs Based on a Convolutional Neural Network. Applied Sciences. 2019; 9(7):1408. https://doi.org/10.3390/app9071408
Chicago/Turabian StyleGeng, Lei, Yuzhou Hu, Zhitao Xiao, and Jiangtao Xi. 2019. "Fertility Detection of Hatching Eggs Based on a Convolutional Neural Network" Applied Sciences 9, no. 7: 1408. https://doi.org/10.3390/app9071408
APA StyleGeng, L., Hu, Y., Xiao, Z., & Xi, J. (2019). Fertility Detection of Hatching Eggs Based on a Convolutional Neural Network. Applied Sciences, 9(7), 1408. https://doi.org/10.3390/app9071408