1. Introduction
The broiler breeding industry is an essential part of animal husbandry. With the increase in the global population, the demand for poultry products is increasing. China is a country with a large population, poultry products in China has always been in a high demand. Unlike Europe and the United States, intensive breeding is still the dominant poultry breeding method in China. In recent years, agriculture in China has been developing towards mechanization, automation, and intelligence. Smart farming is an emerging concept that refers to managing farms using technologies such as the Internet of Things (IoT), robotics, drones, and artificial intelligence (AI) to increase the quantity and quality of products while optimizing the human labor required by production [
1]. In the poultry breeding house, most of the breeding process has been automated, such as feeding, drinking, and manure cleaning. However, there are still some breeding processes that need to be mechanized and automated, such as the monitoring and management of the environment, the weighing of poultry, and the detection of sick and dead poultry individuals [
2].
Death and disease are the crucial issues in broiler breeding. In large-scale and intensive broiler breeding farms, the frequent occurrence of poultry diseases and death will lead to substantial economic losses and welfare problems. According to our experience, dead broilers will rot and stink after a certain amount of time, which may give rise to the cross-infection of disease in the broiler flock and reduce broiler welfare. Infected broilers are simply slaughtered to prevent the spread of disease and further economic damage. Therefore, reducing the negative impact of the dead broiler and such a disease outbreak has drawn significant attention [
3]. Removing the dead broilers from the flock in time can minimize the spread of disease and constrain the economic costs as much as possible. In a large-scale broiler farm, broiler breeders need to observe each cage and manually remove dead broilers on the basis of their breeding experience every day. As there are thousands of cages within a poultry house, the inspection work is time-consuming and laborious [
4]. Especially in a stacked-cage broiler house, it is dangerous for the breeders to inspect the broiler cages in the third and fourth layers. Moreover, the small particles and irritant gas produced by broilers and manure are harmful to the breeders. Therefore, replacing the manual inspection work with automation is vital in reducing economic losses and improving farmers’ welfare.
In recent years, machine learning and digital image processing have been widely used in the study of animal behavior [
5,
6,
7,
8,
9]. Researchers have utilized machine learning and image analysis schemes to identify the health of chickens. Some researchers have used the supervised classifiers to classify the 2D posture shape descriptors and mobility features of the walking broilers to analyze broiler lameness or Newcastle disease [
10,
11,
12,
13]. Accordingly, other researchers have analyzed the posture of broilers based on the skeleton feature for the early detection of sick broilers, and developed a sick-broiler detection method based on deep learning [
14,
15,
16]. Furthermore, the broiler chicken flocks’ activity levels and optical flow changes have been found to correlate significantly with leg disease in broiler chickens flocks [
17,
18,
19,
20]; the activity level and optical flow of flocks can also be used in the disease prediction.
There are also some existing studies which have utilized machine learning methods to identify dead broiler chickens. Zhu et al. [
21] used the support vector machines (SVM) classifier to classify the extracted features of chicken cockscomb and leg to identify dead chickens. Li et al. [
22] designed and constructed a chicken inspection robot to monitor the chickens in the farm. The chicken inspection robot could walk in the corridor of the chicken house and monitor chickens’ health and behavior with similar methods in real time. Bao et al. [
23] proposed a sensor detection method to detect sick and dead chickens: by fastening a foot ring on each chicken, the three-dimensional displacement of chickens was analyzed with a machine-learning classification method to detect the dead and sick chickens in the flocks. Liu et al. [
24] developed a small removal system for dead chickens for a Taiwanese poultry house, which could walk in the flat chicken house and collect dead chickens automatically. It had dimensions of 1.038 × 0.36 × 0.5 m and could remove two dead chickens in a single operation. The system identified the dead chickens through deep learning based on the YOLOv4 network, and achieved a good precision. The existing studies mainly focused on the detection of dead chickens in flat breeding and single-layer-cage breeding. The extension to death detection in stacked-cage broilers remained unclear. As stacked cage is the main method of broiler breeding in China, and it is also the place where automation is most urgently needed, it is necessary to realize the automatic detection of dead broilers in a stacked-cage broiler breeding house. This study aims to develop a method to detect dead broilers in an intensive stacked-cage broiler house automatically. An autonomous inspection platform was developed in this study, which could walk in the broiler house and capture images for each of the cages. A broiler detection model based on the improved-YOLOv3 was deployed on the server to process the images and identify the dead broilers in the flocks. The detection results were stored in an excel format and could be checked in the human–machine interface. With this method, the breeding staff could check the results to know the position of dead broilers without observing the cages one by one, which could further reduce the labor burden of the breeders.
3. Results and Discussion
3.1. Evaluation of the Autonomous Inspection Platform Performance
The walking speed of the platform is of great importance for the inspection work. A high inspection speed will cause the inaccurate positioning of the platform, blurred images, and even disturbing the broiler flocks, which is not expected. However, a lower speed is also not desired. As the inspection time is limited, a lower speed will lead to the inefficiency of the platform. Thus, a suitable inspection speed of the platform is very important for stable driving and image acquisition. After several experiments in the broiler house, the walking speed of the platform was determined to be 0.2 m/s, which was equivalent to a quarter of the walking speed of people. During the inspection, the platform stopped at a quarter of each cage, and each camera captured three images at an interval of 5 s. Therefore, it took 38 s to capture images of four layers of cages simultaneously, and a total of 24 images were captured by the four cameras.
The accuracy of the magnetic navigation has a great impact on the quality of the image and the position of the platform. Unqualified images and incorrect position information may lead to a poor and invalid detection result. Hence, a parking experiment was conducted to test the navigation performance. The platform started from a starting point and drove along the magnetic strip, parking at a distance of 60, 120, 180, or 240 cm away. Offsets of the platform were recorded manually and are shown in
Table 3. The front and rear offsets of the platform are shown in the left part, ranging from a minimum of 0.6 cm to a maximum of 3.7 cm, with an average offset of 2.1 cm. Statistical inference did not show any difference in offset between position distances (
p-value > 0.7). All the offsets were positive in this experiment, this was due to the inertia of the platform causing the platform to move forward a certain distance when parked. The left and right offsets are shown in the right part of
Table 3, where the minimum offset was 0.3 cm, the maximum offset was 2.3 cm, and the average offset was 1.26 cm, with no significant difference in the left and right offset between different position distances (
p > 0.7). The left and right offset is due to the magnetic navigation of the platform. When the platform is driving along the magnetic strip, the magnetic navigation sensor constantly corrects the direction of the platform, so the platform will produce left and right offsets when parked.
3.2. Evaluation of the Dead Broiler Detection Model
3.2.1. Evaluation of Improved-YOLOv3 Models
Table 4 presents the precision, recall, and mAP of the YOLOv3 and improved-YOLOv3 models. The precision, recall, and average precision of the improved-YOLOv3 model on dead broilers were 97.0%, 97.0%, and 98.2%, higher than those of YOLOv3 by 22.4%, 3.5%, and 5.5%, respectively. The precision, recall, and average precision of the improved-YOLOv3 model on live broilers were 93.1%, 96.7%, and 99.0%, which were 16.5%, 3.4%, and 8.7% higher than those of the YOLOv3 model. Processing an image with the improved-YOLOv3 model only took 0.007 s, which was faster than the YOLOv3 model.
Figure 7 shows the mAP curve of the improved-YOLOv3 and YOLOv3 models; the improved-YOLOv3 model converged at about 25 epochs and had a faster convergence rate than the YOLOv3 model. In addition, the improved-YOLOv3 model achieved a mAP of 98.6%, higher than that of the YOLOv3 model. The faster convergence rate and higher index could be attributed to the improvement of the activation function and loss function, which led to an overall improvement of the model.
3.2.2. Evaluation of the Detection Effect of Broilers at Different Growth Stages
Figure 8 shows the broilers at 6 days old and 36 days old. In the breeding process, the appearance of the broilers changes greatly with the increase in age. In order to test the detection performance of the model on broilers of different ages, a total of 222 images from 6 to 40 days were selected and tested by the developed model. As shown in
Table 5, the improved-YOLOv3 model achieved a better detection performance on broilers in different growth stages than YOLOv3, and could correctly identify dead broilers of different ages. With the increase in age, the AP of the improved-YOLOv3 model for dead broilers decreased gradually. This might be explained by two factors. Firstly, in the early inspection work of the autonomous inspection platform, the young broilers would stay away from the platform due to the fear of unfamiliar things, which exposed the dead broilers at the edge of cage. Since there were few broilers in each image, and occlusion between chickens is rare, the model could easily identify the dead broilers in the early stage (
Figure 8a). As the experiment processed, the broilers acclimated to the environment and no longer exhibited avoidance behavior. Thus, the background of the dead broiler in an image became more complicated. Secondly, as the space of the broiler cage was limited, the broilers crowded each other, causing some dead broilers to be occluded by others (
Figure 8b). It was difficult for the model to identify the full features of broilers, which led to the missing of dead broilers. As for the detection of live broilers, the AP of the improved-YOLOv3 model of live broilers did not change significantly with broiler age. This may be attributed to the number of live broiler samples (40,000) in the dataset which were sufficient to make the model robust to the live broilers
3.2.3. Evaluation of Model Performance under Different Light Conditions
In this study, the stacked-cage broiler house used incandescent lamps to provide light for the broilers. Incandescent lamps were installed in the center of the broiler cage, which made the light intensity high in the center of the cage and low at the edges of the cage. In addition, due to the properties of the incandescent lamps and the pecking behavior of broilers, the brightness of an incandescent lamp decreases over time, which results in variations in light intensity in different cages. Therefore, the brightness of the collected images was also varied, which required the model to have a good adaptability to images with different light intensities. Test images with different brightness were randomly selected in this section. The detection performance of the improved-YOLOv3 model and YOLOv3 model was tested on these images.
As shown in
Figure 9, the dead broilers were located in the center of the image and labeled by the red anchor boxes. Color and shape were the main features that distinguished broilers from the background. Too high or too low of a brightness can change the original features of the image to a certain extent, and affect the detection result. As shown in
Figure 8, the improved-YOLOv3 model successfully identified the dead broilers in both high and low brightness images, while the YOLOv3 model missed dead broilers in the low brightness images, which indicated the effectiveness of the improvement. For the high brightness images, the improved-YOLOv3 model identified 19 live broilers, while the YOLOv3 model only identified 15 live broilers, missing the broilers in the upper side. This was likely due to the fact that the broilers in the upper side were relatively small; the SPP module in the improved-YOLOv3 model could reduce the distortion in the up-sampling operation and fuse the features of different size, which improved the detection ability of the model to objects of different sizes. Thus, it could achieve a better performance on the high brightness images. Additionally, the broilers of the improved-YOLOv3 model showed a higher confidence compared with the YOLOv3 model. This may be because the mosaic enhancement enriched the background of the detected object, which means more samples with a high brightness were generated by the dataset. Thus, the improved-YOLOv3 model was robust to different brightness and could identify the dead broiler in a low brightness image.
3.3. Evaluation of the Dead Broiler Detection System
3.3.1. Analysis of the Detection Result of the Dead Broiler Detection System
The dead broiler detection model was employed on the sever in the broiler house, and a human–machine interface was developed using pyqt5 and Python language. By using the human–machine interface, the operator can easily process the image with the developed broiler detection model and determine the location of the dead broilers.
Figure 10 shows part of the processing result. The detection result of the dead broiler is marked by the red rectangle. The first column shows the position of the dead broiler in the broiler house, the second and third columns show the number of dead and live broilers detected by the model, the fourth column shows if any dead broilers are detected. Breeders only need to look up the results from the fourth column and then find the corresponding location of dead broilers from the first column.
The processing results of dead broilers in days 31–36 are shown in
Figure 11. An average of 1664 images were collected and processed each day. Approximately 7–30 dead broilers were identified by the developed model from days 31–36. Among them, 3–7 live broilers were mistakenly identified as dead. Most of the correctly detected dead broilers had prominent features of lying on the grid floor with pink or brownish-black breast exposed, and stiff claws. These dead broilers made up the bulk of the training set. However, there were some dead broiler samples similar to live broilers in the training dataset as shown in
Figure 12a; it was even difficult to distinguish them during the manual inspection according to our experience. The developed model learned features of such dead broilers and misidentified broilers with only partially exposed feathered bodies as dead. As shown in
Figure 12b, these falsely detected dead broilers received a low confidence score, which indicated that this cohort of falsely detected broilers can be filtered by adjusting the confidence threshold of classification but, at the same time, more dead broilers will be missed. Therefore, in order to further improve the detection performance of the model, the best method was to collect more dead broiler samples similar to
Figure 12a to improve the robustness of the model.
3.3.2. Analysis of the Defects of the Dead Broiler Detection System
In this study, the proposed dead broiler inspection system could inspect the broiler flocks and relay the dead broilers’ positions to breeders. Some problems were still found during the experiment. In the late breeding stage, the manure of broilers increases greatly. Despite the manure being cleaned with the manure belt once a day, there was still part of the manure dropped on the aisle due to the design of the manure cleaning belt and the locomotion of broilers. Manure mixed with cooling water, feathers, and dust particles covered the magnetic strip or adhered to the tire, sometimes causing the platform to slip and fail to turn. In this case, we have to remote control the platform to return to the origin and restart it. Furthermore, in the last few days, the magnetic strip had partially fallen off in the cleaning process, which led to false navigation and position errors. Attaining a better fix of the magnetic strip in the aisle of the broiler house was still a problem. Considering the dead broiler detection model, a higher model accuracy is expected, as any false or missed dead broiler in the house may lead to the spread of disease in the broiler flock. It can be also noticed that the image dataset was mainly collected from Ross 308 broilers; thus, the developed model may not be robust to other breeds of broilers.
The ChickenBoy robot developed by the Big Dutchman company is an analysis robot that can support the daily tasks of broiler producers. It moves along fixed rails above the ground and gathers data on climate conditions in the chicken house, distinguishing dead chickens by applying thermographic images [
30]. Nevertheless, the ChickenBoy is mainly designed for flat-breeding chicken houses and is not suitable for the stacked-cage culture model for multiple reasons, such as the shielding of cages and multi-layer structures. However, a fixed rail design in their system may be considered to fix the problem mentioned above. Liu et al. [
24] designed a dead chicken removal system for the flat-breeding chicken house. The tracked vehicle of the dead chicken removal system reported in their research seems to be maneuverable while waking on the litter, which is worth considering. The dead chicken removal system achieves a walking speed of 3.3 cm/s, and could detect dead chickens in front of the system. In our study, the autonomous inspection platform walks in the aisle at a speed of 20 cm/s, and inspects four layers of broiler cages synchronously. Our system is more efficient, despite the image capture interval during inspection.
Occlusions in the broiler house were the main causes of the false and missed detection of dead broilers in this study. This was mainly due to two reasons, the camera’s position and the breeding density. In most studies, the cameras were set directly above the chickens [
7,
15,
19,
31], and the breeding density was relatively low [
15,
24]. However, as broilers were reared in stacked cages in this study, the cameras were positioned outside the cage and the occlusion of the broilers by the feed and water lines was avoided as much as possible. Thus, the oblique shooting angle resulted in some broilers being obscured by others. In addition, the high density of stacked-cage broilers (70 broilers/cage), and the cluster characteristics of broilers make the occlusion more serious [
31]. Bao et al. [
23] measured the three-dimensional displacement of the chickens by fastening a foot ring on each chicken to identify dead and sick chickens in the flock. As the maximum displacement and three-dimensional total variance of the dead chickens were almost 0, this method could identify dead chickens in the flock with 100 percent accuracy. However, it did not need to consider the occlusion problem in the flocks and has obvious drawbacks when used in large-scale chicken farms. Binding the foot rings to more than 100,000 broilers is even more laborious than the daily inspection work, and the foot rings may fall off during the breeding process. Therefore, it is more feasible to use camera and image processing technology to identify dead broilers in large farms. To further improve dead broiler detection performance and robustness, it is worth considering the selection of a more optimized model and collect more samples of different breeds of dead broiler samples.
3.4. Future Work
Currently, most of the functionality of the autonomous inspection platform has been implemented, but it still needs further improvement to facilitate use. Future work will focus on solving the problems mentioned above. In addition, images collected by the autonomous inspection platform were transferred to the server manually. Uploading images via wireless internet access or deploying the dead broiler detection model with the OpenVINO inference engine on a computer (Dell OptiPlex 7090MFF) and uploading the detection result to the CloudDB will be further explored to make the platform more automatic and intelligent.
4. Conclusions
In this study, an autonomous inspection platform was designed to capture images for the stacked-cage broiler house. It had a total length of 53 cm, width of 47 cm, and height of 310 cm, and was designed as a wheeled self-propelled vehicle. A magnetic navigation sensor was fixed on the platform to provide navigation and positioning. Four CCD cameras were mounted on the platform to capture images of four layers of cage broilers simultaneously. The autonomous inspection platform could walk in the broiler house aisle at a speed of 0.2 m/s and stop at each image acquisition waypoint to capture images of the broiler flocks.
A dead broiler detection model was developed with the improved-YOLOv3 network. The improvement of the YOLOv3 network included mosaic enhancement, the Swish function, an SPP module, and CIoU loss. The mAP of the dead broiler detection model was 98.6%, the recall was 96.8%, and the precision was 95.0%. The developed model could process images at a speed of 0.09 s/frame, and identify broilers of different ages. A human–machine interface based on the dead broiler detection model was developed and employed on the sever. Breeders only need to click the buttons successively according to the prompts on the interface, and the position of dead broilers will be displayed on the interface in the form of a table.
By using the autonomous inspection platform to collect images of broilers and processing the images with the dead broiler detection model, breeders could easily obtain the location of the dead broilers in the broiler house. In future work, the detection of dead broilers using thermal infrared images and instance segmentation will be studied. Moreover, we will continue to develop the platform to enable more functionality, such as the automatic monitoring of broilers’ welfare, monitoring of the harmful gases, etc.