Next Article in Journal
Exploring the Impact of Rural Labor Mobility on Cultivated Land Green Utilization Efficiency: Case Study of the Karst Region of Southwest China
Previous Article in Journal
Consumption of Nitrogen Fertilizers in the EU—External Costs of Their Production by Country of Application
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Broiler Mortality Identification Methods Based on Video and Broiler Historical Movement

1
Department of Engineering, China Agricultural University, Beijing 100083, China
2
Agricultural Facilities and Equipment study Institute, Jiangsu Academy of Agriculture Science, Nanjing 210014, China
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(3), 225; https://doi.org/10.3390/agriculture15030225
Submission received: 7 January 2025 / Revised: 16 January 2025 / Accepted: 17 January 2025 / Published: 21 January 2025
(This article belongs to the Special Issue Modeling of Livestock Breeding Environment and Animal Behavior)

Abstract

:
The presence of dead broilers within a flock can be significant vectors for disease transmission and negatively impact the overall welfare of the remaining broilers. This study introduced a dead broiler detection method that leverages the fact that dead broilers remain stationary within the flock in videos. Dead broilers were identified through the analysis of the historical movement information of each broiler in the video. Firstly, the frame difference method was utilized to capture key frames in the video. An enhanced segmentation network, YOLOv8-SP, was then developed to obtain the mask coordinates of each broiler, and an optical flow estimation method was employed to generate optical flow maps and evaluate their movement. An average optical flow intensity (AOFI) index of broilers was defined and calculated to evaluate the motion level of each broiler in each key frame. With the AOFI threshold, broilers in the key frames were classified into candidate dead broilers and active live broilers. Ultimately, the identification of dead broilers was achieved by analyzing the frequency of each broiler being judged as a candidate death in all key frames within the video. We incorporated the parallelized patch-aware attention (PPA) module into the backbone network and improved the overlaps function with the custom power transform (PT) function. The box and mask segmentation mAP of the YOLOv8-SP model increased by 1.9% and 1.8%, respectively. The model’s target recognition performance for small targets and partially occluded targets was effectively improved. False and missed detections of dead broilers occurred in 4 of the 30 broiler testing videos, and the accuracy of the dead broiler identification algorithm proposed in this study was 86.7%.

1. Introduction

The significance of the poultry industry in the global food chain has grown exponentially, providing a vital source of nutrition and protein for the burgeoning human population. This is largely due to their accelerated growth cycle and more efficient feed conversion compared to those of large-scale livestock. Concurrently, consumer perceptions of poultry have been shaped by a multitude of factors, such as health awareness, product quality and safety, and the ethical dimensions of animal welfare [1,2]. “On-farm mortality”, which refers to the uncontrolled death of animals within the farming environment, is a pivotal indicator of broiler welfare. High rates of on-farm mortality not only signal poor environmental management and disease prevention strategies but also reflect a compromised state of welfare for the broilers. In particular, mitigating on-farm mortality through enhanced management practices and health monitoring can significantly improve the overall welfare of broilers.
Mortality among broilers within poultry farms is predominantly driven by two key factors: stress and disease. Stressful conditions may arise from genetic predispositions and environmental stressors, including temperature fluctuations, overcrowding, a lack of environmental enrichment, and compromised air quality, as well as the disruptive effects of human presence [3]. Concurrently, a spectrum of diseases, including lameness, cardiovascular issues, ascites, respiratory infections, and the pervasive risk of avian influenza, also contribute to broiler mortality [4]. The decomposition of dead broilers within the farm releases harmful gases, which not only diminish the welfare of the live broilers but also serve as a catalyst for disease transmission, thereby escalating the cycle of mortality. It is thus of great importance to implement effective identification and removal protocols for dead broilers to mitigate these risks and enhance the overall welfare and management of the broilers.
The identification of dead broilers has traditionally been carried out by manual observation, which is notably labor-intensive and time-consuming. This method exposes breeders to health and safety hazards, especially in the large-scale poultry farms. The emergence of precision livestock farming (PLF), in conjunction with the growing scarcity of the labor force in poultry operations, has highlighted the urgent necessity for automated systems to adeptly monitor and manage the health of poultry. In response to this need, researchers have introduced a variety of methods, including machine vision technology, acoustic perception technology, and accelerometer technology, to identify dead or sick poultry within a flock.
Machine vision technology, characterized by its low invasiveness, high efficiency, and low cost, is widely used in poultry information perception tasks, including the detection of poultry diseases and mortality. In the early stages, image processing methods were the main method to identify sick poultry based on variations in their physical appearance and behavioral patterns. Researchers extracted and analyzed shape geometric features and motion features of poultry from the images, classifying them with classification algorithm to identify the diseased poultry [5,6]. With the introduction of deep learning technology into the agricultural field, researchers have combined deep learning and image processing technology to solve the problem of poultry health identification in complex scenarios, as deep learning has demonstrated great capability in feature extraction and target recognition [7]. Aydin et al. [8] extracted and analyzed the relationship between the gait features, such as speed, stride length, and step frequency, of broilers during walking and the degree of lameness using image processing methods. The results showed that there was an important relationship between the extracted features and the lameness of broilers. Sadeghi et al. [9] extracted statistical features from the thermal images of broilers, achieving 97.2% and 100% accuracy using an SVM algorithm for classifying broilers with avian influenza and Newcastle disease. Vandana at al. [10] constructed an abnormal chicken feces classification model based on the EfficientNetB7 network, thereby identifying diseases such as salmonella, Newcastle, and coccidiosis in chickens. In the aforementioned studies on the identification of diseased poultry based on vision, researchers often identify poultry diseases by extracting dynamic or static features of the poultry. However, in the identification of dead poultry, only static features of the poultry are used for the identification [11,12].
Acoustic perception is also considered promising as a low invasiveness and low-cost technology. Researchers employed sound sensor technology and CNNs to identify abnormal sounds emitted by poultry, thereby achieving monitoring the diseases with classification methods [13,14]. Cuan et al. [15] employed a bidirectional long short-term memory (BiLSTM) model to analyze audio features extracted from the vocalizations of the birds to identify Newcastle disease in poultry. Bhandekar et al. [16] processed the captured chicken vocalizations with the mel-frequency cepstral coefficients method and classified them with the SVM model to detect the abnormal chickens. Both reports achieved good performance in identifying sick chickens. However, it appears that acoustic perception technology was ineffective for detecting the dead poultry, as the dead poultry did not produce any abnormal sounds.
Accelerometer technology was primarily used in studies on the behavior and health identification of large livestock (pigs, cattle, sheep, etc.). Due to economic and practical considerations, it was less commonly applied in the study of poultry behavior and disease detection. Mei et al. [17] leveraged three-dimensional acceleration data, collected through wearable accelerometers, to identify aflatoxin-poisoned broilers. Bao et al. [18] attached sensor-laden foot rings to each chicken to automatically collect three-dimensional acceleration and displacement data of the broilers. The method achieved an accuracy rate of 100% in the detection of dead chickens. However, due to the economic costs, wearable sensor technology remained largely confined to the laboratory stage, which appeared not to be promising for practical applications on commercial farms.
Machine-vision-based techniques for identifying dead broilers have the advantages of being low cost and less instructive for the poultry flocks. However, in the field of dead poultry identification, visual-based methods mainly focus on identifying dead poultry by recognizing the morphological features of the poultry—in other words, static features. There have been no reports on a study that combines machine vision with the movement information of broilers for the identification of dead broilers. Optical flow estimation captures the motion information of objects in a scene by analyzing the displacement of pixels in an image over time [19]. It is very promising for identifying body movements of broilers in flocks [20]. Consequently, the aim of this study is to propose a method for identifying dead broilers based on video and historical movement information, mimicking the manual observation of broiler movement to determine dead broilers in the flock. We analyze the movement status of each broiler in a video to achieve the identification of dead broilers.

2. Materials and Methods

2.1. Video Collection

Broiler video collection work was conducted from 17 November to 14 December 2023 at a commercial broiler farm in YanTai, Shandong Province, China. The farm’s coops housed an average of 14,000 white-feather broilers, with each cage containing 60 to 70 individuals, complete with an incandescent light and 2 feed trays. Male broilers and female broilers were co-raised in the cages. For the video recording, an industrial camera (HIKROBOT China Inc., MV-CA023-10UC, Hangzhou, China) was mounted on a tripod to capture videos from outside the cage. The camera was connected to a laptop computer, enabling the capture of RGB videos at a rate of 32 frames per second. The exposure time of the camera was manually adjusted to suit the varying light intensities in different cages. The video collection focused on the period from the 15th to the 41st day of the broilers’ growth cycle. A total of 98 videos containing dead broilers were recorded, ranging in length from 1 to 2.5 min, and more than 200 videos without dead broilers.

2.2. Overall Technical Route

The purpose of this study was to employ a video method for the identification of dead broilers within poultry flocks. The pipeline of the proposed method is depicted in Figure 1. Key frames were first extracted from the broiler videos using the inter-frame maximum method. Then, we used the average optical flow intensity (AOFI) value threshold to evaluate the movement of broilers in each key frame and classified the broilers into active broilers and candidate dead broilers. The AOFI value was defined based on optical flow estimation and broiler mask segmentation. Finally, we counted the frequency at which each broiler was determined to be a candidate dead broiler in all key frames and made the final determination of dead broilers based on the threshold method.

2.3. Key Frame Extraction

The video data collected from the farm usually comprise numerous frames with low variance, which are not effective for optical flow estimation. This is largely attributed to the broilers’ natural behavior of resting for 70–80% of their time [21] and the limited activity observed in stacked cage housing. Key frames in this study are video frames that contain more information about the movement of broilers. Utilizing key frames rather than the entire video sequence for the identification of dead broilers can significantly decrease the computational load and enhance the efficiency of the identification process. In this research, the frames with average inter-frame differences that reached a local maximum were defined as key frames. The inter-frame maximum method was employed to extract these pivotal frames. The inter-frame difference, which is calculated as the disparity in pixel intensities between two consecutive frames, is described by Equation (1). To reduce noise and prevent the redundant extraction of similar frames, the algorithm incorporated the Hanning window method to smooth the average difference values before identifying local maxima. As a result, the videos were effectively condensed into a curated set of 50 to 100 key frames, which were then advanced to the subsequent stage of analysis.
D n = 0 , 0 X , Y f n + 1 x , y f n x , y X × Y
where (X, Y) refers to the width and height of the frame, f(x, y) represents the color value of the pixel point at the coordinates (x, y), n is the sequence number of the video frame, and D is the inter-frame difference.

2.4. YOLOv8-SP

YOLOv8-seg is a one-stage segmentation algorithm designed for industrial deployment applications, featuring a fast detection speed and high recognition accuracy, and is easy to deploy and apply. It builds upon the YOLOv8 backbone and incorporates a segmentation head inspired by the YOLOACT structure, adding a specialized head branch for mask segmentation. In this study, the YOLOv8-seg algorithm was used to perform segmentation on broilers. Since the videos were captured by an industrial camera positioned outside the cage, broilers appeared smaller in the images when they were further from the camera. The images contained both large targets that were close to the camera and small targets that were further away. To enhance the model’s ability to recognize small objects, this study introduced the parallelized patch-aware attention module to improve the model’s capacity for capturing multi-scale features, thereby boosting its ability to detect small targets. Additionally, within the label assignment strategy of the YOLOv8 network, a custom power transform (PT) function was developed to optimize the overlap matrix. This function aims to further suppress the weight of low-confidence prediction boxes while amplifying the impact of high-quality prediction boxes on the model’s learning performance. The resulting improved segmentation algorithm is referred to as YOLOv8-SP.

2.4.1. Parallelized Patch-Aware Attention Module

The parallelized patch-aware attention (PPA) module was first proposed in infrared small object detection tasks [22], which employed a parallel multi-branch strategy to improve the accuracy of small object detection. As shown in Figure 2, the PPA module includes three parallel branches: the local, global, and serial convolution branches. The input features tensor F R H × W × C was first adjusted through a point-wise convolution to obtain F R H × W × C , and then it was processed through each of the three parallel branches to obtain three different feature tensors, F l o c a l R H × W × C , F g l o b a l R H × W × C , and F c o n v R H × W × C . Finally, the outputs of the three branches were summed to obtain the input of the attention mechanisms, F ˜ R H × W × C . Specifically, the distinction branch is controlled by the patch size parameter p, which aggregates and displaces the non-overlapping patches of the feature maps in spatial dimensions and computes the attention matrix between non-overlapping patches to enable local and global feature extraction and interaction. In the attention mechanism module, the feature tensor was successively processed through a one-dimensional feature attention map and a two-dimensional feature attention map to obtain the final output. The process can be summarized as Equations (2) and (3):
F = M c F ˜ F ˜ , F s = M s F c F c ,
F = δ β d r o p o u t F s ,
where is the element-wise multiplication, F c and F s are the feature tensor after channel and spatial selection, respectively, δ and β are the rectified linear unit and batch normalization operation, respectively, and F is the final output.
Figure 3 illustrates the architecture of the C2f_PPA module. To improve the model’s capacity for extracting image information, the PPA module was utilized to replace the bottleneck within the C2f module. Consequently, all C2f layers (2, 4, 6, and 8) in the backbone network were substituted with the C2f_PPA module.

2.4.2. Power Transform

The task alignment learning (TAL) technique, proposed in the TOOD [23], is a method for distributing positive and negative samples. It achieves explicit alignment between two tasks by designing a sample allocation strategy and a task-alignment loss function, which brings the optimal anchors of both tasks closer. The sample allocation strategy introduces the idea of suppressing the classification scores of misaligned anchors while guiding the network to accurately improve the classification of aligned anchors. Building on this idea, the researchers proposed an anchor alignment metric to evaluate the alignment of tasks at the anchor level, which was used for dynamically optimizing predictions for each anchor. In the YOLOv8 network, the TAL concept is applied to align classification and regression tasks. In RTMDet [24], in response to the difficulty of distinguishing between high-quality and low-quality matches when using generalized IoU as the matching cost, the authors introduced the idea of using the logarithm of IoU as the regression cost, with the following expression:
C r e g = l o g ( I o U )
Inspired by this, to make the matching quality and loss weights of different GT prediction pairs more discriminative, we optimized the overlap calculation function in the alignment metric computation process using a custom power transform (PT) function. This function could further enhance the weight of high-quality predicted boxes while suppressing the influence of low-quality predicted boxes. The function is expressed as follows:
O v e r l a p s = I o U 2     i f I o U 0.5 I o U < T e l s e
where T is the customized IoU threshold.

2.5. Optical Flow Estimation

Optical flow estimation depicts the motion of the pixel points in an image sequence and can provide information, such as motion trajectories and the velocities of objects. In the broiler video, the activity of broilers over a period of time can be estimated by the optical flow estimation of a series of key frames. Traditional optical flow estimation methods perform well in simple scenes and small-motion processing but are not good at handling complex scenes and large-scale motion. The diverse behaviors of broilers and uneven light distribution in the cage create a complex scene that necessitates a more advanced analytical approach.
The recurrent adaptable pyramids with iterative decoding (RAPIDFlow) optical flow estimation network is an efficient and flexible optical flow estimation network that progressively generates highly detailed optical flow estimates through iterative decoding and refinement processes [25]. As shown in Figure 4, the architecture of the RAPIDFlow optical flow network mainly consists of two parts: the encoder and decoder. The encoder part employed stem convolution blocks to project the input images into the feature space and repeatedly applied a single recurrent block to generate multi-scale features. The pyramid decoder used iterative refinement applied to feature pyramids to produce highly detailed flow estimations gradually. At the last pyramid level, a convex up-sample module was added to produce sharper flow fields at the original resolution. The RAPIDFlow network is robust to the input size and efficient in the inference process, which has good performance in detecting the small-range motion in the key frames.

2.6. Dataset and Experimental Setup

A total of 770 images were used to construct the segmentation dataset, the majority of which were obtained from video frames that did not contain dead broilers, and a small portion were from images with dead broilers collected by Hao et al. [26]. Utilizing the Labelme software (5.6.0), each of the broilers present in the images was annotated, with both dead and live broilers uniformly labeled under the category “broiler”. Subsequently, the dataset was structured in accordance with the COCO format and divided into training and validation subsets with an 8:2 ratio, respectively. Evaluation metrics, including precision, recall, and mean average precision (mAP), were chosen to verify the performance of the proposed broiler segmentation model.
All of the training work was implemented on a desktop computer with a GeForce RTX3090 GPU (24GB) and an Intel i9-12000k [email protected]. In terms of the runtime environment, the model was trained on an Ubuntu 20.04 system with PyTorch version 1.13. The input size of the segmentation model was set as 640 × 640 pixels, and the total epoch, batch size, and optimizer were 300, 4, and SGD, respectively. The optical flow network was not trained with a custom dataset but used the pre-trained weight trained on the Sintel optical flow datasets.

2.7. Identification of Dead Broilers

In a sequence of video frames, broilers that display movement are certainly alive and active, while those that do not exhibit movement could either be dead or their movement might have been missed by the video frames. Key frames in a video contain a wealth of motion information about the broilers, allowing their movement over a period to be described by two adjacent key frames. Therefore, the first step was to eliminate the active broilers from the key frames by assessing their movement, retaining those that did not show movement. For ease of description, the latter were defined as candidate dead broilers in this research.
To distinguish active broilers in the key frames from candidate dead broilers, this study defined the AOFI index of broilers to quantify their movement in adjacent key frames. Figure 5 illustrates the computational flowchart of the AOFI index. The first branch acquired the optical flow maps of adjacent key frames through the RAPIDFlow network, and the second branch used an instance segmentation network to segment and extract the pixel coordinate information of each broiler in the key frame. In particular, to ensure that the pixel values in the optical flow map were proportional to the motion level (i.e., pixels with no motion change had a value of 0, and the more intense the motion, the higher the value), an inverse operation was performed on the optical flow map, followed by the extraction of the gray values. Finally, the AOFI value of a broiler was calculated according to Equation (6). A statistical analysis of AOFI values was performed for both dead and active live broilers to confirm an appropriate threshold for detecting candidate dead broilers in the key frames. After that, the threshold was employed to filter out the active live broilers in each key frame, while retaining the pixel coordinates (mask) of the candidate dead broilers for further analysis.
A O F I = i = 1 N I G x , y N
where N is the number of pixels in a broiler mask, (x, y) are the pixel coordinates of the broiler mask, and IG is the gray value of the pixel (x, y) in the inverted optical flow map.
Typically, inactive live broilers will always show some degree of movement over time, resulting in a higher frequency of movement in all video key frames compared to dead broilers. It was possible to assess the historical movement of each broiler in all key frames to identify the dead broilers. Therefore, the second step in identifying dead broilers was to count the frequency at which each broiler was determined to be a candidate dead broiler in all key frames of the video. To facilitate the counting of the candidate dead broilers in all key frames, as shown in Figure 6a, the center point of the broiler mask was used to represent the broiler and project all points onto the last key frame of the video to count the frequency with which each broiler was identified as a candidate dead broiler (Figure 6b). Due to the varying number of key frames extracted from the video sequences, this study defined the frequency at which broilers were identified as candidate dead broilers as the number of times the broilers were judged to be candidate dead broilers divided by the total number of key frames. Algorithm 1 presents the pseudocode for identifying dead broilers in a video.
Algorithm 1. Pseudocode for identifying dead broilers in a video
1: Procedure DetermineDeadBroiler
2: opticalFlowMap ← RAPIDFlow (RGBImage)
3: binaryMask ← YOLOv8-SP (BinaryImage)
4: keyframes ← Broiler video (RGBImage)
5: for each k in keyframes do
6:   for each mask in k do
7:     AOFIgetAOFI (binary mask, opticalFlowMap(k))
8:     if AOFI > 1.1 then
9:      continue
10:   else
11:     centerPoints ← centerPoint (mask)
12:   end if
13:  end for
14: end for
15:  for each mask in lastKeyframe do
16:   for each c in centerPoints do
17:    nmask ← count_c_in_mask (c, mask)
18:   end for
19: end for
20:  if nmask/numberOf Keyframes >1/3 then
21:   return dead broiler detected
22:  end if
23: end procedure

3. Results

3.1. Comparison of Different Models

The performances of different models on the broiler segmentation datasets constructed in this study are shown in Table 1 and Figure 7. To ensure a fair comparison, none of the models used pre-trained weights during training. Among all the models, the YOLOv8-SP achieved the highest detection accuracy, with a mAP of 96.1% for boxes and 96.3% for masks. Compared to YOLOv8s-seg, YOLOv8-SP improved the mAP (box) and mAP (mask) by 1.9% and 1.8%, respectively, while the parameter count only increased by 13.8%, the computational cost rose by 4.5%, and the inference speed decreased by 36.8 FPS. These increases in parameters and computation are considered acceptable. In comparison to YOLOv8m-seg, our model improved mAP (box) and mAP (mask) by 1.2% and 1.0%, respectively, while the parameter count and computational cost were reduced by approximately 51% and 58%, respectively, with an increase in inference speed of 12.7 FPS. Compared to YOLOv5s-seg, although YOLOv8-SP was not superior in terms of parameter count and computation, its inference speed was similar, while the mAP (box) and mAP (mask) showed significant improvements of 1.8% and 2.6%, respectively. Compared to the two-stage models, SOLOv2 [27] and Mask R-CNN [28], YOLOv8-SP demonstrated substantial improvements in both detection accuracy and inference speed.

3.2. Ablation Test

In this study, ablation experiments were conducted to verify the effectiveness of the improved model for different components. YOLOv8s-seg was used as the baseline to test the impact of optimizing different numbers of C2f modules in the YOLOv8 backbone network with the PPA module on model detection performance. Additionally, the effect of varying IoU thresholds in the PT function on model accuracy was tested. As shown in Table 2, the highest detection accuracy was achieved when all four C2f modules in the backbone network were improved with the PPA module and the IoU threshold of the PT function was set to 0.45. The model’s mAP (box) and mAP (mask) increased by 1.9% and 1.8%, respectively, with the parameter count and computational cost rising by 13.8% and 4.5%, respectively. As the number of C2f_PPA modules increased, the model’s mAP (box) and mAP (mask) increased by 1.0% and 0.7%, respectively, indicating that using the PPA module to enhance multi-scale features in the backbone network was effective. Optimizing the label assignment strategy based on the PT function not only avoided any additional increase in parameters and computation but, also, under the optimal threshold, further improved the model’s mAP (box) and mAP (mask) by 0.9% and 1.1%, respectively, compared to the previous configuration.

3.3. Visualizations

As shown in Figure 8a,b, the red rectangular boxes highlight two broilers located at a distance from the camera. Compared to the segmentation results of the YOLOv8-seg model, the broiler segmentation model based on YOLOv8-SP was able to better identify small-target broilers at the far end of the cage. This suggests that the backbone network, improved with PPA module, can effectively extract multi-scale features from the image, thus enhancing the accuracy of small-target detection. In Figure 8c, the YOLOv8-seg model failed to segment the partially occluded broilers located in the middle of the image. In contrast, as shown in Figure 8d, the YOLOv8-SP model not only accurately identified and segmented the occluded broilers but also showed an increase in detection confidence for all broilers compared to the YOLOv8-seg model. This indicates that improving the overlap function using the PT function helped guide the model to learn higher-quality predictions during training, thereby enhancing both detection and segmentation performance. In terms of the quality of the broiler mask, the improved YOLOv8-SP model was able to segment each broiler in the image more accurately, which was highly beneficial for extracting the coordinates of the broilers for subsequent processing.

3.4. Correlation Verification of Broiler Movement and AOFI Value

To validate the positive proportional relationship between the AOFI value of the broiler and its movement, the pixels corresponding to the broiler mask in the optical flow map were extracted and visualized. As depicted in Figure 9, the black areas in the images represent the background, while the colored pixels indicate the broiler’s movement. In the pictures, the color of the pixels signifies the direction of movement, the area of the pixels indicates the extent of the movement, and the brightness of the pixels corresponds to the velocity of the movement. In Figure 9a, the broiler mask was predominantly white, indicating minimal movement during the observed period. As the AOFI value increased, colored pixels began to appear within the white broiler mask, and both the area and saturation of the colored pixels gradually increased. For instance, in Figure 9b, the yellow pixels on the broiler’s head indicated slight up and down movements. Figure 9c–e show broiler masks, where red and blue pixels represent movement in both the broiler’s head and neck. In Figure 9f, the broiler mask had an AOFI value of 8.71, with red pixels exhibiting higher saturation than in Figure 9d, indicating more rapid head movements. Figure 9g,h show broiler masks with AOFI values of 55.61 and 111.65, respectively. The mask regions were almost entirely filled with highly saturated colored pixels, suggesting that the broilers were moving rapidly and vigorously.

3.5. Candidate Dead Broiler Identification with Threshold Method

The optimal threshold for AOFI should minimize the probability of live broilers in the key frames being identified as candidate dead broilers, while ensuring that actual dead broilers are identified as such. Therefore, four videos per week containing dead broilers were randomly selected from weeks three to six, resulting in a total of sixteen videos. Key frames containing dead and live broilers were extracted for analysis from the video clips. A total of 904 dead broilers and 904 live broilers were counted, and their AOFI values were recorded. Figure 10 summarizes the distribution of AOFI values for the 904 live broilers and 904 dead broilers. As shown in Figure 10a, the AOFI values of the dead broilers were concentrated around 1.0, ranging from a minimum value of 0.01 to a maximum value of 26.70. Conversely, the AOFI values of the live broilers shown in Figure 10b were more dispersed, primarily ranging from 0 to 20, with a minimum value of 0 and a maximum value of 138.89.
To further determine the AOFI threshold, the classification performance of candidate dead broilers in the video frames was compared across five different AOFI thresholds. Table 3 details the proportions of dead and live broiler classified as candidate dead broilers at each threshold. As the threshold increased, both the proportion of dead broilers identified as candidate dead broilers and the proportion of live broilers identified as candidate dead broilers gradually increased. At a threshold of 1.1, the proportion of dead broilers identified as candidate dead broilers was 84.18%, which was an increase of 10.18% compared to the threshold of 1.0. The proportion of live broilers identified as candidate dead broilers was 11.83%, an increase of 2.65% compared to the threshold of 1.0. At a threshold of 1.0, the proportion of dead broilers identified as candidate dead broilers was 74.00%, an increase of 20.58% from the threshold of 0.9. The proportion of live broilers identified as candidate dead broilers was 9.18%, an increase of 2.28% from the threshold of 0.9. It can be observed from the results that 53.42% of the dead broilers had an AOFI value ranging between 0 and 0.9. Approximately 20.58% of the dead broilers exhibited an AOFI value within the range of 0.9 to 1.0, while 10.18% of the dead broilers had an AOFI threshold distributed between 1.0 and 1.1. The number of dead and live broilers with AOFI values distributed between 1.1 and 1.2, and between 1.2 and 1.3, did not exceed 3% in either case. Therefore, setting the AOFI threshold at 1.1 could ensure the classification of the maximum number of dead broilers as candidate dead broilers, while simultaneously reducing the likelihood of live broilers being classified as candidate dead broilers. Figure 11a illustrates the schematic diagram of projecting the center points of candidate dead broilers extracted from key frames using the AOFI onto the last key frame. Most of the center points of broilers in the key frames were filtered out with the AOFI threshold (Figure 11b). The dead broilers in the key frames were characterized by the highest concentration of center points, which means dead broilers in most key frames were correctly classified as candidate dead broilers. The positions of live broilers either had no or a lower number of center points, which suggested that the AOFI threshold was effective in filtering out most active broilers in the key frames.

3.6. Results of Identifying Dead Broilers

In a video, the frequency with which dead broilers were identified as candidate dead broilers was often the highest. Consequently, the frequency of each broiler being identified as a candidate dead broiler in the video was tallied, and the maximum value of the frequency was selected for manual verification to determine whether the broiler with the highest frequency was indeed a dead broiler. Eighty videos in the training video sets containing dead broilers were utilized to validate the method. Upon statistical analysis, it was found that in 71 out of the 80 videos, the dead broilers were identified as candidate dead broilers with the highest frequency. Among these, in 67 videos, the ratio of dead broilers successfully identified as candidate dead broilers across all the key frames exceeded one-third. According to the statistical result, this study set the frequency threshold for identifying dead broilers at one-third and validated it on a test set, which included 18 videos with dead broilers and 12 videos with only live broilers.
In the validation results of the 18 videos containing dead broilers, the dead broilers in 16 videos were most frequently identified as candidate dead broilers, with the frequency exceeding one-third of the total frames. In the other two videos, the reason why the dead broilers were not correctly identified is that they were occluded by live broilers for most of the time. As shown in Figure 12a,b, the dead broilers were partially obscured by surrounding broilers, with only a small part of their bodies exposed in the field of view. Among the 12 videos containing only live broilers, 10 were correctly identified. One video was misidentified due to the waterline obstructing the broiler’s head (Figure 12c), which prevented the optical flow algorithm from detecting the broiler’s movements (when broilers are at rest, most of their movement comes from the activity of their heads). Another video was misidentified because a broiler was in a state of sleep, exhibiting minimal or no movement (Figure 12d). In conclusion, the accuracy of the method in identifying dead broilers on the test set was 86.7%.

4. Discussion

4.1. Misclassification of Candidate Dead Broilers

Using the AOFI threshold to identify candidate dead broilers in key frames, the candidate dead broilers usually consisted of both live and dead broilers. The reasons for live broilers being mistakenly identified as candidate dead broilers in key frames can be categorized into two types. First, if a broiler did not move at all during the interval between two adjacent key frames, its AOFI value would be nearly zero, thus being identified as a candidate dead broiler. Second, if a broiler’s movement was minimal or slow, which could not be detected by the optical flow algorithm, resulting in an AOFI value below the threshold, it would be identified as a candidate dead broiler. The first scenario was unavoidable because, in the broiler farm, the movement of broilers in cages was random and uncontrollable. Key frames could not always capture the movement of all broilers simultaneously. Capturing video during periods when the broilers were active could reduce such occurrences. The second scenario frequently occurred with broilers farther from the camera due to perspective effects and shooting angles, which caused their movements to be too small to be detected by the optical flow algorithm. These distant broilers were directly excluded by the algorithm in the identification of the candidate dead broilers. Besides, broilers close to the camera with minimal movements also went undetected by the optical flow algorithm. This issue gradually improved as the broilers aged. As broilers grew older and larger, the body movements caused by their breathing could be detected by the optical flow algorithm, reducing the misidentification of live broilers as candidate dead broilers. To mitigate the second scenario, one could either select an appropriate video capture time or increase the sensitivity of the optical flow algorithm.
The dead broilers in key frames were also sometimes mistakenly classified as live broilers. The primary cause of this issue was the obstruction of dead broilers by live broilers. When the obstruction by live broilers was minimal, the movement of active broilers around the dead broiler could transfer optical flow to the dead broiler, resulting in an abnormal AOFI value. When the obstruction was more severe (with the blocked area greater than half the size of the dead broiler), the segmentation model might fail to accurately segment parts of the dead broiler’s pixels, and the movement of surrounding active broilers would have an even greater impact on the dead broiler’s AOFI value. The misclassification of dead broilers frequently occurred in this study. In each of the videos collected in this study, the ratio of identifying dead broilers as candidate dead broilers never reached 1.0. The results indicated that in all videos, there were always key frames where dead broilers were not correctly classified as candidate dead broilers.

4.2. Cause of False Identification of Dead Broilers

The main reason for the low accuracy in identifying dead broilers was the occlusion caused by other broilers. When a dead broiler is largely occluded, it becomes difficult to accurately segment the dead broiler. Its AOFI value is more susceptible to being influenced by other chickens and might even be missed during segmentation. In the 19 videos where dead broiler identification failed, the dead broilers were obstructed for extended periods, with more than half of their bodies covered. In 10 of these videos, occlusion of the dead broilers was mainly caused by the drinking broilers. Occlusion has consistently been a key factor that hinders the accuracy of poultry health and behavior identification based on video or image analysis. Liu et al. [29] attempted to identify dead chickens using the YOLOv4 algorithm but failed to recognize dead chickens when their heads and feet were not visible. Nasiri et al. [30] also reported that their drinking detection algorithm failed to recognize the drinking behavior of broilers when they were standing behind the water line. As the degree of obstruction of dead broilers increased, any vision-based method was believed unable to accurately identify the dead broilers.

4.3. Comparison with Related Studies

Table 4 shows the identification methods and existing issues in previous studies on dead poultry recognition. Researchers primarily utilized the physical appearance or temperature characteristics of poultry to identify the dead poultry [12,31]. These dead poultry identification methods, constructed based on color camera sensors and deep learning techniques, usually have the advantages of being efficient, minimally invasive, and easy to deploy. However, in most studies, there is a common issue of difficulty in identifying occluded dead poultry. In addition, our previous research found that some dead broilers in cage-layer housing have appearance characteristics similar to those of normal broilers, leading to missed detections when using appearance-based methods for identifying dead broilers [26]. As shown in Figure 13a, in the commercial broiler farm, most dead broilers were found with their chests flipped upwards, displaying a red or brownish-red color, and with their legs exposed, which is distinctly different from the live broilers shown in Figure 13d. However, as depicted in Figure 13b,c, some dead broilers were found lying on their chests, making them resemble resting live broilers and causing the algorithm to struggle with accurate identification. The dead broiler identification method proposed in this study was not affected by physical appearance and could effectively identify the dead broilers, similar to the live ones.

4.4. Limits and Further Improvement

Although the research method proposed in this paper could effectively identify dead broilers that were similar in appearance to normal broilers, it, like other vision-based methods, struggled with severely occluded dead ones. However, in practical applications, video-based methods may have stronger robustness against occlusion compared to image-based methods. This is because when the degree of occlusion of dead broilers changes, image-based methods may not have the opportunity to capture images with slighter occlusion to achieve the identification of dead ones. Compared to studies that rely on public datasets or images collected under laboratory conditions for poultry health identification, the approach proposed in this research is more practical for production environments. The videos used in this study were captured at commercial poultry farms, with the collection method strictly following the positions and angles specified by the inspection platform [26]. From an application perspective, most existing poultry mortality inspection platforms primarily use image-based detection methods to identify the dead poultry. The hardware cost of the method proposed in this study is identical to that of image-based detection, although the algorithm deployment is slightly more complex. Moreover, the inspection strategy for mobile inspection platforms differs: this method requires stopping at each cage for at least one minute to capture video, leading to an increase in inspection time.
To enhance the robustness of the proposed dead broiler identification algorithm, future validation should include more videos. Improvements could involve reducing occlusion and increasing the sensitivity of optical flow estimation. For example, capturing video just before feeding times can prevent drinking broilers from obstructing the view of potential dead ones. Extending the video duration allows live broilers to make more movements. A more precise optical flow estimation algorithm to detect subtle movements, such as those caused by breathing in young broilers, can improve the accuracy of identifying candidate dead broilers in key frames. In the future, we will further consider using video-understanding algorithms to more accurately identify the movements of each broiler, thereby identifying dead broilers in shorter videos.

5. Conclusions

This study presented a novel method for identifying dead broilers on commercial broiler farms. Unlike previous studies that focused on identifying dead poultry based on appearance and temperature anomalies, this research emulated the human approach to identifying dead broilers by analyzing their movements in video footage. A YOLOv8-SP network was developed, and by introducing the PPA module and PT function into the YOLOv8-seg network, the model’s ability to recognize small targets and improve mask quality was significantly enhanced. To describe the movement of broilers between key frames, an AOFI threshold was defined to assess their motion, enabling the preliminary identification of dead broilers. The final identification of dead broilers in the video was determined by calculating the frequency with which each broiler was classified as dead across all key frames.
The identification accuracy of the proposed video-based method was 86.7%. This was primarily due to the fact that our video dataset contained a significant number of occluded dead broilers. Like image-based methods, accurately identifying severely occluded broilers remains a challenge. To further enhance performance, it is recommended to capture broiler videos before feeding times, which would reduce occlusion caused by active broilers. Additionally, employing a more sensitive optical flow algorithm could improve the detection accuracy. Finally, this method may be particularly well suited for flat-raised broilers, as these broilers are housed at a lower density, making them less prone to occlusion.

Author Contributions

Conceptualization, H.H.; methodology, H.H. and E.D.; software, H.H.; validation, H.H.; formal analysis, H.H. and F.Z.; investigation, H.H.; resources, H.W.; data curation, X.L. and H.H.; writing—original draft preparation, H.H. and X.L.; writing—review and editing, E.D., L.W., F.Z. and H.W.; visualization, H.H.; supervision, L.W.; project administration, H.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The video datasets were collected on a broiler farm in China, under the supervision of the Animal Welfare and Animal Ethics Review Committee of China Agricultural University. Grant Number: AW72604202-6-1. The broilers in the experiment died without any human intervention.

Data Availability Statement

All data generated or analyzed during this study are included in this published article.

Acknowledgments

The authors would like to thank Zhoudong and Fengqibo for providing the experimental materials and the experimental site.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations were used in this manuscript:
YOLOv8-segYOLOv8 segmentation
AOFIAverage optical flow intensity

References

  1. Castro, F.L.S.; Chai, L.; Arango, J.; Owens, C.M.; Smith, P.A.; Reichelt, S.; DuBois, C.; Menconi, A. Poultry Industry Paradigms: Connecting the Dots. J. Appl. Poult. Study 2023, 32, 100310. [Google Scholar] [CrossRef]
  2. Astill, J.; Dara, R.A.; Fraser, E.D.G.; Roberts, B.; Sharif, S. Smart Poultry Management: Smart Sensors, Big Data, and the Internet of Things. Comput. Electron. Agric. 2020, 170, 105291. [Google Scholar] [CrossRef]
  3. Tainika, B.; Şekeroğlu, A.; Akyol, A.; Waithaka Ng’ang’a, Z. Welfare Issues in Broiler Chickens: Overview. World’s Poult. Sci. J. 2023, 79, 285–329. [Google Scholar] [CrossRef]
  4. Kittelsen, K.E.; Granquist, E.G.; Kolbjørnsen, Ø.; Nafstad, O.; Moe, R.O. A Comparison of Post-Mortem Findings in Broilers Dead-on-Farm and Broilers Dead-on-Arrival at the Abattoir. Poult. Sci. 2015, 94, 2622–2629. [Google Scholar] [CrossRef] [PubMed]
  5. Zhuang, X.; Zhang, T. Detection of Sick Broilers by Digital Image Processing and Deep Learning. Biosyst. Eng. 2019, 179, 106–116. [Google Scholar] [CrossRef]
  6. Fodor, I.; Van Der Sluis, M.; Jacobs, M.; De Klerk, B.; Bouwman, A.C.; Ellen, E.D. Automated Pose Estimation Reveals Walking Characteristics Associated with Lameness in Broilers. Poult. Sci. 2023, 102, 102787. [Google Scholar] [CrossRef]
  7. Cui, Y.; Kong, X.; Chen, C.; Li, Y. study on Broiler Health Status Recognition Method Based on Improved YOLOv5. Smart Agric. Technol. 2023, 6, 100324. [Google Scholar] [CrossRef]
  8. Aydin, A. Development of an Early Detection System for Lameness of Broilers Using Computer Vision. Comput. Electron. Agric. 2017, 136, 140–146. [Google Scholar] [CrossRef]
  9. Sadeghi, M.; Banakar, A.; Minaei, S.; Orooji, M.; Shoushtari, A.; Li, G. Early Detection of Avian Diseases Based on Thermography and Artificial Intelligence. Animals 2023, 13, 2348. [Google Scholar] [CrossRef] [PubMed]
  10. Yogi, K.K.; Yadav, S.P. Chicken Diseases Detection and Classification Based on Fecal Images Using EfficientNetB7 Model. Evergreen 2024, 11, 314–330. [Google Scholar] [CrossRef]
  11. Xin, C.; Li, H.; Li, Y.; Wang, M.; Lin, W.; Wang, S.; Zhang, W.; Xiao, M.; Zou, X. study on an Identification and Grasping Device for Dead Yellow-Feather Broilers in Flat Houses Based on Deep Learning. Agriculture 2024, 14, 1614. [Google Scholar] [CrossRef]
  12. Zhao, Y.; Shen, M.; Liu, L.; Chen, J.; Zhu, W. Study on the method of detecting dead chickens in caged chicken based on improved YOLOv5s and image fusion. J. Nanjing Agric. Univ. 2024, 47, 369–382. (In Chinese) [Google Scholar]
  13. Carpentier, L.; Vranken, E.; Berckmans, D.; Paeshuyse, J.; Norton, T. Development of Sound-Based Poultry Health Monitoring Tool for Automated Sneeze Detection. Comput. Electron. Agric. 2019, 162, 573–581. [Google Scholar] [CrossRef]
  14. Adebayo, S.; Aworinde, H.O.; Akinwunmi, A.O.; Alabi, O.M.; Ayandiji, A.; Sakpere, A.B.; Adeyemo, A.; Oyebamiji, A.K.; Olaide, O.; Kizito, E. Enhancing Poultry Health Management through Machine Learning-Based Analysis of Vocalization Signals Dataset. Data Brief 2023, 50, 109528. [Google Scholar] [CrossRef] [PubMed]
  15. Cuan, K.; Zhang, T.; Li, Z.; Huang, J.; Ding, Y.; Fang, C. Automatic Newcastle Disease Detection Using Sound Technology and Deep Learning Method. Comput. Electron. Agric. 2022, 194, 106740. [Google Scholar] [CrossRef]
  16. Bhandekar, A.; Udutalapally, V.; Das, D. Acoustic Based Chicken Health Monitoring in Smart Poultry Farms. In Proceedings of the 2023 IEEE International Symposium on Smart Electronic Systems (iSES), Ahmedabad, India, 18 December 2023; pp. 224–229. [Google Scholar]
  17. Mei, W.; Yang, X.; Zhao, Y.; Wang, X.; Dai, X.; Wang, K. Identification of Aflatoxin-Poisoned Broilers Based on Accelerometer and Machine Learning. Biosyst. Eng. 2023, 227, 107–116. [Google Scholar] [CrossRef]
  18. Bao, Y.; Lu, H.; Zhao, Q.; Yang, Z.; Xu, W. Detection system of dead and sick chickens in large scale farms based on artificial intelligence. Math. Biosci. Eng. 2021, 18, 6117–6135. [Google Scholar] [CrossRef] [PubMed]
  19. Wang, Y.; Wang, W.; Li, Y.; Guo, J.; Xu, Y.; Ma, J.; Ling, Y.; Fu, Y.; Jia, Y. study on Traditional and Deep Learning Strategies Based on Optical Flow Estimation—A Review. J. King Saud Univ. Comput. Inf. Sci. 2024, 36, 102029. [Google Scholar] [CrossRef]
  20. Dawkins, M.S.; Wang, L.; Ellwood, S.A.; Roberts, S.J.; Gebhardt-Henrich, S.G. Optical Flow, Behaviour and Broiler Chicken Welfare in the UK and Switzerland. Appl. Anim. Behav. Sci. 2021, 234, 105180. [Google Scholar] [CrossRef]
  21. Neves, D.P.; Mehdizadeh, S.A.; Tscharke, M.; Nääs, I.D.A.; Banhazi, T.M. Detection of Flock Movement and Behaviour of Broiler Chickens at Different Feeders Using Image Analysis. Inf. Process. Agric. 2015, 2, 177–182. [Google Scholar] [CrossRef]
  22. Xu, S.; Zheng, S.; Xu, W.; Xu, R.; Wang, C.; Zhang, J.; Teng, X.; Li, A.; Guo, L. HCF-Net: Hierarchical context fusion network for infrared small object detection. arXiv 2024, arXiv:2403.10778. [Google Scholar]
  23. Feng, C.; Zhong, Y.; Gao, Y.; Scott, M.R.; Huang, W. TOOD: Task-aligned One-stage Object Detection. In Proceeding of the IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; pp. 3490–3499. [Google Scholar]
  24. Lyu, C.; Zhang, W.; Huang, H.; Zhou, Y.; Wang, Y.; Liu, Y.; Zhang, S.; Chen, K. RTMDet: An empirical study of designing real-time object detectors. arXiv 2022, arXiv:2212.07784. [Google Scholar]
  25. Morimitsu, H.; Zhu, X.; Cesar, R.M.; Ji, X.; Yin, X.-C. RAPIDFlow: Recurrent adaptable pyramids with iterative decoding for efficient optical flow estimation. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 13–17 May 2024; pp. 2946–2952. [Google Scholar]
  26. Hao, H.; Fang, P.; Duan, E.; Yang, Z.; Wang, L.; Wang, H. A Dead Broiler Inspection System for Large-Scale Breeding Farms Based on Deep Learning. Agriculture 2022, 12, 1176. [Google Scholar] [CrossRef]
  27. Wang, X.; Zhang, R.; Kong, T.; Li, L.; Shen, C. SOLOv2: Dynamic and fast instance segmentation. Adv. Neural Inf. Process. Syst. 2020, 33, 17721–17732. [Google Scholar]
  28. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar]
  29. Liu, H.-W.; Chen, C.-H.; Tsai, Y.-C.; Hsieh, K.-W.; Lin, H.-T. Identifying Images of Dead Chickens with a Chicken Removal System Integrated with a Deep Learning Algorithm. Sensors 2021, 21, 3579. [Google Scholar] [CrossRef]
  30. Nasiri, A.; Amirivojdan, A.; Zhao, Y.; Gan, H. An Automated Video Action Recognition-Based System for Drinking Time Estimation of Individual Broilers. Smart Agric. Technol. 2024, 7, 100409. [Google Scholar] [CrossRef]
  31. Luo, S.; Ma, Y.; Jiang, F.; Wang, H.; Tong, Q.; Wang, L. Dead Laying Hens Detection Using TIR-NIR-Depth Images and Deep Learning on a Commercial Farm. Animals 2023, 13, 1861. [Google Scholar] [CrossRef] [PubMed]
  32. Muvva, V.V.; Zhao, Y.; Parajuli, P.; Zhang, S.; Tabler, T.; Purswell, J. Automatic Identification of Broiler Mortality Using Image Processing Technology. In Proceedings of the 10th International Livestock Environment Symposium (ILES X), Omaha, NE, USA, 25–27 September 2018; p. 1. [Google Scholar]
Figure 1. Schematic diagram of the dead broiler identification algorithm.
Figure 1. Schematic diagram of the dead broiler identification algorithm.
Agriculture 15 00225 g001
Figure 2. The network structure of the parallelized patch-aware attention module.
Figure 2. The network structure of the parallelized patch-aware attention module.
Agriculture 15 00225 g002
Figure 3. Schematic diagram of the C2f_PPA module.
Figure 3. Schematic diagram of the C2f_PPA module.
Agriculture 15 00225 g003
Figure 4. Schematic diagram of the RAPIDFlow network.
Figure 4. Schematic diagram of the RAPIDFlow network.
Agriculture 15 00225 g004
Figure 5. Schematic diagram of broiler average optical flow intensity (AOFI) calculation.
Figure 5. Schematic diagram of broiler average optical flow intensity (AOFI) calculation.
Agriculture 15 00225 g005
Figure 6. Center points of broilers. (a) The green rectangular boxes represent the boundary boxes of the broilers obtained from the broiler segmentation model, with P1(x1, y1) and P2(x2, y2) denoting the top-left and bottom-right coordinates of the rectangle, respectively. O(x, y) is the center point of the broiler mask. (b) The result of projecting the center points of the candidate dead broilers from all key frames onto the last key frame.
Figure 6. Center points of broilers. (a) The green rectangular boxes represent the boundary boxes of the broilers obtained from the broiler segmentation model, with P1(x1, y1) and P2(x2, y2) denoting the top-left and bottom-right coordinates of the rectangle, respectively. O(x, y) is the center point of the broiler mask. (b) The result of projecting the center points of the candidate dead broilers from all key frames onto the last key frame.
Agriculture 15 00225 g006
Figure 7. Comparisons with other models.
Figure 7. Comparisons with other models.
Agriculture 15 00225 g007
Figure 8. Visualization of the segment results. (a,b) The segmentation results of YOLOv8-seg and YOLOv8-SP at three weeks of age. (c,d) The segmentation results of YOLOv8-SP at six weeks of age. In (a,c), the red rectangular boxes mark examples of missed detections of broilers by YOLOv8-seg. In (b,d), the red rectangular boxes mark the successful detection results of broiler using YOLOv8-SP.
Figure 8. Visualization of the segment results. (a,b) The segmentation results of YOLOv8-seg and YOLOv8-SP at three weeks of age. (c,d) The segmentation results of YOLOv8-SP at six weeks of age. In (a,c), the red rectangular boxes mark examples of missed detections of broilers by YOLOv8-seg. In (b,d), the red rectangular boxes mark the successful detection results of broiler using YOLOv8-SP.
Agriculture 15 00225 g008
Figure 9. Examples of broiler masks with different AOFI values. (a) AOFI value: 0.74, (b) AOFI value: 1.23, (c) AOFI value: 2.31, (d) AOFI value: 4.53, (e) AOFI value: 6.86, (f) AOFI value: 8.71, (g) AOFI value: 55.61, and (h) AOFI value: 111.65.
Figure 9. Examples of broiler masks with different AOFI values. (a) AOFI value: 0.74, (b) AOFI value: 1.23, (c) AOFI value: 2.31, (d) AOFI value: 4.53, (e) AOFI value: 6.86, (f) AOFI value: 8.71, (g) AOFI value: 55.61, and (h) AOFI value: 111.65.
Agriculture 15 00225 g009
Figure 10. Statistics of the AOFI values of (a) dead and (b) live broilers in the key frames.
Figure 10. Statistics of the AOFI values of (a) dead and (b) live broilers in the key frames.
Agriculture 15 00225 g010
Figure 11. Schematic diagram of broiler distribution. Each blue point represents a broiler in the key frame. (a) Before removing the active broilers and (b) after removing the active broilers.
Figure 11. Schematic diagram of broiler distribution. Each blue point represents a broiler in the key frame. (a) Before removing the active broilers and (b) after removing the active broilers.
Agriculture 15 00225 g011
Figure 12. Examples of misidentified broilers. (a,b) The dead broilers were marked by red rectangular boxes. (c,d) The live broiler obstructed by the waterline and the broiler in the sleeping state were marked by red rectangular boxes, respectively.
Figure 12. Examples of misidentified broilers. (a,b) The dead broilers were marked by red rectangular boxes. (c,d) The live broiler obstructed by the waterline and the broiler in the sleeping state were marked by red rectangular boxes, respectively.
Agriculture 15 00225 g012
Figure 13. Examples of dead and live broilers. (a) Dead broilers that die from acute causes are usually found with their chest facing upwards and their limbs dangling down, with the color of the abdomen gradually changing over time from a pinkish color to a brownish-black. (b,c) Dead broilers that die from chronic diseases have an appearance similar to that of resting live broilers. (d) Resting live broiler.
Figure 13. Examples of dead and live broilers. (a) Dead broilers that die from acute causes are usually found with their chest facing upwards and their limbs dangling down, with the color of the abdomen gradually changing over time from a pinkish color to a brownish-black. (b,c) Dead broilers that die from chronic diseases have an appearance similar to that of resting live broilers. (d) Resting live broiler.
Agriculture 15 00225 g013
Table 1. Comparisons with other models.
Table 1. Comparisons with other models.
ModelmAP (Box)mAP (Mask)ParametersGFLOPsFPS
YOLOv8s-seg94.2%94.5%11,779,98742.4108.7
YOLOv8m-seg95.1%95.1%27,222,963110.059.2
YOLOv5s-seg94.5%93.5%7,398,42225.790.1
Mask RCNN88.0%85.1%45,822,771284.528.8
SOLOv2-86.7%48,549,068279.542.9
YOLOv8-SP96.3%96.1%13,408,86146.371.9
Table 2. Ablation test.
Table 2. Ablation test.
ModelsmAP (Box)mAP (Mask)ParametersGFLOPs
YOLOv8-seg94.2%94.5%11,779,98742.4
+PPA(1,2)95.0%95.1%11,922,21344.4
+PPA(3,4)95.0%95.0%13,266,63544.4
+PPA(1,2,3,4)95.2%95.2%13,408,86146.3
+PPA(1,2,3,4) + PT(0.40)94.9%95.1%13,408,86146.3
+PPA(1,2,3,4) + PT(0.45)96.1%96.3%13,408,86146.3
+PPA(1,2,3,4) + PT(0.5)96.0%95.7%13,408,86146.3
+PPA(1,2,3,4) + PT(0.55)95.5%95.3%13,408,86146.3
+PPA(1,2,3,4) + PT(0.60)96.1%96.3%13,408,86146.3
Table 3. Proportions of dead and live broilers classified as candidate dead broilers at different AOFI thresholds (T).
Table 3. Proportions of dead and live broilers classified as candidate dead broilers at different AOFI thresholds (T).
Mask TypeT = 0.9T = 1.0T = 1.1T = 1.2T = 1.3
Dead broilers 53.42%74.00%84.18%86.17%87.61%
Live broilers 6.96%9.18%11.83%14.15%15.81%
Table 4. Comparison of the dead poultry detection method with related studies.
Table 4. Comparison of the dead poultry detection method with related studies.
AuthorsDetection TargetSensorMethodExtracted FeaturesAdvantageExisting Issues
Hao et al. (2022) [26]Caged dead broilerRGB cameraYOLOv3Morphological characteristicsEfficient, minimally invasive, and easy to deployFailed to identify occluded dead broilers that have similar appearance to normal ones
Liu et al. (2022) [29]Flat-raised dead chickenRGB cameraYOLOv4Morphological characteristicsFailed to identify the occluded dead chickens
Xin et al. (2024) [11]Flat-raised dead broilerBinocular cameraYOLOv6Morphological characteristicsFailed to identify the distance and occluded broilers
Muvva et al. (2018) [32]Flat-raised dead broilerThermal camerasBackground subtraction methodThermal characteristicsMinimally invasiveHigh cost, and failed to identify the occluded dead broilers
Luo et al. (2023) [31]Caged dead chickenDepth and thermal camerasDeformable DETRMorphological and thermal characteristicsMinimally invasive, high accuracyHigh cost and low image quality
Bao et al. (2021) [18]Caged dead chickenThree-dimensional accelerationBack propagation networkMotion characteristicsHigh accuracyIntrusive and additional labor burden
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hao, H.; Zou, F.; Duan, E.; Lei, X.; Wang, L.; Wang, H. Research on Broiler Mortality Identification Methods Based on Video and Broiler Historical Movement. Agriculture 2025, 15, 225. https://doi.org/10.3390/agriculture15030225

AMA Style

Hao H, Zou F, Duan E, Lei X, Wang L, Wang H. Research on Broiler Mortality Identification Methods Based on Video and Broiler Historical Movement. Agriculture. 2025; 15(3):225. https://doi.org/10.3390/agriculture15030225

Chicago/Turabian Style

Hao, Hongyun, Fanglei Zou, Enze Duan, Xijie Lei, Liangju Wang, and Hongying Wang. 2025. "Research on Broiler Mortality Identification Methods Based on Video and Broiler Historical Movement" Agriculture 15, no. 3: 225. https://doi.org/10.3390/agriculture15030225

APA Style

Hao, H., Zou, F., Duan, E., Lei, X., Wang, L., & Wang, H. (2025). Research on Broiler Mortality Identification Methods Based on Video and Broiler Historical Movement. Agriculture, 15(3), 225. https://doi.org/10.3390/agriculture15030225

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop