Pig Movement Estimation: Comparison
Please note this is a comparison between Version 2 by Sirius Huang and Version 1 by Heng Zhou.

Pig husbandry constitutes a significant segment within the broader framework of livestock farming, with porcine well-being emerging as a paramount concern due to its direct implications on pig breeding and production. An easily observable proxy for assessing the health of pigs lies in their daily patterns of movement. The daily movement patterns of pigs can be used as an indicator of their health, in which more active pigs are usually healthier than those who are not active, providing farmers with knowledge of identifying pigs’ health state before they become sick or their condition becomes life-threatening. However, the conventional means of estimating pig mobility largely rely on manual observations by farmers, which is impractical in the context of contemporary centralized and extensive pig farming operations. In response to these challenges, multi-object tracking and pig behavior methods are adopted to monitor pig health and welfare closely.

  • pig movement estimation
  • multi-object tracking
  • optical flow
  • livestock farming

1. Introduction

The increasing integration of Artificial Intelligence (AI) into the agricultural sector has garnered significant attention recently, primarily propelled by the rapid advancements in AI technologies. Within the domain of computer vision, recognized for its intricate tasks spanning object detection, action recognition, multi-object tracking, and more, AI has demonstrated successful applications across diverse agricultural domains. These applications encompass crucial areas such as plant disease detection [1], pig behavior recognition [2], cattle behavior recognition [3], and livestock tracking [4], among others. It is paramount to underscore the pivotal role of livestock farming in the broader agricultural landscape, serving as a primary source of meat production for a significant portion of the global population. In response to this evolving agricultural paradigm, a novel concept known as Precision Livestock Farming (PLF) has emerged. PLF harnesses the synergistic capabilities of AI and Internet of Things (IoT) technologies, equipping livestock farmers with scientifically informed decision-making tools and adaptive management strategies [5,6][5][6]. This innovative approach ushers in a new era in livestock farming, where data-driven insights and intelligent systems empower farmers to optimize their operations and enhance the overall efficiency and sustainability of livestock production.
Pig health stands as a recurrent focal point within the realm of livestock farming given its intricate interplay with pig breeding and production. A fundamental yardstick for assessing pig well-being lies in the daily ambulatory patterns exhibited by pigs. In conventional pig farming, the onus of monitoring pig movement typically falls upon farmers, an endeavor demanding substantial time and labor resources [7,8][7][8]. However, modern commercial pig breeding enterprises have embraced a centralized and large-scale operational paradigm, rendering traditional labor-intensive monitoring approaches impractical. Moreover, the demanding working conditions pervasive in the pig farming industry have restricted the pool of individuals willing to pursue careers in this sector, resulting in a dearth of labor resources for monitoring pig movement. These factors underscore the pressing necessity for automated approaches to pig farming.
In response, numerous researchers have dedicated their efforts to this burgeoning field of study. Broadly, two predominant methodologies for estimating pig movement have emerged. The first approach leverages behavior recognition algorithms [9,10,11,12][9][10][11][12] to classify various pig activities, encompassing lying, walking, sitting, standing, drinking, etc. The second approach harnesses tracking algorithms [8,13,14][8][13][14] within the domain of computer vision to monitor the positions of individual pigs and subsequently assess pig movement using the center point of bounding boxes. Notably, two pivotal considerations underlie the process of calculating pig movement. The first key point is that the effectiveness of the tracking method is of paramount importance, necessitating a tracking model capable of consistent tracking across all frames for each pig. The second key point is how to obtain pig movement. Although significant strides have been made in the realm of pig movement assessment, existing methodologies exhibit limitations in providing both individual and cumulative distance measurements. Regarding the methods based on behavior recognition, this strand of research primarily concentrates on static behaviors and fails to provide quantified assessments of the current and cumulative movements of individual pigs, i.e., how far each pig moves and how long each pig keeps moving in one hour or one day. As for tracking-based approaches, the related literature tends to first track the pigs, obtaining bounding boxes for each pig, and then use the distance the center points of bounding boxes move as the measure of pig movement. Nevertheless, the accuracy of motion distance measurements in this method hinges on the size of bounding boxes, rendering calculations inaccurate when bounding box dimensions fluctuate due to tracking limitations.

2. Applications of Computer Vision Technologies in Pig Farming

The rapid advancement of computer vision technology has ushered the pig farming industry into an era of non-contact and automated breeding. Many classic computer vision tasks have been adapted and applied to the pig farming sector, providing a foundation for pig health monitoring and breeding management decisions. In terms of pig detection applications, Bo et al. [21][15] proposed a real-time pig detection system based on infrared cameras that effectively mitigates the issues of infrared reflection in pig farms. This system comprises a data collector for gathering infrared images, a preprocessor for converting noisy images into clean ones, and a detector for pig detection. The preprocessor employs U-Net and Generative Adversarial Networks (GAN) for feature extraction and is trained on paired clean datasets and datasets with simulated noise. Lei et al. [22][16] introduced a non-contact machine vision method where Mask R-CNN and UNet-Attention were implemented for sow target perception in complex scenarios. Ding et al. [23][17] proposed a method named FD-CNN to detect the regions of the active piglets based on YOLOv5s model. They employed a detection model to predict the area occupied by active piglets and then estimated the overall average activity level of piglets during the lactation period by calculating the ratio of this area to the total area occupied by all piglets. This analysis was used to study the variations in piglet activity. Regarding pig tracking, Guo et al. [24][18] proposed a weighted association algorithm combined with two multi-object tracking models to improve the tracking performance. Liu et al. presented a new method to track individual trajectories for pigs where the tracking model was based on DeepLabCut and the trajectory optimal clustering was achieved by kernel principal component analysis. To measure the number and types of social encounters for pigs, Wutke et al. [25][19] developed a framework for the automated identification of social contacts in which Convolutional Neural Networks (CNN) and Kalman Filter were employed to recognize social contacts in the form of head–head and head–tail contacts. In terms of automated pig monitoring, Wang et al. [26][20] proposed a one-shot tracker to solve the re-identification problem in pig tracking. This method jointly trained detection models and re-identification models and combined re-identification features and IoU for matching. To diagnose the productivity, health, and welfare of pigs, plenty of behavior recognition methods have been developed based on pig detection and tracking. Tu et al. [27][21] achieved pig behavior recognition based on tracking algorithms where the behavior identification of each pig was based on tracking results. Hao et al. [11] proposed a deep mutual learning enhanced two-stream method consisting of two mutual learning networks for identifying pig behaviors. In their approach, two mutual learning networks were able to extract rich appearance and motion features, improving performance. To recognize the aggressive behavior of group-housed pigs, Gao et al. [12] presented a hybrid model combining CNN and Gated Recurrent Unit (GRU) to extract behavior features, and a specific spatiotemporal attention mechanism was added into the model to better classify the behaviors. Ji et al. [28][22] utilized a temporal shift module inserted into four different CNN networks to automatically recognize pig behaviors. The whole model was efficient, without extra parameters and complexity. All these applications of computer vision techniques are beneficial to pig farming, improving the efficiency of management and reducing labor costs.

3. Pig Movement Estimation

The daily activity intensity of pigs is one of the crucial indicators for measuring their health. There are two kinds of mainstream methods to automatically assess pig movement. One method is to first identify pig behavior and then count the occurrences of each behavior [9,10,12][9][10][12]. These methods primarily concentrate on static behaviors and fail to provide quantified assessments of the current and cumulative movements of individual pigs, i.e., how far each pig moves. The other method harnesses tracking algorithms within the domain of computer vision to monitor the positions of individual pigs and subsequently assess pig movement using the center point of bounding boxes [8,13,14][8][13][14]. As explained before, the size of bounding boxes is easily changed during the tracking process; thus, it is not reliable to be used for estimating the actual movement of each pig. On the contrary, the proposed method integrates optical flow with a tracking algorithm where the optical flow shows the movement of all pigs and the tracking results provide the bounding box information of each pig, beneficial to compute the movement of each pig. In this way, movement estimation and bounding boxes are decoupled, and movement calculation does not rely on the size of bounding boxes. Furthermore, the optical flow can reflect pig motions of different parts, which provides more fine-grained information for assessing the health status of pigs.

References

  1. Dong, J.; Lee, J.; Fuentes, A.; Xu, M.; Yoon, S.; Lee, M.H.; Park, D.S. Data-centric annotation analysis for plant disease detection: Strategy, consistency, and performance. Front. Plant Sci. 2022, 13, 1037655.
  2. Chen, C.; Zhu, W.; Norton, T. Behaviour recognition of pigs and cattle: Journey from computer vision to deep learning. Comput. Electron. Agric. 2021, 187, 106255.
  3. Fuentes, A.; Han, S.; Nasir, M.F.; Park, J.; Yoon, S.; Park, D.S. Multiview Monitoring of Individual Cattle Behavior Based on Action Recognition in Closed Barns Using Deep Learning. Animals 2023, 13, 2020.
  4. Han, S.; Fuentes, A.; Yoon, S.; Jeong, Y.; Kim, H.; Park, D.S. Deep learning-based multi-cattle tracking in crowded livestock farming using video. Comput. Electron. Agric. 2023, 212, 108044.
  5. Wang, S.; Jiang, H.; Qiao, Y.; Jiang, S.; Lin, H.; Sun, Q. The Research Progress of Vision-Based Artificial Intelligence in Smart Pig Farming. Sensors 2022, 22, 6541.
  6. Collins, L.; Smith, L. Smart agri-systems for the pig industry. Animal 2022, 16, 100518.
  7. Ho, K.Y.; Tsai, Y.J.; Kuo, Y.F. Automatic monitoring of lactation frequency of sows and movement quantification of newborn piglets in farrowing houses using convolutional neural networks. Comput. Electron. Agric. 2021, 189, 106376.
  8. Xu, J.; Ye, J.; Zhou, S.; Xu, A. Automatic quantification and assessment of grouped pig movement using the XGBoost and YOLOv5s models. Biosyst. Eng. 2023, 230, 145–158.
  9. Chen, C.; Zhu, W.; Steibel, J.; Siegford, J.; Wurtz, K.; Han, J.; Norton, T. Recognition of aggressive episodes of pigs based on convolutional neural network and long short-term memory. Comput. Electron. Agric. 2020, 169, 105166.
  10. Zhang, K.; Li, D.; Huang, J.; Chen, Y. Automated video behavior recognition of pigs using two-stream convolutional networks. Sensors 2020, 20, 1085.
  11. Hao, W.; Zhang, K.; Zhang, L.; Han, M.; Hao, W.; Li, F.; Yang, G. TSML: A New Pig Behavior Recognition Method Based on Two-Stream Mutual Learning Network. Sensors 2023, 23, 5092.
  12. Gao, Y.; Yan, K.; Dai, B.; Sun, H.; Yin, Y.; Liu, R.; Shen, W. Recognition of aggressive behavior of group-housed pigs based on CNN-GRU hybrid model with spatio-temporal attention mechanism. Comput. Electron. Agric. 2023, 205, 107606.
  13. Cowton, J.; Kyriazakis, I.; Bacardit, J. Automated individual pig localisation, tracking and behaviour metric extraction using deep learning. IEEE Access 2019, 7, 108049–108060.
  14. Chen, C.P.J.; Morota, G.; Lee, K.; Zhang, Z.; Cheng, H. VTag: A semi-supervised pipeline for tracking pig activity with a single top-view camera. J. Anim. Sci. 2022, 100, skac147.
  15. Bo, Z.; Atif, O.; Lee, J.; Park, D.; Chung, Y. GAN-based video denoising with attention mechanism for field-applicable pig detection system. Sensors 2022, 22, 3917.
  16. Lei, K.; Zong, C.; Yang, T.; Peng, S.; Zhu, P.; Wang, H.; Teng, G.; Du, X. Detection and analysis of sow targets based on image vision. Agriculture 2022, 12, 73.
  17. Ding, Q.A.; Chen, J.; Shen, M.X.; Liu, L.S. Activity detection of suckling piglets based on motion area analysis using frame differences in combination with convolution neural network. Comput. Electron. Agric. 2022, 194, 106741.
  18. Guo, Q.; Sun, Y.; Min, L.; van Putten, A.; Knol, E.F.; Visser, B.; Rodenburg, T.B.; Bolhuis, J.E.; Bijma, P.; de With, P.H.N. Video-based Detection and Tracking with Improved Re-Identification Association for Pigs and Laying Hens in Farms. In Proceedings of the VISIGRAPP (4: VISAPP), Online, 6–8 February 2022; pp. 69–78.
  19. Wutke, M.; Heinrich, F.; Das, P.P.; Lange, A.; Gentz, M.; Traulsen, I.; Warns, F.K.; Schmitt, A.O.; Gültas, M. Detecting animal contacts—A deep learning-based pig detection and tracking approach for the quantification of social contacts. Sensors 2021, 21, 7512.
  20. Wang, M.; Larsen, M.L.; Liu, D.; Winters, J.F.; Rault, J.L.; Norton, T. Towards re-identification for long-term tracking of group housed pigs. Biosyst. Eng. 2022, 222, 71–81.
  21. Tu, S.; Zeng, Q.; Liang, Y.; Liu, X.; Huang, L.; Weng, S.; Huang, Q. Automated Behavior Recognition and Tracking of Group-Housed Pigs with an Improved DeepSORT Method. Agriculture 2022, 12, 1907.
  22. Ji, H.; Teng, G.; Yu, J.; Wen, Y.; Deng, H.; Zhuang, Y. Efficient Aggressive Behavior Recognition of Pigs Based on Temporal Shift Module. Animals 2023, 13, 2078.
More
Video Production Service