You're using an outdated browser. Please upgrade to a modern browser for the best experience.
Submitted Successfully!
Thank you for your contribution! You can also upload a video entry or images related to this topic. For video creation, please contact our Academic Video Service.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1032 2023-02-22 10:36:29 |
2 update references and layout Meta information modification 1032 2023-02-22 10:50:33 |

Video Upload Options

We provide professional Academic Video Service to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Yes No
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Volkmann, N.; Zelenka, C.; Devaraju, A.M.; Brünger, J.; Stracke, J.; Spindler, B.; Kemper, N.; Koch, R. Injury Identification during Turkey Husbandry. Encyclopedia. Available online: https://encyclopedia.pub/entry/41527 (accessed on 17 July 2025).
Volkmann N, Zelenka C, Devaraju AM, Brünger J, Stracke J, Spindler B, et al. Injury Identification during Turkey Husbandry. Encyclopedia. Available at: https://encyclopedia.pub/entry/41527. Accessed July 17, 2025.
Volkmann, Nina, Claudius Zelenka, Archana Malavalli Devaraju, Johannes Brünger, Jenny Stracke, Birgit Spindler, Nicole Kemper, Reinhard Koch. "Injury Identification during Turkey Husbandry" Encyclopedia, https://encyclopedia.pub/entry/41527 (accessed July 17, 2025).
Volkmann, N., Zelenka, C., Devaraju, A.M., Brünger, J., Stracke, J., Spindler, B., Kemper, N., & Koch, R. (2023, February 22). Injury Identification during Turkey Husbandry. In Encyclopedia. https://encyclopedia.pub/entry/41527
Volkmann, Nina, et al. "Injury Identification during Turkey Husbandry." Encyclopedia. Web. 22 February, 2023.
Injury Identification during Turkey Husbandry
Edit

Injurious pecking against conspecifics is a serious problem in turkey husbandry. Bloody injuries act as a trigger mechanism to induce further pecking, and timely detection and intervention can prevent massive animal welfare impairments and costly losses. Thus, the overarching aim is to develop a camera-based system to monitor the flock and detect injuries using neural networks. In a preliminary study, images of turkeys were annotated by labelling potential injuries. These were used to train a network for injury detection.

turkeys keypoint detection pecking injuries

1. Introduction

Research on farm animal welfare and behavior now utilizes computer vision and deep learning technologies. In the best-case scenario, such approaches can support, simplify, and, above all, accelerate continuous animal observation. Furthermore, implemented real-time monitoring of large animal flocks such as in conventional poultry farming that uses computer vision and machine-learning algorithms can prevent large-scale outbreaks of diseases or behavioral disorders [1]. For example, previous studies regarding poultry farming evaluated behavior [2], lameness [3], feeding [4][5], lighting preferences [6], or movement [7][8] based on new PLF technologies.
Analyzing animal behavior and health should be conducted with minimal human interference and involvement to not unnecessarily affect animals or disturb their natural behavior. Computer vision is a proven and non-invasive technology for video and image data collection [9]. Computer vision tasks can use pose estimation, which provides important behavioral information. Pose estimation can be described as follows: individual objects are abstracted into keypoints, i.e., spatial locations of interest such as body parts or joints. These keypoints are built into skeletons, and poses are finally estimated on them. To enhance the recognition precision, additional markers can be placed on the studied animal, although this method could distract it and could be very expensive depending on the number of individuals [10]. Alternatively, modern approaches for pose estimation of animals are supported by non-invasive vision-based solutions such as keypoint detection (KPD). Thus, keypoints are marked manually on sample images or video frames to form a skeleton model to record an individual animal as well as estimate its pose [11][12].
In turkey husbandry, injurious pecking against conspecifics is a widespread and serious problem in animal welfare [13]. The early detection of the occurrence of this injurious pecking in a turkey flock can avoid serious wounding. Indeed, bloody injuries trigger further pecking behavior [14], and thus an early intervention can prevent an outbreak of this behavioral disorder [15]. One option to support the turkey farmer in monitoring the flock with regard to animal welfare-related issues such as the occurrence of injurious pecking is the use of computer vision systems. In a preliminary study, the foundations were laid for the development of an image-based automated system using a neural network to detect pecking injuries in a turkey flock [16]. A neural network was trained based on the manual annotations of (color) alterations to skin and plumage on images of turkey hens. Various additional work steps were then performed to improve the detection assessment. However, the essential issue in the preliminary study was uncertainty regarding the correct evaluation. This primarily occurred in the case of plumage injuries in which detection was difficult due to shadows, turkeys’ posture, and/or overlapping of the individual animals. In the system developed to date, there was also an increased rate of false positives due to erroneously detected ‘injuries’ on the litter or on stable equipment. To tackle these problems and reduce the false-positive detections in further research, the present study aimed to provide more information and therefore first to train the network in identifying the animal and its body regions (e.g., head, neck, back, tail, and wings) in order to then, in a second step, detect potential injuries on the previously identified animal body.

2. Injury Identification during Turkey Husbandry Using Neural Networks

Analyzing animal behavior via tracking and monitoring has been implemented by different tools including radio-frequency identification (RFID) transponders [17], accelerometers [18], and cameras coupled with image analysis [19]. Videos or images have been analyzed and used for studies on broilers including bodyweight [20][21], health status [22][23], behavior [24], flock movement [25], and locomotion/activity [3][26]. The technology has also been used in the poultry sector for research into butchering [27], carcass and meat monitoring [28][29], and egg quality analysis [30][31].
A recent review on tracking systems for the assessment of farmed poultry stated that computer vison systems can be used for a variety of applications such as checking images for the presence of poultry, classifying the identified animal as sick or absent, determining feeding and watering structures, or for locating the exact position of poultry in an image [32]. The so-called keypoints can offer more detailed information about the body and body parts of a recorded animal. KPD algorithms can locate these areas in isolation, and pose estimation can detect these keypoints and connect their structural information [33]. Thus, these pose estimation models have mainly been used for humans [34][35]. They were also tested on laboratory animals when recorded in a controlled environment, e.g., mice [36], locusts [37], fruit flies [38], and even worms (C. elegans) [39]. However, there are relatively few architectures for recognizing the poses of farm animals such as cows [33][40][41], pigs [42], and broiler chickens [10]. By detecting the different body keypoints and their locations, these tools can offer activity recognition or video surveillance in humans.
Fang et al. [10] combined a pose estimation network of broiler chickens with a classification network to analyze broiler chickens’ behavior. They used the front view of a broiler’s head and the side view of the body to construct a pose skeleton through the feature points of a chicken and then tracked specific body parts in various behaviors such as eating, resting, or running. Finally, they stated that their research provided an appropriate non-invasive method to analyze chicken behavior. More recently, Doornweerd et al. [43] reported the performance of an animal pose estimation network: the network was investigated, trained, and tested on multi-species data from broilers as well as from turkeys. They collected data for pose and gait assessments and evaluated a multi-species model to reduce the required dataset and finally annotation needs. Doornweerd et al. [43] recorded the turkeys walking along a corridor, and they used a view from behind the animals, paying particular attention to their locomotion system. They then defined eight key points (head, neck, right and left knee, right and left hock, and right and left foot).

References

  1. Zhuang, X.; Bi, M.; Guo, J.; Wu, S.; Zhang, T. Development of an early warning algorithm to detect sick broilers. Comput. Electron. Agric. 2018, 144, 102–113.
  2. Youssef, A.; Exadaktylos, V.; Berckmans, D.A. Towards real-time control of chicken activity in a ventilated chamber. Biosyst. Eng. 2015, 135, 31–43.
  3. Aydin, A. Development of an early detection system for lameness of broilers using computer vision. Comput. Electron. Agric. 2017, 136, 140–146.
  4. Aydin, A.; Berckmans, D. Using sound technology to automatically detect the short-term feeding behaviours of broiler chickens. Comput. Electron. Agric. 2016, 121, 25–31.
  5. Li, G.; Zhao, Y.; Purswell, J.L.; Du, Q.; Chesser, G.D.; Lowe, J.W. Analysis of feeding and drinking behaviors of group-reared broilers via image processing. Comput. Electron. Agric. 2020, 175, 105596.
  6. Li, G.; Li, B.; Shi, Z.; Zhao, Y.; Ma, H. Design and evaluation of a lighting preference test system for laying hens. Comput. Electron. Agric. 2018, 147, 118–125.
  7. Stadig, L.M.; Rodenburg, T.B.; Ampe, B.; Reubens, B.; Tuyttens, F.A.M. An automated positioning system for monitoring chickens’ location: Effects of wearing a backpack on behaviour, leg health and production. Appl. Anim. Behav. Sci. 2018, 198, 83–88.
  8. Li, G.; Hui, X.; Chen, Z.; Chesser, G.; Zhao, Y. Development and evaluation of a method to detect broilers continuously walking around feeder as an indication of restricted feeding behaviors. Comput. Electron. Agric. 2021, 181, 105982.
  9. Leroy, T.; Vranken, E.; Van Brecht, A.; Struelens, E.; Sonck, B.; Berckmans, D. A computer vision method for on-line behavioral quantification of individually caged poultry. Trans. ASABE 2006, 49, 795–802.
  10. Fang, C.; Zhang, T.; Zheng, H.; Huang, J.; Cuan, K. Pose estimation and behavior classification of broiler chickens based on deep neural networks. Comput. Electron. Agric. 2021, 180, 105863.
  11. Psota, E.T.; Schmidt, T.; Mote, B.; Pérez, L.C. Long-term tracking of group-housed livestock using keypoint detection and map estimation for individual animal identification. Sensors 2020, 20, 3670.
  12. Brunger, J.; Gentz, M.; Traulsen, I.; Koch, R. Panoptic segmentation of individual pigs for posture recognition. Sensors 2020, 20, 3710.
  13. Dalton, H.A.; Wood, B.J.; Torrey, S. Injurious pecking in domestic turkeys: Development, causes, and potential solutions. World’s Poult. Sci. J. 2013, 69, 865–876.
  14. Huber-Eicher, B.; Wechsler, B. Feather pecking in domestic chicks: Its relation to dustbathing and foraging. Anim. Behav. 1997, 54, 757–768.
  15. Krautwald-Junghanns, M.-E.; Ellerich, R.; Mitterer-Istyagin, H.; Ludewig, M.; Fehlhaber, K.; Schuster, E.; Berk, J.; Dressel, A.; Petermann, S.; Kruse, W.; et al. Examination of the prevalence of skin injuries in debeaked fattened turkeys. Berl. Munch. Tierarztl. Wochenschr. 2011, 124, 8–16.
  16. Volkmann, N.; Brunger, J.; Stracke, J.; Zelenka, C.; Koch, R.; Kemper, N.; Spindler, B. Learn to train: Improving training data for a neural network to detect pecking injuries in turkeys. Animals 2021, 11, 2655.
  17. Sibanda, T.Z.; Welch, M.; Schneider, D.; Kolakshyapati, M.; Ruhnke, I. Characterising free-range layer flocks using unsupervised cluster analysis. Animals 2020, 10, 855.
  18. Yang, X.; Zhao, Y.; Street, G.M.; Huang, Y.; Filip To, S.D.; Purswell, J.L. Classification of broiler behaviours using triaxial accelerometer and machine learning. Animal 2021, 15, 100269.
  19. Gebhardt-Henrich, S.G.; Stratmann, A.; Dawkins, M.S. Groups and individuals: Optical flow patterns of broiler chicken flocks are correlated with the behavior of individual birds. Animals 2021, 11, 568.
  20. Mollah, M.B.R.; Hasan, M.A.; Salam, M.A.; Ali, M.A. Digital image analysis to estimate the live weight of broiler. Comput. Electron. Agric. 2010, 72, 48–52.
  21. Mortensen, A.K.; Lisouski, P.; Ahrendt, P. Weight prediction of broiler chickens using 3D computer vision. Comput. Electron. Agric. 2016, 123, 319–326.
  22. Okinda, C.; Lu, M.; Liu, L.; Nyalala, I.; Muneri, C.; Wang, J.; Zhang, H.; Shen, M. A machine vision system for early detection and prediction of sick birds: A broiler chicken model. Biosyst. Eng. 2019, 188, 229–242.
  23. Zhuang, X.; Zhang, T. Detection of sick broilers by digital image processing and deep learning. Biosyst. Eng. 2019, 179, 106–116.
  24. Pereira, D.F.; Miyamoto, B.C.B.; Maia, G.D.N.; Tatiana Sales, G.; Magalhães, M.M.; Gates, R.S. Machine vision to identify broiler breeder behavior. Comput. Electron. Agric. 2013, 99, 194–199.
  25. Neves, D.P.; Mehdizadeh, S.A.; Tscharke, M.; Nääs, I.d.A.; Banhazi, T.M. Detection of flock movement and behaviour of broiler chickens at different feeders using image analysis. Inf. Process. Agric. 2015, 2, 177–182.
  26. Van Hertem, T.; Norton, T.; Berckmans, D.; Vranken, E. Predicting broiler gait scores from activity monitoring and flock data. Biosyst. Eng. 2018, 173, 93–102.
  27. Ye, C.W.; Yousaf, K.; Qi, C.; Liu, C.; Chen, K.J. Broiler stunned state detection based on an improved fast region-based convolutional neural network algorithm. Poult Sci 2020, 99, 637–646.
  28. Chmiel, M.; Słowiński, M.; Dasiewicz, K. Application of computer vision systems for estimation of fat content in poultry meat. Food Control. 2011, 22, 1424–1427.
  29. Geronimo, B.C.; Mastelini, S.M.; Carvalho, R.H.; Barbon Júnior, S.; Barbin, D.F.; Shimokomaki, M.; Ida, E.I. Computer vision system and near-infrared spectroscopy for identification and classification of chicken with wooden breast, and physicochemical and technological characterization. Infrared Phys. Technol. 2019, 96, 303–310.
  30. Alon, A.S. An image processing approach of multiple eggs’ quality inspection. Int. J. Adv. Trends Comput. Sci. Eng. 2019, 8, 2794–2799.
  31. Narin, B.; Buntan, S.; Chumuang, N.; Ketcham, M. Crack on Eggshell Detection System Based on Image Processing Technique. In Proceedings of the 18th International Symposium on Communications and Information Technologies, Bangkok, Thailand, 26–29 September 2018; pp. 1–6.
  32. Neethirajan, S. Automated tracking systems for the assessment of farmed poultry. Animals 2022, 12, 232.
  33. Liu, H.; Reibman, A.R.; Boerman, J.P. Video analytic system for detecting cow structure. Comput. Electron. Agric. 2020, 178, 105761.
  34. Zhang, J.; Chen, Z.; Tao, D. Towards high performance human keypoint detection. Int. J. Comput. Vis. 2021, 129, 2639–2662.
  35. Hong, F.; Lu, C.; Liu, C.; Liu, R.; Jiang, W.; Ju, W.; Wang, T. PGNet: Pipeline guidance for human key-point detection. Entropy 2020, 22, 369.
  36. Pereira, T.D.; Aldarondo, D.E.; Willmore, L.; Kislin, M.; Wang, S.S.H.; Murthy, M.; Shaevitz, J.W. Fast animal pose estimation using deep neural networks. Nat. Methods 2019, 16, 117–125.
  37. Graving, J.M.; Chae, D.; Naik, H.; Li, L.; Koger, B.; Costelloe, B.R.; Couzin, I.D. DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning. Elife 2019, 8, e47994.
  38. Günel, S.; Rhodin, H.; Morales, D.; Campagnolo, J.; Ramdya, P.; Fua, P. DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. ELife 2019, 8, e48571.
  39. Hebert, L.; Ahamed, T.; Costa, A.C.; O’Shaughnessy, L.; Stephens, G.J. WormPose: Image synthesis and convolutional networks for pose estimation in C. elegans. PLoS Comput. Biol. 2021, 17, e1008914.
  40. Li, X.; Cai, C.; Zhang, R.; Ju, L.; He, J. Deep cascaded convolutional models for cattle pose estimation. Comput. Electron. Agric. 2019, 164, 104885.
  41. Russello, H.; van der Tol, R.; Kootstra, G. T-LEAP: Occlusion-robust pose estimation of walking cows using temporal information. Comput. Electron. Agric. 2022, 192, 106559.
  42. Quddus Khan, A.; Khan, S.; Ullah, M.; Cheikh, F.A. A Bottom-up approach for pig skeleton extraction using RGB data. In Proceedings of the International Conference on Image and Signal Processing, Marrakesh, Morocco, 4–6 June 2020; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2020; pp. 54–61.
  43. Doornweerd, J.E.; Kootstra, G.; Veerkamp, R.F.; Ellen, E.D.; van der Eijk, J.A.J.; van de Straat, T.; Bouwman, A.C. Across-species pose estimation in poultry based on images using deep learning. Front. Anim. Sci. 2021, 2, 791290.
More
Upload a video for this entry
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , , , ,
View Times: 508
Revisions: 2 times (View History)
Update Date: 22 Feb 2023
1000/1000
Hot Most Recent
Academic Video Service