Livestock Management by Unmanned Aerial Vehicles: Comparison
Please note this is a comparison between Version 3 by Beatrix Zheng and Version 2 by Beatrix Zheng.

Unmanned Aerial Vehicles (UAVs) can revolutionize livestock herding and management. As a result, there is an increasing scientific interest in using UAVs to manage livestock. UAVs can be used to control livestock grazing areas and remote sensing of these animals

  • livestock management
  • FANET operations
  • grazing field
  • area surveillance

1. Introduction

In recent years, most governments have defined livestock grazing areas/fields or paths, with the effectiveness of most herding operations relying on the extraction of information from aerial pictures [1]. In a farm context, unmanned air vehicles (UAVs) are typically operated at human height or a low to medium altitude, giving them the same or wider angles of view as humans. As established by the regulatory body, grazing field mapping can be used for UAV operations due to the similarity of viewpoint with people at eye level. However, some computational and scientific issues are unique to UAV operations. One of these issues is the minimum time coverage of ground regions utilizing a group of unmanned air vehicles (UAVs)UAVs dubbed Flying Adhoc Network (FANET) outfitted with image sensors [2]. As a result, this research proposes a way for using UAVs to cover and sense ground areas. It will concentrate on practical issues arising only during vehicle deployment. The number of UAVs utilized in the task, for example, is determined by the area’s size and layout, the UAV’s maximum flight time and, more crucially, the time required to prepare and launch the UAV. Figure 1 depicts the conceptual diagram. The operation of multiple UAVs to achieve adequate coverage is dependent on the mode of communication amongst the UAVs. It is critical to ensure that the communication efficiency among sensor-equipped devices is at the highest level to accomplish a flawless operation of UAV-based livestock management [3]. These UAVs’ communication is typically affected by the speed of the UAVs within the grazing field. Nonetheless, with the high speed of UAVs, a high coverage area is said to be achieved because the UAVs are moving at a faster speed.
Figure 1. Conceptual Framework.
Some studies have used aerial pictures (satellite imagery) and UAV-based communication systems to examine farm animal tracking and monitoring for applications such as grazing distribution monitoring, pasture usage, domestic livestock management, and livestock behavior. Grazing distribution monitoring has attracted increased attention, intending to manage individual animals through continuous and real-time monitoring of their perimeter coverage during grazing as well as their health, welfare, output, and environmental effects. The data collected during the livestock monitoring procedure contributes to the long-term viability of the agroecosystem by providing herders with timely and reliable information about how their animals behave on the farm [4]. Nonetheless, it is known that satellite usage is applicable for animal tracking and monitoring through space-retrieved imagery that can analyze cattle productivity on different pastures. However, the use of drones is justified when the angle matters. Some uses of drones for livestock management include spotting trespassing hunters, farming illegal activities, and controlling herder operations.

2. Livestock Management on Grazing Field

Livestock management in widespread production systems may be thought-provoking, particularly in huge regions. The use of UAVs to gather images from the region of concern is rapidly becoming a feasible alternative. Nonetheless, proper processes for extracting pertinent data from the images are still rare. Conventionally, the recognition of livestock through a UAV is centered on a simple process involving a video recording of the pasture where the livestock is spotted and calculated manually by a human viewer. This practice was found to be valuable in the detection and counting of cattle in a quarantine environment [5]. However, this method is manual and always involves the presence of a human observer. The authors in the work of [3][6][7] present the parameters and key limitations, current regulations, potential requirements, and challenges for operating UAVs in smart agriculture. To computerize the livestock detection and counting method, the number of livestock was added up in the video frame by detaching its image from the background and applying thresholding on each image frame of the video sequence [8][9][10]. The ability of Unmanned Aircraft Systems (UAS) overflights for cattle surveillance was assessed in [11]. The information attained from the Unmanned Aircraft System (UAS) image was used to model the cattle distribution, and the outcomes were related to bio-logged cattle. The prospect of using UAV video surveillance to predict the food intake of non-nursing beef cows was examined where the cow feeding pattern was determined from the processed video files. The results suggest that UAV surveillance could be vital in monitoring cow feeding behavior [12]. The work of [13] suggested a general smart video surveillance system and studied some glitches in cow performance analysis using an intelligent image-based monitoring system framework with a hidden Markov process. while the authors in the work of [14] suggested a new robotic animal herding system centered on a system of independent barking drones. Such a system aims to substitute old herding approaches so that an enormous amount of farm animals can be speedily collected from a sparse status and then driven to a selected place. Vayssade et al. propose a method to process images taken by a commercial drone to automate the tracking of animal activities using a combination of thresholding and supervised classification methods [15]. Jung et al. use a Proportional Integral Derivative (PID) controller on four quadrotor UAVs to guide four animals into their pen within the minimum time by creating noises of predators modeled with an exponential function to provide a solution to the cattle roundup problem [10].
The geographical proximity used to examine grazing behavior and social structure is a critical indicator of performance in cow behavior. As such, Mufford et al. developed a competent way to compute the spatial proximity of beef cattle, employing UAV-based image acquirement and photogrammetric analysis. Still-frames pulled out from the UAV video screenshots were used to produce Orth mosaics, revealing that groups of correlated sets were nearer than non-correlated ones [16]. Sun et al. offered a real-world method that used UAVs and verified its use at a distinctive household pasture to examine the hourly spatial distribution of each yak [17]. Favier et al. explored the use of UAVs to detect and round up sheep by developing a prototype controlled from a laptop base station running on LABVIEW [18]. Jasper Brown et al. discovered the relationship between object detector performance and spatial degradation for livestock. Factual data was established using focus drone images and then down sampled to various ground sample distances (GSDs).
Apart from the simple method of livestock detection, advancements in Artificial Intelligence (AI) and Machine Learning (ML) have allowed researchers to detect livestock using a pre-trained Convolutional Neural Network (CNN)-based architecture. For instance, an adjusted version of R-CNN was employed to identify and count livestock on a grazing field [19]. In this method, a selective search algorithm was used to generate a region proposal and then applied a CNN to extract the features in the region, which were then later classified using a Support Vector Machine (SVM). The confidence value was obtained by applying it to a linear bounding box. Kate et al. scrutinized twelve sheep’s habits and normal responses to a drone to fit mathematical models of shepherding to the new dimension. The model targets make it realistic for AI to advance the independence of farmers in shepherding above the ground [20]. Barbedo et al. proposed a cattle counting scheme incorporating a deep learning model for the rough animal position and color space manipulation to increase the contrast between animals and the background [21]. Jayme Garcia Arnal Barbedo et al. investigated the prospect of utilizing an incline angle to increase the expanse of the acquired image in cattle monitoring. This feature was realized by creating a model for animal detection using Deep Convolutional Neural Networks. Fact findings show that oblique images can be magnificently utilized in some circumstances, but certain real restrictions must be solved for the method to be attractive [22]. Andrew et al. recommend a deep neural network method for livestock recognition employing UAVs with onboard deep learning inference [23]. Soares et al. suggested a technique for identifying and counting cattle, in above-the-ground images acquired by UAVs, based on CNNs and a graph-based optimization technique to remove detected duplicated images [24]. Shao et al. propose a cattle detection and counting system based on CNNs using aerial images taken by a UAV [25].
Owning that the future routing protocols depend on the nature of the communication link, the works in [26][27] present the design of an interface protocol for an indoor Flying Ad-hoc Network-specific routing protocol, using light fidelity as a communication link. The focus was to achieve high throughput. However, practical routing problems encountered during UAV operations are not considered. To solve practical routing problems encountered during UAV operations, ref. [28] proposed an optimized solution for the problem of minimum time coverage of ground areas using multiple UAVs with image sensors. This is achieved by determining the geographic coordinates a single UAV would cover in a minimum time and then formulating mixed-integer linear programming to route the UAVs over the geographic area. The UAVs required to cover a particular area could then be selected. However, the UAV routing strategy does not consider possible collisions among the UAVs. Hence, to avoid link breakage during information transfer, the communication path between multiple UAVs is optimized to improve communication links in flying Adhoc networks using smell agent optimization and the Particle Swarm Optimization (PSO) algorithm [29].

References

  1. Alanezi, M.A.; Shahriar, M.S.; Hasan, M.B.; Ahmed, S.; Sha’aban, Y.A.; Bouchekara, H.R. Livestock Management with Unmanned Aerial Vehicles: A Review. IEEE Access 2022, 10, 45001–45028.
  2. Nintanavongsa, P.; Pitimon, I. Impact of sensor mobility on UAV-based smart farm communications. In Proceedings of the IEEE Proceedings International Electrical Engineering Congress (iEECON), Pattaya, Thailand, 8–10 March 2017.
  3. Maddikunta, P.K.R.; Hakak, S.; Alazab, M.; Bhattacharya, S.; Gadekallu, T.R.; Khan, W.Z.; Pham, Q.V. Unmanned aerial vehicles in smart agriculture: Applications, requirements, and challenges. IEEE Sens. J. 2021, 21, 17608–17619.
  4. Li, X.; Xing, L. Use of unmanned aerial vehicles for livestock monitoring based on streaming K-means clustering. IFAC Pap. Online 2019, 52, 324–329.
  5. Goolsby, J.; Jung, J.; Landivar, J.; McCutcheon, W.; Lacewell, R.; Duhaime, R.; Schwartz, A. Evaluation of unmanned aerial vehicles (UAVs) for detection of cattle in the cattle fever tick permanent quarantine zone. Subtrop. Agric. Environ. 2016, 67, 24–27.
  6. Hogan, S.D.; Kelly, M.; Stark, B.; Chen, Y. Unmanned aerial systems for agriculture and natural resources. Calif. Agric. 2017, 71, 5–14.
  7. Herlin, A.; Brunberg, E.; Hultgren, J.; Högberg, N.; Rydberg, A.; Skarin, A. Animal welfare implications of digital tools for monitoring and management of cattle and sheep on pasture. Animals 2021, 11, 829.
  8. Al-Thani, N.; Albuainain, A.; Alnaimi, F.; Zorba, N. Drones for sheep livestock monitoring. In Proceedings of the 2020 IEEE 20th Mediterranean Electrotechnical Conference (MELECON), Palermo, Italy, 6 June 2020; pp. 672–676.
  9. Sarwar, F.; Griffin, A.; Periasamy, P.; Portas, K.; Law, J. Detecting and counting sheep with a convolutional neural network. In Proceedings of the 2018 15th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Auckland, New Zealand, 27–30 November 2018; pp. 1–6.
  10. Vayssade, J.A.; Arquet, R.; Bonneau, M. Automatic activity tracking of goats using drone camera. Comput. Electron. Agric. 2019, 162, 767–772.
  11. Mulero-Pazmany, M.; Barasona, J.A.; Acevedo, P.; Vicente, J.; Negro, J.J. Unmanned Aircraft Systems complement biologging in spatial ecology studies. Ecol. Evol. 2015, 5, 4808–4818.
  12. Nyamuryekung’e, S.; Cibils, A.F.; Estell, R.E.; Gonzalez, A.L. Use of an unmanned aerial vehicle− mounted video camera to assess feeding behavior of Raramuri Criollo cows. Rangel. Ecol. Manag. 2016, 69, 386–389.
  13. Mufford, J.T.; Hill, D.J.; Flood, N.J.; Church, J.S. Use of unmanned aerial vehicles (UAVs) and photogrammetric image analysis to quantify spatial proximity in beef cattle. J. Unmanned Veh. Syst. 2019, 7, 194–206.
  14. Zin, T.T.; Kobayashi, I.; Tin, P.; Hama, H. A general video surveillance framework for animal behavior analysis. In Proceedings of the Third International Conference on Computing Measurement Control and Sensor Network (CMCSN), Matsue, Japan, 20–22 May 2016; pp. 130–133.
  15. Li, X.; Huang, H.; Savkin, A.V.; Zhang, J. Robotic Herding of Farm Animals Using a Network of Barking Aerial Drones. Drones 2022, 6, 29.
  16. Jung, S.; Ariyur, K.B. Strategic cattle roundup using multiple quadrotor UAVs. Int. J. Aeronaut. Space Sci. 2017, 18, 315–326.
  17. Sun, Y.; Yi, S.; Hou, F.; Luo, D.; Hu, J.; Zhou, Z. Quantifying the dynamics of livestock distribution by unmanned aerial vehicles (UAVs): A case study of yak grazing at the household scale. Rangel. Ecol. Manag. 2020, 73, 642–648.
  18. Favier, M.A.; Green, R.; Linz, A. The potential for UAV technology to assist in sheep management in the Scottish Highlands. Bornimer Agrartech. Ber. 2013, 81, 209–222.
  19. Brown, J.; Qiao, Y.; Clark, C.; Lomax, S.; Rafique, K.; Sukkarieh, S. Automated aerial animal detection when spatial resolution conditions are varied. Comput. Electron. Agric. 2022, 193, 106689.
  20. Yaxley, K.J.; Joiner, K.F.; Abbass, H. Drone approach parameters leading to lower stress sheep flocking and movement: Sky shepherding. Sci. Rep. 2021, 11, 7803.
  21. Barbedo, J.G.A.; Koenigkan, L.V.; Santos, P.M.; Ribeiro, A.R.B. Counting cattle in UAV images—dealing with clustered animals and animal/background contrast changes. Sensors 2020, 20, 2126.
  22. Barbedo, J.G.A.; Koenigkan, L.V.; Santos, P.M. Cattle detection using oblique UAV images. Drones 2020, 4, 75.
  23. Andrew, W.; Greatwood, C.; Burghardt, T. Aerial animal biometrics: Individual Friesian cattle recovery and visual identification via an autonomous UAV with onboard deep inference. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 237–243.
  24. Soares, V.H.A.; Ponti, M.A.; Gonçalves, R.A.; Campello, R.J. Cattle counting in the wild with geolocated aerial images in large pasture areas. Comput. Electron. Agric. 2021, 189, 106354.
  25. Shao, W.; Kawakami, R.; Yoshihashi, R.; You, S.; Kawase, H.; Naemura, T. Cattle detection and counting in UAV images based on convolutional neural networks. Int. J. Remote Sens. 2020, 41, 31–52.
  26. Sadiq, B.O.; Adedokun, A.E.; Mu’azu, M.B.; Sha’aban, Y.A. A Specific Routing Protocol for Flying Adhoc Network. Telkomnika 2018, 16, 606–617.
  27. Sadiq, B.O.; Salawudeen, A.T.; Sha’aban, Y.A.; Adedokun, E.A.; Mu’Azu, M.B. Interface protocol design: A communication guide for indoor FANET. Telecommun. Comput. Electron. Control. 2019, 17, 3175–3182.
  28. Avellar, G.S.; Pereira, G.A.; Pimenta, L.C.; Iscold, P. Multi-UAV routing for area coverage and remote sensing with minimum time. Sensors 2015, 15, 27783–27803.
  29. Sadiq, B.O.; Salawudeen, A.T. FANET optimization: A destination path flow model. Int. J. Electr. Comput. Eng. 2020, 10, 4381–4389.
More