Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1093 2023-12-28 13:47:31 |
2 format change + 2 word(s) 1095 2023-12-29 01:48:07 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Povlsen, P.; Bruhn, D.; Durdevic, P.; Arroyo, D.O.; Pertoldi, C. Real-Time Automatic Drone Surveillance and Wildlife Monitoring. Encyclopedia. Available online: https://encyclopedia.pub/entry/53224 (accessed on 18 May 2024).
Povlsen P, Bruhn D, Durdevic P, Arroyo DO, Pertoldi C. Real-Time Automatic Drone Surveillance and Wildlife Monitoring. Encyclopedia. Available at: https://encyclopedia.pub/entry/53224. Accessed May 18, 2024.
Povlsen, Peter, Dan Bruhn, Petar Durdevic, Daniel Ortiz Arroyo, Cino Pertoldi. "Real-Time Automatic Drone Surveillance and Wildlife Monitoring" Encyclopedia, https://encyclopedia.pub/entry/53224 (accessed May 18, 2024).
Povlsen, P., Bruhn, D., Durdevic, P., Arroyo, D.O., & Pertoldi, C. (2023, December 28). Real-Time Automatic Drone Surveillance and Wildlife Monitoring. In Encyclopedia. https://encyclopedia.pub/entry/53224
Povlsen, Peter, et al. "Real-Time Automatic Drone Surveillance and Wildlife Monitoring." Encyclopedia. Web. 28 December, 2023.
Real-Time Automatic Drone Surveillance and Wildlife Monitoring
Edit

Wildlife monitoring can be time-consuming and expensive, but the fast-developing technologies of uncrewed aerial vehicles, sensors, and machine learning pave the way for automated monitoring.

wildlife monitoring uncrewed aerial systems UAV

1. Introduction

The use of aerial drones for wildlife monitoring has increased exponentially in the past decade [1][2][3][4][5][6][7]. These drones, also known as uncrewed aerial vehicles (UAVs), unmanned aerial systems (UASs), and remotely piloted aircraft systems (RPASs), can carry a variety of sensors, including high-resolution visible-light-cameras (RGB) and thermal infrared (TI) cameras. As the technologies advance and the price of these drones and sensors drops, they become more accessible to conservation biologists, wildlife managers, and other professionals working with wildlife monitoring [2][3][4][5]. The prospects of drones in wildlife monitoring have already been proven to save time, create better imagery and spatial data for especially cryptic and nocturnal animals [8][9], and reduce the risks and hazards for the observer [10][11]. However, the methods are still in the early stages, and need further development to be truly superior and cost-saving compared to traditional monitoring methods. Automatic detection is pivotal for this development, and computer vision is likely to be the solution [1].

2. Automatic Detection and Computer Vision

Over the past decade, artificial intelligence has led to significant progress in the domain of computer vision, automating image and video analysis tasks. Among computer vision methods, Convolutional Neural Networks (CNNs) are particularly promising for future advances in automating wildlife monitoring [6][12][13][14][15][16][17][18]. Corcoran et al. [3] concluded that when implementing automatic detection, fixed-winged drones with RGB sensors were ideal for detecting larger animals in open terrain, whereas, for small, elusive animals in more complex habitats, multi-rotor systems with infrared (IR) or thermal infrared sensors are the better choice, especially when monitoring cryptic and nocturnal animals. It was also noted that there is a knowledge gap in understanding the effects of the chosen drone platforms, sensors, and survey design on the false positive detections made by the trained models, thereby potentially overestimating [3].

3. You-Only-Look-Once-Based UAV Technology

A popular and open-source group of CNNs is the YOLO (You Only Look Once) object detection and image segmentation models, with several iterations and active development [14][19][20][21][22], and a technology cross-fusion with drones has already been proposed as YOLO-Based UAV Technology (YBUT) [6]. The advantages of the YOLO models are that they are fast [8], making it possible to perform object detection in real-time on live footage, and that they are relatively user-friendly and intuitive, making the models approachable to non-computer scientists. By using the Python programming language, it is more accessible for custom development and customization. This makes it possible to implement it in external hardware so that, for example, object detection can be carried out in real-time onboard a drone. Object detection and tracking of cars and persons are already integrated into several unmanned aerial systems, such as the DJI Matrice 300RTK [23], but customization of these systems is limited. The YOLO framework and YBUT show potential for active community development [6][24]. Examples of this are architectures based on YOLOv5 that improve the model’s ability to detect minutely small objects in drone imagery [12][25], improved infrared image object detection network, YOLO-FIRI [26], and improved YOLOv5 framework to detect wildlife in dense spatial distribution [17].

4. Mean Average Precision

When training neural networks, here called models, one of the main parameters for explaining the performance of a model is the mean average precision (mAP) [27]. This is a metric used to evaluate the performance of a model when predicting bounding boxes at different confidence levels, and thereby measure the precision of the trained model in comparison to other models applied to the same test dataset. A training dataset may be a collection of manually annotated images divided into a set for the training itself, a validation set, and a testing set, also known as dataset splitting [27]. The validation set is used to detect the overfitting of a trained model and the test set is used to evaluate its performance on an unseen dataset. Mean average precision (mAP) consists of several parameters: precision, recall, and intersection over union (IOU) [18][27]. The precision of a model (calculated as the number of true positives divided by the sum of true and false positives generated by the model), describes the proportion of positive predictions that are correct. The precision of a model does, however, not take the false negatives into account. The recall of a model, calculated as the number of true positives divided by the sum of true positives and false negatives, describes how many of the true positives the model correctly detects. This means that there is a trade-off between precision and recall. Detection becomes less precise when making more predictions at a lower confidence level, which in return gives a higher recall. Precision–recall curves visualize how the precision of the model behaves when changing the selected confidence threshold. The IOU measures how much overlap there is between the bounding box on an image from the test dataset, manually annotated, and a bounding box annotated by the trained model, on the same image. Therefore, the IOU gives a proportion of how much of the object of the specified class and how much of the surroundings are included in the detection. mAP curves are the mean of the precision–recall curve for all classes and for all IOU thresholds for each class, so it both takes into consideration the number of false negatives and false positives, as well as how precise the bounding boxes are drawn around the object for detection [18][27].
Povlsen et al. [28] flew in predetermined flight paths at 60 m altitude with a DJI Mavic 2 Enterprise Advanced with the thermal camera pointing directly down (90°), covering the transects that were simultaneously surveyed, monitoring hare, deer, and fox. Using transect counting, it was possible to spot roughly the same number of animals as the traditional ground-based spotlight count [28]. However, this method covered a relatively small area per flight, and required post-processing of the captured imagery, still making it time-consuming. In the present research, the researchers tried a slightly different approach by manually piloting the UAV continuously, using the scouring method which also had been shown to match and potentially surpass the traditional spotlight method [9]. By scouring the area with the camera angled at about 45°, the researchers attained better situational awareness and covered a larger area per flight. This approach does require some experience from the drone pilot [24], both in piloting the drone and camera and in spotting animals in thermal imagery, but, as the researchers will show, there is a potential in automating this approach using machine learning (ML) to improve post-processing efficiency and possibly even collect data in real-time automatically while the drone is airborne.

References

  1. Linchant, J.; Lisein, J.; Semeki, J.; Lejeune, P.; Vermeulen, C. Are unmanned aircraft systems (UASs) the future of wildlife monitoring? A review of accomplishments and challenges. Mammal. Rev. 2015, 45, 239–252.
  2. Lyu, X.; Li, X.; Dang, D.; Dou, H.; Wang, K.; Lou, A. Unmanned Aerial Vehicle (UAV) Remote Sensing in Grassland Ecosystem Monitoring: A Systematic Review. Remote Sens. 2022, 14, 1096.
  3. Corcoran, E.; Winsen, M.; Sudholz, A.; Hamilton, G. Automated detection of wildlife using drones: Synthesis, opportunities and constraints. Methods Ecol. Evol. 2021, 12, 1103–1114.
  4. Petso, T.; Jamisola, R.S.; Mpoeleng, D. Review on methods used for wildlife species and individual identification. Eur. J. Wildl. Res. 2022, 68, 3.
  5. Robinson, J.M.; Harrison, P.A.; Mavoa, S.; Breed, M.F. Existing and emerging uses of drones in restoration ecology. Methods Ecol. Evol. 2022, 13, 1899–1911.
  6. Chen, C.; Zheng, Z.; Xu, T.; Guo, S.; Feng, S.; Yao, W.; Lan, Y. YOLO-Based UAV Technology: A Review of the Research and Its Applications. Drones 2023, 7, 190.
  7. Tomljanovic, K.; Kolar, A.; Duka, A.; Franjevic, M.; Jurjevic, L.; Matak, I.; Ugarkovic, D.; Balenovic, I. Application of UAS for Monitoring of Forest Ecosystems—A Review of Experience and Knowledge. Croat. J. For. Eng. 2022, 43, 487–504.
  8. Psiroukis, V.; Malounas, I.; Mylonas, N.; Grivakis, K.; Fountas, S.; Hadjigeorgiou, I. Monitoring of free-range rabbits using aerial thermal imaging. Smart Agric. Technol. 2021, 1, 100002.
  9. Povlsen, P.; Bruhn, D.; Pertoldi, C.; Pagh, S. A Novel Scouring Method to Monitor Nocturnal Mammals Using Uncrewed Aerial Vehicles and Thermal Cameras—A Comparison to Line Transect Spotlight Counts. Drones 2023, 7, 661.
  10. Chrétien, L.; Théau, J.; Ménard, P. Wildlife multispecies remote sensing using visible and thermal infrared imagery acquired from an unmanned aerial vehicle (UAV). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-1/W4, 241–248.
  11. Beaver, J.T.; Baldwin, R.W.; Messinger, M.; Newbolt, C.H.; Ditchkoff, S.S.; Silman, M.R. Evaluating the Use of Drones Equipped with Thermal Sensors as an Effective Method for Estimating Wildlife. Wildl. Soc. Bull. 2020, 44, 434–443.
  12. Baidya, R.; Jeong, H. YOLOv5 with ConvMixer Prediction Heads for Precise Object Detection in Drone Imagery. Sensors 2022, 22, 8424.
  13. Zhang, M.; Gao, F.; Yang, W.; Zhang, H. Wildlife Object Detection Method Applying Segmentation Gradient Flow and Feature Dimensionality Reduction. Electronics 2023, 12, 377.
  14. Winsen, M.; Denman, S.; Corcoran, E.; Hamilton, G. Automated Detection of Koalas with Deep Learning Ensembles. Remote Sens. 2022, 14, 2432.
  15. Rominger, K.R.; Meyer, S.E. Drones, Deep Learning, and Endangered Plants: A Method for Population-Level Census Using Image Analysis. Drones 2021, 5, 126.
  16. Tan, M.; Chao, W.; Cheng, J.; Zhou, M.; Ma, Y.; Jiang, X.; Ge, J.; Yu, L.; Feng, L. Animal Detection and Classification from Camera Trap Images Using Different Mainstream Object Detection Architectures. Animals 2022, 12, 1976.
  17. Pei, Y.; Xu, L.; Zheng, B. Improved YOLOv5 for Dense Wildlife Object Detection; Deng, W., Feng, J., Huang, D., Kan, M., Sun, Z., Zheng, F., Wang, W., He, Z., Eds.; Springer Nature Switzerland: Cham, Switzerland, 2022; pp. 569–578.
  18. Eikelboom, J.A.J.; Wind, J.; van de Ven, E.; Kenana, L.M.; Schroder, B.; de Knegt, H.J.; van Langevelde, F.; Prins, H.H.T. Improving the precision and accuracy of animal population estimates with aerial image object detection. Methods Ecol. Evol. 2019, 10, 1875–1887.
  19. Ultralytics/YOLOv5. Available online: https://github.com/ultralytics/yolov5 (accessed on 27 April 2023).
  20. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767.
  21. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788.
  22. Ultralytics.com. Available online: https://docs.ultralytics.com/ (accessed on 28 April 2023).
  23. DJI Matrice 300RTK. Available online: https://www.dji.com/dk/matrice-300 (accessed on 27 April 2023).
  24. Whitworth, A.; Pinto, C.; Ortiz, J.; Flatt, E.; Silman, M. Flight speed and time of day heavily influence rainforest canopy wildlife counts from drone-mounted thermal camera surveys. Biodivers Conserv. 2022, 31, 3179–3195.
  25. Dai, W.; Wang, H.; Song, Y.; Xin, Y. Wildlife small object detection based on enhanced network in ecological surveillance. In Proceedings of the 33rd Chinese Control and Decision Conference (CCDC), Kunming, China, 22–24 May 2021.
  26. Li, S.; Li, Y.; Li, Y.; Li, M.; Xu, X. YOLO-FIRI: Improved YOLOv5 for Infrared Image Object Detection. IEEE 2021, 9, 141861–141875.
  27. Roboflow.com. Available online: https://help.roboflow.com/ (accessed on 27 April 2023).
  28. Povlsen, P.; Linder, A.C.; Larsen, H.L.; Durdevic, P.; Arroyo, D.O.; Bruhn, D.; Pertoldi, C.; Pagh, S. Using Drones with Thermal Imaging to Estimate Population Counts of European Hare (Lepus europaeus) in Denmark. Drones 2023, 7, 5.
More
Information
Subjects: Ecology
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , ,
View Times: 205
Revisions: 2 times (View History)
Update Date: 29 Dec 2023
1000/1000