Vision-Based Flying Obstacle Detection for Avoiding Midair Collisions: Comparison
Please note this is a comparison between Version 2 by Alfred Zheng and Version 1 by Antonio Fernández-Caballero.

See and avoid is a basic procedure that pilots must learn and apply during flight. Various technologies have been introduced to avoid midair collisions, but accidents still occur because they are neither mandatory in all airspaces nor suitable for all aircraft.

  • midair collision
  • obstacle detection
  • computer vision

1. Introduction

Although the sky may seem big enough for two flying vehicles to collide, the facts show that midair collisions still occur from time to time and are a major concern for aviation safety authorities. As a preventative measure, pilots are instructed to keep one eye out of the cockpit, scan the sky for potential threats, and be prepared to maneuver to avoid a potential accident [1,2][1][2]. However, this “see-and-avoid” rule has several important limitations. First, the pilot cannot look outside all the time, as he also has to check the instruments inside the cockpit, which measure, for example, air speed or engine temperature. The time spent looking inwards and outwards must therefore be perfectly balanced. Pilots who spend most of their time looking at the instruments, and this is especially true of novice pilots, endanger the aircraft by ignoring other elements circulating around them.
The “80–20” rule suggests that pilots should spend no less than 80% of their time looking out and no more than 20% of their time checking instruments. The 80 does not just refer to looking for other traffic, as the pilot also looks for visual cues used for navigation. In any case, even if a pilot or crew member could spend 100% of their time scanning the sky, this would not mean that no threat could escape the human eye. In fact, the fraction of our visual field that allows us to detect anything in the sky is extremely small. Therefore, for practical scanning, pilots are also instructed to follow a pattern, dividing the horizon into regions and taking a moment (1–2 s) to focus before moving on to the next region. Hence, the horizon is divided into nine regions; the pilot’s eye scans one ninth at a time. In other words, at least 89% of the horizon remains unattended at all times. This gives a clear idea of the chances of a threat escaping the human eye, especially when you consider that a light aircraft, such as a 9-meter-wingspan Piper PA-28 Cherokee, approaching head-on at 90 knots on a collision course, 5 seconds before impact, looks no bigger than a sparrow [3]. To make matters worse, the performance of the human eye can be reduced by cloud cover, glare from the sun, fatigue, and many other factors.
With today’s technologies, such as SSR (secondary surveillance radar), transponders, TCAS (traffic collision avoidance system) and, more recently, ADS-B (automatic dependent surveillance-broadcast), one might think that midair collisions should no longer occur. However, they do, because these technologies are not mandatory in all countries, airspaces, or aircraft. They are also fallible, because human factors still cause accidents. The new century has also brought a new player onto the scene. Since 2005, the use of unmanned aerial vehicles (UAVs) or drones in commercial applications has grown exponentially [4[4][5],5], increasing the need for safe path planning [6] and obstacle avoidance [7]. New commercial applications are not without risk, potentially causing damage and disrupting other aerial activities [8]. In 2017, a DJI Phantom 4 collided with a US Army UH-60M helicopter near Coney Island, New York. This is only the first documented case of a UAV collision with a manned aircraft, but the number of UAV sightings by pilots has increased dramatically in recent years. Aircraft collision avoidance is therefore a challenging problem due to the stochastic environment and uncertainty about the intentions of other aircraft [9].
For these reasons, and particularly in the case of manned and unmanned light aircraft in uncontrolled airspace and at aerodromes, various safety agencies and pilot associations are encouraging pilots and UAV users to install some form of electronic conspicuity (EC) device on their vehicles to be more aware of nearby aircraft [10,11,12][10][11][12]. An example of such EC technology is FLARM (FLight alARM, https://flarm.com/ accessed on 11 September 2023). The FLARM predicts potential conflicts and alerts the pilot with visual and audible warnings [13]. This device was originally designed for gliders, which are slower than light aircraft. The main limitation of these devices is compatibility, as a FLARM can only display air traffic that uses another matching FLARM. Incompatibility occurs, for example, when the communication solution is different due to the use of different frequencies (the US version of FLARM devices uses a different band than the European one) or different protocols (a FLARM device that has not been updated for a year is not compatible with the latest version of the protocol and will automatically shut down). In addition, some devices are active, i.e., they transmit and share their position with others, while others are passive, i.e., they listen to the transmissions of others but remain invisible to them (e.g., many portable air traffic detectors only listen to the transponders of other aircraft). In this “Tower of Babel” scenario, when communications fail or are absent, pilots should rely not solely on their eyes to detect threats, but on an artificial eye capable of scanning the sky faster, farther, wider, sharper, and more consistently.
The most used solution for preventing accidents is the use of radars. There are many types of radar used in aviation, but the most important is the primary surveillance radar (PSR). PSR detects the position and altitude of an aircraft by measuring the time it takes for radar waves to bounce off an aircraft and return to the radar antenna. These radar systems can detect both transponder-equipped and non-transponder-equipped aircraft. PSR is not perfect and has its limitations. PSR cannot identify the detected obstacle, and the required equipment is expensive [14]. Computer vision solutions have the advantage in this subject because the equipment is relatively cheap, and depending on the implementation, the solution can identify the incoming obstacle. Nevertheless, computer vision effectiveness can be impacted by light conditions. That is the reason why some researchers try to combine both approaches like [15] to overcome each approach’s limitations.

2. Vision-Based Flying Obstacle Detection for Avoiding Midair Collisions

The issue has attracted attention in recent years due to several factors. First, the availability of nonmilitary drones to the general public since 2006 [37,38][16][17] poses several challenges. The possibility of drones colliding not only with other unmanned aerial vehicles in the air but also with manned and passenger aircraft is a major concern today. In addition, the market offers a wide catalog of drones with distinctive features, such as integrated cameras, global positioning system devices, wireless connectivity, accelerometers, and altimeters. This easy access to drones also allows researchers to test different solutions without major risks [39][18].

2.1. Computer Vision

The second factor is computer vision. Computer vision began in the early 1970s as a way to mimic human vision, and it has continued to grow and improve ever since [40][19]. Thanks to many advances, computer vision libraries require less computing power to run their algorithms, making it more feasible for researchers to move their solutions to lower-end hardware [41][20]. Finally, single-board computers give researchers the ability to test on the fly without the need for heavy equipment [42][21]. A single-board computer is a single PC board with a processor, memory, and some form of I/O that allows it to function as a computer [43][22]. The size and weight of single-board computers make them perfect for mounting on a drone or light aircraft without affecting its performance in flight. In the mid-1970s, the “dyna-micro” was one of the first true single-board computers. The next major development in single-board computers came in 2008 from BeagleBoard.org, which created a low-cost, open-source development board called the BeagleBoard [43][22]. According to the reviewed papers, the detection of moving obstacles with computer vision starts with the images provided by a camera or a group of cameras. Computer vision cannot be accurate without obtaining good images with the best possible resolution. In the included studies, the majority of publications used a single monocular camera (71.76%). Papers using stereo cameras represent 17.64% of the publications. This is probably due to the fact that applications using stereo cameras are computationally expensive compared with those using monocular cameras [44][23]. The captured images must then be processed using computer algorithms to detect possible obstacles. The most commonly used vision recognition process identified in the papers was object detection. Object detection involves identifying a specific object in successive images. The perfect object detection algorithm must be fast and accurate. Object detection can be complemented by object tracking, which uses information from the previous frame to track objects and is faster than object detection [45][24]. Grouping the selected papers by the method used to test the collision avoidance algorithm shows that almost half of the publications (47.05%) use only computer simulations to verify and validate their solutions. Using these simulations is cheaper and safer than using a manned or unmanned aircraft. It is safer because it avoids the risks of testing a collision avoidance algorithm in real flight, where accidents can occur and have costly—or even fatal—consequences, especially in the early stages of development when the solution is less mature and more prone to error.

2.2. Testing Tools

Researchers prefer a controlled environment to carry out tests and improve their solutions before real-world trials take place. Matlab is a programming and numerical computing platform. Matlab was released to the public in 1984. Matlab can be used for data analysis, data visualization, algorithm development, and more [46][25]. Matlab’s ease of use and large database of built-in algorithms make it one of the preferred methods for testing algorithms. However, since 2018, authors prefer to use flight simulators (5 papers from 2018 to 2021), Gazebo (4 papers from 2019 (Acropolis) to 2021 (Fortress)), and Blender v2.91 (1 paper in 2020). New improvements and increased realism in off-the-shelf flight simulators may be the reason for authors to switch to this software, in particular FlightGear v2020.3.18 and X-Plane v12.0. FlightGear is a free, open-source, multiplatform flight simulator. The first version was released in 1997. FlightGear was used in two papers described at RQ6, immediately after the launch of a major release, FlightGear 2.0, in 2010. X-Plane is a flight simulator available since 1995. In 2017, the engine received a major update (v11.0), providing greater graphical realism [47,48][26][27]. The recent release of the popular Microsoft Flight Simulator (v1.31.22.0), after many years since the last update, may make it another option to consider in future releases. The use of real drones as a testing method for validating collision avoidance algorithms began in 2012. The use of drones helps researchers create a more realistic yet controlled testing environment, reducing interference when assessing the effectiveness of an algorithm. Although the Federal Aviation Administration issued a commercial drone permit in 2006, drones were not widely available at the time [49][28]. It was not until 2010 that the company Parrot released Parrot AR.Drone 1.0 [50][29]. This drone was a huge commercial success, selling over half a million units. Parrot AR.Drone 2.0 was released in 2012 [51][30]. In 2013, the Chinese company DJI launched the first camera-equipped drone called Phantom 1 [52][31]. In 2018, the same company launched the DJI Mavic Air. This drone was designed to be portable, powerful, and accessible to enthusiasts of all levels. More interestingly, the DJI Mavic Air added an obstacle avoidance system for safety in the air [53][32]. Unfortunately, the authors could not find any details on the obstacle avoidance system used by DJI. It is noteworthy that only four publications reported the use of a manned aircraft in the tests. As discussed above, the use of simulators does not incur the cost of using real vehicles and reduces the hazards. Small quadrotor UAVs are also affordable and have a very low cost compared with manned aircraft. However, it should be noted that the solutions tested on quadrotor UAVs may not be directly applicable to light-manned aircraft or even larger drones due to factors such as vehicle speed, flight altitude, and weather conditions, to name a few.

2.3. Obstacles and Future Work

The algorithms, solutions, and proposals described in the articles included in this systematic review are not yet free from shortcomings that should be addressed in the next revisions, which represents an opportunity for future work and developments in this field. Some problems are related to errors, inaccuracies, and false positives ([P7], [P24], [P41], [P42]). For example, [P42] reports 90% hits in near scenarios but 40% false alarms in far ones, which also shows the importance of testing in different scenarios to realize the limitations of a proposal. Indeed, many authors are willing to test their solutions in additional scenarios, especially in more complex, crowded, and noisy environments, as the next step of their research ([P4], [P7], [P15], [P25], [P31], [P39], [P59], [P61]). Simulation software plays an important role here. For example, [P48] uses the minimalist proprietary DJI flight simulator, but the authors claim that another, more appropriate simulation software would be needed to further improve their algorithms. A ground-based video dataset, such as the one presented in [P32], may be the solution for evaluating video-based detection and avoidance systems. However, beyond more realistic simulations and video datasets, many authors would like to extend their test to the real world, i.e., to perform real flight tests with their solution embedded in a real manned or unmanned aircraft ([P4], [P8], [P18], [P27], [P48], [P59], [P60]). Real tests reveal other problems related to vehicle speed ([P11], [P30], [P61]) or energy consumption ([P48]). For example, in [P11], a single 2D camera is used to generate a 3D model of the scene, but the lower is the speed, the less information is obtained. On the contrary, in [P30], the authors point out that the faster the drone moves, the more blurred the video becomes, which reduces the accuracy of the algorithm. The limited field of view of the cameras is another problem ([P25]). Some authors propose to address this in future work using additional cameras ([P61]) or other sensors, such as a depth sensor ([P60]), laser rangefinder ([P11]), or global positioning system (GPS) ([P18]). For example, in [P18], the authors plan to use GPS and stereo vision to determine the positions of both the vehicle and the obstacles in real tests. The tracking of obstacles and their direction and speed is another problem to be solved ([P8], [P10], [P18], [P25], [P30], [P41]). In particular, distance estimation is a real challenge ([P41]). The correct spatial positioning of an obstacle is key for avoidance algorithms ([P8], [P27], [P33], [P41], [P54]), where further research is needed to improve maneuvering, minimize effort, and reduce deviation after avoidance. Finally, one paper proposed to investigate collision avoidance approaches for multiple intruders ([P39]). For manned aircraft, one problem with detecting and avoiding obstacles in the air is how to warn pilots of a potential threat. Again, preparing test scenarios with manned aircraft is expensive, and people believe that technologies such as virtual reality and augmented reality would help. Such technologies have grown considerably in recent years [54,55][33][34]. For example, immersive virtual reality could be combined with flight simulator software to reproduce test scenarios for prototyping and testing warning solutions. Augmented reality could be used to simulate approaching threats in flight and could also lead to new ways of alerting and guiding pilots to avoid a collision. It is worth noting that these technologies were included as keywords in the search, but no matching results were found. People believe that they are promising technologies that should be explored in future approaches.

3. Conclusions

See and avoid is a basic procedure that pilots must learn and apply during flight. Various technologies have been introduced to avoid midair collisions, but accidents still occur because they are neither mandatory in all airspaces nor suitable for all aircraft. Technology can also fail and human error can occur, as was sadly demonstrated in 1986 when a Piper Cherokee light aircraft collided with the vertical stabilizer of a DC-9 airliner over Los Angeles International Airport (California) because the Piper did not have a Mode C transponder, which would have alerted others of its presence. In addition, neither pilot appeared to have seen the other aircraft. Computer vision can assist pilots in this task, leading to solutions that match or exceed the performance of the human eye in this task, complementing the pilot’s functions, or being used in autonomous systems to avoid obstacles and other flying vehicles.

References

  1. Federal Aviation Administration. How to Avoid a Mid Air Collision—P-8740-51. 2021. Available online: https://www.faasafety.gov/gslac/ALC/libview_normal.aspx?id=6851 (accessed on 11 September 2023).
  2. Federal Aviation Administration. Airplane Flying Handbook, FAA-H-8083-3B; Federal Aviation Administration, United States Department of Transportation: Oklahoma, OK, USA, 2016.
  3. UK Airprox Board. When every second counts. Airprox Saf. Mag. 2017, 2017, 2–3.
  4. Akbari, Y.; Almaadeed, N.; Al-maadeed, S.; Elharrouss, O. Applications, databases and open computer vision research from drone videos and images: A survey. Artif. Intell. Rev. 2021, 54, 3887–3938.
  5. Yang, X.; Wei, P. Autonomous Free Flight Operations in Urban Air Mobility with Computational Guidance and Collision Avoidance. IEEE Trans. Intell. Transp. Syst. 2021, 22, 5962–5975.
  6. Jiang, Y.; Wu, Q.; Zhang, G.; Zhu, S.; Xing, W. A diversified group teaching optimization algorithm with segment-based fitness strategy for unmanned aerial vehicle route planning. Expert Syst. Appl. 2021, 185, 115690.
  7. Shin, S.Y.; Kang, Y.W.; Kim, Y.G. Reward-driven U-Net training for obstacle avoidance drone. Expert Syst. Appl. 2020, 143, 113064.
  8. Ghasri, M.; Maghrebi, M. Factors affecting unmanned aerial vehicles’ safety: A post-occurrence exploratory data analysis of drones’ accidents and incidents in Australia. Saf. Sci. 2021, 139, 105273.
  9. Bertram, J.; Wei, P.; Zambreno, J. A Fast Markov Decision Process-Based Algorithm for Collision Avoidance in Urban Air Mobility. IEEE Trans. Intell. Transp. Syst. 2022, 23, 15420–15433.
  10. Srivastava, A.; Prakash, J. Internet of Low-Altitude UAVs (IoLoUA): A methodical modeling on integration of Internet of “Things” with “UAV” possibilities and tests. Artif. Intell. Rev. 2023, 56, 2279–2324.
  11. Jenie, Y.I.; van Kampen, E.J.; Ellerbroek, J.; Hoekstra, J.M. Safety Assessment of a UAV CD&R System in High Density Airspace Using Monte Carlo Simulations. IEEE Trans. Intell. Transp. Syst. 2018, 19, 2686–2695.
  12. Uzochukwu, S. I can see clearly now. Microlight Fly. Mag. 2019, 2019, 22–24.
  13. Šimák, V.; Škultéty, F. Real time light-sport aircraft tracking using SRD860 band. Transp. Res. Procedia 2020, 51, 271–282.
  14. Vabre, P. Air Traffic Services Surveillance Systems, Including an Explanation of Primary and Secondary Radar. Victoria, Australia: The Airways Museum & Civil Aviation Historical Society. 2009. Available online: http://www.airwaysmuseum.comSurveillance.htm (accessed on 12 July 2009).
  15. Vitiello, F.; Causa, F.; Opromolla, R.; Fasano, G. Detection and tracking of non-cooperative flying obstacles using low SWaP radar and optical sensors: An experimental analysis. In Proceedings of the 2022 International Conference on Unmanned Aircraft Systems (ICUAS), Dubrovnik, Croatia, 21–24 June 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 157–166.
  16. Kellermann, R.; Biehle, T.; Fischer, L. Drones for parcel and passenger transportation: A literature review. Transp. Res. Interdiscip. Perspect. 2020, 4, 100088.
  17. Kindervater, K.H. The emergence of lethal surveillance: Watching and killing in the history of drone technology. Secur. Dialogue 2016, 47, 223–238.
  18. Hussein, M.; Nouacer, R.; Corradi, F.; Ouhammou, Y.; Villar, E.; Tieri, C.; Castiñeira, R. Key technologies for safe and autonomous drones. Microprocess. Microsyst. 2021, 87, 104348.
  19. Ortmeyer, C. Computer Vision: Algorithms and Applications; Springer: London, UK, 2011.
  20. Feng, X.; Jiang, Y.; Yang, X.; Du, M.; Li, X. Computer vision algorithms and hardware implementations: A survey. Integration 2019, 69, 309–320.
  21. Chamola, V.; Kotesh, P.; Agarwal, A.; Gupta, N.; Guizani, M. A Comprehensive Review of Unmanned Aerial Vehicle Attacks and Neutralization Techniques. Ad Hoc Netw. 2021, 111, 102324.
  22. Kindervater, K.H. Then and now: A brief history of single board computers. Electron. Des. Uncovered 2014, 6, 1–11.
  23. Fernández-Caballero, A.; López, M.T.; Saiz-Valverde, S. Dynamic stereoscopic selective visual attention (DSSVA): Integrating motion and shape with depth in video segmentation. Expert Syst. Appl. 2008, 34, 1394–1402.
  24. Joshi, P.; Escrivá, D.; Godoy, V. OpenCV by Example: Enhance Your Understanding of Computer Vision and Image Processing by Developing Real-World Projects in OpenCV 3; Packt Publishing: Birmingham, UK, 2016.
  25. Moler, C.; Little, J. A History of MATLAB. Proc. ACM Program. Lang. 2020, 4, 81.
  26. Yu, L.; He, G.; Zhao, S.; Wang, X.; Shen, L. Design and implementation of a hardware-in-the-loop simulation system for a tilt trirotor UAV. J. Adv. Transp. 2020, 2020, 4305742.
  27. Kumar, A.; Yoon, S.; Kumar, V.R.S. Mixed reality simulation of high-endurance unmanned aerial vehicle with dual-head electromagnetic propulsion devices for earth and other planetary explorations. Appl. Sci. 2020, 10, 3736.
  28. Dronethusiast. The History of Drones (Drone History Timeline from 1849 to 2019). 2019. Available online: https://www.dronethusiast.com/history-of-drones/ (accessed on 11 September 2023).
  29. Dormehl, L. The History of Drones in 10 Milestones. 2018. Available online: https://www.digitaltrends.com/cool-tech/history-of-drones/ (accessed on 11 September 2023).
  30. Pollicino, J. Parrot Unveils AR.Drone 2.0 with 720p HD Camera, Autonomous Video-Recording, We Go Hands-On. 2012. Available online: https://www.engadget.com/2012-01-08-parrot-unveils-ar-drone-2-0-with-720p-hd-camera-autonomous-vide.html (accessed on 11 September 2023).
  31. DJI. Phantom. 2021. Available online: https://www.dji.com/es/phantom (accessed on 11 September 2023).
  32. DrDrone.ca. Timeline of DJI Drones: From the Phantom 1 to the Mavic Air. 2018. Available online: https://www.drdrone.ca/blogs/drone-news-drone-help-blog/timeline-of-dji-drones (accessed on 11 September 2023).
  33. Grand View Research. Augmented Reality Market Size, Share & Trends Analysis Report By Component, By Display (HMD & Smart Glass, HUD, Handheld Devices), By Application, By Region, And Segment Forecasts, 2021–2028. 2021. Available online: https://www.grandviewresearch.com/industry-analysis/augmented-reality-market (accessed on 11 September 2023).
  34. Grand View Research. Virtual Reality Market Size, Share & Trends Analysis Report by Technology (Semi & Fully Immersive, Non-immersive), By Device (HMD, GTD), by Component (Hardware, Software), by Application, and Segment Forecasts, 2021–2028. 2021. Available online: https://www.grandviewresearch.com/industry-analysis/virtual-reality-vr-market (accessed on 11 September 2023).
More
Video Production Service