Unmanned Aerial Vehicle Remote Sensing: Comparison
Please note this is a comparison between Version 2 by Sirius Huang and Version 1 by zhengxin zhang.

Unmanned aerial vehicle (UAV) remote sensing has been widely used in agriculture, forestry, mining, and other industries. UAVs can be flexibly equipped with various sensors, such as optical, infrared, and LIDAR, and become an essential remote sensing observation platform. Based on UAV remote sensing, researchers can obtain many high-resolution images, with each pixel being a centimeter or millimeter. 

  • UAV
  • remote sensing
  • land applications
  • UAV imagery

1. Introduction

Since the 1960s, Earth observation satellites have garnered significant attention from both military [1,2][1][2] and civilian [3,4,5][3][4][5] sectors, due to their unique high-altitude observation ability, enabling simultaneous monitoring of a wide range of ground targets. Since the 1970s, several countries have launched numerous Earth observation satellites, such as NASA’s Landsat [6] series; ESA’s SPOT [7] series; and commercial satellites such as IKONOS [8], QuickBird, and the WorldView series, generating an enormous volume of remote sensing data. These satellites have facilitated the development of several generations of remote sensing image analysis methods, including remote sensing index methods [9[9][10][11][12][13][14][15],10,11,12,13,14,15], object-oriented analysis methods (OBIA) [16,17[16][17][18][19][20][21][22],18,19,20,21,22], and deep neural network methods [23,24,25,26,27][23][24][25][26][27] in recent years, all of which rely on the multi-spectral and high-resolution images generated by these remote sensing satellites.
From the 1980s onward, remote sensing research had mainly been based on satellite data. Due to the cost of satellite launches, there were only a few remote sensing satellites available for a long time, and most satellite images required high costs to obtain limited data, except for a few satellites such as the Landsat series that were partially free. This also affected the direction of remote sensing research. During this period, many remote sensing index methods based on ground target spectral characteristics mainly used free Landsat satellite data. Other satellite data were less used, due to their high purchase costs.
Beside the high cost and lack of supply, remote sensing satellite data acquisition is also constrained by several factors that affect the observation ability and direction of research:
  • The observation ability of a remote sensing satellite is determined by its cameras. A satellite can only carry one or two cameras as sensors, and these cameras cannot be replaced once the satellite has been launched. Therefore, the observation performance of a satellite cannot be improved in its lifetime;
  • Remote sensing satellites can only observe targets when flying over the adjacent area above the target and along the satellite’s orbit, which limits the ability to observe targets from a specific angle;
In Equation (1), NIR refers to the measured intensity of reflected light in the near-infrared spectral range (700∼800 nm), while Red refers to the measured intensity of reflected light in the red spectral range (600∼700 nm). The NDVI index is used to measure vegetation density, as living green plants, algae, cyanobacteria, and other photosynthetic autotrophs absorb red and blue light but reflect near-infrared light. Thus, vegetation-rich areas have higher NDVI values. After the launch of the Landsat-1 satellite in 1972, multi-spectral scanner system (MSS) sensors that can independently observe the ground reflected light according to the frequency range became a research hot spot data source. When dealing with the problem of spring vegetation greening and subsequent degradation in the Great Plains of the Central United States, the studied regional latitude differences are large, so NVDI [9] was proposed as a spectral index method that is not sensitive to changes of latitude and solar zenith angle. The NDVI index ranges from 0.3 to 0.8 in densely vegetated areas, and the NDVI value range is negative for cloud- and snow-covered areas; for a water body, the NDVI value is close to 0; for bare soil, the NDVI value is a small positive value. In addition to the vegetation index, other common remote sensing indices include the normalized difference water index (NDWI) [12], enhanced vegetation index (EVI) [11], leaf area index (LAI) [43][30], modified soil adjusted vegetation index (MSAVI) [13], soil adjusted vegetation index (SAVI) [14], and other remote sensing index methods. These methods measure the spectral radiation intensity of blue light, green light, red light, red edge, near-infrared, and other object reflection bands. Table 1 presents a comparison between the multi-spectral cameras of UAVs and the multi-spectral sensors of satellites. One notable difference is that a UAV’s multi-spectral camera has a specific narrow band known as the “red edge” [44][31], which is not present in many satellites’ multi-spectral sensors. This band has a wavelength range of 680 nm to 730 nm, transitioning from the visible light frequencies easily absorbed by plants to the infrared band largely reflected by plant cells. From a spectral perspective, this band represents an area where the reflectance of sunlight of plants changes significantly. A few satellites, such as the European Space Agency(ESA)’s Sentinel-2, have data available in this band. Research on satellite data has revealed a correlation between leaf area index (LAI) [43][30] and this band [45,46,47][32][33][34]. LAI [43][30] is a crucial variable in predicting photosynthetic productivity and evapotranspiration. Another significant difference between UAV multi-spectral cameras and satellite sensors is the advantage of UAVs’ multi-spectral cameras in spatial resolution. UAV multi-spectral cameras can reach centimeter/pixel spatial resolution, which is currently unattainable by satellite sensors. Centimeter-resolution multi-spectral images have many applications in precision agriculture.
Table 1.
Parameters of UAV multi-spectral cameras and several satellite multi-spectral sensors.
These constraints not only limit the scope of remote sensing research but also affect research directions. For instance, land cover/land use is a important aspect of remote sensing research. However, the research object of land cover/land use is limited by the spatial resolution of remote sensing image data. The current panchromatic cameras carried by remote sensing satellites have a resolution of 31 cm/pixel, which can only identify the type, location, and outline information of ground targets with a 3 m [28] size or more, such as buildings, roads, trees, ships, cars, etc. Ground objects with smaller sized aerial projections, such as people, animals, bicycles, etc., cannot be distinguished from the images, due to the relatively large pixel size. Similarly, change detection, which compares different information in images taken of the same target in two or more periods, is another example. Since the data used in many research articles are images taken by the same remote sensing satellite at different times along its orbit and at the same spatial location, the observation angles and spatial resolution of these images are similar, making them suitable for pixel-by-pixel information comparison methods. Hence, change detection has become a key direction in remote sensing research since the 1980s.
In the past decade, the emergence of multi-rotor unmanned aerial vehicles (UAV) has gradually changed the above-mentioned limitations in remote sensing research. This type of unmanned aircraft is pilotless, consumes no fuel, and does not require maintenance of turboshaft engines. These multi-copters are equipped with cheap but reliable brushless motors, which only require a small amount of electricity per flight. Users can schedule the entire flight process of a multi-copter, from takeoff to landing, and edit flight parameters such as passing points, flight speed, acceleration, and climbing rate. Compared to human-crewed aircraft such as helicopters and small fixed-wing aircraft, multi-rotor drones are more stable and reliable, and have several advantages for remote sensing applications.

2. UAV Platforms and Sensors

The hardware of a UAV remote sensing platform consists of two parts: the flight platform of the drone, and the sensors they are equipped with. Compared to remote sensing satellites, one of the most significant advantages of UAV remote sensing is the flexible replacement of sensors, which allows researchers to use the same drone to study the properties and characteristics of different objects by using different types of sensors. Figure 1 shows this sections’ structure, including the drone’s flight platform and the different types of sensors carried.
Figure 1.
UAV platforms and sensors.

2.1. UAV Platform

UAVs have been increasingly employed as a remote sensing observation platform for near-ground applications. Multi-rotor, fixed-wing, hybrid UAVs, and unmanned helicopters are the commonly used categories of UAVs. Among these, multi-rotor UAVs have gained the most popularity, owing to their numerous advantages. These UAVs, which come in various configurations, such as four-rotor, six-rotor, and eight-rotor, offer high safety during takeoff and landing and do not require a large airport or runway. They are highly controllable during flight and can easily adjust their flight altitude and speed. Additionally, some multi-rotor UAVs are equipped with obstacle detection abilities, allowing them to stop or bypass obstacles during flight. Figure 2 shows four typical UAV platforms.
Figure 2.
UAV platforms: (
a
) Multi-rotor UAV, (
b
) Fixed-wing UAV, (
c
) Unmanned Helicopter, (
d
) VTOL UAV.
Multi-rotor UAVs utilize multiple rotating propellers powered by brushless motors to control lift. This mechanism enables each rotor to independently and frequently adjust its rotation speed, thereby facilitating quick recovery of flight altitude and attitude in case of disturbances. However, the power efficiency of multi-rotor UAVs is not prominent, and their flight duration is relatively short. Common consumer grade drones, after carefully optimizing their weight and power, have a duration of about 30 min; for example, DJI’s Mavic Pro has a flight range of 27 min, Mavic 2 has a range of 31 min, and Mavic Air 2 has a range of 34 min. Despite these limitations, multi-rotor UAVs have been extensively used as remote sensing data acquisition platforms in the reviewed literature. Fixed-wing UAVs, which are similar in structure to common aircraft, generate lift force from the upper and lower air pressure generated by their fixed wings during forward movement. These UAVs require a runway for takeoff and landing, and their landing process is more challenging to control than that of multi-rotor UAVs. The stable flight of fixed-wing UAVs necessitates that the wings provide more lift than the weight of the aircraft, requiring the UAV to maintain a certain minimum speed throughout its flight. Consequently, these UAVs cannot hover, and their response to rising or falling airflow is limited. While the flight speed of fixed-wing UAVs is superior to that of multi-rotor UAVs, their flight duration is also longer. Unmanned helicopters, which have a structure similar to helicopters, employ a large rotor to provide lift and a tail rotor to control direction. These UAVs possess excellent power efficiency and flight duration, but their mechanical blade structure is complex, leading to high vibrations and costs. Nonetheless, limited research work on using unmanned helicopters as a remote sensing platform was reported in the reviewed literature. Hybrid UAVs, also known as vertical take-off and landing (VTOL), combine the features of both multi-rotor and fixed-wing UAVs. These UAVs take off and land in multi-rotor mode and fly in fixed-wing mode, providing the advantages of easy control during takeoff and landing and energy-saving during flight.

2.2. Sensors Carried by UAVs

UAVs have been widely utilized as a platform for remote sensing, and the sensors carried by these aircraft play a critical role in data acquisition. Among the sensors commonly used by multi-rotor UAVs, there are two main categories: imagery sensors and three-dimensional information sensors. In addition to the two types of sensor that are commonly used, other types of sensors carried by drones include gas sensors, air particle sensors, small radars, etc. Figure 3 shows four typical UAV-carried sensors.
Figure 3.
Sensors carried by UAVs: (
a
) RGB Camera, (
b
) Multi-spectral Camera, (
c
) Hyper-spectral Camera, (
d
) LIDAR.
Imagery sensors capture images of the observation targets and can be further classified into several types. RGB cameras capture images in the visible spectrum and are commonly used for vegetation mapping, land use classification, and environmental monitoring. Multi-spectral/hyper-spectral cameras capture images in multiple spectral bands, enabling the identification of specific features such as vegetation species, water quality, and mineral distribution. Thermal imagers capture infrared radiation emitted by the targets, making it possible to identify temperature differences and detect heat anomalies. These sensors can provide high-quality imagery data for various remote sensing applications. In addition to imagery sensors, multi-rotor UAVs can also carry three-dimensional information sensors. These sensors are relatively new and have been developed in recent years with the advancement of simultaneous localization and mapping (SLAM) technology. LIDAR sensors use laser beams to measure the distance between the UAV and the target, enabling the creation of high-precision three-dimensional maps. Millimeter wave radar sensors use electromagnetic waves to measure the distance and velocity of the targets, making them suitable for applications that require long-range and all-weather sensing. Multi-camera arrays capture images from different angles, allowing the creation of 3D models of the observation targets. These sensors can provide rich spatial information, enabling the analysis of terrain elevation, structure, and volume.

2.2.1. RGB Cameras

RGB cameras are a prevalent remote sensing sensor among UAVs, and two types of RGB cameras are commonly used on UAV platforms. The first type is the UAV-integrated camera, which is mounted on the UAV using its gimbal. This camera typically has a resolution of 20 megapixels or higher, such as the 20-megapixel 4/3-inch image sensor integrated into the DJI Mavic 3 aircraft and the 20-megapixel 1-inch image sensor integrated into AUTEL’s EVO II Pro V3 UAV. These cameras can capture high-resolution images at high frame rates, offering the advantages of being lightweight, compact, and having a long endurance. However, they cannot replace the original lens with telephoto and wide-angle lenses, which are required for remote and wide-angle environments. The second type of camera commonly carried by UAVs is a single lens reflex (SLR) camera, which enables the replacement of lenses with different focal lengths. UAVs equipped with SLR cameras offer the advantage of lens flexibility and can be used for remote sensing or wide-angle observation, making them a valuable tool for such applications. Nonetheless, SLR cameras are heavier and require gimbals for installation, necessitating a UAV with sufficient size and load capacity to accommodate them. For example, Liu et al. [42][29] utilized the SONY A7R camera, which provides multiple lens options, including zoom and fixed focus lenses, to produce a high-precision digital elevation model (DEM) in their research.

2.2.2. Multi-Spectral and Hyper-Spectral Camera

Multi-spectral and hyper-spectral cameras are remote sensing instruments that collect the spectral radiation intensity of reflected sunlight at specific wavelengths. A multi-spectral camera is designed to provide data similar to that of multi-spectral remote sensing satellites, allowing for quantitative observation of the radiation intensity of reflected light on ground targets in specific sunlight bands. In processing multi-spectral satellite remote sensing image data, the reflected light intensity data of the same ground target in different spectral bands are used as remote sensing indices, such as the widely used normalized difference vegetation index (NDVI) [9] dimensionless index, which is defined as in Equation (1):
NDVI=NIRRedNIR+RedNDVI=NIR−RedNIR+Red
NDVI = NIR Red NIR + Red

References

  1. Simonett, D.S. Future and Present Needs of Remote Sensing in Geography; Technical Report; 1966. Available online: https://ntrs.nasa.gov/citations/19670031579 (accessed on 23 May 2023).
  2. Hudson, R.; Hudson, J.W. The military applications of remote sensing by infrared. Proc. IEEE 1975, 63, 104–128.
  3. Badgley, P.C. Current Status of NASA’s Natural Resources Program. Exploring Unknown. 1960; p. 226. Available online: https://ntrs.nasa.gov/citations/19670031597 (accessed on 23 May 2023).
  4. Roads, B.O.P. Remote Sensing Applications to Highway Engineering. Public Roads 1968, 35, 28.
  5. Taylor, J.I.; Stingelin, R.W. Infrared imaging for water resources studies. J. Hydraul. Div. 1969, 95, 175–190.
  6. Roy, D.P.; Wulder, M.A.; Loveland, T.R.; Woodcock, C.E.; Allen, R.G.; Anderson, M.C.; Helder, D.; Irons, J.R.; Johnson, D.M.; Kennedy, R.; et al. Landsat-8: Science and product vision for terrestrial global change research. Remote Sens. Environ. 2014, 145, 154–172.
  7. Chevrel, M.; Courtois, M.; Weill, G. The SPOT satellite remote sensing mission. Photogramm. Eng. Remote Sens. 1981, 47, 1163–1171.
  8. Dial, G.; Bowen, H.; Gerlach, F.; Grodecki, J.; Oleszczuk, R. IKONOS satellite, imagery, and products. Remote Sens. Environ. 2003, 88, 23–36.
  9. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.; Deering, D. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; Technical Report; 1973. Available online: https://ntrs.nasa.gov/citations/19740022555 (accessed on 23 May 2023).
  10. Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666.
  11. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213.
  12. Gao, B.C. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266.
  13. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126.
  14. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309.
  15. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107.
  16. Blaschke, T.; Lang, S.; Lorup, E.; Strobl, J.; Zeil, P. Object-oriented image processing in an integrated GIS/remote sensing environment and perspectives for environmental applications. Environ. Inf. Plan. Politics Public 2000, 2, 555–570.
  17. Blaschke, T.; Strobl, J. What’s wrong with pixels? Some recent developments interfacing remote sensing and GIS. Z. Geoinformationssysteme 2001, 12–17. Available online: https://www.researchgate.net/publication/216266284_What’s_wrong_with_pixels_Some_recent_developments_interfacing_remote_sensing_and_GIS (accessed on 23 May 2023).
  18. Schiewe, J. Segmentation of high-resolution remotely sensed data-concepts, applications and problems. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 380–385.
  19. Hay, G.J.; Blaschke, T.; Marceau, D.J.; Bouchard, A. A comparison of three image-object methods for the multiscale analysis of landscape structure. ISPRS J. Photogramm. Remote Sens. 2003, 57, 327–345.
  20. Benz, U.C.; Hofmann, P.; Willhauck, G.; Lingenfelder, I.; Heynen, M. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J. Photogramm. Remote Sens. 2004, 58, 239–258.
  21. Blaschke, T.; Burnett, C.; Pekkarinen, A. New contextual approaches using image segmentation for objectbased classification. In Remote Sensing Image Analysis: Including the Spatial Domain; De Meer, F., de Jong, S., Eds.; 2004; Available online: https://courses.washington.edu/cfr530/GIS200106012.pdf (accessed on 23 May 2023).
  22. Zhan, Q.; Molenaar, M.; Tempfli, K.; Shi, W. Quality assessment for geo-spatial objects derived from remotely sensed data. Int. J. Remote Sens. 2005, 26, 2953–2974.
  23. Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; Proceedings, Part III 18. Springer: Cham, Switzerland, 2015; pp. 234–241.
  24. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. In Proceedings of the IEEE international Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2961–2969.
  25. Hu, J.; Shen, L.; Sun, G. Squeeze-and-excitation networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 7132–7141.
  26. Chen, L.C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 834–848.
  27. Chu, X.; Zheng, A.; Zhang, X.; Sun, J. Detection in crowded scenes: One proposal, multiple predictions. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 12214–12223.
  28. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16.
  29. Liu, Y.; Zheng, X.; Ai, G.; Zhang, Y.; Zuo, Y. Generating a high-precision true digital orthophoto map based on UAV images. ISPRS Int. J. Geo-Inf. 2018, 7, 333.
  30. Watson, D.J. Comparative physiological studies on the growth of field crops: I. Variation in net assimilation rate and leaf area between species and varieties, and within and between years. Ann. Bot. 1947, 11, 41–76.
  31. Seager, S.; Turner, E.L.; Schafer, J.; Ford, E.B. Vegetation’s red edge: A possible spectroscopic biosignature of extraterrestrial plants. Astrobiology 2005, 5, 372–390.
  32. Delegido, J.; Verrelst, J.; Meza, C.; Rivera, J.; Alonso, L.; Moreno, J. A red-edge spectral index for remote sensing estimation of green LAI over agroecosystems. Eur. J. Agron. 2013, 46, 42–52.
  33. Lin, S.; Li, J.; Liu, Q.; Li, L.; Zhao, J.; Yu, W. Evaluating the effectiveness of using vegetation indices based on red-edge reflectance from Sentinel-2 to estimate gross primary productivity. Remote Sens. 2019, 11, 1303.
  34. Imran, H.A.; Gianelle, D.; Rocchini, D.; Dalponte, M.; Martín, M.P.; Sakowska, K.; Wohlfahrt, G.; Vescovo, L. VIS-NIR, red-edge and NIR-shoulder based normalized vegetation indices response to co-varying leaf and Canopy structural traits in heterogeneous grasslands. Remote Sens. 2020, 12, 2254.
  35. Datta, D.; Paul, M.; Murshed, M.; Teng, S.W.; Schmidtke, L. Soil Moisture, Organic Carbon, and Nitrogen Content Prediction with Hyperspectral Data Using Regression Models. Sensors 2022, 22, 7998.
  36. Jackisch, R.; Madriz, Y.; Zimmermann, R.; Pirttijärvi, M.; Saartenoja, A.; Heincke, B.H.; Salmirinne, H.; Kujasalo, J.P.; Andreani, L.; Gloaguen, R. Drone-borne hyperspectral and magnetic data integration: Otanmäki Fe-Ti-V deposit in Finland. Remote Sens. 2019, 11, 2084.
  37. Thiele, S.T.; Bnoulkacem, Z.; Lorenz, S.; Bordenave, A.; Menegoni, N.; Madriz, Y.; Dujoncquoy, E.; Gloaguen, R.; Kenter, J. Mineralogical mapping with accurately corrected shortwave infrared hyperspectral data acquired obliquely from UAVs. Remote Sens. 2021, 14, 5.
  38. Krause, S.; Sanders, T.G.; Mund, J.P.; Greve, K. UAV-based photogrammetric tree height measurement for intensive forest monitoring. Remote Sens. 2019, 11, 758.
  39. Yu, J.W.; Yoon, Y.W.; Baek, W.K.; Jung, H.S. Forest Vertical Structure Mapping Using Two-Seasonal Optic Images and LiDAR DSM Acquired from UAV Platform through Random Forest, XGBoost, and Support Vector Machine Approaches. Remote Sens. 2021, 13, 4282.
  40. Zhang, H.; Bauters, M.; Boeckx, P.; Van Oost, K. Mapping canopy heights in dense tropical forests using low-cost UAV-derived photogrammetric point clouds and machine learning approaches. Remote Sens. 2021, 13, 3777.
  41. Chen, C.; Yang, B.; Song, S.; Peng, X.; Huang, R. Automatic clearance anomaly detection for transmission line corridors utilizing UAV-Borne LIDAR data. Remote Sens. 2018, 10, 613.
  42. Zhang, R.; Yang, B.; Xiao, W.; Liang, F.; Liu, Y.; Wang, Z. Automatic extraction of high-voltage power transmission objects from UAV lidar point clouds. Remote Sens. 2019, 11, 2600.
  43. Alshawabkeh, Y.; Baik, A.; Fallatah, A. As-Textured As-Built BIM Using Sensor Fusion, Zee Ain Historical Village as a Case Study. Remote Sens. 2021, 13, 5135.
  44. Short, N.M. The Landsat Tutorial Workbook: Basics of Satellite Remote Sensing; National Aeronautics and Space Administration, Scientific and Technical Information Branch: Washington, DC, USA, 1982; Volume 1078.
  45. Schowengerdt, R.A. Soft classification and spatial-spectral mixing. In Proceedings of the International Workshop on Soft Computing in Remote Sensing Data Analysis, Milan, Italy, 4–5 December 1995; pp. 4–5.
  46. Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440.
  47. Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495.
  48. Chen, L.C.; Zhu, Y.; Papandreou, G.; Schroff, F.; Adam, H. Encoder-decoder with atrous separable convolution for semantic image segmentation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 801–818.
  49. Wang, X.; Kong, T.; Shen, C.; Jiang, Y.; Li, L. Solo: Segmenting objects by locations. In Proceedings of the Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, 23–28 August 2020; Proceedings, Part XVIII 16. Springer: Cham, Switzerland, 2020; pp. 649–665.
  50. Bolya, D.; Zhou, C.; Xiao, F.; Lee, Y.J. Yolact: Real-time instance segmentation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 9157–9166.
  51. Zhao, G.; Zhang, W.; Peng, Y.; Wu, H.; Wang, Z.; Cheng, L. PEMCNet: An Efficient Multi-Scale Point Feature Fusion Network for 3D LiDAR Point Cloud Classification. Remote Sens. 2021, 13, 4312.
  52. Harvey, W.; Rainwater, C.; Cothren, J. Direct Aerial Visual Geolocalization Using Deep Neural Networks. Remote Sens. 2021, 13, 4017.
  53. Chollet, F. Xception: Deep learning with depthwise separable convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1251–1258.
  54. Zhuang, J.; Dai, M.; Chen, X.; Zheng, E. A Faster and More Effective Cross-View Matching Method of UAV and Satellite Images for UAV Geolocalization. Remote Sens. 2021, 13, 3979.
  55. Chen, B.; Chen, Z.; Deng, L.; Duan, Y.; Zhou, J. Building change detection with RGB-D map generated from UAV images. Neurocomputing 2016, 208, 350–364.
  56. Cook, K.L. An evaluation of the effectiveness of low-cost UAVs and structure from motion for geomorphic change detection. Geomorphology 2017, 278, 195–208.
  57. Mesquita, D.B.; dos Santos, R.F.; Macharet, D.G.; Campos, M.F.; Nascimento, E.R. Fully convolutional siamese autoencoder for change detection in UAV aerial images. IEEE Geosci. Remote Sens. Lett. 2019, 17, 1455–1459.
  58. Hastaoğlu, K.Ö.; Gül, Y.; Poyraz, F.; Kara, B.C. Monitoring 3D areal displacements by a new methodology and software using UAV photogrammetry. Int. J. Appl. Earth Obs. Geoinf. 2019, 83, 101916.
  59. Carrivick, J.L.; Smith, M.W.; Quincey, D.J. Structure from Motion in the Geosciences; John Wiley & Sons: Hoboken, NJ, USA, 2016.
  60. Lucieer, A.; Jong, S.M.d.; Turner, D. Mapping landslide displacements using Structure from Motion (SfM) and image correlation of multi-temporal UAV photography. Prog. Phys. Geogr. 2014, 38, 97–116.
  61. Li, M.; Cheng, D.; Yang, X.; Luo, G.; Liu, N.; Meng, C.; Peng, Q. High precision slope deformation monitoring by uav with industrial photogrammetry. IOP Conf. Ser. Earth Environ. Sci. 2021, 636, 012015.
  62. Han, D.; Lee, S.B.; Song, M.; Cho, J.S. Change detection in unmanned aerial vehicle images for progress monitoring of road construction. Buildings 2021, 11, 150.
  63. Huang, R.; Xu, Y.; Hoegner, L.; Stilla, U. Semantics-aided 3D change detection on construction sites using UAV-based photogrammetric point clouds. Autom. Constr. 2022, 134, 104057.
  64. Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of unmanned aerial vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sens. 2018, 10, 1606.
  65. Rebelo, C.; Nascimento, J. Measurement of Soil Tillage Using UAV High-Resolution 3D Data. Remote Sens. 2021, 13, 4336.
  66. Almeida, A.; Gonçalves, F.; Silva, G.; Mendonça, A.; Gonzaga, M.; Silva, J.; Souza, R.; Leite, I.; Neves, K.; Boeno, M.; et al. Individual Tree Detection and Qualitative Inventory of a Eucalyptus sp. Stand Using UAV Photogrammetry Data. Remote Sens. 2021, 13, 3655.
  67. Hartwig, M.E.; Ribeiro, L.P. Gully evolution assessment from structure-from-motion, southeastern Brazil. Environ. Earth Sci. 2021, 80, 548.
More
Video Production Service