Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 + 3008 word(s) 3008 2021-03-01 10:10:24 |
2 format change Meta information modification 3008 2021-03-15 10:58:45 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Gambella, F. UASs Application in Viticultural Scenarios. Encyclopedia. Available online: https://encyclopedia.pub/entry/7999 (accessed on 21 July 2024).
Gambella F. UASs Application in Viticultural Scenarios. Encyclopedia. Available at: https://encyclopedia.pub/entry/7999. Accessed July 21, 2024.
Gambella, Filippo. "UASs Application in Viticultural Scenarios" Encyclopedia, https://encyclopedia.pub/entry/7999 (accessed July 21, 2024).
Gambella, F. (2021, March 15). UASs Application in Viticultural Scenarios. In Encyclopedia. https://encyclopedia.pub/entry/7999
Gambella, Filippo. "UASs Application in Viticultural Scenarios." Encyclopedia. Web. 15 March, 2021.
UASs Application in Viticultural Scenarios
Edit

New technologies for management, monitoring, and control of spatio-temporal crop variability in precision viticulture scenarios are numerous. Remote sensing relies on sensors able to provide useful data for the improvement of management efficiency and the optimization of inputs. unmanned aerial systems (UASs) are the newest and most versatile tools, characterized by high precision and accuracy, flexibility, and low operating costs. 

UASs Application in Viticultural Scenarios UAS vegetation index 3D vineyard characterization canopy height model precision farming precision viticulture remote sensing sustainability of resources vineyard detection and segmentation

1. Introduction

Precision agriculture concerns the use of multiple technologies to manage the spatial and temporal variability associated with agricultural production, improving crop performance, economic benefits, and environmental quality by limiting the use of pollutants [1][2][3]. In viticulture, precision agriculture techniques are used to improve the efficient use of inputs (e.g., fertilizers and chemicals), yield forecasting, selective harvesting of grape quality, and agree with the real needs (e.g., nutrients and water) of each plot within the vineyard [4]. New technologies have been developed for vineyard management, monitoring, and control of vine growth. Remote and proximal sensors become reliable instruments to disentangle vineyard overall status, essential to describe vineyards’ spatial variability at high resolution and give recommendations to improve management efficiency [5].

In the last decades, the development of aircraft and satellite platform technologies for remote sensing increased the spatial resolution, temporal availability, and capability to describe plants’ biophysical features [6][7]. Aircraft remote sensing campaigns can be planned with greater flexibility, but they are difficult and expensive [8]. Satellite image acquisition of large areas saves a considerable time, but has a low and inadequate resolution for precision viticulture (PV) [9]. Possible cloud cover combined with fixed acquisition times (referring to the time needed for the satellite to complete its orbit and return to the field area) could limit the monitoring process, and not allow early detection during specific phenological phases of the crop. Di Gennaro et al. [10] demonstrate the effectiveness of the spatial resolution provided by satellite imagery, Sentinel-2, on a trellis-shaped viticulture, as demonstrated for other permanent crops. However, due to the discontinuous nature of vine rows, their moderate coverage, soil influences between rows, background and shade, vineyards pose a challenge for remote sensing analysis: remote sensing images should be processed to separate the pixels of the canopy from the background [11].

Among all the remote sensing technologies for spatial and temporal heterogeneity detection, unmanned aerial systems (UASs) are the newest tools and likely the most useful in terms of high accuracy, flexibility, and low operational costs [12]. UASs can cover large rural areas much faster than people scouting on the ground, making it easier and more efficient to detect problems. UASs are often combined with imaging sensors, which allow the acquisition of images at higher spatial resolutions than those offered by satellites. Post-processing techniques combined with machine learning tools evolved to the point that the visual indications contained in an image can be extracted and transformed into useful information for farm management [13]. Poor weather conditions reduce the radiometric quality of the images resulting in less accurate and precise surface reconstruction. Reduced light conditions influence the stability of images’ features and increase errors in photo alignment and point cloud creation [14]. Calibration targets and post-processing techniques help standardize photo light conditions, especially in cloudy sky, low light conditions [15]. UAS remote sensing is a useful option for crop mapping even under cloudy conditions when satellite or airborne remote sensing are inoperable. The remote sensing task currently accounts for the majority of the operations performed with agricultural UASs [16]. In addition to applications involving the use of sensors and the extrapolation of useful information, UASs are applied and are under study for various types of operations, such as crop spraying operations [17][18][19][20][21][22], or combined with wireless sensor network (WSN) ground monitoring systems [23].

2. Some Samples: Unmanned Aerial Systems (UASs) Application in Viticultural Scenarios

2.1. Rows Segmentation and Crop Features Detection Techniques

The detection of intra-vineyard variability for site-specific interventions has always been a priority for PV, allowing grape growers to manage vineyards more efficiently and pursue a better grape production and quality. Satellite technology is not always able to guarantee a proper resolution to detect and differentiate the vine rows’ vegetation contours due to its coarse ground resolution. UASs, rather, show a high potential thanks to the sensor’s high resolution, with a ground sampling distance (GSD) often close to 1 cm. Vegetation indices are taken as useful tools for vegetation characterization, usually obtained by arithmetic spectral band combination [24]. Spectral information usually derives by visible red–green–blue (RGB), multispectral, hyperspectral, and thermal sensors mounted on board UASs [25][26][27]. Many vegetation indices have been used and compared for canopy biophysical estimation, including leaf area index (LAI), productivity, and biomass [28][29]. Matese et al. [30] proved the effectiveness of developed open source/low-cost UAS in real field conditions for vigor areas mapping within vineyards. The RGB images are useful tools for vineyard spatial variability monitoring, which requires an accurate segmentation to extract relevant information. Manual segmentation (e.g., by geographic information system—GIS) of RGB images is laborious, time-consuming, and needs to be improved to consider accuracies of the canopy, the shadow effect, and different soil conditions in inter-rows. Starting from ultra-high-resolution RGB imagery obtained from UAS, C. Poblete-Echeverría et al. [31] presented a vine canopy detection and segmentation approach using four different classifications methods (K-means, artificial neural networks, random forest, and spectral indices). The results showed how the 2G_RBi spectral index (derived by the difference in the divergence of the red and blue bands from the green in the absolute brightness of the channel), complemented with the Otsu method for thresholding [32], was the best option in terms of performance for vine canopy detection. This method was automatic and easy to apply since it does not need specific software to perform the calculations of the indices.

The high-resolution UAS images represent a challenge for classification due to higher intra-class spectral variability. In this spectral variability, object-based image analysis (OBIA) emerged in remote sensing segmentation applications [33][34]. The research carried out by Jimenez-Brenes et al. [35] aimed to develop a rapid mapping technique and obtain management maps to fight against the Cynodon dactylon (a typical vineyard weed). Starting from RGB and red–green-near-infrared (RGNIR) images, the team worked on the optimum spectral vegetation index, which is useful to classify bermudagrass, grapevine, and bare soil areas through an automatic algorithm, and the design of site-specific management maps for weed control. The geometric characteristics of the canopy are used in agriculture as a proxy of pruning, pest effects on crops, or fruit detection [36], but the collection of these data at the field scale is time-consuming and offers uncertain results. Despite the great variety of technologies used to characterize the 3D structures of plants (radar, digital photogrammetric techniques, stereo images, ultrasonic sensors, and light detection and ranging sensors), many of them have aspects that limit their use. Most of them are expensive, and it is challenging to use them in large spatial extents. The novelty of the work from Mesas-Carrascosa et al. [37] lies in the possibility to apply vegetation indices to RGB point clouds for the automatic detection and classification of vegetation and to determine grapevines’ height using the soil points as a reference. This automatic process, without any selected parameter of training, guarantees the lack of errors due to manual intervention in the separation process of the points’ classes.

As mentioned before, the extraction of pure vines pixels (i.e., the pixels that compose the leaf wall of the vines) is indispensable to achieve effective and good quality vineyard maps for site-specific management [38][39]. Comba et al. [40] designed a new methodology, constituted by three main steps based on dynamic segmentation, to identify vine rows from UAS aerial images even in the presence of low illumination, inter-row grassing, trees shadows, or other disturbance elements. The process works without any user intervention, and with a limited number of parameters for the calibration. The information obtained from this approach can be used in PV scenarios to obtain vigor and prescription maps for crop management or inter-row route tracking for unmanned ground vehicles (UGVs). Nolan et al. [41] described an automated algorithm, applied to a high-resolution aerial orthomosaic, for an unsupervised detection and delineation of vine rows. The algorithm takes advantage of “skeletonization” techniques, based on an extraction of a simplified shape (skeleton) of an object, to reduce the complexity of agricultural scenes into a collection of skeletal descriptors. Thanks to a series of geometric and spatial constraints applied to each skeleton, the algorithm accurately identifies and segments each vine row.

Pádua et al. [42] showed a method to automatically estimate and extract Portuguese vineyards’ canopies, combining vegetation indices and digital elevation models (DEM) derived from UAS high-resolution images, to differentiate between vines’ canopies and inter-row vegetation cover. It proved to be an effective method when applied with consumer-grade sensors carried by UASs. Moreover, it also proved to be a fast and efficient way to extract vineyard information, enabling vineyard plots mapping for PV management tasks. In the paper from Cinat et al. [43], three algorithms based on HSV (hue, saturation, value), DEM, and K-means were applied to RGB and RGNIR UAS imagery, to perform unsupervised canopy segmentation without human support over three scenarios derived from two vineyards. The first P18 scenario corresponds to the survey operations conducted in 2018 on 1 ha of commercial Barbera cv. vineyard. The M17 and M18 scenarios refer to flights performed in 2017 and 2018 on a 1.4 ha Sangiovese cv. vineyard. The two vineyards differ for different rows and slopes orientation and different intra-row and inter-row spacing. The research team tested the ability of the algorithms to identify grapevines without human supervision introducing estimation indexes. The estimation indices were useful to define the algorithm’s ability to over or under-estimate vine canopies. The three algorithms showed a different ability to estimate vines but, in general, HSV-based and DEM algorithms were comparable in terms of computation time. The K-means algorithm, however, increased computational demand as the quality of the DEM increased.

While rows identification from UAS images saw relevant development in the last years, a missing plant method was not developed until the study by Primicerio et al. [44] with a new methodology for vine segmentation in virtual shapes, each representing a real plant. They discussed, extracted, and coupled to a statistical classifier, an extensive set of features to evaluate its performance in missing plant detection within the parcels. Baofeng et al. [45] discovered instead the possibility to obtain accurate information about the affected or missing grapevines from a digital surface model (DSM). The analysis process started with a three-dimensional (3D) reconstruction from the RGB images, collected using the UAS, and the structure from motion (SfM) technique to obtain the DSM. A different approach followed by Pichon et al. [46], which did not involve the use of computer image analysis techniques, aimed at identifying relevant information that growers and advisers can extract from UAS images of the vineyard. The proposed methodology demonstrated that most of the information on grapevines status could be extracted from UAS-based visible images by the experts, assuming this information of great interest throughout the growing cycle of the vine, particularly for advisers, as support to drive management strategies.

2.2. Vineyard Remote Analysis for Variability Monitoring

PV could be defined as the set of monitoring and managing for spatial variability in physical, chemical, biological variables related to the productivity of vineyards [47]. A primary work, about the UAS platform and implemented sensors for data collecting, was carried out by Turner et al. [48] showing the perspective of the UAS technology to provide “on-demand” data. They analyzed the algorithms used in data processing, in the orthorectification process, and the vegetation indices to evaluate the differences within the vineyard images. The results highlighted the potential of UAS multi-sensor systems in PV, and their versatility enhanced by the possibility to collect data sets “on-demand” with a temporal resolution that spans the critical times in the crop growing season. The UASs spatial resolution permits to collect imagery at a much higher resolution and investigate a bigger spatial variability inside the vineyard compared to satellites and aircraft [49]. Differently from satellite technology, limited due to unfavorable re-visit times and orbit coverage patterns [50], UAS close-range photogrammetry represents an efficient method for continuously collecting information [51]

Matese et al. [52] introduced a new technique to evaluate the spatial distribution of vine vigor and phenolic maturity. A normalized difference vegetation index (NDVI) map was obtained by a high-resolution multispectral camera mounted on a UAS. Spatial variability of grape anthocyanin content was detected in situ evaluating ANTH_R and ANTH_RG indices by using a fluorescence-based sensor (MultiplexTM). The two techniques appeared suitable to compare vine related information on a relatively large scale. The research by Zarco-Tejada et al. [53] showed the feasibility of mapping leaf carotenoid concentration from high-resolution hyperspectral imagery. The R515/R570 index was explored for vineyards in this study. The PROSPECT-5 leaf radiative transfer model was linked to the SAILH and FLIGHT canopy-level radiative transfer models to simulate the pure vine reflectance without soil and shadow effects due to the UAS hyperspectral imagery, which enabled targeting pure vines. Primicerio et al. [54] used a UAS as a tool to combine high spatial resolution images, quick turnaround times, and low operational costs for vegetation monitoring, providing low-cost approaches to meet the critical requirements of spatial, spectral, and temporal resolutions needed. A low cost and open-source agro-meteorological monitoring system was designed and developed, and its placement and topology were optimized using a set of UAS-taken multispectral images. Mathews [55] captured aerial images of a Texas vineyard at post-flowering, veraison, and harvest stages using digital cameras mounted on board a UAS. The images were processed to generate reflectance orthophotos and then segmented to extract canopy area and NDVI-based canopy density. Derived canopy area and density values were compared to the number of clusters, cluster size, and yield to explore correlations. Differently from the derived canopy area, the NDVI-based canopy density exhibited no significant relationships because of the radiometric inaccuracy of the sensors. A vine performance index (VPI) was calculated to map spatial variation in canopy vigor for the entire growing season. C. Rey-Caraméset al. [56] used multispectral and spectral indices to assess vegetative, productive, and berry composition spatial variability (obtained by SFR_RAD and NBI_GAD MultiplexTM indices) within a vineyard. The correlations were significant but moderate among the spectral indices and the field variables, the pattern of the spectral indices agreed with that of the vegetative variables and mean cluster weight. The results proved the utility of the multi-spectral imagery acquired from a UAS to delineate homogeneous zones within the vineyard, allowing the grape-grower to carry out a specific management of each subarea. The aim of the work by Matese et al. [57] was to evaluate different sources of images and processing methodologies to describe spatial variability of spectral-based and canopy-based vegetation indices within a vineyard, and their relationship with productive and qualitative vine parameters. Comparison between image-derived indices from Sentinel 2 NDVI, unfiltered and filtered UAS NDVI, and agronomic features were performed. UAS images allow calculating new non-spectral indices based on canopy architecture that provide additional and useful information to the growers with regards to within-vineyard management zone delineation. Caruso et al. [58] identified three sites of different vines vigor in a mature vineyard to test the potential of the visible-near infrared (VIS-NIR) spectral information acquired from an UAS in estimating the LAI, leaf chlorophyll, pruning weight, canopy height, and canopy volume of grapevines. They showed that the combined use of VIS-NIR cameras and UAS is a rapid and reliable technique to determine canopy structure and LAI of grapevine. Romboli et al. [59] focused on the impact of vine vigor on Sangiovese grapes and wines, applying a high-resolution remote sensing technique by a UAS platform to identify vigor at the single vine level. The test confirms the ability of UAS technology to assess the evaluation of vigor variability inside the vineyard and confirm the influence of vigor on the flavonoid compounds as a function of bunch position in the canopy. Matese and Di Gennaro [60] described the implementation of a multisensory UAS system capable of flying with three sensors simultaneously to perform different monitoring options. The vineyard variability was assessed in terms of characterization of the state of vines vigor using a multispectral camera, leaf temperature with a thermal camera, and an innovative approach of missing plants analysis with a high spatial resolution RGB camera.

Pádua et al. [61] developed an analysis methodology useful to assist the decision-making processes in viticulture. They employed UASs to acquire RGB, multispectral, and thermal aerial imagery in a vineyard, enabling the multi-temporal characterization of the vineyard development throughout a season, thanks to the computation of the NDVI, crop surface models (CSM), and the crop water stress index (CWSI). Vigor maps were computed first considering the whole vineyard, second considering only automatically detected grapevine vegetation, and third considering grapevine vegetation by applying a normalization process before creating the vigor maps. Results showed that vigor maps considering only grapevine vegetation provided an accurate and better representation of the vineyard variability, gathering significant spatial associations through a multi-temporal analysis of vigor maps, and by comparing vigor maps with both height and water stress estimation. The objective of the work by Matese et al. [62] was to evaluate the performance of statistical methods to compare different maps of a vineyard, derived from UAS acquired imagery, and some from in situ ground characterization. The team proved how these methods, which consider data spatial structure to compare ground autocorrelated data and spectral and geometric information derived from UAS-acquired imagery, are highly appropriate, and would lead winegrowers to implement PV as a management tool. Pádua et al. [63] developed a multi-temporal vineyard plots analysis method at a grapevine scale using RGB, multispectral, and thermal infrared (TIR) sensors, enabling the estimation of the biophysical and geometrical parameters and missing grapevine plants detection. A high overall agreement was obtained concerning the number of grapevines present in each row and the individual grapevine identification. Moreover, the extracted individual grapevine parameters enabled the assessment of vineyard variability in each epoch and to monitor its multi-temporal evolution.

References

  1. Pierce, F.J.; Nowak, P. Aspects of Precision Agriculture. In Advances in Agronomy; Elsevier: Amsterdam, The Netherlands, 1999; Volume 67, pp. 1–85. ISBN 978-0-12-000767-7.
  2. Blackmore, S. The Role of Yield Maps in Precision Farming. Ph.D. Thesis, Cranfield University, Cranfield, UK, 2003; p. 171.
  3. Sudduth, K.A. Engineering Technologies for Precision Farming. In International Seminar on Agricultural Mechanization Technology for Precision Farming; Rural Development Admin: Suwon, Korean, 1999; p. 16.
  4. Arnó, J.; Martínez Casasnovas, J.A.; Ribes Dasi, M.; Rosell, J.R. Review. Precision Viticulture. Research Topics, Challenges and Opportunities in Site-Specific Vineyard Management. Span. J. Agric. Res. 2009, 7, 779.
  5. Matese, A.; Di Gennaro, S.F. Technology in Precision Viticulture: A State of the Art Review. Int. J. Wine Res. 2015, 69.
  6. Karakizi, C.; Oikonomou, M.; Karantzalos, K. Spectral Discrimination and Reflectance Properties of Various Vine Varieties from Satellite, UAV and Proximate Sensors. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-7/W3, 31–37.
  7. Borgogno-Mondino, E.; Lessio, A.; Tarricone, L.; Novello, V.; de Palma, L. A Comparison between Multispectral Aerial and Satellite Imagery in Precision Viticulture. Precis. Agric. 2018, 19, 195–217.
  8. Matese, A.; Toscano, P.; Di Gennaro, S.; Genesio, L.; Vaccari, F.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990.
  9. Anastasiou, E.; Balafoutis, A.; Darra, N.; Psiroukis, V.; Biniari, A.; Xanthopoulos, G.; Fountas, S. Satellite and Proximal Sensing to Estimate the Yield and Quality of Table Grapes. Agriculture 2018, 8, 94.
  10. Di Gennaro, S.; Dainelli, R.; Palliotti, A.; Toscano, P.; Matese, A. Sentinel-2 Validation for Spatial Variability Assessment in Overhead Trellis System Viticulture Versus UAV and Agronomic Data. Remote Sens. 2019, 11, 2573.
  11. Dobrowski, S.Z.; Ustin, S.L.; Wolpert, J.A. Remote Estimation of Vine Canopy Density in Vertically Shoot-Positioned Vineyards: Determining Optimal Vegetation Indices. Aust. J. Grape Wine Res. 2002, 8, 117–125.
  12. Zhang, C.; Kovacs, J.M. The Application of Small Unmanned Aerial Systems for Precision Agriculture: A Review. Precis. Agric. 2012, 13, 693–712.
  13. Barbedo, J.G.A. A Review on the Use of Unmanned Aerial Vehicles and Imaging Sensors for Monitoring and Assessing Plant Stresses. Drones 2019, 3, 40.
  14. Wierzbicki, D.; Kedzierski, M.; Fryskowska, A. Assesment of The Influence of Uav Image Quality on The Orthophoto Production. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 8.
  15. Dandois, J.P.; Olano, M.; Ellis, E.C. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sens. 2015, 7, 13895–13920.
  16. Ju, C.; Son, H. Multiple UAV Systems for Agricultural Applications: Control, Implementation, and Evaluation. Electronics 2018, 7, 162.
  17. Xue, X.; Lan, Y.; Sun, Z.; Chang, C.; Hoffmann, W.C. Develop an Unmanned Aerial Vehicle Based Automatic Aerial Spraying System. Comput. Electron. Agric. 2016, 128, 58–66.
  18. Wang, C.; He, X.; Wang, X.; Wang, Z.; Wang, S.; Li, L.; Bonds, J.; Herbst, A.; Wang, Z. Method and Distribution Characteristics of Spatial Pesticide Spraying Deposition Quality Balance for Unmanned Aerial Vehicle. Int. J. Agric. Biol. Eng. 2018, 11, 18–26.
  19. Hunter, J.E.; Gannon, T.W.; Richardson, R.J.; Yelverton, F.H.; Leon, R.G. Integration of Remote-weed Mapping and an Autonomous Spraying Unmanned Aerial Vehicle for Site-specific Weed Management. Pest. Manag. Sci. 2020, 76, 1386–1392.
  20. Giles, D.; Billing, R. Deployment and Performance of a Uav for Crop Spraying. Chem. Eng. Trans. 2015, 44, 307–312.
  21. Sarri, D.; Martelloni, L.; Rimediotti, M.; Lisci, R.; Lombardo, S.; Vieri, M. Testing a Multi-Rotor Unmanned Aerial Vehicle for Spray Application in High Slope Terraced Vineyard. J. Agric. Eng. 2019, 50, 38–47.
  22. Sassu, A.; Ghiani, L.; Pazzona, A.; Gambella, F. Development and Implementation of an Ultra-Low Volume (ULV) Spraying Equipment Installed on a Commercial UAV. In Innovative Biosystems Engineering for Sustainable Agriculture, Forestry and Food Production: International Mid-Term Conference 2019 of the Italian Association of Agricultural Engineering (AIIA); Coppola, A., Di Renzo, G.C., Altieri, G., D’Antonio, P., Eds.; Lecture Notes in Civil Engineering; Springer International Publishing: Cham, Switherlands, 2020; Volume 67, pp. 563–571. ISBN 978-3-030-39298-7.
  23. Valente, J.; Sanz, D.; Barrientos, A.; del Cerro, J.; Ribeiro, Á.; Rossi, C. An Air-Ground Wireless Sensor Network for Crop Monitoring. Sensors 2011, 11, 6088–6108.
  24. Huete, A.R. A Soil-Adjusted Vegetation Index (SAVI). Remote Sens. Environ. 1988, 25, 295–309.
  25. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412.
  26. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047.
  27. Khanal, S.; Fulton, J.; Shearer, S. An Overview of Current and Potential Applications of Thermal Remote Sensing in Precision Agriculture. Comput. Electron. Agric. 2017, 139, 22–32.
  28. Carlson, T.N.; Ripley, D.A. On the Relation between NDVI, Fractional Vegetation Cover, and Leaf Area Index. Remote Sens. Environ. 1997, 62, 241–252.
  29. Jiang, Z.; Huete, A.R.; Chen, J.; Chen, Y.; Li, J.; Yan, G.; Zhang, X. Analysis of NDVI and Scaled Difference Vegetation Index Retrievals of Vegetation Fraction. Remote Sens. Environ. 2006, 101, 366–378.
  30. Matese, A.; Primicerio, J.; Di Gennaro, F.; Fiorillo, E.; Vaccari, F.P.; Genesio, L. Development And Application Of An Autonomous And Flexible Unmanned Aerial Vehicle For Precision Viticulture. Acta Hortic. 2013, 63–69.
  31. Poblete-Echeverría, C.; Olmedo, G.; Ingram, B.; Bardeen, M. Detection and Segmentation of Vine Canopy in Ultra-High Spatial Resolution RGB Imagery Obtained from Unmanned Aerial Vehicle (UAV): A Case Study in a Commercial Vineyard. Remote Sens. 2017, 9, 268.
  32. Otsu, N. A Tlreshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66.
  33. Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An Automatic Object-Based Method for Optimal Thresholding in UAV Images: Application for Vegetation Detection in Herbaceous Crops. Comput. Electron. Agric. 2015, 114, 43–52.
  34. Blaschke, T. Object Based Image Analysis for Remote Sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16.
  35. Jiménez-Brenes, F.M.; López-Granados, F.; Torres-Sánchez, J.; Peña, J.M.; Ramírez, P.; Castillejo-González, I.L.; de Castro, A.I. Automatic UAV-Based Detection of Cynodon Dactylon for Site-Specific Vineyard Management. PLoS ONE 2019, 14, e0218132.
  36. Johansen, K.; Raharjo, T.; McCabe, M. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854.
  37. Mesas-Carrascosa, F.-J.; de Castro, A.I.; Torres-Sánchez, J.; Triviño-Tarradas, P.; Jiménez-Brenes, F.M.; García-Ferrer, A.; López-Granados, F. Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications. Remote Sens. 2020, 12, 317.
  38. Smit, J.L.; Sithole, G.; Strever, A.E. Vine Signal Extraction—an Application of Remote Sensing in Precision Viticulture. S. Afr. J. Enol. Vitic. 2016, 31.
  39. Puletti, N.; Perria, R.; Storchi, P. Unsupervised Classification of Very High Remotely Sensed Images for Grapevine Rows Detection. Eur. J. Remote Sens. 2014, 47, 45–54.
  40. Comba, L.; Gay, P.; Primicerio, J.; Ricauda Aimonino, D. Vineyard Detection from Unmanned Aerial Systems Images. Comput. Electron. Agric. 2015, 114, 78–87.
  41. Nolan, A.P.; Park, S.; O’Connell, M.; Fuentes, S.; Ryu, D.; Chung, H. Automated Detection and Segmentation of Vine Rows Using High Resolution UAS Imagery in a Commercial Vineyard. In Proceedings of the MODSIM2015, 21st International Congress on Modelling and Simulation; Modelling and Simulation Society of Australia and New Zealand, Gold Coast, HongKong, China, 25–29 November 2015.
  42. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Bessa, J.; Sousa, A.; Peres, E.; Morais, R.; Sousa, J.J. Vineyard Properties Extraction Combining UAS-Based RGB Imagery with Elevation Data. Int. J. Remote Sens. 2018, 39, 5377–5401.
  43. Cinat, P.; Di Gennaro, S.F.; Berton, A.; Matese, A. Comparison of Unsupervised Algorithms for Vineyard Canopy Segmentation from UAV Multispectral Images. Remote Sens. 2019, 11, 1023.
  44. Primicerio, J.; Caruso, G.; Comba, L.; Crisci, A.; Gay, P.; Guidoni, S.; Genesio, L.; Ricauda Aimonino, D.; Vaccari, F.P. Individual Plant Definition and Missing Plant Characterization in Vineyards from High-Resolution UAV Imagery. Eur. J. Remote Sens. 2017, 50, 179–186.
  45. Baofeng, S.; Jinru, X.; Chunyu, X.; Yulin, F.; Yuyang, S.; Fuentes, S. Digital Surface Model Applied to Unmanned Aerial Vehicle Based Photogrammetry to Assess Potential Biotic or Abiotic Effects on Grapevine Canopies. Biol Eng. 2016, 9, 12.
  46. Pichon, L.; Leroux, C.; Macombe, C.; Taylor, J.; Tisseyre, B. What Relevant Information Can Be Identified by Experts on Unmanned Aerial Vehicles’ Visible Images for Precision Viticulture? Precis. Agric. 2019, 20, 278–294.
  47. Hall, A.; Lamb, D.W.; Holzapfel, B.; Louis, J. Optical Remote Sensing Applications in Viticulture—a Review. Aust. J. Grape Wine Res. 2002, 8, 36–47.
  48. Turner, D.; Lucieer, A.; Watson, C. Development of an Unmanned Aerial Vehicle (UAV) for Hyper Resolution Vineyard Mapping Based on Visible, Multispectral, and Thermal Imagery. In Proceedings of the 34th International symposium on remote sensing of environment, Sydney, Australia, 10–15 April 2011; p. 4.
  49. Schut, A.G.T.; Traore, P.C.S.; Blaes, X.; de By, R.A. Assessing Yield and Fertilizer Response in Heterogeneous Smallholder Fields with UAVs and Satellites. Field Crops Res. 2018, 221, 98–107.
  50. Berni, J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and Narrowband Multispectral Remote Sensing for Vegetation Monitoring From an Unmanned Aerial Vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738.
  51. Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M. Assessing UAV-Collected Image Overlap Influence on Computation Time and Digital Surface Model Accuracy in Olive Orchards. Precis. Agric. 2018, 19, 115–133.
  52. Matese, A.; Capraro, F.; Primicerio, J.; Gualato, G.; Gennaro, S.F.D.; Agati, G. Mapping of Vine Vigor by UAV and Anthocyanin Content by a Non- Destructive Fluorescence Technique. In Precision Agriculture’13; Wageningen Academic Publishers: Wageningen, The Netherlands, 2013.
  53. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.R.; Martín, P. Estimating Leaf Carotenoid Content in Vineyards Using High Resolution Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle (UAV). Agric. For. Meteorol. 2013, 171–172, 281–294.
  54. Primicerio, J.; Matese, A.; Gennaro, S.F.D.; Albanese, L.; Guidoni, S.; Gay, P. Development of an Integrated, Low-Cost and Open-Source System for Precision Viticulture: From UAV to WSN. In Proceedings of the EFITA-WCCA-CIGR Conference “Sustainable Agriculture through ICT Innovation, Turin, Italy, 24–27 June 2013.
  55. Mathews, A.J. Object-Based Spatiotemporal Analysis of Vine Canopy Vigor Using an Inexpensive Unmanned Aerial Vehicle Remote Sensing System. J. Appl. Remote Sens. 2014, 8, 085199.
  56. Rey-Caramés, C.; Diago, M.; Martín, M.; Lobo, A.; Tardaguila, J. Using RPAS Multi-Spectral Imagery to Characterise Vigour, Leaf Development, Yield Components and Berry Composition Variability within a Vineyard. Remote Sens. 2015, 7, 14458–14481.
  57. Matese, A.; Di Gennaro, S.F.; Miranda, C.; Berton, A.; Santesteban, L.G. Evaluation of Spectral-Based and Canopy-Based Vegetation Indices from UAV and Sentinel 2 Images to Assess Spatial Variability and Ground Vine Parameters. Adv. Anim. Biosci. 2017, 8, 817–822.
  58. Caruso, G.; Tozzini, L.; Rallo, G.; Primicerio, J.; Moriondo, M.; Palai, G.; Gucci, R. Estimating Biophysical and Geometrical Parameters of Grapevine Canopies (‘Sangiovese’) by an Unmanned Aerial Vehicle (UAV) and VIS-NIR Cameras. VITIS J. Grapevine Res. 2017, 63–70.
  59. Romboli, Y.; Di Gennaro, S.F.; Mangani, S.; Buscioni, G.; Matese, A.; Genesio, L.; Vincenzini, M. Vine Vigour Modulates Bunch Microclimate and Affects the Composition of Grape and Wine Flavonoids: An Unmanned Aerial Vehicle Approach in a Sangiovese Vineyard in Tuscany: Vine Vigour Affects Grape and Wine Flavonoids. Aust. J. Grape Wine Res. 2017, 23, 368–377.
  60. Matese, A.; Di Gennaro, S. Practical Applications of a Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116.
  61. Pádua, L.; Marques, P.; Adão, T.; Guimarães, N.; Sousa, A.; Peres, E.; Sousa, J.J. Vineyard Variability Analysis through UAV-Based Vigour Maps to Assess Climate Change Impacts. Agronomy 2019, 9, 581.
  62. Matese, A.; Di Gennaro, S.F.; Santesteban, L.G. Methods to Compare the Spatial Variability of UAV-Based Spectral and Geometric Information with Ground Autocorrelated Data. A Case of Study for Precision Viticulture. Comput. Electron. Agric. 2019, 162, 931–940.
  63. Pádua, L.; Adão, T.; Sousa, A.; Peres, E.; Sousa, J.J. Individual Grapevine Analysis in a Multi-Temporal Context Using UAV-Based Multi-Sensor Imagery. Remote Sens. 2020, 12, 139.
More
Information
Contributor MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register :
View Times: 520
Revisions: 2 times (View History)
Update Date: 15 Mar 2021
1000/1000
Video Production Service