New technologies for management, monitoring, and control of spatio-temporal crop variability in precision viticulture scenarios are numerous. Remote sensing relies on sensors able to provide useful data for the improvement of management efficiency and the optimization of inputs. unmanned aerial systems (UASs) are the newest and most versatile tools, characterized by high precision and accuracy, flexibility, and low operating costs.
Precision agriculture concerns the use of multiple technologies to manage the spatial and temporal variability associated with agricultural production, improving crop performance, economic benefits, and environmental quality by limiting the use of pollutants [1][2][3]. In viticulture, precision agriculture techniques are used to improve the efficient use of inputs (e.g., fertilizers and chemicals), yield forecasting, selective harvesting of grape quality, and agree with the real needs (e.g., nutrients and water) of each plot within the vineyard [4]. New technologies have been developed for vineyard management, monitoring, and control of vine growth. Remote and proximal sensors become reliable instruments to disentangle vineyard overall status, essential to describe vineyards’ spatial variability at high resolution and give recommendations to improve management efficiency [5].
In the last decades, the development of aircraft and satellite platform technologies for remote sensing increased the spatial resolution, temporal availability, and capability to describe plants’ biophysical features [6][7]. Aircraft remote sensing campaigns can be planned with greater flexibility, but they are difficult and expensive [8]. Satellite image acquisition of large areas saves a considerable time, but has a low and inadequate resolution for precision viticulture (PV) [9]. Possible cloud cover combined with fixed acquisition times (referring to the time needed for the satellite to complete its orbit and return to the field area) could limit the monitoring process, and not allow early detection during specific phenological phases of the crop. Di Gennaro et al. [10] demonstrate the effectiveness of the spatial resolution provided by satellite imagery, Sentinel-2, on a trellis-shaped viticulture, as demonstrated for other permanent crops. However, due to the discontinuous nature of vine rows, their moderate coverage, soil influences between rows, background and shade, vineyards pose a challenge for remote sensing analysis: remote sensing images should be processed to separate the pixels of the canopy from the background [11].
Among all the remote sensing technologies for spatial and temporal heterogeneity detection, unmanned aerial systems (UASs) are the newest tools and likely the most useful in terms of high accuracy, flexibility, and low operational costs [12]. UASs can cover large rural areas much faster than people scouting on the ground, making it easier and more efficient to detect problems. UASs are often combined with imaging sensors, which allow the acquisition of images at higher spatial resolutions than those offered by satellites. Post-processing techniques combined with machine learning tools evolved to the point that the visual indications contained in an image can be extracted and transformed into useful information for farm management [13]. Poor weather conditions reduce the radiometric quality of the images resulting in less accurate and precise surface reconstruction. Reduced light conditions influence the stability of images’ features and increase errors in photo alignment and point cloud creation [14]. Calibration targets and post-processing techniques help standardize photo light conditions, especially in cloudy sky, low light conditions [15]. UAS remote sensing is a useful option for crop mapping even under cloudy conditions when satellite or airborne remote sensing are inoperable. The remote sensing task currently accounts for the majority of the operations performed with agricultural UASs [16]. In addition to applications involving the use of sensors and the extrapolation of useful information, UASs are applied and are under study for various types of operations, such as crop spraying operations [17][18][19][20][21][22], or combined with wireless sensor network (WSN) ground monitoring systems [23].
The detection of intra-vineyard variability for site-specific interventions has always been a priority for PV, allowing grape growers to manage vineyards more efficiently and pursue a better grape production and quality. Satellite technology is not always able to guarantee a proper resolution to detect and differentiate the vine rows’ vegetation contours due to its coarse ground resolution. UASs, rather, show a high potential thanks to the sensor’s high resolution, with a ground sampling distance (GSD) often close to 1 cm. Vegetation indices are taken as useful tools for vegetation characterization, usually obtained by arithmetic spectral band combination [24]. Spectral information usually derives by visible red–green–blue (RGB), multispectral, hyperspectral, and thermal sensors mounted on board UASs [25][26][27]. Many vegetation indices have been used and compared for canopy biophysical estimation, including leaf area index (LAI), productivity, and biomass [28][29]. Matese et al. [30] proved the effectiveness of developed open source/low-cost UAS in real field conditions for vigor areas mapping within vineyards. The RGB images are useful tools for vineyard spatial variability monitoring, which requires an accurate segmentation to extract relevant information. Manual segmentation (e.g., by geographic information system—GIS) of RGB images is laborious, time-consuming, and needs to be improved to consider accuracies of the canopy, the shadow effect, and different soil conditions in inter-rows. Starting from ultra-high-resolution RGB imagery obtained from UAS, C. Poblete-Echeverría et al. [31] presented a vine canopy detection and segmentation approach using four different classifications methods (K-means, artificial neural networks, random forest, and spectral indices). The results showed how the 2G_RBi spectral index (derived by the difference in the divergence of the red and blue bands from the green in the absolute brightness of the channel), complemented with the Otsu method for thresholding [32], was the best option in terms of performance for vine canopy detection. This method was automatic and easy to apply since it does not need specific software to perform the calculations of the indices.
The high-resolution UAS images represent a challenge for classification due to higher intra-class spectral variability. In this spectral variability, object-based image analysis (OBIA) emerged in remote sensing segmentation applications [33][34]. The research carried out by Jimenez-Brenes et al. [35] aimed to develop a rapid mapping technique and obtain management maps to fight against the Cynodon dactylon (a typical vineyard weed). Starting from RGB and red–green-near-infrared (RGNIR) images, the team worked on the optimum spectral vegetation index, which is useful to classify bermudagrass, grapevine, and bare soil areas through an automatic algorithm, and the design of site-specific management maps for weed control. The geometric characteristics of the canopy are used in agriculture as a proxy of pruning, pest effects on crops, or fruit detection [36], but the collection of these data at the field scale is time-consuming and offers uncertain results. Despite the great variety of technologies used to characterize the 3D structures of plants (radar, digital photogrammetric techniques, stereo images, ultrasonic sensors, and light detection and ranging sensors), many of them have aspects that limit their use. Most of them are expensive, and it is challenging to use them in large spatial extents. The novelty of the work from Mesas-Carrascosa et al. [37] lies in the possibility to apply vegetation indices to RGB point clouds for the automatic detection and classification of vegetation and to determine grapevines’ height using the soil points as a reference. This automatic process, without any selected parameter of training, guarantees the lack of errors due to manual intervention in the separation process of the points’ classes.
As mentioned before, the extraction of pure vines pixels (i.e., the pixels that compose the leaf wall of the vines) is indispensable to achieve effective and good quality vineyard maps for site-specific management [38][39]. Comba et al. [40] designed a new methodology, constituted by three main steps based on dynamic segmentation, to identify vine rows from UAS aerial images even in the presence of low illumination, inter-row grassing, trees shadows, or other disturbance elements. The process works without any user intervention, and with a limited number of parameters for the calibration. The information obtained from this approach can be used in PV scenarios to obtain vigor and prescription maps for crop management or inter-row route tracking for unmanned ground vehicles (UGVs). Nolan et al. [41] described an automated algorithm, applied to a high-resolution aerial orthomosaic, for an unsupervised detection and delineation of vine rows. The algorithm takes advantage of “skeletonization” techniques, based on an extraction of a simplified shape (skeleton) of an object, to reduce the complexity of agricultural scenes into a collection of skeletal descriptors. Thanks to a series of geometric and spatial constraints applied to each skeleton, the algorithm accurately identifies and segments each vine row.
Pádua et al. [42] showed a method to automatically estimate and extract Portuguese vineyards’ canopies, combining vegetation indices and digital elevation models (DEM) derived from UAS high-resolution images, to differentiate between vines’ canopies and inter-row vegetation cover. It proved to be an effective method when applied with consumer-grade sensors carried by UASs. Moreover, it also proved to be a fast and efficient way to extract vineyard information, enabling vineyard plots mapping for PV management tasks. In the paper from Cinat et al. [43], three algorithms based on HSV (hue, saturation, value), DEM, and K-means were applied to RGB and RGNIR UAS imagery, to perform unsupervised canopy segmentation without human support over three scenarios derived from two vineyards. The first P18 scenario corresponds to the survey operations conducted in 2018 on 1 ha of commercial Barbera cv. vineyard. The M17 and M18 scenarios refer to flights performed in 2017 and 2018 on a 1.4 ha Sangiovese cv. vineyard. The two vineyards differ for different rows and slopes orientation and different intra-row and inter-row spacing. The research team tested the ability of the algorithms to identify grapevines without human supervision introducing estimation indexes. The estimation indices were useful to define the algorithm’s ability to over or under-estimate vine canopies. The three algorithms showed a different ability to estimate vines but, in general, HSV-based and DEM algorithms were comparable in terms of computation time. The K-means algorithm, however, increased computational demand as the quality of the DEM increased.
While rows identification from UAS images saw relevant development in the last years, a missing plant method was not developed until the study by Primicerio et al. [44] with a new methodology for vine segmentation in virtual shapes, each representing a real plant. They discussed, extracted, and coupled to a statistical classifier, an extensive set of features to evaluate its performance in missing plant detection within the parcels. Baofeng et al. [45] discovered instead the possibility to obtain accurate information about the affected or missing grapevines from a digital surface model (DSM). The analysis process started with a three-dimensional (3D) reconstruction from the RGB images, collected using the UAS, and the structure from motion (SfM) technique to obtain the DSM. A different approach followed by Pichon et al. [46], which did not involve the use of computer image analysis techniques, aimed at identifying relevant information that growers and advisers can extract from UAS images of the vineyard. The proposed methodology demonstrated that most of the information on grapevines status could be extracted from UAS-based visible images by the experts, assuming this information of great interest throughout the growing cycle of the vine, particularly for advisers, as support to drive management strategies.
PV could be defined as the set of monitoring and managing for spatial variability in physical, chemical, biological variables related to the productivity of vineyards [47]. A primary work, about the UAS platform and implemented sensors for data collecting, was carried out by Turner et al. [48] showing the perspective of the UAS technology to provide “on-demand” data. They analyzed the algorithms used in data processing, in the orthorectification process, and the vegetation indices to evaluate the differences within the vineyard images. The results highlighted the potential of UAS multi-sensor systems in PV, and their versatility enhanced by the possibility to collect data sets “on-demand” with a temporal resolution that spans the critical times in the crop growing season. The UASs spatial resolution permits to collect imagery at a much higher resolution and investigate a bigger spatial variability inside the vineyard compared to satellites and aircraft [49]. Differently from satellite technology, limited due to unfavorable re-visit times and orbit coverage patterns [50], UAS close-range photogrammetry represents an efficient method for continuously collecting information [51]
Matese et al. [52] introduced a new technique to evaluate the spatial distribution of vine vigor and phenolic maturity. A normalized difference vegetation index (NDVI) map was obtained by a high-resolution multispectral camera mounted on a UAS. Spatial variability of grape anthocyanin content was detected in situ evaluating ANTH_R and ANTH_RG indices by using a fluorescence-based sensor (MultiplexTM). The two techniques appeared suitable to compare vine related information on a relatively large scale. The research by Zarco-Tejada et al. [53] showed the feasibility of mapping leaf carotenoid concentration from high-resolution hyperspectral imagery. The R515/R570 index was explored for vineyards in this study. The PROSPECT-5 leaf radiative transfer model was linked to the SAILH and FLIGHT canopy-level radiative transfer models to simulate the pure vine reflectance without soil and shadow effects due to the UAS hyperspectral imagery, which enabled targeting pure vines. Primicerio et al. [54] used a UAS as a tool to combine high spatial resolution images, quick turnaround times, and low operational costs for vegetation monitoring, providing low-cost approaches to meet the critical requirements of spatial, spectral, and temporal resolutions needed. A low cost and open-source agro-meteorological monitoring system was designed and developed, and its placement and topology were optimized using a set of UAS-taken multispectral images. Mathews [55] captured aerial images of a Texas vineyard at post-flowering, veraison, and harvest stages using digital cameras mounted on board a UAS. The images were processed to generate reflectance orthophotos and then segmented to extract canopy area and NDVI-based canopy density. Derived canopy area and density values were compared to the number of clusters, cluster size, and yield to explore correlations. Differently from the derived canopy area, the NDVI-based canopy density exhibited no significant relationships because of the radiometric inaccuracy of the sensors. A vine performance index (VPI) was calculated to map spatial variation in canopy vigor for the entire growing season. C. Rey-Caraméset al. [56] used multispectral and spectral indices to assess vegetative, productive, and berry composition spatial variability (obtained by SFR_RAD and NBI_GAD MultiplexTM indices) within a vineyard. The correlations were significant but moderate among the spectral indices and the field variables, the pattern of the spectral indices agreed with that of the vegetative variables and mean cluster weight. The results proved the utility of the multi-spectral imagery acquired from a UAS to delineate homogeneous zones within the vineyard, allowing the grape-grower to carry out a specific management of each subarea. The aim of the work by Matese et al. [57] was to evaluate different sources of images and processing methodologies to describe spatial variability of spectral-based and canopy-based vegetation indices within a vineyard, and their relationship with productive and qualitative vine parameters. Comparison between image-derived indices from Sentinel 2 NDVI, unfiltered and filtered UAS NDVI, and agronomic features were performed. UAS images allow calculating new non-spectral indices based on canopy architecture that provide additional and useful information to the growers with regards to within-vineyard management zone delineation. Caruso et al. [58] identified three sites of different vines vigor in a mature vineyard to test the potential of the visible-near infrared (VIS-NIR) spectral information acquired from an UAS in estimating the LAI, leaf chlorophyll, pruning weight, canopy height, and canopy volume of grapevines. They showed that the combined use of VIS-NIR cameras and UAS is a rapid and reliable technique to determine canopy structure and LAI of grapevine. Romboli et al. [59] focused on the impact of vine vigor on Sangiovese grapes and wines, applying a high-resolution remote sensing technique by a UAS platform to identify vigor at the single vine level. The test confirms the ability of UAS technology to assess the evaluation of vigor variability inside the vineyard and confirm the influence of vigor on the flavonoid compounds as a function of bunch position in the canopy. Matese and Di Gennaro [60] described the implementation of a multisensory UAS system capable of flying with three sensors simultaneously to perform different monitoring options. The vineyard variability was assessed in terms of characterization of the state of vines vigor using a multispectral camera, leaf temperature with a thermal camera, and an innovative approach of missing plants analysis with a high spatial resolution RGB camera.
Pádua et al. [61] developed an analysis methodology useful to assist the decision-making processes in viticulture. They employed UASs to acquire RGB, multispectral, and thermal aerial imagery in a vineyard, enabling the multi-temporal characterization of the vineyard development throughout a season, thanks to the computation of the NDVI, crop surface models (CSM), and the crop water stress index (CWSI). Vigor maps were computed first considering the whole vineyard, second considering only automatically detected grapevine vegetation, and third considering grapevine vegetation by applying a normalization process before creating the vigor maps. Results showed that vigor maps considering only grapevine vegetation provided an accurate and better representation of the vineyard variability, gathering significant spatial associations through a multi-temporal analysis of vigor maps, and by comparing vigor maps with both height and water stress estimation. The objective of the work by Matese et al. [62] was to evaluate the performance of statistical methods to compare different maps of a vineyard, derived from UAS acquired imagery, and some from in situ ground characterization. The team proved how these methods, which consider data spatial structure to compare ground autocorrelated data and spectral and geometric information derived from UAS-acquired imagery, are highly appropriate, and would lead winegrowers to implement PV as a management tool. Pádua et al. [63] developed a multi-temporal vineyard plots analysis method at a grapevine scale using RGB, multispectral, and thermal infrared (TIR) sensors, enabling the estimation of the biophysical and geometrical parameters and missing grapevine plants detection. A high overall agreement was obtained concerning the number of grapevines present in each row and the individual grapevine identification. Moreover, the extracted individual grapevine parameters enabled the assessment of vineyard variability in each epoch and to monitor its multi-temporal evolution.
This entry is adapted from the peer-reviewed paper 10.3390/s21030956