Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 4808 2023-02-14 15:12:53 |
2 format -29 word(s) 4779 2023-02-15 03:23:45 |

Video Upload Options

Do you have a full video?


Are you sure to Delete?
If you have any further questions, please contact Encyclopedia Editorial Office.
Mahmud, M.S.; Zahid, A.; Das, A.K. Sensing and Automation Technologies for Ornamental Crops. Encyclopedia. Available online: (accessed on 24 April 2024).
Mahmud MS, Zahid A, Das AK. Sensing and Automation Technologies for Ornamental Crops. Encyclopedia. Available at: Accessed April 24, 2024.
Mahmud, Md Sultan, Azlan Zahid, Anup Kumar Das. "Sensing and Automation Technologies for Ornamental Crops" Encyclopedia, (accessed April 24, 2024).
Mahmud, M.S., Zahid, A., & Das, A.K. (2023, February 14). Sensing and Automation Technologies for Ornamental Crops. In Encyclopedia.
Mahmud, Md Sultan, et al. "Sensing and Automation Technologies for Ornamental Crops." Encyclopedia. Web. 14 February, 2023.
Sensing and Automation Technologies for Ornamental Crops

The ornamental crop industry is an important contributor to the economy in the United States. The industry has been facing challenges due to continuously increasing labor and agricultural input costs. Sensing and automation technologies have been introduced to reduce labor requirements and to ensure efficient management operations. Applications of sensors, computer vision, artificial intelligence (AI), machine learning (ML), Internet-of-Things (IoT), and robotic technologies are reported. Some advanced technologies, including 3D cameras, enhanced deep learning models, edge computing, radio-frequency identification (RFID), and integrated robotics used for other cropping systems, are also discussed as potential prospects. Advanced sensing, AI and robotic technologies are critically needed for the nursery crop industry. Adapting these current and future innovative technologies will benefit growers working towards sustainable ornamental nursery crop production.

agricultural mechanization artificial intelligence computer vision remote sensing sensor fusion

1. Introduction

The nursery and greenhouse industry contributes nearly $14 billion in annual sales to the U.S. economy [1]. This industry produces more than 2000 ornamental plant species, covering most of the U.S.’ ornamental plants [2]. Nurseries are, in general, open-air operations where plants grow in the ground or in containers [3]. Greenhouses are typically enclosed environments where growth conditions (e.g., lighting, temperature, humidity, and irrigation) can be controlled [4]. Rapidly increasing production cost due to the increased labor expense, difficulty in obtaining skilled labor, and inappropriate application of agricultural resources are rising concerns for the ornamental industry [5][6]. Operations such as planting, growing, and harvesting nursery crops are heavily dependent on labor. These operations account for 43% of total production expenses [7]. It is becoming increasingly difficult for the industry to obtain such labor, especially the skilled workforce required to grow ornamental crops [8]. Conventional practices apply agricultural resources (such as water, nutrients, fertilizers, and pesticides) excessively and inefficiently, increasing production costs. These conventional approaches not only increase the production cost but are also responsible for contaminating the environment and the ecosystem. The industry must look for alternative solutions, such as automated crop management technologies, to reduce labor needs and ensure the efficient use of crop production resources.
In the current decade, sensing and automation technologies have been continually increasing their impact on different crop management operations [9][10][11][12][13]. These technologies are categorized into two groups: ground-based and aerial-based. Ground-based crop harvesting technologies have been tested on various crops, including sweet pepper [14], lettuce [15], tomato [16], strawberries [11], apples [9], and cherries [17]. Ground-based technologies have also been explored widely in automatic disease detection in different crops, such as: powdery mildew on strawberry leaves [18]; leaf blotch, stripe rust, powdery mildew, leaf rust, black chaff, and smut on wheat leaves [19]; Alternaria leaf spot, brown spot, mosaic, grey spot and rust on apple leaves [20]; and anthracnose, brown spot, mites, black rot, downy mildew, and leaf blight on grape leaves [10]. Recent evolutions in unmanned aerial vehicles (UAVs) show the potential of using them in different agricultural operations, thereby consuming less time than ground-based systems [12]. Until now, UAVs used for agriculture have been limited to only remote sensing applications, due to limited payload capacity and battery life. UAVs have been used in various crop management applications, including automatic canker disease monitoring in citrus [21], weed detection in wheat and oat fields [22], detecting and mapping tree seedlings and individual plants [23][24], and yield estimation in cotton [25]. However, the success of sensing and automation technologies largely depends on the types of sensors used to acquire crop data and the processing algorithms used to extract valuable information.

2. Smart Irrigation

Smart or precision irrigation technology determines the water requirement of crops using set-point control (using soil moisture data) or model-based control (using crop and environmental data) to maximize irrigation efficiency [4][26]. It helps reduce excessive water application while maintaining crop growth and development. Sensors-based irrigation technologies have been tested in different nurseries, including greenhouse, container, pot-in-pot, and field nurseries [27][28][29][30][31].
Table 1 presents different sensor applications for automatic irrigation management in different nurseries. Wireless sensor networks (WSNs) were used to control irrigation water flow in three container-based nurseries [29]. Experiments were conducted in two phases: first, EM50R nodes with EC-5 sensors were used to monitor soil moisture; and second, nR5 nodes were used to monitor and control irrigation. The WSNs-based technology reduced water use by about 20% to 25%. Kim et al. [32] tested soil moisture and EC sensors to monitor and automatically implement irrigation protocols. Substrate moisture data were measured to reduce water usage of hydrangea by as much as 83%. Coates et al. [33] used a VH400 (Vegetronix, Sandy, UT, USA) sensor to monitor soil water content in container nurseries where pots contain hydrangea plants. Even though the VH400 sensor costs half as much as standard EC-5 sensors, the authors concluded the VH400 was unsuitable for nursery crop monitoring because its output varied by up to 29%. This type of sensor (VH400) shows a high sensitivity of ~34 mV rather than ~5 mV using EC-5 per % volumetric water content. Lea-Cox et al. [28] used a hybrid system consisting of a 12-node CMU network (developed by Carnegie Mellon University, United States) and Decagon Ech20 moisture sensors (Decagon Devices Inc., Pullman, WA, USA) to control water applications in real-time in a container nursery. The system was also tested in a greenhouse where a six-node CMU network was used. The results reported that both networks performed well, but encountered some networking challenges at remote sites. The authors noted the CMU network node is less costly than the commercial Decagon Ech20 sensor, but showed similar performance. Wheeler et al. [31] also tested a smart irrigation system in a container nursery and greenhouse. They used Decagon soil moisture sensors along with an nR5 wireless node to control irrigation. The study reported a water use reduction of approximately 50% when compared to grower-controlled irrigation. The same sensor system was trialed previously by Wheeler et al. [5] in a floriculture greenhouse.
The WSNs are also used in pot-in-pot nurseries. Belayneh et al. [34] used this technology to control irrigation in dogwood (planted in 15-gal containers) and red maple (planted in 30-gal containers) nurseries. The EM50R nodes were used to monitor data from soil moisture, and environmental sensors and nR5 nodes were used for irrigation control. Volumetric water content-based sensors were utilized for monitoring soil moisture. The sensors were inserted at a 6-inch depth for dogwood and at 6 and 12 inches depth for red maple. The results showed that the WSNs-based irrigation method reduced water usage by ~34% and ~63% for red maple and dogwood, respectively. Lea-Cox and Belayneh [35] developed a smart battery-operated nR5 wireless sensor node using a series of soil moisture and environmental sensors to irrigate dogwood and red maple nursery blocks. The study reduced daily water application by about 62.9%. The authors concluded that this sensor-based irrigation technology resulted in nearly a three-fold increase in the efficiency of water without reducing the quality or growth of trees.
Internet-of-Things (IoT)-based smart irrigation systems have also been used for ornamental crop production. Banda-Chávez et al. [36] developed an IoT-based sensor network to activate the irrigation system to irrigate ornamental plant using an IoT platform and soil moisture sensors (YL-69). In addition, Beeson and Brooks [37] used an evapotranspiration (ETo) model-based smart irrigation system for wax-leaf privet. The study reported that this model-based irrigation system could reduce water application by about 22.22% annually, compared to the traditional overhead irrigation method. Although a limited number of studies have reported on the IoT-based automatic irrigation systems used for the ornamental industry, trends and current successes of this technology for other crop industries show promising potential for ornamental crop production.
Although studies have reported the potential of using sensors-based technology for irrigation management, many factors impede this technology’s efficacy. Sensor-to-sensor variability in a particular environment could be one of them. The greatest variability among sensor readings occurred at volumetric water content levels just below the water-holding capacity of the substrate. Therefore, finding sensor-to-sensor variability in a particular nursery condition can greatly increase confidence in the data. Sensor positioning is another important factor that directly affects efficacy. Accurate positioning is needed in nursery conditions, particularly when measuring soil moisture content in container production. Sensors need to be placed in that part of the root zone where active water uptake occurs. Determination of optimal sensor numbers is another factor in specifying sensors for a nursery environment. The optimal number of sensors for a particular nursery depends primarily on the accuracy and repeatability of the sensors, variation among sensors, spatial variability of the nursery environment, and cost.
Table 1. Summary of studies reported for smart nursery irrigation.
Crop Nursery Types Soil Sensor Types Water Saving References
Ornamentals Container Capacitance-based (WSNs) 20% to 25% Chappell et al. [29]
Hydrangea Container Capacitance-based (WSNs) Not Reported Coates et al. [33]
Red Maple and Cherokee Princess Container and Greenhouse Matric potential and capacitance sensors (WSNs) Not Reported Lea-Cox et al. [28]
Hydrangea Container Electrical conductivity (WSNs) As much as 83% Kim et al. [32]
Woody Ornamental Plants: Oakleaf Hydrangea, Japanese Andromeda, Catawba Rosebay and Mountain Laurel Container and Greenhouse Capacitance-based (WSNs) 50% Wheeler et al. [31]
Dogwood and Red Maple Pot-in-pot Capacitance-based (WSNs) 34% to 63% Belayneh et al. [34]
Dogwood and Red Maple Pot-in-pot Capacitance-based (WSNs) 62.9% Lea-Cox and Belayneh [35]
Ornamental plants Pots in indoor Capacitance-based (IoT) Not Reported Banda-Chávez et al. [36]

3. Plant Stress Detection

Detection of stresses such as drought, disease infection, and pest pressure, recognizes unfavorable condition or substance that affects the growth, development or production of plants or crops using sensors and advanced technologies [38]. This detection helps growers to identify problems and take preventive actions before stresses significantly damage plants or crops. Two types of stresses have been identified in ornamental crop production: abiotic plant stress and biotic plant stress. Abiotic plant stress includes drought, nutrient deficiency, salinity problems, floods, etc., while biotic stress refers to damage caused by fungi, bacteria, insects, or weeds. Sensors, including RGB, thermal, and spectral, have been utilized to monitor stresses in ornamental crop production [39][40][41][42].
Table 2 represents different ornamental plant disease detection using advanced sensing technologies. Red-green-blue (RGB) imaging sensors with a spectrum range of 400–700 nm (visible range) are used to monitor ornamental plant stresses due to their affordability and application in other cropping systems. Velázquez-López et al. [39] developed an image processing-based powdery mildew disease detection system for rose plants by using the Open CV library. The system detected powdery mildew by converting RGB images to hue, saturation, and value (HSV) color space and achieved the highest disease region matching of 93.2% by segmenting with V channel using close captured images (captured at 10 cm from the rose canopies). This is considered a major limitation, especially for real-time disease detection, where multiple diseases would be present. Nuanmeesri [43] advanced the image processing technique from traditional image segmentation to deep learning-based detection in order to identify up to 15 different diseases. A hybrid deep learning model built by fusing convolutional neural networks (CNNs) and a support vector machine (SVM) were used. Researchers also tested the image registration approach of two imaging media for ornamental crop disease detection. Minaei et al. [42] registered RGB and thermal images to detect powdery mildew and gray mold disease on roses for developing a site-specific spraying system. A few studies have compared RGB imaging with spectral imaging for tulip disease detection [40][44]. The results reported that a spectral imaging system achieved better detection accuracies than RGB imaging while detecting tulip breaking virus (TBV).
Hyperspectral imaging is a powerful tool that uses imaging and spectroscopy for detecting stresses at the early stage, gathering and processing feature information from a wide spectrum of light. Researchers have used hyperspectral sensors for ornamental crops, but mainly in laboratory applications due to their vulnerability in real-time field applications [40]. Polder et al. [45] identified Botrytis infected Cyclamen plants with selected features (bands) of 497, 635, 744, 839, 604, 728, 542, and 467 nm in a controlled greenhouse environment. Poona and Ismail [41] selected wavebands located across VIS, red edge, NIR, and SWIR regions to detect Fusarium circinatum infection in Pinus radiata seedlings at the asymptomatic stage. The study concluded that random forest (RF) is a good machine learning (ML) classifier to discriminate disease infection from spectral bands. Heim et al. [46] also used RF to differentiate myrtle rust-infected lemon myrtle plants and achieved an overall accuracy of 90%. The spectral wavebands (545, 555, 1505, and 2195 nm) were selected for discrimination. Considering hyperspectral systems’ slow data processing and expense, some studies have tried to find an alternative to hyperspectral imaging. A few studies have used the multispectral imaging system instead because of its faster data processing ability. Polder et al. [40] used an RGB-NIR-based multispectral system (range 500–750 nm) to detect TBV disease in tulips and achieved a classification accuracy of 92%. They employed a linear discriminant classifier along with R, G, B, and NIR features to segment the plant and the soil. The author used features such the fraction of red pixels, mean normalized red value, mean normalized green value, and ratio of contour pixels of spots to classify disease in tulips. Pethybridge et al. [47] assessed ray blight disease (caused by Phoma ligulicola) intensity using a hand-held multispectral radiometer with 485, 560, 660, 830, and 1650 nm spectral band sensors. The study used vegetation indices, including normalized difference vegetative index (NDVI), green normalized difference vegetative index (GNDVI), difference vegetative index, and renormalized difference vegetative index to assess ray blight disease.
Thermal imaging has also been tested for stress detection in ornamental plants, a technique which depicts the spatial distribution of temperature differences in a captured scene by converting infrared (IR) radiation into visible images. Jafari et al. [48] classified asymptomatic powdery mildew and gray mold disease on roses by fusing thermal images with visible-range captured images. Valuable thermal features were extracted, and artificial neural networks (ANN) and SVM were used to classify healthy and disease-infected rose plants. The thermal features include maximum, minimum, median, mode, standard deviation, maximum difference in temperature, skewness, kurtosis, sum of squared errors, and so on. Studies have been conducted for disease stress detection using thermal imaging; however, this type of sensing is more practical for water stress detection. Before conducting the above experiment, Jafari et al. [49] attempted to classify Botrytis cinerea infection on rose using thermal spectra and radial-basis neural networks. Buitrago et al. [50] analyzed the infrared spectra of plants for water stress detection and concluded that spectral changes in plant regions had a direct connection with the microstructure and biochemistry of leaves.
Stress detection technologies are widely used in other crop industries, especially for agronomic crops (such as corn and soybean) and tree fruits (such as apple and citrus), but very few experiments have been conducted for ornamental crops (mostly in the floriculture industry). Very limited research, almost no studies, have been conducted for the woody ornamental industry. A few studies have been conducted to detect stress using RGB sensors because RGB cameras do not require deep technical knowledge to operate or use. Spectral sensors are necessary to detect stress at an asymptomatic or early stage. Spectral sensors have a huge potential for the ornamental industry, but not much progress has been previously reported. Currently, UAVs are very popular for crop stress detection and monitoring, but the applications of these systems are also very limited for the ornamental crop industry.
Table 2. Summary of studies reported for plant stress detection.
Crop Stress Type Imaging Type Processing Method Accuracies References
Rose Powdery mildew RGB (a video camera: Everio) Images were converted to HSV, and then segmentation performed to extract the disease region Highest 93.2% of disease region matching Velázquez-López et al. [39]
Rose Fifteen different rose diseases Color images downloaded from the Google search engine and ChromeDriver A hybrid deep learning model (CNNs with SVM) 90.26% accuracy, 90.59% precision, 92.44% recall, and 91.50% F1-score Nuanmeesri [43]
Rose Powdery mildew and gray mold RGB (Canon 550D Kiss X4);
Thermal camera (ITI-P400)
Image registration of visible and thermal images and then segmentation to segment diseased area Not reported Minaei et al. [42]
Tulip Tulip breaking virus RGB (Nikon D70 with a NIKON 18–70 mm zoom lens); Spectral camera (Specim, spectrum from 430 to 900 nm with a resolution of 4.5 nm) Spatial information was extracted after segmentation, and then Fisher’s linear discriminant analysis (LDA) used for the detection Best results of 9, 18 and 29% detection error were achieved for Barcelona, Monte Carlo, Yokohama tulip variety, respective using the spectral camera Polder et al. [44]
Tulip Tulip breaking virus RGB (Prosilica GC2450 and GC2450); RGB-NIR multispectral (JAI AD120GE); Multispectral (using six-band filter wheel, range 500-750 nm) Plant segmented by thresholding the excessive-green image ((2G–R–B) > 0) and then LDA for TBV classification 92% of TBV-diseased plants were accurately classified using RGB-NIR multispectral system Polder et al. [40]
Cyclamen Botrytis Hyperspectral imaging (400–1000 nm) Selected most discriminating wavelengths and then applied LDA 90% of pixels were classified correctly Polder et al. [45]
Pinus radiata seedlings Pitch
canker disease (F. circinatum infection)
Hyperspectral imaging (600–2500 nm) Wavebands were selected using the Boruta algorithm, and then Random forests were used for discriminating infected seedlings 0.82 and 0.84 KHAT values for healthy-infected and infected damaged discrimination, respectively Poona and Ismail [41]
Lemon myrtle Myrtle rust Hyperspectral imaging (350–2500 nm) Four wavebands were chosen, and RF was applied for discrimination 90% of overall accuracy Heim et al. [46]
Pyrethrum Ray blight
Multispectral radiometer Reflectance was measured, and data were analyzed using regression analysis Not reported Pethybridge et al. [47]
Rose Powdery mildew and gray mold Infrared thermal camera (ITI-P400) Image registration and then segmentation were performed to extract features, and finally, neuro-fuzzy classifiers were used for classification 92.3% and 92.59% estimation rates were achieved for powdery mildew and gray mold, respectively Jafari et al. [48]
Rose Botrytis cinerea infection Infrared thermal camera (ITI-P400) Analyzed extracted thermal features with radial-basis neural networks 96.4% correct estimation rate Jafari et al. [49]

4. Smart Spraying

Management of different pests and diseases is essential to ensure high quality ornamental nursery crop production meeting the market’s requirements [51]. Traditional management techniques include pruning the infected branches, removing dead or infected plants, monitoring diseases, trapping insects, growing pest-resistant cultivars, and pesticide applications [52]. Foliar pesticide application is the most effective method for preventing pest infestations and ensuring healthy and unblemished nursery plants [53]. In the United States, the greenhouse and nursery industries use about 1.3 million kg of pesticides every year, saving billions worth of crops [54]. Conventionally, radial air-assisted sprayers are the most used spray equipment for pesticide application in ornamental nurseries [55]. These sprayers apply pesticides to the entire field regardless of the plant structure, plant growth stage, and absence of plants in rows, thus, resulting in under- or over-spraying [56] as well as contaminating the environment, wasting pesticides, and increasing production cost [57]. This problem is more critical for the nursery industry, as there is great diversity in canopy structures and densities found in nursery crops. In field nursery production, it is a common practice that trees of different ages and cultivars are planted in the same row. The traditional sprayers cannot adjust sprayer settings to match the target tree requirements, reducing application efficiency. One way to improve spraying efficiency is to use sensing technologies to identify target trees for precise spraying applications, also referred to as smart/variable-rate-intelligent spraying.
Smart spraying is defined as the precise application of pesticides, performed by controlling the spray output of each nozzle based on the presence, structure, and canopy density of plants as obtained from sensors such as ultrasound, laser, and cameras [18]. In recent years, significant research has been conducted to develop smart spraying systems for the nursery industry. Different sensors, such as ultrasonic and laser, have been utilized to measure the canopy parameters for intelligent spraying in nursery crops. The initial efforts for smart nursery spraying were reported back in 2010 by a team of scientists from the United States [58]. The authors developed two precision sprayer prototypes: a hydraulic boom sprayer with an ultrasonic sensor for small narrow trees such as liners and an air-assisted sprayer with a laser scanner for other ornamental nursery species. The authors compared the spray consumption between a sensor-based sprayer and a conventional air blast sprayer at three growing stages and four travel speeds (3.2, 4.8, 6.4, and 8.0 km/h). The sensor-based air-assisted sprayer applied 70%, 66%, and 52% fewer chemicals at different growth stages than conventional spraying. The results also reported a uniform spray deposit and coverage regardless of changes in the canopy size and travel speed. Jeon and Zhu [59] developed an ultrasonic-sensed real-time variable-rate vertical boom sprayer for nursery liners. The sprayer consisted of two booms with five pairs of equally spaced nozzles, with the ultrasonic sensor mounted 0.35 m ahead of the nozzles. Field tests were conducted for six different liner species at travel speeds from 3.2 to 8.0 km/h. The spray nozzles were triggered successfully from 4.5 to 12.5 cm ahead of the target, and the effects of travel speed on mean spray coverage and deposit were insignificant.
Laser sensing is another technology used for precision spraying for many tree crops. A few studies have been reported that utilize laser scanning for smart spraying applications in nurseries. Chen et al. [53] developed a variable-rate air-assisted sprayer using a laser scanner. The authors reported that the spray coverage differences inside the canopies were not statistically significant at 3.2 and 6.4 km/h travel speeds. Liu et al. [60] used a laser scanner to develop an intelligent variable-rate air-assisted sprayer and tested the system in a commercial nursery and grapevine orchard. The authors reported that the new sprayer reduced chemical usage by more than 50% compared to the conventional sprayer at a travel speed of 3.2 to 8.0 km/h. Shen et al. [61] developed an air-assisted laser-guided sprayer for Japanese maple nursery trees. The new sprayer consisted of a 270° radial-range laser scanner, embedded controller, and pulse-width-modulated (PWM) nozzles. The authors reported an accurate measurement of different trees and control of nozzles to match trees independently. The spray usage was reduced by 12 to 43%, compared to the conventional spraying. In addition, a few studies have been reported for field validation of precision sprayers to control different diseases. Zhu et al. [55] validated the laser-guided air-assisted sprayer and reported a chemical saving of about 36% and 30% in the Prairifire crabapple and Honey locust nurseries, respectively. Chen et al. [62] also conducted a performance comparison of laser-guided air-assisted sprayers with conventional sprayers in commercial nurseries with different test plants. The author reported 56% and 52% chemical savings for two nurseries. Similarly, a few other studies have compared the performance of smart laser-guided sprayers with conventional sprayers and reported promising results for effective disease control in different nursery crops [57][63].
Smart spraying for nursery crops using different sensing technologies, mainly ultrasonic and laser, has been reported in the last decade. Ultrasonic and laser sensors were integrated with conventional sprayers to detect the target (e.g., canopies). Although ultrasonic sensor-based sprayers exhibit significant chemical savings, their accuracy varies with temperature, humidity, and detection distance [53]. On the other hand, laser sensors are less influenced by weather conditions when detecting and measuring target characteristics [64]. Moreover, the nursery industry encounters several unique challenges, such as the lack of crop uniformity, varying shapes, sizes, growth patterns, and harvest schedules. Most existing sprayers have been developed for the orchard environment [55]; modifications may be required to make them usable for ornamental nursery crop production. Another challenge for the ornamental industry is its high aesthetic thresholds allowing for no visible infections. Thus, efforts are required to develop a smart spraying system based on the requirements of the nursery industry.

5. Plant Biometrics and Identification

Information on plant physiology and responses to biotic/abiotic stresses are critical to determine the management practices required to improve productivity and sustainability in the nursery industry. Plant biometry (e.g., structural information) can assist in understanding the plant’s growth differences in diverse environments [65]. Cultivar identification of nursery plants is also important for breeding, reproduction, and cultivation [66]. Plant biometry is a classification system that distinguishes a plant by defining its authenticity using physiological characteristics. The defined biometric for an individual plant should be universal, distinctive, permanent, and collectible [67].
Different sensors, including cameras and LiDAR, have been utilized for nursery plant biometrics. The research for nursery plant identification using camera imaging systems started in the 1990s. Shearer and Holmes [68] used a camera vision system to identify tree species in the nursery. The study used color co-occurrence matrices derived from intensity, saturation, and hue to identify seven common containerized nursery plants. A total of 33 texture features were used for the analysis, and the reported classification accuracy was 91%. She et al. [69] developed a high-resolution imaging system to classify containerized Perennial peanut and Fire chief arborvitae plants for counting. he authors found that the classification accuracy of plants with flowers was higher (97%) than those without flowers (96%). Leiva et al. [70] developed an unmanned aircraft system (UAS)-based imaging system for counting container-grown Fire Chief arborvitae. The author developed a custom counting algorithm and tested it on different backgrounds, including gravel and black fabric. The reported results indicated counting errors of 8% and 2% for gravel and black fabric backgrounds, respectively.
In another study, the authors used a depth camera for height measurements of nursery plants [71]. The authors implemented Ghostnet–YoloV4 Network for measuring height and counting different nursery plants, including spruce, Mongolian scotch pine, and Manchurian ash. They achieved an accuracy of more than 92% for measurement and counting. Gini et al. [72] used a UAS-based multispectral imaging system to classify eleven nursery plant species. The author implemented multiple grey level co-occurrence matrix algorithms to perform textural analysis of acquired images. A principal component analysis was used after feature extraction, achieving a classification accuracy of 87% for the selected plants. Likewise, a few studies have reported the application of LiDAR sensors to identify nursery plants. Weiss et al. [73] developed a method for identifying nursery plant species using a LiDAR sensor and supervised machine learning. The author used multiple machine learning classifiers and 83 features to identify six containerized nursery plant species, and achieved an accuracy of more than 98%.
Similarly, LiDAR and light curtain sensors were used to develop a stem detection and classification system for almond nursery plants [74]. The authors developed a custom segmentation and thresholding algorithm, and the reported detection accuracies with the LiDAR and light curtain sensors were 95.7% and 99.48%, respectively. The success rates for dead/alive plant detection for the LiDAR and light curtain sensors were 93.75% and 94.16%, respectively. Additionally, a few other studies have reported the application of machine vision approaches using different machine learning and deep learning methodologies for detecting and classifying different flower nurseries [66][75][76][77][78][79].
Nursery crop management is time-consuming and labor-intensive, bringing a great need for automation, especially for large nursery production areas. Sensing-based plant biometrics, identification, and recognition are promising but challenging tasks. The rapid advancements in sensing, computation, artificial Intelligence (AI), and data analytics have allowed more detailed investigations in this domain. Research has been reported to identify tree species for management operations and counting plants for inventory control using different types of sensors, including RGB, multispectral, LiDAR, etc. A few recent studies have utilized state-of-art deep learning techniques for nursery plant classification; however, more efforts are needed to facilitate the growers’ use of such techniques for the profitability and sustainability of the nursery industry.

6. Other Significant Works

The economics of production practices associated with fertilizer inputs, pest control needs, and labor requirements affect the nursery industry. Most nursery production operations are labor intensive. According to Gunjal et al. [80], labor accounts for 70% of the costs for nursery production. Though a few operations in nursery production have been mechanized, many others have not been automated. Advanced sensing and mechanization/automation could reduce resource consumption and labor dependence [74]. In this context, the ornamental nursery industry has witnessed some progress in different sensing, automation, and robotic applications. Li et al. [81] developed a trimming robot for ornamental plants. The design includes a knife system and a rotary base, allowing the knife to rotate 360 degrees to cut the plants into the desired shape. The robot was tested for five different nursery plant species (Aglaia odorata, Murraya exotica, Camellia oleifera, Osmanthus fragrans, and Radermachera sinica), and results indicated that the overall performance was above 93% with the time taken as 8.89 s. Zhang et al. [82] developed a path-planning scheme for a watering robot for containerized ornamental nursery plants. The authors optimized the robot’s path planning using a genetic algorithm with neighbor exchanging to test different watering strategies, and achieved promising results in terms of water savings. Sharma and Borse [83] developed an autonomous mobile robot to carry out different production operations in the nursery. The robot featured multiple sensor modules, including camera and climate monitoring, to perform real-time growth monitoring, disease detection, and the spraying of fertilizer, pesticide, and water. The platform was also equipped with a Zigbee communication framework to transmit the sensed data to the central control system. The system achieved the desired results for disease detection and growth monitoring; however, no technical details are provided. Similarly, a conceptual design of a cable-driven parallel robot (CDPR) to perform different operations, including seeding, weeding, and nutrition monitoring for plant nurseries has been presented [84]. The authors performed the operational and path planning simulation to execute seeding and weeding operations. Additionally, a pretrained VGG16 model was used for weed identification, and results showed promise, with an accuracy of 96.29% achieved during testing. Despite some progress, the status of research-based findings for robotic applications in the nursery industry lags far behind its contemporary industries.


  1. USDA. U.S. Horticulture in 2014 (Publication ACH12-33); United States Department of Agriculture: Beltsville, MD, USA. Available online: (accessed on 21 November 2022).
  2. Lea-Cox, J.D.; Zhao, C.; Ross, D.S.; Bilderback, T.E.; Harris, J.R.; Day, S.D.; Hong, C.; Yeager, T.H.; Beeson, R.C.; Bauerle, W.L.; et al. A Nursery and Greenhouse Online Knowledge Center: Learning Opportunities for Sustainable Practice. HortTechnology 2010, 20, 509–517.
  3. Majsztrik, J.C.; Fernandez, R.T.; Fisher, P.R.; Hitchcock, D.R.; Lea-Cox, J.; Owen, J.S.; Oki, L.R.; White, S.A. Water Use and Treatment in Container-Grown Specialty Crop Production: A Review. Water. Air. Soil Pollut. 2017, 228, 151.
  4. Majsztrik, J.; Lichtenberg, E.; Saavoss, M. Ornamental Grower Perceptions of Wireless Irrigation Sensor Networks: Results from a National Survey. HortTechnology 2013, 23, 775–782.
  5. Wheeler, W.D.; Thomas, P.; van Iersel, M.; Chappell, M. Implementation of Sensor-Based Automated Irrigation in Commercial Floriculture Production: A Case Study. HortTechnology 2018, 28, 719–727.
  6. Rihn, A.L.; Velandia, M.; Warner, L.A.; Fulcher, A.; Schexnayder, S.; LeBude, A. Factors Correlated with the Propensity to Use Automation and Mechanization by the US Nursery Industry. Agribusiness 2022, 39, 110–130.
  7. USDA ERS. Farm Labor. Available online: (accessed on 20 November 2022).
  8. McClellan, M. Don’t Wait, Automate. Available online: (accessed on 20 November 2022).
  9. Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, Integration, and Field Evaluation of a Robotic Apple Harvester. J. Field Robot. 2017, 34, 1140–1159.
  10. Liu, B.; Ding, Z.; Tian, L.; He, D.; Li, S.; Wang, H. Grape Leaf Disease Identification Using Improved Deep Convolutional Neural Networks. Front. Plant Sci. 2020, 11, 1082.
  11. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402.
  12. Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing. Remote Sens. 2020, 12, 938.
  13. Gajjar, R.; Gajjar, N.; Thakor, V.J.; Patel, N.P.; Ruparelia, S. Real-Time Detection and Identification of Plant Leaf Diseases Using Convolutional Neural Networks on an Embedded Platform. Vis. Comput. 2022, 38, 2923–2938.
  14. Lehnert, C.; English, A.; Mccool, C.; Tow, A.W.; Perez, T. Autonomous Sweet Pepper Harvesting for Protected Cropping Systems. IEEE Robot. Autom. Lett. 2017, 2, 872–879.
  15. Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A Field-Tested Robotic Harvesting System for Iceberg Lettuce. J. Field Robot. 2020, 37, 225–245.
  16. Yasukawa, S.; Li, B.; Sonoda, T.; Ishii, K. Development of a Tomato Harvesting Robot. Proc. Int. Conf. Artif. Life Robot. 2017, 22, 408–411.
  17. Amatya, S.; Karkee, M.; Gongal, A.; Zhang, Q.; Whiting, M.D. Detection of Cherry Tree Branches with Full Foliage in Planar Architecture for Automated Sweet-Cherry Harvesting. Biosyst. Eng. 2016, 146, 3–15.
  18. Mahmud, M.S.; Zahid, A.; He, L.; Martin, P. Opportunities and Possibilities of Developing an Advanced Precision Spraying System for Tree Fruits. Sensors 2021, 21, 3262.
  19. Lu, J.; Hu, J.; Zhao, G.; Mei, F.; Zhang, C. An In-Field Automatic Wheat Disease Diagnosis System. Comput. Electron. Agric. 2017, 142, 369–379.
  20. Jiang, P.; Chen, Y.; Liu, B.; He, D.; Liang, C. Real-Time Detection of Apple Leaf Diseases Using Deep Learning Approach Based on Improved Convolutional Neural Networks. IEEE Access 2019, 7, 59069–59080.
  21. Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-Based Remote Sensing Technique to Detect Citrus Canker Disease Utilizing Hyperspectral Imaging and Machine Learning. Remote Sens. 2019, 11, 1373.
  22. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-Temporal Mapping of the Vegetation Fraction in Early-Season Wheat Fields Using Images from UAV. Comput. Electron. Agric. 2014, 103, 104–113.
  23. Pearse, G.D.; Tan, A.Y.S.; Watt, M.S.; Franz, M.O.; Dash, J.P. Detecting and Mapping Tree Seedlings in UAV Imagery Using Convolutional Neural Networks and Field-Verified Data. ISPRS J. Photogramm. Remote Sens. 2020, 168, 156–169.
  24. Zhang, C.; Atkinson, P.M.; George, C.; Wen, Z.; Diazgranados, M.; Gerard, F. Identifying and Mapping Individual Plants in a Highly Diverse High-Elevation Ecosystem Using UAV Imagery and Deep Learning. ISPRS J. Photogramm. Remote Sens. 2020, 169, 280–291.
  25. Feng, A.; Zhou, J.; Vories, E.D.; Sudduth, K.A.; Zhang, M. Yield Estimation in Cotton Using UAV-Based Multi-Sensor Imagery. Biosyst. Eng. 2020, 193, 101–114.
  26. Lea-Cox, J.D.; Bauerle, W.L.; van Iersel, M.W.; Kantor, G.F.; Bauerle, T.L.; Lichtenberg, E.; King, D.M.; Crawford, L. Advancing Wireless Sensor Networks for Irrigation Management of Ornamental Crops: An Overview. HortTechnology 2013, 23, 717–724.
  27. Cornejo, C.; Haman, D.Z.; Yeager, T.H. Evaluation of Soil Moisture Sensors, and Their Use to Control Irrigation Systems for Containers in the Nursery Industry; ASAE Paper No. 054056; ASAE: St. Joseph, MI, USA, 2005.
  28. Lea-Cox, J.D.; Ristvey, A.G.; Kantor, G.F. Using Wireless Sensor Technology to Schedule Irrigations and Minimize Water Use in Nursery and Greenhouse Production Systems ©. Comb. Proc. Int. Plant Propagators Soc. 2008, 58, 512–518.
  29. Chappell, M.; Dove, S.K.; van Iersel, M.W.; Thomas, P.A.; Ruter, J. Implementation of Wireless Sensor Networks for Irrigation Control in Three Container Nurseries. HortTechnology 2013, 23, 747–753.
  30. van Iersel, M.W.; Chappell, M.; Lea-Cox, J.D. Sensors for Improved Efficiency of Irrigation in Greenhouse and Nursery Production. HortTechnology 2013, 23, 735–746.
  31. Wheeler, W.D.; Chappell, M.; van Iersel, M.; Thomas, P. Implementation of Soil Moisture Sensor Based Automated Irrigation in Woody Ornamental Production. J. Environ. Hortic. 2020, 38, 1–7.
  32. Kim, J.; Chappell, M.; Van Iersel, M.W.; Lea-Cox, J.D. Wireless Sensors Networks for Optimization of Irrigation, Production, and Profit in Ornamental Production. Acta Hortic. 2014, 1037, 643–649.
  33. Coates, R.W.; Delwiche, M.J.; Broad, A.; Holler, M.; Evans, R.; Oki, L.; Dodge, L. Wireless Sensor Network for Precision Irrigation Control in Horticultural Crops; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2012; Volume 3.
  34. Belayneh, B.E.; Lea-Cox, J.D.; Lichtenberg, E. Costs and Benefits of Implementing Sensor-Controlled Irrigation in a Commercial Pot-in-Pot Container Nursery. HortTechnology 2013, 23, 760–769.
  35. Lea-Cox, J.D.; Belayneh, B.E. Implementation of Sensor-Controlled Decision Irrigation Scheduling in Pot-in-Pot Nursery Production. Acta Hortic. 2013, 1034, 93–100.
  36. Manuel Banda-Chávez, J.; Pablo Serrano-Rubio, J.; Osvaldo Manjarrez-Carrillo, A.; Maria Rodriguez-Vidal, L.; Herrera-Guzman, R. Intelligent Wireless Sensor Network for Ornamental Plant Care. In Proceedings of the IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018; Volume 1.
  37. Beeson, R., Jr.; Brooks, J. Evaluation of a Model Based on Reference Crop Evapotranspiration (ETo) for Precision Irrigation Using Overhead Sprinklers during Nursery Production of Ligustrum Japonica. Proc. V Int. Symp. Irrig. Hortic. Crops 2006, 792, 85–90.
  38. Zubler, A.V.; Yoon, J.Y. Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning. Biosensors 2020, 10, 193.
  39. Velázquez-López, N.; Sasaki, Y.; Nakano, K.; Mejía-Muñoz, J.M.; Kriuchkova, E.R. Detection of Powdery Mildew Disease on Rose Using Image Processing with Open CV. Rev. Chapingo Ser. Hortic. 2011, 17, 151–160.
  40. Polder, G.; van der Heijden, G.W.A.M.; van Doorn, J.; Baltissen, T.A.H.M.C. Automatic detection of tulip breaking virus (TBV) in tulip fields using machine vision. Biosyst. Eng. 2014, 117, 35–42.
  41. Poona, N.K.; Ismail, R. Using Boruta-Selected Spectroscopic Wavebands for the Asymptomatic Detection of Fusarium Circinatum Stress. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 3764–3772.
  42. Minaei, S.; Jafari, M.; Safaie, N. Design and Development of a Rose Plant Disease-Detection and Site-Specific Spraying System Based on a Combination of Infrared and Visible Images. J. Agric. Sci. Technol. 2018, 20, 23–36.
  43. Nuanmeesri, S. A Hybrid Deep Learning and Optimized Machine Learning Approach for Rose Leaf Disease Classification. Eng. Technol. Appl. Sci. Res. 2021, 11, 7678–7683.
  44. Polder, G.; van der Heijden, G.W.A.M.; van Doorn, J.; Clevers, J.G.P.W.; van der Schoor, R.; Baltissen, A.H.M.C. Detection of the Tulip Breaking Virus (TBV) in Tulips Using Optical Sensors. Precis. Agric. 2010, 11, 397–412.
  45. Polder, G.; Pekkeriet, E.; Snikkers, M. A Spectral Imaging System for Detection of Botrytis in Greenhouses. In Proceedings of the EFITA-WCCA-CIGR Conference “Sustainable Agriculture through ICT Innovation”, Turin, Italy, 24–27 June 2013.
  46. Heim, R.H.J.; Wright, I.J.; Allen, A.P.; Geedicke, I.; Oldeland, J. Developing a Spectral Disease Index for Myrtle Rust (Austropuccinia psidii). Plant Pathol. 2019, 68, 738–745.
  47. Pethybridge, S.J.; Hay, F.; Esker, P.; Groom, T.; Wilson, C.; Nutter, F.W. Visual and Radiometric Assessments for Yield Losses Caused by Ray Blight in Pyrethrum. Crop Sci. 2008, 48, 343–352.
  48. Jafari, M.; Minaei, S.; Safaie, N. Detection of Pre-Symptomatic Rose Powdery-Mildew and Gray-Mold Diseases Based on Thermal Vision. Infrared Phys. Technol. 2017, 85, 170–183.
  49. Jafari, M.; Minaei, S.; Safaie, N.; Torkamani-Azar, F.; Sadeghi, M. Classification Using Radial-Basis Neural Networks Based on Thermographic Assessment of Botrytis Cinerea Infected Cut Rose Flowers Treated with Methyl Jasmonate. J. Crop Prot. 2016, 5, 591–602.
  50. Buitrago, M.F.; Groen, T.A.; Hecker, C.A.; Skidmore, A.K. Changes in Thermal Infrared Spectra of Plants Caused by Temperature and Water Stress. ISPRS J. Photogramm. Remote. Sens. 2016, 111, 22–31.
  51. Braman, S.; Chappell, M.; Chong, J.; Fulcher, A.; Gauthier, N.; Klingeman, W.; Knox, G.; LeBude, A.; Neal, J.; White, S.; et al. Pest Management Strategic Plan for Container and Field-Produced Nursery Crops: Revision 2015. In Proceedings of the Southern Nursery Integrated Pest Management Working Group (SNIPM), Mills River, NC, USA, 30–31 July 2009; Volume 236.
  52. Mizell, R.F.; Short, D.E. Integrated Pest Management in the Commercial Ornamental Nursery. 2015; Volume 8. Available online: (accessed on 20 November 2022).
  53. Chen, Y.; Zhu, H.; Ozkan, H.E. Development of a Variable-Rate Sprayer with Laser Scanning Sensor to Synchronize Spray Outputs to Tree Structures. Trans. ASABE 2012, 55, 773–781.
  54. Hudson, W.G.; Garber, M.P.; Oetting, R.D.; Mizell, R.F.; Chase, A.R.; Bondari, K. Pest Management in the United States Greenhouse and Nursery Industry: V. Insect and Mite Control. HortTechnology 1996, 6, 216–221.
  55. Zhu, H.; Rosetta, R.; Reding, M.E.; Zondag, R.H.; Ranger, C.M.; Canas, L.; Fulcher, A.; Derksen, R.C.; Ozkan, H.E.; Krause, C.R. Validation of a Laser-Guided Variable-Rate Sprayer for Managing Insects in Ornamental Nurseries. Trans. ASABE 2017, 60, 337–345.
  56. Fox, R.D.; Derksen, R.C.; Zhu, H.; Brazee, R.D.; Svensson, S.A. A History of Air-Blast Sprayer Development and Future Prospects. Trans. ASABE 2008, 51, 405–410.
  57. Chen, L.; Zhu, H.; Horst, L.; Wallhead, M.; Reding, M.; Fulcher, A. Management of Pest Insects and Plant Diseases in Fruit and Nursery Production with Laser-Guided Variable-Rate Sprayers. HortScience 2021, 56, 94–100.
  58. Zhu, H.; Jeon, H.Y.; Gu, J.; Derksen, R.C.; Krause, C.R.; Ozkan, H.E.; Chen, Y.; Reding, M.E.; Ranger, C.M.; Cañas, L.; et al. Development of Two Intelligent Spray Systems for Ornamental Nurseries©. In Proceedings of the International Plant Propagators’ Society, Miami, FL, USA, 1 August 2010; Volume 60, p. 322.
  59. Jeon, H.; Zhu, H. Development of a Variable-Rate Sprayer for Nursery Liner Applications. Trans. ASABE 2012, 55, 303–312.
  60. Liu, H.; Zhu, H.; Shen, Y.; Chen, Y. Embedded Computer-Controlled Laser Sensor-Guided Air-Assisted Precision Sprayer Development. In Proceedings of the ASABE Annual International Meeting, New Orleans, LA, USA, 26–29 July 2015.
  61. Shen, Y.; Zhu, H.; Liu, H.; Chen, Y.; Ozkan, E. Development of a Laser-Guided, Embedded-Computercontrolled, Air-Assisted Precision Sprayer. Trans. ASABE 2017, 60, 1827–1838.
  62. Chen, L.; Wallhead, M.; Zhu, H.; Fulcher, A. Control of Insects and Diseases with Intelligent Variable-Rate Sprayers in Ornamental Nurseries. J. Environ. Hortic. 2019, 37, 90–100.
  63. Fessler, L.; Fulcher, A.; Schneider, L.; Wright, W.C.; Zhu, H. Reducing the Nursery Pesticide Footprint with Laser-Guided, Variable-Rate Spray Application Technology. HortScience 2021, 141, 1572–1584.
  64. Wei, J.; Salyani, M. Development of a Laser Scanner for Measuring Tree Canopy Characteristics: Phase 1. Prototype Development. Trans. Am. Soc. Agric. Eng. 2004, 47, 2101–2107.
  65. Campbell, J.; Sarkhosh, A.; Habibi, F.; Ismail, A.; Gajjar, P.; Zhongbo, R.; Tsolova, V.; El-sharkawy, I. Biometrics Assessment of Cluster- and Berry-related Traits of Muscadine Grape Population. Plants 2021, 10, 1067.
  66. Zhang, R.; Tian, Y.; Zhang, J.; Dai, S.; Hou, X.; Wang, J.; Guo, Q. Metric Learning for Image-Based Flower Cultivars Identification. Plant Methods 2021, 17, 1–14.
  67. Maltoni, D.; Maio, D.; Jain, A.K.; Prabhakar, S. Handbook of Fingerprint Recognition; Springer Science and Business Media: New York, NY, USA, 2009.
  68. Shearer, S.A.; Holmes, R.G. Plant identification using color co-occurrence matrices. Trans. ASAE 1990, 33, 1237–1244.
  69. She, Y.; Ehsani, R.; Robbins, J.; Leiva, J.N.; Owen, J. Applications of High-Resolution Imaging for Open Field Container Nursery Counting. Remote Sens. 2018, 10, 2018.
  70. Leiva, J.N.; Robbins, J.; Saraswat, D.; She, Y.; Ehsani, R. Evaluating Remotely Sensed Plant Count Accuracy with Differing Unmanned Aircraft System Altitudes, Physical Canopy Separations, and Ground Covers. J. Appl. Remote Sens. 2017, 11, 036003.
  71. Yuan, X.; Li, D.; Sun, P.; Wang, G.; Ma, Y. Real-Time Counting and Height Measurement of Nursery Seedlings Based on Ghostnet–YoloV4 Network and Binocular Vision Technology. Forests 2022, 13, 1459.
  72. Gini, R.; Sona, G.; Ronchetti, G.; Passoni, D.; Pinto, L. Improving Tree Species Classification Using UAS Multispectral Images and Texture Measures. ISPRS Int. J. Geo-Inf. 2018, 7, 315.
  73. Weiss, U.; Biber, P.; Laible, S.; Bohlmann, K.; Zell, A. Plant Species Classification Using a 3D LIDAR Sensor and Machine Learning. In Proceedings of the 2010 Ninth International Conference on Machine Learning and Applications, Washington, DC, USA, 12–14 December 2010; pp. 339–345.
  74. Garrido, M.; Perez-Ruiz, M.; Valero, C.; Gliever, C.J.; Hanson, B.D.; Slaughter, D.C. Active Optical Sensors for Tree Stem Detection and Classification in Nurseries. Sens. Switz. 2014, 14, 10783–10803.
  75. Alipour, N.; Tarkhaneh, O.; Awrangjeb, M.; Tian, H. Flower Image Classification Using Deep Convolutional Neural Network. In Proceedings of the 2021 7th International Conference on Web Research (ICWR), Tehran, Iran, 19–20 May 2021; pp. 1–4.
  76. Dharwadkar, S.; Bhat, G.; Subba Reddy, N.V.; Aithal, P.K. Floriculture Classification Using Simple Neural Network and Deep Learning. In Proceedings of the 2017 2nd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), Bangalore, India, 19–20 May 2017; pp. 619–622.
  77. Malik, M.; Aslam, W.; Nasr, E.A.; Aslam, Z.; Kadry, S. A Performance Comparison of Classification Algorithms for Rose Plants. Comput. Intell. Neurosci. 2022, 2022, 1842547.
  78. Narvekar, C.; Rao, M. Flower Classification Using CNN and Transfer Learning in CNN-Agriculture Perspective. In Proceedings of the 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), Thoothukudi, India, 3–5 December 2020; pp. 660–664.
  79. Soleimanipour, A.; Chegini, G.R. A Vision-Based Hybrid Approach for Identification of Anthurium Flower Cultivars. Comput. Electron. Agric. 2020, 174, 105460.
  80. Gunjal, S.; Waskar, D.; Dod, V.; Bhujbal, B.; Ambad, S.N.; Rajput, H.; Hendre, P.; Thoke, N.; Bhaskar, M. Horticulture Nursery Management. 2012. Available online: (accessed on 20 November 2022).
  81. Li, M.; Ma, L.; Zong, W.; Luo, C.; Huang, M.; Song, Y. Design and Experimental Evaluation of a Form Trimming Machine for Horticultural Plants. Appl. Sci. Switz. 2021, 11, 2230.
  82. Zhang, M.; Guo, W.; Wang, L.; Li, D.; Hu, B.; Wu, Q. Modeling and Optimization of Watering Robot Optimal Path for Ornamental Plant Care. Comput. Ind. Eng. 2021, 157, 107263.
  83. Sharma, S.; Borse, R. Automatic Agriculture Spraying Robot with Smart Decision Making. Adv. Intell. Syst. Comput. 2016, 530, 743–758.
  84. Prabha, P.; Vishnu, R.S.; Mohan, H.T.; Rajendran, A.; Bhavani, R.R. A Cable Driven Parallel Robot for Nursery Farming Assistance. In Proceedings of the 2021 IEEE 9th Region 10 Humanitarian Technology Conference (R10-HTC), Bangalore, India, 30 September–2 October 2021; pp. 1–6.
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to : , ,
View Times: 715
Entry Collection: Remote Sensing Data Fusion
Revisions: 2 times (View History)
Update Date: 15 Feb 2023