Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2119 2022-11-03 15:16:15 |
2 format Meta information modification 2119 2022-11-04 02:34:22 | |
3 format -1 word(s) 2118 2022-11-07 10:23:24 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Ojo, M.O.;  Zahid, A. Deep Learning in Controlled Environment Agriculture. Encyclopedia. Available online: https://encyclopedia.pub/entry/32766 (accessed on 16 May 2024).
Ojo MO,  Zahid A. Deep Learning in Controlled Environment Agriculture. Encyclopedia. Available at: https://encyclopedia.pub/entry/32766. Accessed May 16, 2024.
Ojo, Mike O., Azlan Zahid. "Deep Learning in Controlled Environment Agriculture" Encyclopedia, https://encyclopedia.pub/entry/32766 (accessed May 16, 2024).
Ojo, M.O., & Zahid, A. (2022, November 03). Deep Learning in Controlled Environment Agriculture. In Encyclopedia. https://encyclopedia.pub/entry/32766
Ojo, Mike O. and Azlan Zahid. "Deep Learning in Controlled Environment Agriculture." Encyclopedia. Web. 03 November, 2022.
Deep Learning in Controlled Environment Agriculture
Edit

Controlled environment agriculture (CEA) is an unconventional production system that is resource efficient, uses less space, and produces higher yields. Deep learning (DL) has been introduced in CEA for different applications including crop monitoring, detecting biotic and abiotic stresses, irrigation, microclimate prediction, energy efficient controls, and crop growth prediction.

smart farming greenhouse deep neural networks indoor agriculture plant factory

1. Introduction

Sustainable access to high-quality food is a problem in developed and developing countries. Rapid urbanization, climate change, and depleting natural resources have raised the concern for global food security. Additionally, the rapid population growth further aggregate the food insecurity challenge. According to World Health Organization, the food production needs to be increased by 70% to meet the food demand of about 10 billion people by 2050 [1], of which about 6.5 billion will be living in urban areas [2]. A significant amount of food is produced in the open fields using traditional agricultural practices, which results in low yields per sq. ft of land used. Simply increasing the agricultural land is not a long-term option because of the associated risks of land degradation, de-forestation, and increased emissions due to transportation to urban areas [3]. Thus, alternative production systems are essential to offset these challenges for establishing a sustainable food supply chain.
Controlled environment agriculture (CEA), including greenhouses, high-tunnels, vertical farms (vertical or horizontal plane), and plant factories, is increasingly considered an important strategy to address global food challenges [4]. CEA is further categorized based on the growing medium and production technology (hydroponics, aquaponics, aeroponics, and soil-based). CEA integrates knowledge across multiple disciplines to optimize crop quality and production efficiency without sufficient arable land. Globally, the CEA market has witnessed a growth of about 19% in 2020 and is projected to grow at a compound annual growth rate of 25% during the 2021–28 period [5]. CEA market in the US is predicted to be $3 billion by 2024, with an annual growth of about 24% [6]. Advocates of CEA claim that the system is more than 90% efficient in water use, produces 10–250 times the higher yield per unit area, and generates 80% less waste than traditional field production, while also reducing food transportation miles in urban areas [3][7][8].
Despite all these benefits, the CEA industry struggles to achieve economic sustainability due to inefficient microclimate and rootzone-environment controls and high costs. Microclimate control, including light, temperature, airflow, carbon dioxide, and humidity, is a major challenge in CEA, which is essential to produce uniform, high quantity, and quality crops [9]. In the last decade, substantial research has been carried out on implementing intelligent systems in CEA facilities such as nutrient solution management for hydroponic farm [10], and cloud-based micro-environment monitoring and control systems for the vertical farm [11]. Further, using artificial intelligence (AI) algorithms have also created new opportunities for intelligent predictions and self-learning [12]. DL has gained significant attention in the last few years due to its massive footprints in many modern day technologies. DL algorithms applied to CEA across all units have provided insights into farmers’ support and action. Computer vision and DL algorithms have been implemented to automate the irrigation in vertical stack farms [13], and microclimate control [14], which facilitated the growers to carry out a quantitative assessment for high-level decision-making.

2. What Are the Most Often Utilized DL Models in CEA and Their Benefits and Drawbacks?

In CEA, DL models have been applied to a variety of tasks, such as crop phenotyping, disease and small insect detection, growth monitoring, nutrient status and stress level monitoring, microclimatic condition prediction, and robotic harvesting, all of which require large amounts of data for the machine to learn from. The architectures have been implemented in various ways, including deep belief network (DBN), convolutional neural network (CNN), recurrent neural networks (RNN), stacked auto-encoders, long short-term memory (LSTM), and hybrid approaches. CNN, which has three primary benefits including parameter sharing, sparse interactions, and equivalent representations, is a popular and commonly used approach in deep learning. CNN’s feature mapping includes k filters that have been spatially divided into several channels [15]. The feature map’s width and height are reduced using the pooling technique. CNNs use filters to capture the semantic correlations through convolution operations in multiple-dimensional data as well as pooling layers for scaling and shared weights for memory reduction to evaluate hidden patterns. As a result, the CNN architecture has a significant advantage in comprehending spatial data, and the network’s accuracy improves as the number of convolutional layers rises.
RNN and LSTM are very useful in processing time-series data, which are frequently utilized in CEA. The most well-known RNN variations include Neural Turing Machines (NTM), Gated Recurrent Units (GRU), and Long-Short Term Memory (LSTM), with LSTM being the most popular for CEA applications. Typically for data dimensionality reduction, compression, and fusion, autoencoders (AE) are used to automatically learn and represent the unlabeled input data. Encode and decode are two of the autoencoder’s operations. Encoding input images yields a code, which is subsequently decoded to get an output. The back-propagation technique is used to train the network so that the output is equal to the input. A DBN is created by stacking a number of distinct unsupervised networks, such as RBMs (restricted Boltzmann machines), so that each layer can be connected to both previous and subsequent layers. As a result, DBNs are often constructed by stacking two or more RBMs. It is significant to demonstrate that DBNs have been used in CEA applications [16]. Each DL approach has the features that make it better suited than the others to a certain application in the CEA. Hybrid models are said to address the shortcomings of some of the single DL methods. The hybrid approach demonstrates the integration of several deep learning techniques.

3. Deep Learning in Greenhouses

RQ.2: What are the main application domains of DL in CEA?

3.1.1. Microclimate Condition Prediction

Maintaining the greenhouse at its ideal operating conditions throughout all phases of plant growth requires an understanding of the microclimate and its characteristics. The greenhouse can increase crop yield by operating at the optimal temperature, humidity, carbon dioxide (CO2) concentrations, and other microclimate parameters at each stage of the plant growth. For instance, greater indoor air temperatures—which can be achieved by preserving the greenhouse effect or using the right heating technology—are necessary for the maximum plant growth in cold climates. On the other hand, the greenhouse effect is only necessary in very hot areas for a brief period of around 2–3 months while other suitable cooling systems are needed [17]. Accurate prediction of a greenhouse’s internal environmental factors using DL approaches is one of the recent trends in CEA.

3.1.2. Yield Estimation

Crop detection, one of the most important topics in smart agriculture, especially in greenhouse production, is critical for matching crop supply and demand and crop management to boost productivity. Many of the surveyed research demonstrate the application of DL models for crop yield estimation. The Single Shot MultiBox detector (SSD) method was used in the studies [18][19][20][21] to estimate tomato crops in the greenhouse environment followed by robotic harvesting. Other applications of SSD include detecting oyster mushrooms in [22] and sweet pepper in [23]. Another DL model called You Only Look Once (YOLO) with different modifications has been utilized in some of the research for crop yield estimation as demonstrated in [20][21][24][25][26][27][28]. As described in [29][30][31][32][33][34], R-CNN models such as Mask-RCNN and Faster-RCNN, two of the most widely used DL models, are used in crop yield prediction applications, especially for tomato and strawberry. Other custom DL models for detecting crops have been proposed in the studies of [35][36][37][38].

3.1.3. Disease Detection and Classification

Disease control in greenhouse environments is one of the most pressing issues in agriculture. Spraying pesticides/insecticides equally over the agricultural area is the most common disease control method. Although effective, this approach comes at a tremendous financial cost. Techniques for image recognition using DL can dramatically increase efficiency and speed while reducing recognition cost. Similarly, the diseases of cucumber such as powdery mildew (PM) in [39][40][41], downy mildew (DM) in [34][39][40][41] and virus disease in [41] are the sole diseases discussed based on the assessments of the evaluated publications. The wheat disease stated in [42] is another disease reported in the examined research.

3.1.4. Growth Monitoring

Plant growth monitoring is one of the applications where DL techniques have been applied to greenhouse production. Plant growth monitoring encompasses various areas such as length estimation at all crop growth stages as demonstrated in [43][44], and anomalies in plant growth in [45][46]. Other areas where plant growth monitoring is applied are in the prediction of Phyto-morphological descriptors as demonstrated in [47], seedling vigor rating in [48], leaf-shape estimation [49], and spike detection and segmentation in [50].

3.1.5. Nutrient Detection and Estimation

It is crucial for crop management in greenhouses to accurately diagnose the nutritional state of crops because both an excess and a lack of nutrients can result in severe damage and decreased output. The goal of automatically identifying nutritional deficiencies is comparable to that of automatically recognizing diseases in that both involve finding the visual signs that characterize the disorder of concern. Based on the survey, researchers realized that there are few works dedicated to DL for nutrient estimation compared to most works utilizing DL for nutrient detection. The goal of nutritional detection is to identify one of these pertinent deficiencies, therefore symptoms that do not seem to be connected to the targeted disorders are disregarded. The studies [51][52] employed the autoencoders approach to detect nutrient deficiencies and lead content, respectively. CNN models were also frequently used in applications for nutrient detection. This was demonstrated in soybean leaf defoliation in [53], nutrient concentration in [54], nutrient deficiencies in [52], net photosynthesis modeling in [55] and calcium and magnesium deficiencies in [56]. As shown in [16], the cadmium concentration of lettuce leaves was estimated using a different DL model called DBN that was optimized using particle swarm optimization.

3.1.6. Small Insect Detection

The intricate nature of pest control in greenhouses calls for a methodical approach to early and accurate pest detection. Using an automatic detection approach (i.e., DL) for small insects in a greenhouse is even more critical for quickly and efficiently obtaining trap counts. The most prevalent greenhouse insects discovered in the studies are whiteflies and thrips [57][58][59][60]. The survey mentioned four studies for applying DL models (mostly CNN architectures) for tiny pest detection.

3.1.7. Robotic Harvesting

Robotics has evolved into a new “agricultural tool” in an era where smart agriculture technology is so advanced. The development of agricultural robots has been hastened by the integration of digital tools, sensors, and control technologies, exhibiting tremendous potential and advantages in modern farming. These developments span from rapidly digitizing plants with precise, detailed temporal and spatial information to completing challenging nonlinear control tasks for robot navigation. High-value crops planted in CEA (i.e., tomato, sweet pepper, cucumber, and strawberry) ripen heterogeneously and require selective harvesting of only the ripe fruits. According to the research, few works have utilized DL for robotic harvesting applications, such as picking-point positioning in grapes [61], obstacle separation using robots in tomato harvesting [62], 3D-pose detection for tomato bunch [63] and lastly, target tomato positioning estimation [64].

3.1.8. Others

Other applications related to DL in CEA applications include predicting low-density polyethylene (LDPE) film life and mechanical properties in greenhouses using a hybrid model integrating both SVM and CNN [65].

4. Deep Learning in Indoor Farms

This subsection presents the main applications of the works that utilized DL in indoor farms (vertical farms, shipping containers, plant factories, etc.,). 

3.2.1. Stress-Level Monitoring

To reduce both acute and chronic productivity loss, early detection of plant stress is crucial in CEA production. Rapid detection and decision-making are necessary when stress manifests in plants in order to manage the stress and prevent economic loss. It has been discovered that a few DL stress-level monitoring research are reported for plant factories. Stress level monitoring encompasses various areas such as water stress classification [66], tip-burn stress detection [67], lettuce light stress grading [68], and abnormal leaves sorting [69].

3.2.2. Growth Monitoring

In an indoor farm, it is critical to maintain a climate that promotes crop development through ongoing farm conditions monitoring. Crop states are critical for determining the optimal cultivation environment, and by continuously monitoring crop statuses, a proper crop-optimized farm environment can feasibly be maintained. In contrast to traditional methods, which is time-consuming, DL models are required to automate the monitoring system and increase measurement accuracy. It has been found that several studies used DL models for growth monitoring in indoor farms, including plant biomass monitoring [70], growth prediction model in arabidopsis [71], growth prediction model in lettuce [72], vision based plants phenotyping [73], plant growth prediction algorithm [74][75] and the development of automatic plant factory control system [76].

3.2.3. Yield Estimation

Due to its advantages over traditional methods in terms of accuracy, speed, robustness, and even resolving complicated agricultural scenarios, DL methods have been applied to yield estimation and counting research applications in indoor farming systems. The domains covered by yield estimation and counting from the examined publications include the identification of rapeseed [77] and cherry tomatoes [78].

References

  1. World Health Organization. The State of Food Security and Nutrition in the World 2018: Building Climate Resilience for Food Security and Nutrition; Food and Agriculture Organization: Rome, Italy, 2018.
  2. Avtar, R.; Tripathi, S.; Aggarwal, A.K.; Kumar, P. Population–Urbanization–Energy Nexus: A Review. Resources 2019, 8, 136.
  3. Benke, K.; Tomkins, B. Future Food-Production Systems: Vertical Farming and Controlled-Environment Agriculture. Sustain. Sci. Pract. Policy 2017, 13, 13–26.
  4. Saad, M.H.M.; Hamdan, N.M.; Sarker, M.R. State of the Art of Urban Smart Vertical Farming Automation System: Advanced Topologies, Issues and Recommendations. Electronics 2021, 10, 1422.
  5. Fortune Business Insights. Vertical Farming Market to Rise at 25.2% CAGR by 2028; Increasing Number of Product Launches Will Aid Growth, Says Fortune Business Insights™. Available online: https://www.globenewswire.com/news-release/2021/06/08/2243245/0/en/vertical-farming-market-to-rise-at-25-2-cagr-by-2028-increasing-number-of-product-launches-will-aid-growth-says-fortune-business-insights.html (accessed on 18 July 2022).
  6. Cision. United States $3 Billion Vertical Farming Market to 2024: Growing Popularity of Plug & Play Farms Scope for Automation Using Big Data and AI. Based on Report, Vertical Farming Market in the U.S.—Industry Outlook and Forecast 2019–2024”, by Research and Markets. Available online: https://www.prnewswire.com/news-releases/united-states-3-billion-vertical-farming-market-to-2024-growing-popularity-of-plug--play-farms--scope-for-automation-using-big-data-and-ai-300783042.html (accessed on 18 July 2022).
  7. Asseng, S.; Guarin, J.R.; Raman, M.; Monje, O.; Kiss, G.; Despommier, D.D.; Meggers, F.M.; Gauthier, P.P. Wheat Yield Potential in Controlled-Environment Vertical Farms. Proc. Natl. Acad. Sci. USA 2020, 117, 19131–19135.
  8. Naus, T. Is Vertical Farming Really Sustainable. EIT Food. Available online: https://www.eitfood.eu/blog/post/is-vertical-farming-really-sustainable (accessed on 18 July 2022).
  9. Chia, T.-C.; Lu, C.-L. Design and Implementation of the Microcontroller Control System for Vertical-Garden Applications. In Proceedings of the 2011 Fifth International Conference on Genetic and Evolutionary Computing, Xiamen, China, 29 August–1 September 2011; pp. 139–141.
  10. Michael, G.; Tay, F.; Then, Y. Development of Automated Monitoring System for Hydroponics Vertical Farming. J. Phys. Conf. 2021, 1844, 012024.
  11. Bhowmick, S.; Biswas, B.; Biswas, M.; Dey, A.; Roy, S.; Sarkar, S.K. Application of IoT-Enabled Smart Agriculture in Vertical Farming. In Advances in Communication, Devices and Networking, Lecture Notes in Electrical Engineering; Springer: Sinngapore, 2019; Volume 537, pp. 521–528.
  12. Monteiro, J.; Barata, J.; Veloso, M.; Veloso, L.; Nunes, J. Towards Sustainable Digital Twins for Vertical Farming. In Proceedings of the 2018 Thirteenth International Conference on Digital Information Management (ICDIM), Berlin, Germany, 24–26 September 2018; pp. 234–239.
  13. Siregar, R.R.A.; Palupiningsih, P.; Lailah, I.S.; Sangadji, I.B.; Sukmajati, S.; Pahiyanti, A.N.G. Automatic Watering Systems in Vertical Farming Using the Adaline Algorithm. In Proceedings of the International Seminar of Science and Applied Technology (ISSAT 2020), Virtual, 24 November 2020; pp. 429–435.
  14. Ruscio, F.; Paoletti, P.; Thomas, J.; Myers, P.; Fichera, S. Low-cost Monitoring System for Hydroponic Urban Vertical Farms. Int. J. Agric. Biosyst. Eng. 2019, 13, 267–271.
  15. Tao, X.; Zhang, D.; Wang, Z.; Liu, X.; Zhang, H.; Xu, D. Detection of Power Line Insulator Defects using Aerial Images Analyzed with Convolutional Neural Networks. IEEE Trans. Syst. Man Cybern. Syst. 2018, 50, 1486–1498.
  16. Sun, J.; Wu, M.; Hang, Y.; Lu, B.; Wu, X.; Chen, Q. Estimating Cadmium Content in Lettuce Leaves Based on Deep Brief Network and Hyperspectral Imaging Technology. J. Food Process Eng. 2019, 42, e13293.
  17. Aljubury, I.M.A.; Ridha, H.D. Enhancement of Evaporative Cooling System in a Greenhouse using Geothermal Energy. Renew. Energy 2017, 111, 321–331.
  18. Tenorio, G.L.; Caarls, W. Automatic Visual Estimation of Tomato Cluster Maturity in Plant Rows. Mach. Vis. Appl. 2021, 32, 1–18.
  19. Yuan, T.; Lv, L.; Zhang, F.; Fu, J.; Gao, J.; Zhang, J.; Li, W.; Zhang, C.; Zhang, W. Robust Cherry Tomatoes Detection Algorithm in Greenhouse Scene Based on SSD. Agriculture 2020, 10, 160.
  20. Moreira, G.; Magalhaes, S.A.; Pinho, T.; Santos, F.N.d.; Cunha, M. Benchmark of Deep Learning and a Proposed HSV Colour Space Models for the Detection and Classification of Greenhouse Tomato. Agronomy 2022, 12, 356.
  21. Magalhaes, S.A.; Castro, L.; Moreira, G.; Santos, F.N.D.; Cunha, M.; Dias, J.; Moreira, A.P. Evaluating the Single-Shot Multibox Detector and YOLO Deep Learning Models for the Detection of Tomatoes in a Greenhouse. Sensors 2021, 21, 3569.
  22. Rong, J.; Wang, P.; Yang, Q.; Huang, F. A Field-Tested Harvesting Robot for Oyster Mushroom in Greenhouse. Agronomy 2021, 11, 1210.
  23. Arad, B.; Kurtser, P.; Barnea, E.; Harel, B.; Edan, Y.; Ben-Shahar, O. Controlled Lighting and Illumination-Independent Target Detection for Real-Time Cost-Efficient Applications. The Case Study of Sweet Pepper Robotic Harvesting. Sensors 2019, 9, 1390.
  24. Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and Accurate Green Pepper Detection in Complex Backgrounds Via an Improved YOLOv4-tiny Model. Comput. Electron. Agric. 2021, 191, 106503.
  25. Lu, C.-P.; Liaw, J.-J.; Wu, T.-C.; Hung, T.-F. Development of a Mushroom Growth Measurement System Applying Deep Learning for Image Recognition. Agronomy 2019, 9, 32.
  26. Zhang, P.; Li, D. YOLO-VOLO-LS: A Novel Method for Variety Identification of Early Lettuce Seedlings. Front. Plant Sci. 2022, 13, 806878.
  27. Lawal, O.M.; Zhao, H. YOLOFig Detection Model Development Using Deep Learning. IET Image Process. 2021, 15, 3071–3079.
  28. Lawal, O.M. YOLOMuskmelon: Quest for Fruit Detection Speed and Accuracy using Deep Learning. IEEE Access 2021, 9, 15221–15227.
  29. Fonteijn, H.; Afonso, M.; Lensink, D.; Mooij, M.; Faber, N.; Vroegop, A.; Polder, G.; Wehrens, R. Automatic Phenotyping of Tomatoes in Production Greenhouses using Robotics and Computer Vision: From Theory to Practice. Agronomy 2021, 11, 1599.
  30. Seo, D.; Cho, B.-H.; Kim, K. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, I, 2211.
  31. Afonso, M.; Fonteijn, H.; Fiorentin, F.S.; Lensink, D.; Mooij, M.; Faber, N.; Polder, G.; Wehrens, R. Tomato Fruit Detection and Counting in Greenhouses Using Deep Learning. Front. Plant Sci. 2020, 11, 571299.
  32. Zhou, C.; Hu, J.; Xu, Z.; Yue, J.; Ye, H.; Yang, G. A Novel Greenhouse-Based System for the Detection and Plumpness Assessment of Strawberry using an Improved Deep Learning Technique. Front. Plant Sci. 2020, 11, 559.
  33. Mu, Y.; Chen, T.-S.; Ninomiya, S.; Guo, W. Intact Detection of Highly Occluded Immature Tomatoes on Plants using Deep Learning Techniques. Sensors 2020, 20, 2984.
  34. Liu, K.; Zhang, C.; Yang, X.; Diao, M.; Liu, H.; Li, M. Development of an Occurrence Prediction Model for Cucumber Downy Mildew in Solar Greenhouses Based on Long Short-Term Memory Neural Network. Agronomy 2022, 12, 442.
  35. Picon, A.; San-Emeterio, M.G.; Bereciartua-Perez, A.; Klukas, C.; Eggers, T.; Navarra-Mestre, R. Deep Learning-based Segmentation of Multiple Species of Weeds and Corn Crop Using Synthetic and Real Image Datasets. Comput. Electron. Agric. 2022, 194, 106719.
  36. Sun, J.; He, X.; Wu, M.; Wu, X.; Shen, J.; Lu, B. Detection of Tomato Organs based on Convolutional Neural Network under the Overlap and Occlusion Backgrounds. Mach. Vis. Appl. 2020, 31, 1–13.
  37. Islam, M.P.; Nakano, Y.; Lee, U.; Tokuda, K.; Kochi, N. TheLNet270v1–A Novel Deep-Network Architecture for the Automatic Classification of Thermal Images for Greenhouse Plants. Front. Plant Sci. 2021, 12, 630425.
  38. Lyu, B.; Smith, S.D.; Cherkauer, K.A. Fine-Grained Recognition in High-Throughput Phenotyping. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 72–73.
  39. Zhou, J.; Li, J.; Wang, C.; Wu, H.; Zhao, C.; Wang, Q. A Vegetable Disease Recognition Model for Complex Background based on Region Proposal and Progressive Learning. Comput. Electron. Agric. 2021, 184, 106101.
  40. Zhang, P.; Yang, L.; Li, D. Efficientnet-B4-Ranger: A Novel Method for Greenhouse Cucumber Disease Recognition under Natural Complex Environment. Comput. Electron. Agric. 2020, 176, 105652.
  41. Wang, C.; Zhou, J.; Zhao, C.; Li, J.; Teng, G.; Wu, H. Few-shot Vegetable Disease Recognition Model Based on Image Text Collaborative Representation Learning. Comput. Electron. Agric. 2021, 184, 106098.
  42. Zhang, Z.; Flores, P.; Friskop, A.; Liu, Z.; Igathinathane, C.; Jahan, N.; Mathew, J.; Shreya, S. Enhancing Wheat Disease Diagnosis in a GreenhouseUusing Image Deep Features and Parallel Feature Fusion. Front. Plant Sci. 2022, 13, 834447.
  43. Vit, A.; Shani, G.; Bar-Hillel, A. Length Phenotyping with Interest Point Detection. Comput. Electron. Agric. 2020, 176, 105629. Available online: https://www.sciencedirect.com/science/article/pii/S0168169919318939 (accessed on 15 September 2022).
  44. Boogaard, F.P.; Rongen, K.S.; Kootstra, G.W. Robust node detection and tracking in fruit-vegetable crops using deep learning and multi-view imaging. Biosyst. Eng. 2020, 192, 117–132.
  45. Xhimitiku, I.; Bianchi, F.; Proietti, M.; Tocci, T.; Marini, A.; Menculini, L.; Termite, L.F.; Pucci, E.; Garinei, A.; Marconi, M.; et al. Anomaly Detection in Plant Growth in a Controlled Environment using 3D Scanning Techniques and Deep Learning. In Proceedings of the 2021 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 3–5 November 2021; pp. 86–91.
  46. Choi, K.; Park, K.; Jeong, S. Classification of Growth Conditions in Paprika Leaf Using Deep Neural Network and Hyperspectral Images. In Proceedings of the 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), Jeju Island, Korea, 17–20 August 2021; pp. 93–95.
  47. Lauguico, S.; Concepcion, R.; Tobias, R.R.; Alejandrino, J.; Guia, J.D.; Guillermo, M.; Sybingco, E.; Dadios, E. Machine Vision-Based Prediction of Lettuce Phytomorphological Descriptors using Deep Learning Networks. In Proceedings of the IEEE 12th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Manila, Philippines, 3–7 December 2020; pp. 1–6.
  48. Zhu, F.; He, M.; Zheng, Z. Data Augmentation using Improved cDCGAN for Plant Vigor Rating. Comput. Electron. Agric. 2020, 175, 105603.
  49. Baar, S.; Kobayashi, Y.; Horie, T.; Sato, K.; Suto, H.; Watanabe, S. Non-destructive Leaf Area Index Estimation Via Guided Optical Imaging for Large Scale Greenhouse Environments. Comput. Electron. Agric. 2022, 197, 106911.
  50. Ullah, S.; Henke, M.; Narisetti, N.; Panzarová, K.; Trtílek, M.; Hejatko, J.; Gladilin, E. Towards Automated Analysis of Grain Spikes in Greenhouse Images Using Neural Network Approaches: A Comparative Investigation of Six Methods. Sensors 2021, 21, 7441.
  51. Zhou, X.; Sun, J.; Tian, Y.; Yao, K.; Xu, M. Detection of Heavy Metal Lead in Lettuce Leaves Based on Fluorescence Hyperspectral Technology Combined with Deep Learning Algorithm. Spectrochim. Acta Part Mol. Biomol. Spectrosc. 2022, 266, 120460.
  52. Tran, T.-T.; Choi, J.-W.; Le, T.-T.H.; Kim, J.-W. A Comparative Study of Deep CNN in Forecasting and Classifying the Macronutrient Deficiencies on Development of Tomato Plant. Appl. Sci. 2019, 9, 1601.
  53. da Silva, L.A.; Bressan, P.O.; Gonçalves, D.N.; Freitas, D.M.; Machado, B.B.; Gonxcxalves, W.N. Estimating Soybean Leaf Defoliation using Convolutional Neural Networks and Synthetic Images. Comput. Electron. Agric. 2019, 156, 360–368.
  54. Ahsan, M.; Eshkabilov, S.; Cemek, B.; Küçüktopcu, E.; Lee, C.W.; Simsek, H. Deep Learning Models to Determine Nutrient Concentration in Hydroponically Grown Lettuce Cultivars (Lactuca sativa L.). Sustainability 2021, 14, 416.
  55. Qu, Y.; Clausen, A.; Jørgensen, B.N. Application of Deep Neural Network on Net Photosynthesis Modeling. In Proceedings of the IEEE 19th International Conference on Industrial Informatics (INDIN), Palma de Mallorca, Spain, 21–23 July 2021; pp. 1–7.
  56. Kusanur, V.; Chakravarthi, V.S. Using Transfer Learning for Nutrient Deficiency Prediction and Classification in Tomato Plan. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 784–790.
  57. Li, W.; Wang, D.; Li, M.; Gao, Y.; Wu, J.; Yang, X. Field detection of tiny pests from sticky trap images using deep learning in agricultural greenhouse. Comput. Electron. Agric. 2021, 183, 106048.
  58. Tureček, T.; Vařacha, P.; Turexcxková, A.; Psota, V.; Jankúu, P.; Štěpánek, V.; Viktorin, A.; xSxenkexrxík, R.; Jašek, R.; Chramcov, B.; et al. Scouting of Whiteflies in Tomato Greenhouse Environment Using Deep Learning. In Agriculture Digitalization and Organic Production; Springer: Singapore, 2022; pp. 323–335.
  59. Wang, D.; Wang, Y.; Li, M.; Yang, X.; Wu, J.; Li, W. Using an Improved YOLOv4 Deep Learning Network for Accurate Detection of Whitefly and Thrips on Sticky Trap Images. Trans. ASABE 2021, 64, 919–927.
  60. Rustia, D.J.A.; Chao, J.-J.; Chiu, L.-Y.; Wu, Y.-F.; Chung, J.-Y.; Hsu, J.-C.; Lin, T.-T. Automatic Greenhouse Insect Pest Detection and Recognition Based on a Cascaded Deep Learning Classification Method. J. Appl. Entomol. 2021, 145, 206–222.
  61. Jin, Y.; Liu, J.; Wang, J.; Xu, Z.; Yuan, Y. Far-near Combined Positioning of Picking-point based on Depth Data Features for Horizontal-Trellis Cultivated Grape. Comput. Electron. Agric. 2022, 194, 106791.
  62. Xiong, Y.; Ge, Y.; From, P.J. An Obstacle Separation Method for Robotic Picking of Fruits in Clusters. Comput. Electron. Agric. 2020, 175, 105397.
  63. Zhang, F.; Gao, J.; Zhou, H.; Zhang, J.; Zou, K.; Yuan, T. Three-Dimensional Pose Detection method Based on Keypoints Detection Network for Tomato Bunch. Comput. Electron. Agric. 2022, 195, 106824.
  64. Gong, L.; Wang, W.; Wang, T.; Liu, C. Robotic Harvesting of the Occluded Fruits with a Precise Shape and Position Reconstruction Approach. J. Field Robot. 2022, 39, 69–84.
  65. Lahcene, A.; Amine, D.M.; Abdelkader, D. A Hybrid Deep Learning Model for Predicting Lifetime and Mechanical Performance Degradation of Multilayer Greenhouse Polyethylene Films. Polym. Sci. Ser. B 2021, 63, 964–977.
  66. Hendrawan, Y.; Damayanti, R.; Riza, D.F.A.; Hermanto, M.B. Classification of Water Stress in Cultured Sunagoke Moss Using Deep Learning. Telkomnika 2021, 19, 1594–1604.
  67. Gozzovelli, R.; Franchetti, B.; Bekmurat, M.; Pirri, F. Tip-Burn Stress Detection of Lettuce Canopy Grown in Plant Factories. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 11–17 October 2021; pp. 1259–1268.
  68. Hao, X.; Jia, J.; Gao, W.; Guo, X.; Zhang, W.; Zheng, L.; Wang, M. MFC-CNN: An Automatic Grading Scheme for Light Stress Levels of Lettuce (Lactuca sativa L.) leaves. Comput. Electron. Agric. 2020, 179, 105847.
  69. Wu, Z.; Yang, R.; Gao, F.; Wang, W.; Fu, L.; Li, R. Segmentation of Abnormal Leaves of Hydroponic Lettuce Based on DeepLabV3+ for Robotic Sorting. Comput. Electron. Agric. 2021, 190, 106443.
  70. Buxbaum, N.; Lieth, J.; Earles, M. Non-Destructive Plant Biomass Monitoring With High Spatio-Temporal Resolution via Proximal RGB-D Imagery and End-to-End Deep Learning. Front. Plant Sci. 2022, 13, 758818.
  71. Chang, S.; Lee, U.; Hong, M.J.; Jo, Y.D.; Kim, J.-B. Time-Series Growth Prediction Model Based on U-Net and Machine Learning in Arabidopsis. Front. Plant Sci. 2021, 12, 512–721.
  72. Rizkiana, A.; Nugroho, A.; Salma, N.; Afif, S.; Masithoh, R.; Sutiarso, L.; Okayasu, T. Plant Growth Prediction Model for Lettuce (Lactuca sativa) in Plant Factories Using Artificial Neural Network. In Proceedings of the IOP Conference Series: Earth and Environmental Science, Miass, Russia, 20–23 September 2021; Volume 733, p. 012027.
  73. Franchetti, B.; Ntouskos, V.; Giuliani, P.; Herman, T.; Barnes, L.; Pirri, F. Vision Based Modeling of Plants Phenotyping in Vertical Farming under Artificial Lighting. Sensors 2019, 19, 4378.
  74. Kim, T.; Lee, S.-H.; Kim, J.-O. A Novel Shape Based Plant Growth Prediction Algorithm Using Deep Learning and Spatial Transformation. IEEE Access 2022, 10, 731–737.
  75. Hwang, Y.; Lee, S.; Kim, T.; Baik, K.; Choi, Y. Crop Growth Monitoring System in Vertical Farms Based on Region-of-Interest Prediction. Agriculture 2022, 12, 656.
  76. Vorapatratorn, S. Development of Automatic Plant Factory Control Systems with AI-Based Artificial Lighting. In Proceedings of the 13th International Conference on Information Technology and Electrical Engineering (ICITEE), Chiang Mai, Thailand, 14–15 October 2021; pp. 69–73.
  77. Zhang, P.; Li, D. EPSA-YOLO-V5s: A Novel Method for Detecting the Survival Rate of Rapeseed in a Plant Factory Based on Multiple Guarantee Mechanisms. Comput. Electron. Agric. 2022, 193, 106714.
  78. Xu, P.; Fang, N.; Liu, N.; Lin, F.; Yang, S.; Ning, J. Visual Recognition of Cherry Tomatoes in Plant Factory Based on Improved Deep Instance Segmentation. Comput. Electron. Agric. 2022, 197, 106991.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : ,
View Times: 780
Revisions: 3 times (View History)
Update Date: 07 Nov 2022
1000/1000