Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 4942 2022-07-11 13:24:49 |
2 format correct Meta information modification 4942 2022-07-12 03:41:51 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Holzinger, A.;  Saranti, A.;  Angerschmid, A.;  Retzlaff, C.O.;  Gronauer, A.;  Pejakovic, V.;  Medel-Jimenez, F.;  Krexner, T.;  Gollob, C.;  Stampfer, K. Smart Farm and Forest Operations Needs Human-Centered AI. Encyclopedia. Available online: https://encyclopedia.pub/entry/25008 (accessed on 06 July 2024).
Holzinger A,  Saranti A,  Angerschmid A,  Retzlaff CO,  Gronauer A,  Pejakovic V, et al. Smart Farm and Forest Operations Needs Human-Centered AI. Encyclopedia. Available at: https://encyclopedia.pub/entry/25008. Accessed July 06, 2024.
Holzinger, Andreas, Anna Saranti, Alessa Angerschmid, Carl Orge Retzlaff, Andreas Gronauer, Vladimir Pejakovic, Francisco Medel-Jimenez, Theresa Krexner, Christoph Gollob, Karl Stampfer. "Smart Farm and Forest Operations Needs Human-Centered AI" Encyclopedia, https://encyclopedia.pub/entry/25008 (accessed July 06, 2024).
Holzinger, A.,  Saranti, A.,  Angerschmid, A.,  Retzlaff, C.O.,  Gronauer, A.,  Pejakovic, V.,  Medel-Jimenez, F.,  Krexner, T.,  Gollob, C., & Stampfer, K. (2022, July 11). Smart Farm and Forest Operations Needs Human-Centered AI. In Encyclopedia. https://encyclopedia.pub/entry/25008
Holzinger, Andreas, et al. "Smart Farm and Forest Operations Needs Human-Centered AI." Encyclopedia. Web. 11 July, 2022.
Smart Farm and Forest Operations Needs Human-Centered AI
Edit

The main impetus for the global efforts toward the current digital transformation in almost all areas of our daily lives is due to the great successes of artificial intelligence (AI), and in particular, the workhorse of AI, statistical machine learning (ML). The intelligent analysis, modeling, and management of agricultural and forest ecosystems, and of the use and protection of soils, already play important roles in securing our planet for future generations and will become irreplaceable in the future. AI in areas that matter to human life (agriculture, forestry, climate, health, etc.) has led to an increased need for trustworthy AI with two main components: explainability and robustness.

machine learning artificial intelligence human-centered AI

1. Artificial Intelligence

Artificial intelligence is one of the oldest fields of computer science and was extremely popular in its early days in the 1950s. However, the requirements quickly reached the limits of the computing power of digital computer systems at the time. This made AI interesting in theory but a failure practically and especially economically, which inevitably led to a decline in interest in AI in the 1980s. AI only became very popular again a decade ago, driven by the tremendous successes of data-driven statistical machine learning.
Artificial neural networks have their origins in the artificial neurons [1] developed by McCulloch and Pitts (1943). Today’s neural networks consist of very many layers and have an enormous number of connections, and use a special form of compositionality in which features in one layer are combined in many different ways to produce more abstract features in the next layer [2]. The success of such AI, referred to as “deep learning”, has only been made possible by the computing power available today. The increasing complexity of such deep learning models has naturally led to drawbacks and new problems in the comprehensibility of results. This lack of comprehensibility can be very important, especially when using such AI systems in areas that affect human life [3].
Many fundamental AI concepts date back to the middle of the last century. Their current success is actually based on a combination of three factors: (1) powerful, low-cost, and available digital computing technology, (2) scalable statistical machine learning methods (e.g., deep learning), and (3) the explosive growth of available datasets.
To date, AI has reached levels of popularity and maturity that have let it permeate nearly all industries and application areas, and it is the main driver of the current digital transformation—due to its undeniable potential to benefit humanity and the environment. AI can definitely help find new solutions to our society’s most pressing challenges in virtually all areas of life: from agriculture and forest ecosystems, which affect our entire planet, to the health of every individual.
For all its benefits, the large-scale adoption of AI technologies also holds enormous and unimagined potential for new, unforeseen threats. Therefore, all stakeholders—governments, policymakers, and industry—along with academia, must ensure that AI is developed with knowledge and consideration of these potential threats. The security, traceability, transparency, explainability, validity, and verifiability of AI applications must always be ensured at all times [3]. However, how is AI actually defined now, what is trustworthy AI, and what is human-centered AI?
For trustworthy AI, it is imperative to include ethical and legal aspects, which is a cross-disciplinary goal, because all trusted AI solutions must be not only ethically responsible but also legally compliant [4]. Dimensions of trustworthiness for AI include: security, safety, fairness, accountability (traceability, replicability), auditability (verifiability, checkability), and most importantly, robustness and explainability; see [5].
Human-centered AI we define as a synergistic approach to align AI solutions with human values, ethical principles, and legal requirements to ensure safety and security, enabling trustworthy AI. This HCAI concept is now widely supported by renowned institutions (Stanford Human-Centered AI Institute, Berkeley Center for Human-Compatible AI, Cambridge Leverhulme Center for the Future of Intelligence, Chicago Human AI Lab, Utrecht Human-centered AI, Sydney Human-Centered AI Lab) and world-leading experts, such as Ben Shneiderman, Fei Fei Li, Joseph A. Konstan, Stephen Yang, and Christopher Manning, to name a few [6].
The inclusion of a human-in-the-loop in interactive machine learning [7] is thereby not only helpful to increase the performances of AI algorithms, but also highly desirable to counter the earlier fears and anxieties that artificial intelligence automates everything, replaces and displaces humans, and pushes them into passive roles [8].
In addition, integrating a human-in-the-loop (expert-in-the-loop) has many other advantages: Human experts excel at certain tasks by thinking multimodally and embedding new information in a conceptual knowledge space shaped by individual experience and prior knowledge. Farmers and foresters can build on an enormous amount of prior knowledge.

2. State-of-the-Art AI Technologies

2.1. Classification of AI Technologies

Today, AI can be successfully applied in virtually all application areas [9]. Due to resource conservation and the demand for sustainability, precision concepts, similar to precision medicine, are gaining more attention. These include a very wide range of different information technologies that are already used in many agricultural and forestry operations worldwide [10][11][12][13]. In this context, satellite technology, geographic information systems (GIS), and remote sensing are also very important to improve all functions and services of the agricultural and forestry sectors [14]. Available tools include mobile applications [15], a variety of smart sensors [16], drones (unmanned aerial vehicles, UAVs) [17], cloud computing [18], Internet of Things (IoT) [19], and blockchain technologies [20]. An increasingly important and often underappreciated area is the provision of energy, making alternative low-energy approaches imperative [21].
All of these technologies make it possible to process information about the state of the soil, plants, weather, or animals in a shared network in quasi-real time and make it available for further processes regardless of location. This means that today’s agricultural and forestry systems are being expanded to include other relevant processes/services, and additional datasets are being created for quantitative and qualitative information along the entire value chain for plant production and animal husbandry products and food safety (“from farm to fork”). Let us now show a concrete application example for each type of our four AI classes.

2.2. Autonomous AI Systems

“Full automation is one of the hottest topics in AI and could lead to fully driverless cars in less than a decade”, stated a 2015 Nature article [22]. Fully autonomous vehicles are indeed the popular example of AI and are also readily representable, as the Society of Automotive Engineers (SAE) has provided very descriptive definitions for levels of automation in its standards. Levels of automation emerged as a way to represent gradations or categories of autonomy and to distinguish between tasks for machines and tasks for humans. In a very recent paper, however, Hopkins and Schwanen (2021) [23] argued that the current discourse on automated vehicles is underpinned by a technology-centered logic dominated by AI proponents, and point to the benefits of a stronger human-centered perspective.
However, compared to car driving, the complexity of processes in agriculture and forestry is disproportionately higher. Agricultural and forestry systems include virtually all processes for the production of f5 (food, feed, fiber, fire, fuel). In this context, the production processes take place both indoors (buildings and facilities for people, livestock, post-harvest, and machinery) and outdoors, and face much diversity in terms of soil, plants, animals, and people. The temporal resolution of process phenomena varies over an extremely wide range (from milliseconds, e.g., moving machinery, to many years, e.g., growth of trees and changes in soil).

2.2.1. In Agriculture

A major problem in agriculture has always been weed control. In their daily life cycle, plants and weeds compete with each other for soil nutrients, water from the soil, and sunlight. If weeds are left untouched, increased weed growth can affect both crop yields and crop quality. Several studies have already shown that these effects can be significant, ranging from 48 to 71%, depending on the crop [24].
Moreover, in certain cases, crop damage can be so high that the entire yield is not suitable for the market [25]. To prevent this, weed control has emerged as a necessity. Furthermore, with the ever-increasing trends in crop yield production, the demand for process optimization, i.e., reduction of energy losses, herbicide use, and manual labor, is becoming more and more urgent. To meet the above requirements, traditional methods of weed control need to be changed. One of the possible ways to achieve this is to introduce systems that significantly reduce the presence of human labor, the use of herbicides, and mechanical treatment of the soil by focusing only on specific areas where and when intervention is needed. The novel approach based on the above principles is called smart farming or Agriculture 4.0. Moreover, this type of system should involve the performance of agricultural tasks autonomously, i.e., without human intervention, relying entirely on its own systems to collect the data, navigate through the field, detect the plants/weeds, and perform the required operation based on the results of the collected data [26].
These types of systems are known as autonomous agricultural robot systems. Each autonomous agricultural robotic system, e.g., an autonomous robot for weed control, consists of four main systems, i.e., steering/machine vision, weed detection, mapping, and precision weed control [26]. Most agricultural robots are developed for outdoor use, though some of them can operate indoors [27]. Precise navigation of these devices is provided throughout the global navigation satellite systems (GNSS) and real-time kinematics (RTK) [28][29].
However, under certain conditions, localization accuracy may fall below the required thresholds, and then autonomous robotic systems must rely on machine vision and indoor positioning systems, such as adaptive Monte Carlo localization and laser scanners [30]. The above two technologies are widely used and commercially available. Weed control in the row is mainly done by the four conventional weed control methods, i.e., electric, chemical, thermal, and mechanical weed control methods. Currently, weed detection and identification is the most challenging issue. Several studies have addressed this issue, with detection accuracy varying from 60 to 90% under ideal test conditions [26]. Thanks to extensive remote sensing technologies and data processing software, the development of weed maps has become a reality, and together with machine vision, a powerful tool for weed detection.
Some of the most representative autonomous agricultural robotic systems are: The robotic system for weed control in sugar beets developed by Astrand et al. (2002) [31]. The robot consisted of two vision systems used for crop guidance and detection, and a hoe for weed removal. A front camera with two-row detection at 5-meter range and near-infrared filter was used for row detection and navigation, while a second color camera mounted inside the robot was used for weed detection. Initial trials showed that color-based plant detection was feasible and that the robot’s subsystems could functionally work together. The BoniRob autonomous multipurpose robotic platform (see Figure 1) with wavelength-matched illumination system for capturing high-resolution image data was used for weed detection and precision spraying [32], and for ground intervention measurements [33]. Autonomous navigation along crop rows was achieved using 3D laser scans or by using global navigation satellite systems (GNSS). Lamm et al. (2002) developed a robotic system for weed control in cotton fields that is able to distinguish weeds from cotton plants and precisely apply herbicides. A machine vision algorithm was used to determine the diameter of the inscribed leaf circle to identify the plant species. Field tests showed a spray efficiency of 88.8 percent [34].
Figure 1. Field robot BoniRob [32].
Similarly to Astrand et al. (2002) [31], Blasco et al. (2002) [35] developed two machine vision system robots for weed control. The machine vision systems are used separately, one for in row navigation and the second one for the weed detection. Precise target weeding was done with an end-effector which emitted electrical charge [35]. Bawden et al. (2017) [36] have developed an autonomous robot platform with a heterogeneous weeding array. The weeding mechanism is based on machine vision for weed detection and classification, together with weeding array which combines precise spraying and hoeing methods for weed destruction [36].
As can be seen, robotic technologies are changing current practices in agricultural technology, particularly in autonomous weed control. The steady increase in research and development in this area will inevitably have a significant impact on traditional agricultural practices.

2.2.2. In Forestry

Timber harvesting is physically demanding and risky, as forest workers often work manually and are exposed to heavy and fast-moving objects such as trees, logs, and dangerous machinery. Over time, timber harvesting has become more mechanized to increase worker safety, productivity, and environmental sustainability.
In the context of increasing productivity through machine use, Ringdahl (2011) [37] found that human operators can become a bottleneck because it is not possible to work as fast as the potential capacity of machines. In trafficable terrain, harvesters and forwarders represent the highest level of mechanization, and they are basically manually controlled by human using joysticks. One way to overcome this human limitation of machine capacity is to change forest working methods in such a way that human activities are reduced to a minimum or are no longer required, like in autonomous vehicles [38]. While autonomous robotic systems are already being used in controlled workspaces such as factories or in simple agricultural environments, the use of autonomous machines in more complex environments, such as forests, is still in the research and development stage. One of the biggest challenges is on-the-fly navigation in the forest.
The most common approach for autonomous navigation in open terrain such as agriculture is based on global navigation satellite systems (GNSS). However, the GNSS signal absorption by the forest canopy leads to position errors of up to 50 m and more, which requires other solutions independent of the GNSS signal [39]. In addition to localization of the forest machine’s own position, the complex terrain and obstacles such as understory, and above all, trees, must also be taken into account when navigating autonomously in forests. In recent years, methods in the field of remote sensing have increasingly been developed to generate digital twins of forests based of terrestrial, airborne, or spaceborne sensor technologies. Gollob et al. (2020) [40] showed that personal laser scanning (PLS) is able to capture and automatically map terrain information, individual tree parameters and entire stands in a highly efficient way. This data can serve as a navigation basis for autonomous forest machines and the optimized operational harvest planning [39].
Rossmann (2010) [39] showed that an initial guess of the forest machine position can be made using an “imprecise” GNSS sensor. 2D laser scanners or stereo cameras on the forest machine (e.g., [41][42][43]) can detect tree positions in the near neighborhood of the machine (local tree pattern). The position of the forest machine can be determined efficiently and precisely by means of tree pattern matching between the stand map (global tree pattern from, e.g., PLS) and the local tree pattern [39]. The initial guess of the machine position with GNSS helps to make the pattern matching more time efficient.
Regardless of the challenging navigation, research is also being done on the type of locomotion of autonomous forest machines. Machines that seemed futuristic just a few years ago are already in use or available as prototypes. For example, the concept of animals moving slowly from branch-to-branch was used by New Zealand scientists and engineers to build a tree-to-tree locomotion machine (swinging machine) [38] (see Figure 2). To date, the swinging harvester has been radio-controlled—but with the challenges shown in terms of navigation and sensor fusion, the path to an autonomous, soil-conserving forestry machine is mapped out.
Figure 2. Tree to tree robot (image taken by SCION NZ Forest Research Institute, used with permission from Richard Parker).

2.3. Automated AI Systems

2.3.1. In Agriculture

As previously mentioned, autonomous AI systems are relatively advanced and are constantly being developed. These developments and upgrades lead to higher efficiency of the machines. In addition, more and more systems in modern tractors and harvesters are becoming fully automated to minimize the operator’s workload. The two most important domains of automation are situation awareness and process monitoring. For example, machine vision guidance systems are already widely used in modern tractors and harvesters, allowing the machine to automatically align itself with the harvest line without the operator’s help, so that humans can focus on other processes while they do so [44]. Infrared 3D camera systems on harvesters continuously monitor and control bin placement while allowing the operator to focus on the harvesting process [45].
Process monitoring is particularly pronounced in harvesting operations, where speed must be constantly controlled and adjusted according to the operation being performed [46]. The precise application of fertilizers and herbicides is also usually monitored and controlled automatically throughout the process. For this purpose, data from a global navigation satellite system (GNSS) as a guidance system with real-time sensor technology (e.g., Claas Crop Sensor [47]) are communicated among the tractor, the application device, and the task controller via the terminal, which has been done for some time [48].

2.3.2. In Forestry

Cable-yarding technologies are the basis for efficient and safe timber harvesting on steep slopes. To guarantee low harvesting costs and low environmental impacts on remaining trees and soil, the machine position and cable road must be carefully planned. For this planning, usually only imprecise information about the terrain and the forest stands is available. If stand and terrain information were collected with traditional measuring devices such as calipers, hypsometers, and theodolites, these measurements would be labor intensive, time consuming, prone to various errors, and thus limited in their spatial and temporal extent [49].
Thus, the cable road layout is still determined by experts based on rules of thumb and empirical knowledge [50]. Rules for this are formulated, for example, in Heinimann (2003) [51]. Automatic methods (optimization methods) to solve this problem have already been formulated in countries such as the USA and Chile [52][53][54]. However, these optimization or planning methods are largely based on the assumption of clear-cutting and do not use modern sensor technology to capture individual tree and terrain data. For example, high-resolution 3D data in form of a digital twin of the forest combined with well-known optimization functions and expert knowledge is a key factor to optimizing planning of timber harvesting. In this way, automatically optimized cable road planning can help to minimize the environmental impact and the costs for cable yarding (e.g., [50][55]).
In terms of cable yarding, there are also other examples for automation that are already being used in practice: Most cable yarding systems follow a typical scheme (work phases) of unloaded out, accumulate load, loaded in, and drop load on landing. Two of these phases, unloaded and loaded travel, have been automated; thus, the operator can work in the meantime with an integrated processor [42]. Pierzchała et al. (2018) [56] developed a method for automatic recognition of cable yarding work phases by using multiple sensors on the carriage and tower yarder. Further automation steps in cable yarding would be conceivable in the future; for example, the carriage could be equipped with additional orientation sensors, such as laser scanners or stereo cameras, for orientation.

2.4. Assisted AI Systems

2.4.1. In Agriculture

Assisted AI systems in agriculture are tightly overlapped with automated AI systems. In agricultural applications, machines can independently perform certain repetitive tasks without the human intervention. However, in the decision-making loop, humans are those one who make final decisions [57]. For example, implementation of a wide variety of non-invasive sensors in fruit and vegetable processing, e.g., drying processes, merged together with AI technologies, can be used to control drying processes and changes in shape of vegetables and fruits, and to predict optimum drying process parameters [58]. Several systems, e.g., situation awareness systems such as machine vision guidance systems, though performing their work automatically, can be still manually overridden by an operator [59]. An example is the precision application of fertilizer and pesticides: sprayers can work in fully manual mode [44]. In modern tractors, advanced steering control systems can adjust steering performance in order to suit current conditions, etc. [60]. Furthermore, fuel consumption efficiency can be improved with the above-mentioned technologies [61].

2.4.2. In Forestry

Operating with forestry cranes requires a lot of knowledge and experience to be productive with a low impact on the environment. Furthermore, trained forestry machine operators are essential for efficient timber production, in particular to reduce harvesting damage to the remaining trees and reduce machine downtime. Ovaskainen et al. (2004) [62] has shown that the productivity of trained harvester (Cut-to-length (CTL)) operators varies by about 40% under similar forest stand conditions. The study hypothesizes that efficiency differences are related to aspects of operator crane experience based on deliberate practice of motor skills, situational awareness, and visual perception. The state-of-the-art in crane control is the use of two analog joysticks, which are controlled by the two hands and/or the fingers. The joysticks provide electrical or hydraulic signals that control the flow rate of the hydraulic system and thus enable precise movement of the individual hydraulic cylinders. Motor learning is the key to smooth crane movements and harvester head control. Forest machinery machinists make approximately 4000 control inputs/h, many of which are repeated again and again but always have to be applied in a targeted manner [63]. Purfürst (2010) [64] found that learning to operate a harvester took on average 9 months. Furthermore, the operator must also master decision making and planning in order to achieve an appropriate level of performance. In summary, it can be stated that harvester/forwarder operation, especially crane operation, is ergonomically, motorically, and cognitively very demanding. To improve this, existing forestry machines are constantly being improved. Modern sensors (e.g., diameter and length measurement) combined with intelligent data processing can help to assist certain operations, such as processing stems into logs by automatically moving the harvester head to predetermined positions depending on the stem shape and log quality. On the one hand, this reduces the workload of the harvester operator, and on the other hand, optimizes the profit on the timber. Other good examples of how intelligent assistance can make work easier and also more efficient for the machine operator are the intelligent crane control systems from John Deere Forestry Oy (IBC: Intelligent Boom Control [65], see Figure 3), Komatsu (Smart Crane [66]), and Palfinger (Smart Control [67]). In such systems, the crane is controlled via the crane tip (harvester head or grapple), and the crane’s movement is automatically done via algorithms. Therefore, the operator is not controlling individual rams, but only needs to concentrate on the crane tip and control it with the joysticks. The system also dampens the movements of the cylinders and stops jerky load thrusts in the end positions, which enables jerk-free operation. The smart control results in less fatigue for the machinist, making them more productive overall. Results from Manner et al. (2019) [68] showed that if the crane is controlled with IBC compared to conventional crane control, the machine working time for the forwarder is 5.2% shorter during loading and 7.9% shorter during unloading. It has already been shown that the use of such a smart crane control system makes it much easier to learn how to operate harvester or forwarder machines [42][69].
Figure 3. Principles of the IBC system [70].

2.5. Augmenting AI Systems

2.5.1. In Agriculture

A good systematic for AR applications is given by Hurst et al. (2021) [71] (see Figure 4 and Figure 5), which has subdivisions into (1) marker-based, (2) markerless (location-based), (3) dynamic augmentation, and (4) complex augmentation, subdivided by AR type. Based on the four classifications, there are a lot of potential applications of AR in agriculture.
Figure 4. Types of AR deployment within crop and livestock management. Please refer to the excellent overview by Hurst et al. (2021) [71].
Figure 5. Analysis of technologies coupled with AR in Farming. Dark blue refers to crop-bases articles, whereas light green is for livestock, adapted. For details, please refer to the original paper by Hurst et al. (2021) [71].
Augmented reality (AR) enables the combination of a real environment and an interactive experience, e.g., synthetically generated information. It is necessary to clearly distinguish augmented reality (AR) from virtual reality (VR). Virtual reality uses a synthetically generated environment to replace the real environment, whereas augmented reality enhances the real environment with synthetically generated data [72]. Both technologies have found many applications in a wide variety of domains [73][74][75]. However, agricultural applications require the technology to be even more user-friendly, for example, by replacing smartphones and tablets with smart glasses [76]. This would provide operators with hands-free capabilities and a less constrained user interface. Although there are still some technical and technological drawbacks, smart glasses have emerged as promising potential candidates for the main platform for AR.
Meanwhile, head-mounted AR displays, for example, are being used to help farmers detect plant diseases [77]. Through the head-mounted display camera that observes the plants, images of the plant leaves are captured in real time and sent to the cloud server for analysis, which also provides a lot of room for detecting defects/anomalies. After post-processing the data in the cloud, the results are transmitted back to the head-mounted display. In this way, a augmented reality head-mounted display allows less experienced farmers to inspect the field more efficiently and quickly in search of infected plants. In addition, such technology can also help train and educate the farmer directly in the field. The farmer’s knowledge remains invaluable, as the expert can contribute their domain knowledge to future machine teaching AI solutions.
AR smart glasses are already used for scanning QR codes during characteristic livestock activities such as feeding and milking [78]. The initial results show that the above-mentioned glasses can provide significant help to farmers by enabling real-time consultation, data collection, data sharing, etc., and proved to be a useful tool for herd management and feeding, where many more applications will bring benefits to farmers in the future, especially through AI.
AR smart glasses (see Figure 6) [79] have already been implemented to create a system that assists the operator during field operations such as plowing and fertilizing. This can minimize the strain on the operator caused by constantly tracking maps, light bars, etc., which can be especially pronounced in large, irregularly shaped fields. With the help of AR glasses, the operator thus obtains data on their trajectory, speed, etc., which is superimposed on the data of the treated surfaces and largely simplifies the operator’s work. It should be mentioned that this system can be used both inside the machine and outside it. This system has proven to be very useful in fertilizing, spraying, and plowing operations.
Figure 6. AR-based positioning assist system: (a) tractor mounted, (b) manual mounted [79].
Besides the above-mentioned examples, augmented reality-related research is on the rise, mainly focusing on greenhouse management [80], weed management [81], and farmer management support [82].

2.5.2. In Forestry

Automated and assisting systems combined with a variety of modern sensors increase the productivity and quality of forest work. Even if we are far from using the full potential of smart technology in forest machines, the operator is already receiving a large amount of information compared to traditional machines. In addition, forestry machines are also increasingly networked with each other (e.g., harvesters and forwarders) and also have possibilities to retrieve or exchange information from the Internet. On the one hand, this abundance of data helps to train autonomous, automatic, or assisting processes; but on the other hand, it leads to a challenge in mediating the information to the operator. However, in addition to many of the benefits of “big data”, negative effects can also occur in the form of divided attention, information overload, and operator stress. A further increase in the information burden on the operator can potentially lead to higher chances of human errors that could harm not only the operator, but also people, machines, and objects in the neighborhood of the machine [83][84][85]. Augmented reality can help to provide the information better and more efficiently to the machine operator. Augmented reality, which matches data generated by sensors and algorithms with the user’s actual perception of the environment, can improve the operator’s awareness of the machine, the environment, and the specific work progress. Sitompul and Wallmyr (2019) [83] defined for forest applications that augmented information could be provided for two types of operation: in-cabin operation and remote operation. Traditionally, heavy machines, such as forestry machines, are operated in the cab. In recent years, there have been repeated efforts and studies to move the operator from working in the cab to working in a remote control station, which is called teleoperation. With regard to cab work, Kymäläinen et al. (2017) [86] and Aromaa et al. (2020) [87] have proposed to show technically related visual obstacles of the forestry machine (e.g., crane) transparently on a display.
The hidden areas (behind the crane) could be seen with the help of cameras. This increases safety for the surrounding area. In addition to classic screens, there is also the option of projecting information directly into the operator’s field of view via a heads-up display. For example, the operator can be provided directly with information about the bucking optimization of the harvester without having to look at a separate monitor [88]. Teleworking mainly leads to hazard minimization for the operator. The biggest challenge in teleworking is sufficient visibility, as the fields of view of the cameras attached to the machine are limited [89]. Furthermore, depth vision is lost due to the most commonly used 2D displays, which makes it difficult to position the crane exactly and grip logs.
For timber loading onto trucks, Hiab (HiVision [90], see Figure 7) and Palfinger (Virtual Drive [91]) each offer a control system for forestry cranes. It allows the operator to control the crane from the truck cab while monitoring the environment with the help of a camera and VR goggles. The advantages of the system are that the truck driver no longer has to leave the cab, thereby avoiding hazards, and the operator is ergonomically positioned.
Figure 7. Hiab HiVision VR system [90].
Due to the high cost of machinery and the dangerous nature of the work, great importance must be placed on the training of forestry machine operators. Thus, in addition to supporting operational work, augmented reality also offers advantages in education and training. With realistic simulations of forestry machines [92] and forest [93], a student can be trained in a safe environment in all work processes that occur in real operations.

References

  1. McCulloch, W.S.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biol. 1943, 5, 115–133.
  2. Bengio, Y.; Lecun, Y.; Hinton, G. Deep learning for AI. Commun. ACM 2021, 64, 58–65.
  3. Holzinger, A.; Weippl, E.; Tjoa, A.M.; Kieseberg, P. Digital Transformation for Sustainable Development Goals (SDGs)—A Security, Safety and Privacy Perspective on AI. In Springer Lecture Notes in Computer Science, LNCS 12844; Springer: Cham, Switzerland, 2021; pp. 1–20.
  4. Holzinger, A.; Dehmer, M.; Emmert-Streib, F.; Cucchiara, R.; Augenstein, I.; Del Ser, J.; Samek, W.; Jurisica, I.; Díaz-Rodríguez, N. Information fusion as an integrative cross-cutting enabler to achieve robust, explainable, and trustworthy medical artificial intelligence. Inf. Fusion 2022, 79, 263–278.
  5. Holzinger, A. The Next Frontier: AI We Can Really Trust. In Proceedings of the ECML PKDD 2021, CCIS 1524; Michael Kamp, E.A., Ed.; Springer: Cham, Switzerland, 2021; pp. 1–14.
  6. Shneiderman, B. Human-Centered AI; Oxford University Press: Oxford, UK, 2022.
  7. Holzinger, A.; Plass, M.; Kickmeier-Rust, M.; Holzinger, K.; Crişan, G.C.; Pintea, C.M.; Palade, V. Interactive machine learning: Experimental evidence for the human in the algorithmic loop. Appl. Intell. 2019, 49, 2401–2414.
  8. Dietterich, T.G.; Horvitz, E.J. Rise of concerns about AI: Reflections and directions. Commun. ACM 2015, 58, 38–40.
  9. Wang, D.; Cao, W.; Zhang, F.; Li, Z.; Xu, S.; Wu, X. A Review of Deep Learning in Multiscale Agricultural Sensing. Remote Sens. 2022, 14, 559.
  10. Zhang, J.; Huang, Y.; Pu, R.; Gonzalez-Moreno, P.; Yuan, L.; Wu, K.; Huang, W. Monitoring plant diseases and pests through remote sensing technology: A review. Comput. Electron. Agric. 2019, 165, 104943.
  11. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of remote sensing in precision agriculture: A review. Remote Sens. 2020, 12, 3136.
  12. Yao, H.; Qin, R.; Chen, X. Unmanned aerial vehicle for remote sensing applications—A review. Remote Sens. 2019, 11, 1443.
  13. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402.
  14. Saiz-Rubio, V.; Rovira-Más, F. From smart farming towards agriculture 5.0: A review on crop data management. Agronomy 2020, 10, 207.
  15. Mendes, J.; Pinho, T.M.; Neves dos Santos, F.; Sousa, J.J.; Peres, E.; Boaventura-Cunha, J.; Cunha, M.; Morais, R. Smartphone applications targeting precision agriculture practices—A systematic review. Agronomy 2020, 10, 855.
  16. Sartori, D.; Brunelli, D. A smart sensor for precision agriculture powered by microbial fuel cells. In Proceedings of the 2016 IEEE Sensors Applications Symposium (SAS), Catania, Italy, 20–22 April 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–6.
  17. Elmeseiry, N.; Alshaer, N.; Ismail, T. A Detailed Survey and Future Directions of Unmanned Aerial Vehicles (UAVs) with Potential Applications. Aerospace 2021, 8, 363.
  18. Kalyani, Y.; Collier, R. A Systematic Survey on the Role of Cloud, Fog, and Edge Computing Combination in Smart Agriculture. Sensors 2021, 21, 5922.
  19. Jarial, S. Internet of Things application in Indian agriculture, challenges and effect on the extension advisory services—A review. J. Agribus. Dev. Emerg. Econ. 2022, ahead-of-print.
  20. Cockburn, M. Application and prospective discussion of machine learning for the management of dairy farms. Animals 2020, 10, 1690.
  21. Haxhibeqiri, J.; De Poorter, E.; Moerman, I.; Hoebeke, J. A survey of LoRaWAN for IoT: From technology to application. Sensors 2018, 18, 3995.
  22. Waldrop, M.M. Autonomous vehicles: No drivers required. Nat. News 2015, 518, 20–23.
  23. Hopkins, D.; Schwanen, T. Talking about automated vehicles: What do levels of automation do? Technol. Soc. 2021, 64, 101488.
  24. Monaco, T.; Grayson, A.; Sanders, D. Influence of four weed species on the growth, yield, and quality of direct-seeded tomatoes (Lycopersicon esculentum). Weed Sci. 1981, 29, 394–397.
  25. Roberts, H.; Hewson, R.; Ricketts, M.E. Weed competition in drilled summer lettuce. Hortic. Res. 1977, 17, 39–45.
  26. Slaughter, D.C.; Giles, D.; Downey, D. Autonomous robotic weed control systems: A review. Comput. Electron. Agric. 2008, 61, 63–78.
  27. Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128.
  28. Sabatini, R.; Moore, T.; Ramasamy, S. Global navigation satellite systems performance analysis and augmentation strategies in aviation. Prog. Aerosp. Sci. 2017, 95, 45–98.
  29. Lim, Y.; Pongsakornsathien, N.; Gardi, A.; Sabatini, R.; Kistan, T.; Ezer, N.; Bursch, D.J. Adaptive human–robot interactions for multiple unmanned aerial vehicles. Robotics 2021, 10, 12.
  30. Ehsani, M.R.; Sullivan, M.D.; Zimmerman, T.L.; Stombaugh, T. Evaluating the dynamic accuracy of low-cost GPS receivers. In Proceedings of the 2003 ASAE Annual Meeting. American Society of Agricultural and Biological Engineers, Las Vegas, NV, USA, 27–30 July 2003; p. 1.
  31. Åstrand, B.; Baerveldt, A.J. An agricultural mobile robot with vision-based perception for mechanical weed control. Auton. Robot. 2002, 13, 21–35.
  32. Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, W.; Stachniss, C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052.
  33. Scholz, C.; Moeller, K.; Ruckelshausen, A.; Hinck, S.; Göttinger, M. Automatic soil penetrometer measurements and GIS based documentation with the autonomous field robot platform bonirob. In Proceedings of the 12th International Conference of Precision Agriculture, Sacramento, CA, USA, 20–23 July 2014.
  34. Lamm, R.D.; Slaughter, D.C.; Giles, D.K. Precision weed control system for cotton. Trans. ASAE 2002, 45, 231.
  35. Blasco, J.; Aleixos, N.; Roger, J.; Rabatel, G.; Moltó, E. AE—Automation and emerging technologies: Robotic weed control using machine vision. Biosyst. Eng. 2002, 83, 149–157.
  36. Bawden, O.; Kulk, J.; Russell, R.; McCool, C.; English, A.; Dayoub, F.; Lehnert, C.; Perez, T. Robot for weed species plant-specific management. J. Field Robot. 2017, 34, 1179–1199.
  37. Ringdahl, O. Automation in Forestry: Development of Unmanned Forwarders. Ph.D. Thesis, Institutionen för Datavetenskap, Umeå Universitet, Umeå, Sweden, 2011.
  38. Parker, R.; Bayne, K.; Clinton, P.W. Robotics in forestry. N. Z. J. For. 2016, 60, 8–14.
  39. Rossmann, J.; Krahwinkler, P.; Schlette, C. Navigation of mobile robots in natural environments: Using sensor fusion in forestry. J. Syst. Cybern. Inform. 2010, 8, 67–71.
  40. Gollob, C.; Ritter, T.; Nothdurft, A. Forest inventory with long range and high-speed personal laser scanning (PLS) and simultaneous localization and mapping (SLAM) technology. Remote Sens. 2020, 12, 1509.
  41. Visser, R. Next Generation Timber Harvesting Systems: Opportunities for Remote Controlled and Autonomous Machinery; Project No: PRC437-1718; Forest & Wood Products Australia Limited: Melbourne, Australia, 2018.
  42. Visser, R.; Obi, O.F. Automation and robotics in forest harvesting operations: Identifying near-term opportunities. Croat. J. For. Eng. J. Theory Appl. For. Eng. 2021, 42, 13–24.
  43. Wells, L.A.; Chung, W. Evaluation of ground plane detection for estimating breast height in stereo images. For. Sci. 2020, 66, 612–622.
  44. Thomasson, J.A.; Baillie, C.P.; Antille, D.L.; McCarthy, C.L.; Lobsey, C.R. A review of the state of the art in agricultural automation. Part II: On-farm agricultural communications and connectivity. In Proceedings of the 2018 ASABE Annual International Meeting. American Society of Agricultural and Biological Engineers, Detroit, Michigan, 29 July–1 August 2018; p. 1.
  45. Vázquez-Arellano, M.; Griepentrog, H.W.; Reiser, D.; Paraforos, D.S. 3-D imaging systems for agricultural applications—A review. Sensors 2016, 16, 618.
  46. Schueller, J.K. Engineering advancements. In Automation: The Future of Weed Control in Cropping Systems; Springer: Berlin/Heidelberg, Germany, 2014; pp. 35–49.
  47. Claas Crop Sensor. 2022. Available online: https://www.claas.co.uk/products/easy-2018/precision-farming/crop-sensor-isaria (accessed on 6 March 2022).
  48. Goense, D.; Hofstee, J.; Van Bergeijk, J. An information model to describe systems for spatially variable field operations. Comput. Electron. Agric. 1996, 14, 197–214.
  49. Gollob, C.; Ritter, T.; Wassermann, C.; Nothdurft, A. Influence of scanner position and plot size on the accuracy of tree detection and diameter estimation using terrestrial laser scanning on forest inventory plots. Remote Sens. 2019, 11, 1602.
  50. Bont, L.G.; Maurer, S.; Breschan, J.R. Automated cable road layout and harvesting planning for multiple objectives in steep terrain. Forests 2019, 10, 687.
  51. Heinimann, H.R. Holzerntetechnik zur Sicherstellung einer minimalen Schutzwaldpflege: Bericht im Auftrag des Bundesamtes für Umwelt, Wald und Landschaft (BUWAL). Interner Ber./ETH For. Eng. 2003, 12.
  52. Dykstra, D.P.; Riggs, J.L. An application of facilities location theory to the design of forest harvesting areas. AIIE Trans. 1977, 9, 270–277.
  53. Chung, W. Optimization of Cable Logging Layout Using a Heuristic Algorithm for Network Programming; Oregon State University: Corvallis, OR, USA, 2003.
  54. Epstein, R.; Weintraub, A.; Sapunar, P.; Nieto, E.; Sessions, J.B.; Sessions, J.; Bustamante, F.; Musante, H. A combinatorial heuristic approach for solving real-size machinery location and road design problems in forestry planning. Oper. Res. 2006, 54, 1017–1027.
  55. Bont, L.; Heinimann, H.R.; Church, R.L. Optimizing cable harvesting layout when using variable-length cable roads in central Europe. Can. J. For. Res. 2014, 44, 949–960.
  56. Pierzchała, M.; Kvaal, K.; Stampfer, K.; Talbot, B. Automatic recognition of work phases in cable yarding supported by sensor fusion. Int. J. For. Eng. 2018, 29, 12–20.
  57. Abdullahi, H.S.; Mahieddine, F.; Sheriff, R.E. Technology impact on agricultural productivity: A review of precision agriculture using unmanned aerial vehicles. In International Conference on Wireless and Satellite Systems; Springer: Berlin/Heidelberg, Germany, 2015; pp. 388–400.
  58. Chen, J.; Zhang, M.; Xu, B.; Sun, J.; Mujumdar, A.S. Artificial intelligence assisted technologies for controlling the drying of fruits and vegetables using physical fields: A review. Trends Food Sci. Technol. 2020, 105, 251–260.
  59. Antille, D.L.; Lobsey, C.R.; McCarthy, C.L.; Thomasson, J.A.; Baillie, C.P. A review of the state of the art in agricultural automation. Part IV: Sensor-based nitrogen management technologies. In Proceedings of the 2018 ASABE Annual International Meeting. American Society of Agricultural and Biological Engineers, Detroit, Michigan, 29 July–1 August 2018; p. 1.
  60. Hague, T.; Tillett, N. Navigation and control of an autonomous horticultural robot. Mechatronics 1996, 6, 165–180.
  61. Howard, C.N.; Kocher, M.F.; Hoy, R.M.; Blankenship, E.E. Testing the fuel efficiency of tractors with continuously variable and standard geared transmissions. Trans. ASABE 2013, 56, 869–879.
  62. Ovaskainen, H.; Uusitalo, J.; Väätäinen, K. Characteristics and significance of a harvester operators’ working technique in thinnings. Int. J. For. Eng. 2004, 15, 67–77.
  63. Dreger, F.A.; Rinkenauer, G. Cut to Length Harvester Operator Skill: How Human Planning and Motor Control Co-Evolve to Allow Expert Performance. Fruehjahrskongress 2020, Berlin Digitaler Wandel, Digitale Arbeit, Digitaler Mensch? 2020. Available online: https://gfa2020.gesellschaft-fuer-arbeitswissenschaft.de/inhalt/D.1.3.pdf (accessed on 4 March 2022).
  64. Purfürst, F.T. Learning curves of harvester operators. Croat. J. For. Eng. J. Theory Appl. For. Eng. 2010, 31, 89–97.
  65. Intelligent Boom Control. 2022. Available online: https://www.deere.co.uk/en/forestry/ibc/ (accessed on 6 March 2022).
  66. Smart Crane. 2022. Available online: https://www.komatsuforest.com/explore/smart-crane-for-forwarders (accessed on 6 March 2022).
  67. Smart Control. 2022. Available online: https://www.palfingerepsilon.com/en/Epsolutions/Smart-Control (accessed on 4 March 2022).
  68. Manner, J.; Mörk, A.; Englund, M. Comparing forwarder boom-control systems based on an automatically recorded follow-up dataset. Silva. Fenn. 2019, 53, 10161.
  69. Englund, M.; Mörk, A.; Andersson, H.; Manner, J. Delautomation av Skotarkran–Utveckling och Utvärdering i Simulator. . 2017. Available online: https://www.skogforsk.se/cd_20190114162732/contentassets/e7e1a93a4ebd41c386b85dc3f566e5e8/delautomatiserad-skotarkran-utveckling-och-utvardering-i-simulator-arbetsrapport-932-2017.pdf (accessed on 3 March 2022).
  70. IBC: Operator’s Instructions 1WJ1110G004202-, 1WJ1210G002102-, 1WJ1510G003604-. 2022. Available online: https://www.johndeeretechinfo.com/search?p0=doc_type&p0_v=operators%20manuals&pattr=p0 (accessed on 7 March 2022).
  71. Hurst, W.; Mendoza, F.R.; Tekinerdogan, B. Augmented Reality in Precision Farming: Concepts and Applications. Smart Cities 2021, 4, 1454–1468.
  72. Burdea, G.C.; Coiffet, P. Virtual Reality Technology; John Wiley & Sons: Hoboken, NJ, USA, 2003.
  73. Seth, A.; Vance, J.M.; Oliver, J.H. Virtual reality for assembly methods prototyping: A review. Virtual Real. 2011, 15, 5–20.
  74. Schultheis, M.T.; Rizzo, A.A. The application of virtual reality technology in rehabilitation. Rehabil. Psychol. 2001, 46, 296.
  75. Höllerer, T.; Feiner, S. Mobile augmented reality. In Telegeoinformatics: Location-Based Computing and Services; Routledge: London, UK, 2004; Volume 21.
  76. Lee, L.H.; Hui, P. Interaction methods for smart glasses: A survey. IEEE Access 2018, 6, 28712–28732.
  77. Ponnusamy, V.; Natarajan, S.; Ramasamy, N.; Clement, C.; Rajalingam, P.; Mitsunori, M. An iot- enabled augmented reality framework for plant disease detection. Revue D’Intell. Artif. 2021, 35, 185–192.
  78. Caria, M.; Sara, G.; Todde, G.; Polese, M.; Pazzona, A. Exploring Smart Glasses for Augmented Reality: A Valuable and Integrative Tool in Precision Livestock Farming. Animals 2019, 9, 903.
  79. Santana-Fernández, J.; Gómez-Gil, J.; del Pozo-San-Cirilo, L. Design and implementation of a GPS guidance system for agricultural tractors using augmented reality technology. Sensors 2010, 10, 10435–10447.
  80. De Castro Neto, M.; Cardoso, P. Augmented reality greenhouse. In Proceedings of the EFITA-WCCA-CIGR Conference “Sustainable Agriculture through ICT Innovation”, Turin, Italy, 24–27 June 2013; pp. 24–27.
  81. Vidal, N.R.; Vidal, R.A. Augmented reality systems for weed economic thresholds applications. Planta Daninha 2010, 28, 449–454.
  82. Okayama, T.; Miyawaki, K. The “Smart Garden” system using augmented reality. IFAC Proc. Vol. 2013, 46, 307–310.
  83. Sitompul, T.A.; Wallmyr, M. Using augmented reality to improve productivity and safety for heavy machinery operators: State of the art. In Proceedings of the 17th International Conference on Virtual-Reality Continuum and Its Applications in Industry, Brisbane, QLD, Australia, 14–16 November 2019; pp. 1–9.
  84. Akyeampong, J.; Udoka, S.; Caruso, G.; Bordegoni, M. Evaluation of hydraulic excavator Human–Machine Interface concepts using NASA TLX. Int. J. Ind. Ergon. 2014, 44, 374–382.
  85. Chen, Y.C.; Chi, H.L.; Kangm, S.C.; Hsieh, S.H. A smart crane operations assistance system using augmented reality technology. In Proceedings of the 28th International Symposium on Automation and Robotics in Construction, ISARC 2011, Seoul, Korea, 29 June–2 July 2011; pp. 643–649.
  86. Kymäläinen, T.; Suominen, O.; Aromaa, S.; Goriachev, V. Science fiction prototypes illustrating future see-through digital structures in mobile work machines. In EAI International Conference on Technology, Innovation, Entrepreneurship and Education; Springer: Berlin/Heidelberg, Germany, 2017; pp. 179–193.
  87. Aromaa, S.; Goriachev, V.; Kymäläinen, T. Virtual prototyping in the design of see-through features in mobile machinery. Virtual Real. 2020, 24, 23–37.
  88. Englund, M.; Lundström, H.; Brunberg, T.; Löfgren, B. Utvärdering av Head-Up Display för Visning av Apteringsinformation i Slutavverkning; Technical Report; Skogforsk: Uppsala, Sweden, 2015.
  89. Fang, Y.; Cho, Y.K. Effectiveness analysis from a cognitive perspective for a real-time safety assistance system for mobile crane lifting operations. J. Constr. Eng. Manag. 2017, 143, 05016025.
  90. HIAB HIVISION. 2022. Available online: https://www.hiab.com/en-us/digital-solutions/hivision (accessed on 6 March 2022).
  91. Virtual Drive. 2022. Available online: https://www.palfingerepsilon.com/en/Epsolutions/Virtual-Drive (accessed on 6 March 2022).
  92. Virtual Training for Ponsse. 2022. Available online: http://www.upknowledge.com/ponsse (accessed on 3 March 2022).
  93. Freund, E.; Krämer, M.; Rossmann, J. Towards realistic forest machine simulators. In Proceedings of the Modeling and Simulation Technologies Conference, Denver, CO, USA, 14–17 August 2000; p. 4095.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , , , , , ,
View Times: 504
Entry Collection: Environmental Sciences
Revisions: 2 times (View History)
Update Date: 12 Jul 2022
1000/1000
Video Production Service