1. Evolution of Vision-Based Self-Steering Tractors
The rapid development of computers, electronic sensors and computing technologies in the 1980s has motivated the interest in autonomous vehicle guidance systems. A number of guidance technologies have been proposed
[12,13][1][2]; ultrasonic, optical, mechanical, etc. Since the early 1990s, GPS systems have been used widely as relatively newly introduced and accurate guiding sensors in numerous agricultural applications towards fully autonomous navigation
[14][3]. However, the high cost of reliable GPS sensors made them prohibitive to use in agricultural navigation applications. Machine vision technologies based on optical local sensors could be alternatively used to guide agricultural vehicles when crop row structures can be observed. Then, the camera system could determine the relative position of the machinery in relation to the crop rows and guide the vehicle between them to perform field operations. Local features could help to fine-tune the trajectory of the vehicle on-site. The latter is the main reason why most of the existing studies on vision-based guided tractors focus on structured fields that are characterized by crop rows. A number of image processing methodologies have been suggested to define the guidance path from crop row images; yet only a finite number of vision-based guidance systems have been developed for real in-field applications
[15][4].
Machine vision was first introduced for the automatic navigation of tractors and combines in the 1980s. In 1987, Reid and Searcy
[16][5] developed a dynamic thresholding technique to extract path information from field images. The same authors, later in the same year
[17][6], proposed a variation of their previous work. The guidance signal was computed by the same algorithm. Additionally, the distribution of the crop-background was estimated by a bimodal Gaussian distribution function, and run-length encoding was employed for locating the center points of row crop canopy shapes in thresholded images. Billingsley and Schoenfisch
[18][7] designed a vision guidance system to steer a tractor relative to crop rows. The system could detect the end of the row and warn the driver to turn the tractor. The tractor could automatically acquire its track in the next row. The system was further optimized later by changes in technology; however, the fundamental principles of their previous research have remained the same
[19][8]. Pinto and Reid
[20][9] proposed a heading angle and offset determination using principal component analysis in order to visually guide a tractor. The task was addressed as a pose recognition problem where a pose was defined by the combination of heading angle and offset. In
[21][10], Benson et al. developed a machine vision algorithm for crop edge detection. The algorithm was integrated into a tractor for automated harvest to locate the field boundaries for guidance. The same authors, in
[22][11], automated a maize harvest with a combine vision-based steering system based on fuzzy logic.
In
[23][12], three machine vision guidance algorithms were developed to mimic the perceptive process of a human operator towards automated harvest, both in the day and at night, reporting accuracies equivalent to a GPS. In
[24][13], a machine vision system was developed for an agricultural small-grain combine harvester. The proposed algorithm used a monochromatic camera to separate the uncut crop rows from the background and to calculate a guidance signal. Keicher and Seudert
[25][14] developed an automatic guidance system for mechanical weeding in crop rows based on a digital image processing system combined with a specific controller and a proportional hydraulic valve. Åstrand and Baerveldt performed extensive research on the vision-based guidance of tractors and developed robust image processing algorithms integrated with agricultural tractors to detect the position of crop rows
[26][15]. Søgaard and Olsen
[27][16] developed a method to guide a tractor with respect to the crop rows. The method was based on color images of the field surface. Lang
[28][17] proposed an automatic steering control system for a plantation tractor based on the direction and distance of the camera to the stems of the plants. Kise
[29][18] presented a row-detection algorithm for a stereovision-based agricultural machinery guidance system. The algorithm used functions for stereo-image processing, extracted elevation maps and determined navigation points. In
[30][19], Tillett and Hague proposed a computer vision guidance system for cereals that was mounted on a hoe tractor. In subsequent work
[31][20], they presented a method for locating crop rows in images and tested it for the guidance of a mechanical hoe in winter wheat. Later, they extended the operating range of their tracking system to sugar beets
[32][21]. Subramanian et al.
[33][22] tested machine vision for the guidance of a tractor in a citrus grove alleyway and compared it to a laser radar. Both approaches for path tracking performed similarly. An automatic steering rice transplanter based on image-processing self-guidance was presented by Misao
[34][23]. The steering system used a video camera zoom system. Han et al.
[35][24] developed a guidance directrix planner to control an agricultural vehicle that was converted to the desired steering wheel angle through navigation. In
[36][25], Okamoto et al. presented an automatic guidance system based on a crop row sensor consisting of a charge-coupled device (CCD) camera and an image processing algorithm, implemented for the autonomous guidance of a weeding cultivator.
Autonomous tractor steering is the most established among agricultural navigation technologies; self-steering tractors have already been commercialized for about two decades
[12,13][1][2]. Commercial tractor navigation techniques involve a fusion of sensors and are not based solely on machine vision; therefore, they are not in the scope of this research.
Although vision-based tractor navigation systems have been developed, their commercial application is still in its early stages, due to problems affecting their reliability, as reported subsequently. However, relevant research reveals the potential of vision-based automatic guidance in agricultural machinery; thus, the next decade is expected to be crucial for vision-based self-steering tractors to revolutionize the agricultural sector. A revolution is also expected by the newest trend in agriculture: agricultural robots, namely
Agrobots, that claim to replace tractors. Agrobots can navigate autonomously in fields based on the same principles and sensors and can work on crop scale with precision and dexterity
[5][26]. However, compared to tractors, an Agrobot is a sensitive, high-cost tool that can perform specific tasks. In contrast, a tractor is very durable and sturdy, can operate under adverse weather conditions and is versatile since it allows for the flexibility to adapt to a multitude of tools (topping tools, lawnmowers, sprayers, etc.) for a variety of tasks. Therefore, tractors are key pieces of equipment for all farms, from small to commercial scale, and at present, there is no intention to replace them but to upgrade them in terms of navigational autonomy.