Lane detection and tracking are the advanced key features of the advanced driver assistance system. Lane detection is the process of detecting white lines on the roads. Lane tracking is the process of assisting the vehicle to remain in the desired path, and it controls the motion model by using previously detected lane markers.
1. Introduction
Autonomous passenger vehicles are a direct implementation of transportation-related autonomous robotics research. They are also known as self-driving vehicles or driverless vehicles. Shakey the robot (1966–1972) is the first autonomous mobile robot that has been documented
[1]. It was developed by Stanford Research Institute’s Artificial Intelligence Centre and was capable of detecting the environment, thinking, planning, and navigation. In basic settings, vision-based lane tracking and obstacle avoidance sparked interest in autonomous vehicles
[2]. In the early 1990s, The Royal Armament Research and Development Establishment in the United Kingdom created two vehicles for obstacle-free navigation on and off the road
[3]. In the United States, the first operations of autonomous driving in realistic settings dates back to Carnegie Mellon University’s NavLab in the early 1990s
[4]. The vehicle developed by NavLab was operated at very low speeds due to the limited computational power available at the time. Early US research projects also included the California PATH project, which developed the automated highway
[5]. Vehicle steering was automated with manual longitudinal control in the “No Hands Across America” project
[6]. In early 2000, CyberCars, one of several European projects began developing technologies based on automated transport
[7]. The announcement of the defence advanced research projects agency (DARPA) grand challenge in 2003 generated research interest in autonomous cars. Following that, in 2006, the DARPA urban challenge was performed in a controlled situation with a variety of autonomous and human-operated vehicles. Since then, many manufactures, including Audi, BMW, Bosch, Ford, GM, Lexus, Mercedes, Nissan, Tesla, Volkswagen, Volvo and Google, have launched self-driving vehicle projects in collaboration with universities
[8]. Google’s self-driving car has experimented and travelled 500 thousand kilometres and has begun building prototypes of its own cars
[9]. A completely autonomous vehicle would be expected to drive to a chosen location without any expectation of shared control with the driver, including safety-critical tasks.
The performance of lane detection and tracking depends on the well-developed roads and their lane markings, so smart cities are also a prominent factor in autonomous vehicle research. The idea of a smart city is often linked with an eco-city or a sustainable city, both of which seek to enhance the quality of municipal services while lowering their costs. Smart cities’ primary goal is to balance technological innovation with the economic, social, and environmental problems that tomorrow’s cities face. The greater closeness between government and people is required in smart cities that embrace the circular economy’s concepts
[10]. The way materials and goods flow around people and their demands will alter, as will the structure of cities. Several car manufacturers such as Tesla and Audi have already launched autonomous vehicle marketing for private use. Soon, society will be influenced by autonomous vehicles’ spread to urban transport systems
[11]. The development of smart cities with the introduction of connected and autonomous vehicles could potentially transform cities and guide long-term urban planning
[10].
Autonomous vehicles and Advanced Driver Assistance Systems (ADAS) are predicted to provide a higher degree of safety and reduce fuel and energy consumption and road traffic emissions. ADAS is implemented for safe and efficient driving, which has many driver assistance features such as warning drivers about forwarding collision warning or safe lane change
[12]. Research shows that most accidents occur because of driver errors, and the ADAS can reduce the accidents and workload of the driver. If there is a likelihood of an accident, ADAS can take the necessary action to avoid it
[13]. Lane departure warning (LDW), which utilizes lane detection and tracking algorithms, is an essential feature of the ADAS. The LDW warns the driver when a vehicle crosses white lane lines unintentionally and controls the vehicle by bringing it back into the desired safe path. Three types of approaches for lane detection are usually discussed in the existing literature: learning-based approach, features-based approach, and model-based approach
[13][14][15][16][17][18] . Many challenges and issues have been highlighted in the literature regarding the LDW systems, such as visibility conditions change, variation in images, and lane appearance diversity
[17]. Since different countries have used various lane markers, there is a challenge for lane detection and tracking to solve the problems.
2. Lane Detection and Tracking Algorithms
The feature-based approach uses edges and local visual characteristics of interest, such as gradient, colour, brightness, texture, orientation, and variations, which are relatively insensitive to road shapes but sensitive to illumination effects. The model-based approaches apply global road models to fit low levels of features that are more robust against illumination effects, but they are sensitive to road shapes
[13][14]. The geometrics parameters are used in the model-based approach for lane detection
[16][17][18]. The learning-based approach consists of two stages: training and classification. The training process uses previously known errors and system properties to construct a model, e.g., program variables. In addition, the classification phase applies the training model to the user set of properties and outputs that are more likely to be correlated with the error ordered by their probability of fault discloser
[19]. It is then followed up by summary tables (
Table 1,
Table 2,
Table 3 and
Table 4) that present the key features of these algorithms and strengths, weaknesses, and future prospects.
Table 1. A summary of methods used for lane detection and tracking with general remarks.
Table 2. A comprehensive summary of lane detection and tracking algorithm.
Table 3. A comprehensive summary of learning-based model predictive controller lane detection and tracking.
Table 4. A comprehensive summary of robust lane detection and tracking.
Some of the key observations from Table 3, Table 4 and Table 5 are summarized below:
-
Frequent calibration is required for accurate decision making in a complex environment.
-
Reinforcement learning with the model predictive control could be a better choice to avoid false lane detection.
-
Model-based approaches (robust lane detection and tracking) provide better results in different environmental conditions. Camera quality plays an important role in determining lane marking.
-
The algorithm’s performance depends on the type of filter used, and the Kalman filter is mostly used for lane tracking.
-
In a vision-based system, image smoothing is the initial lane detection and tracking stage, which plays a vital role in increasing systems performance.
-
External disturbances like weather conditions, vision quality, shadow and blazing, and internal disturbances such as too narrow, too wide, and unclear lane marking, drop algorithm performance.
-
The majority of researchers (>90%) have used custom datasets for research.
-
Monocular, stereo and infrared cameras have been used to capture images and videos. The algorithm’s accuracy depends on the type of camera used, and a stereo camera gives better performance than a monocular camera.
-
The lane markers can be occluded by a nearby vehicle while doing overtake.
-
There is an abrupt change in illumination as the vehicle gets out of a tunnel. Sudden changes in illumination affect the image quality and drop the system performance.
-
The results show that the lane detection and tracking efficiency rate under dry and light rain conditions is near 99% in most scenarios. However, the efficiency of lane marking detection is significantly affected by heavy rain conditions.
-
It has been seen that the performance of the system drops due to unclear and degraded lane markings.
-
IMU (Inertia measurement unit) and GPS are examples that help to improve RADAR and LIDAR’s performance of distance measurement.
-
One of the biggest problems with today’s ADAS is that changes in environmental and weather conditions have a major effect on the system’s performance.