The time evolution of intelligent vehicle technology is explained, which highlights the development of an intelligent
vehicle and its safety applications, focusing on the various usages of perception sensors in production.
During this phase, the dynamic stability of vehicles was one of the focal points. Inertial sensors incorporated into inertial measurement units (IMUs) combined with an odometer were often used to improve the stability of the vehicle, particularly when the road had several curves, and this soon led to driver assistance like anti-lock braking systems (ABSs), followed by traction control (TC) and electronic stability (ECS) [10]. Mercedes has shown efficacy and importance for human life with the combined ABS and ECS systems and the “Moose Test” has attracted public and official attention [11]. Nevertheless, safety concerns were limited to drivers and passengers, increasing concern about mobility and the safety of human life in the surrounding area, which led the way to the development of external sensors. In 1986, the European project PROMETHEUS [12] involving university research centers and transport as well as automotive companies, carried out basic studies on autonomous features ranging from collision prevention to cooperative driving to the environmental sustainability of vehicles. Within this framework, several different approaches to an intelligent transport system have been designed, implemented, and demonstrated. In 1995, the vision study laid the foundation for a research team led by Ernst Dickmann, who used the Mercedes-Benz S-Class and embarked on a journey of 1590 km from Munich (Germany) to Copenhagen (Denmark) and back, using jolting computer vision and integrated memory microprocessors optimized for parallel processing to react in real time. The result of the experiment marked the way for computer vision technology, where the vehicle, with high speeds of more than 175 km/h and with minimal human intervention, was driven autonomously 95% of the time. In the same year, in July 1995, Carnegie Mellon University’s NavLab5 traveled across the country on a “No Hands Across America” tour in which the vehicle was instrumented with a vision camera, GPS receiver gyroscope, and steering and wheel encoders. Moreover, neural networks were used to control the steering wheel, while the throttle and brakes were human controlled [13]. Later, in 1996, the University of Parma launched its ARGO project, which completed more than 2000 km of autonomous driving on public roads, using a two-camera system for road follow-up, platooning, and obstacle prevention [14]. Meanwhile, other technologies around the world have made way in the market for various semi-autonomous vehicle applications. For example, to develop car parking assistance systems, ultrasonic sensors were used to detect barriers in the surroundings. Initially, these systems had merely a warning function to help prevent collisions when moving in and out of parking spaces. Toyota introduced ultrasonic back sonar as a parking aid in the Toyota Corona in 1982 and continued its success until 1988 [15]. Later, in 1998, the Mercedes-Benz adaptive cruise control radar was introduced, and these features were initially only usable at speeds greater than 30 km/h [16]. Slowly, autonomous and semi-autonomous highway concepts emerged and major projects were announced to explore dynamic stability and obstacle detection sensors such as vision, radar, ultrasonic, differential GPS, and gyroscopes for road navigation. The navigation tasks included lane keeping, departure warning, and automatic curve warning [17,18]. Most of these projects were carried out in normal operating environments. The phase came to a halt with the National Automated Highway System Consortium [19] on the demonstration of automated driving functions and the discussion on seven specific topics related to automated vehicles: (i) driver assistance for safety, (ii) vehicle-to-vehicle communication, (iii) vehicle-to-environment communication, (iv) artificial intelligence and soft computing tools, (v) embedded high-performance hardware for sensor data processing, (vi) standards and best practices for efficient communication, and (vii) traffic analysis systems.
This entry is adapted from the peer-reviewed paper 10.3390/s20226532