Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 The time evolution of intelligent vehicle technology is explained, which highlights the development of an intelligent vehicle and its safety applications, focusing on the various usages of perception sensors in production. + 2397 word(s) 2397 2020-11-17 04:49:08 |
2 format correct -658 word(s) 1739 2020-11-25 05:21:05 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Mohammed, A.S.; Amamou, A.; Ayevide, F.K.; Kelouwani, S.; Agbossou, K.; Zioui, N. Evolution of Intelligent Vehicle Technology. Encyclopedia. Available online: https://encyclopedia.pub/entry/3210 (accessed on 29 March 2024).
Mohammed AS, Amamou A, Ayevide FK, Kelouwani S, Agbossou K, Zioui N. Evolution of Intelligent Vehicle Technology. Encyclopedia. Available at: https://encyclopedia.pub/entry/3210. Accessed March 29, 2024.
Mohammed, Abdul Sajeed, Ali Amamou, Follivi Kloutse Ayevide, Sousso Kelouwani, Kodjo Agbossou, Nadjet Zioui. "Evolution of Intelligent Vehicle Technology" Encyclopedia, https://encyclopedia.pub/entry/3210 (accessed March 29, 2024).
Mohammed, A.S., Amamou, A., Ayevide, F.K., Kelouwani, S., Agbossou, K., & Zioui, N. (2020, November 24). Evolution of Intelligent Vehicle Technology. In Encyclopedia. https://encyclopedia.pub/entry/3210
Mohammed, Abdul Sajeed, et al. "Evolution of Intelligent Vehicle Technology." Encyclopedia. Web. 24 November, 2020.
Evolution of Intelligent Vehicle Technology
Edit

The time evolution of intelligent vehicle technology is explained, which highlights the development of an intelligent
vehicle and its safety applications, focusing on the various usages of perception sensors in production.

intelligent vehicle evolution Advanced driver assistance systems perception sensors

1. Phase I (1980 to 200)

During this phase, the dynamic stability of vehicles was one of the focal points. Inertial sensors incorporated into inertial measurement units (IMUs) combined with an odometer were often used to improve the stability of the vehicle, particularly when the road had several curves, and this soon led to driver assistance like anti-lock braking systems (ABSs), followed by traction control (TC) and electronic stability (ECS) [1]. Mercedes has shown efficacy and importance for human life with the combined ABS and ECS systems and the “Moose Test” has attracted public and official attention[2]. Nevertheless, safety concerns were limited to drivers and passengers, increasing concern about mobility and the safety of human life in the surrounding area, which led the way to the development of external sensors. In 1986, the European project PROMETHEUS[3] involving university research centers and transport as well as automotive companies, carried out basic studies on autonomous features ranging from collision prevention to cooperative driving to the environmental sustainability of vehicles. Within this framework, several different approaches to an intelligent transport system have been designed, implemented, and demonstrated. In 1995, the vision study laid the foundation for a research team led by Ernst Dickmann, who used the Mercedes-Benz S-Class and embarked on a journey of 1590 km from Munich (Germany) to Copenhagen (Denmark) and back, using jolting computer vision and integrated memory microprocessors optimized for parallel processing to react in real time. The result of the experiment marked the way for computer vision technology, where the vehicle, with high speeds of more than 175 km/h and with minimal human intervention, was driven autonomously 95% of the time. In the same year, in July 1995, Carnegie Mellon University’s NavLab5 traveled across the country on a “No Hands Across America” tour in which the vehicle was instrumented with a vision camera, GPS receiver gyroscope, and steering and wheel encoders. Moreover, neural networks were used to control the steering wheel, while the throttle and brakes were human controlled[4]. Later, in 1996, the University of Parma launched its ARGO project, which completed more than 2000 km of autonomous driving on public roads, using a two-camera system for road follow-up, platooning, and obstacle prevention[5]. Meanwhile, other technologies around the world have made way in the market for various semi-autonomous vehicle applications. For example, to develop car parking assistance systems, ultrasonic sensors were used to detect barriers in the surroundings. Initially, these systems had merely a warning function to help prevent collisions when moving in and out of parking spaces. Toyota introduced ultrasonic back sonar as a parking aid in the Toyota Corona in 1982 and continued its success until 1988[6]. Later, in 1998, the Mercedes-Benz adaptive cruise control radar was introduced, and these features were initially only usable at speeds greater than 30 km/h[7]. Slowly, autonomous and semi-autonomous highway concepts emerged and major projects were announced to explore dynamic stability and obstacle detection sensors such as vision, radar, ultrasonic, differential GPS, and gyroscopes for road navigation. The navigation tasks included lane keeping, departure warning, and automatic curve warning[8][9]. Most of these projects were carried out in normal operating environments. The phase came to a halt with the National Automated Highway System Consortium[10] on the demonstration of automated driving functions and the discussion on seven specific topics related to automated vehicles: (i) driver assistance for safety, (ii) vehicle-to-vehicle communication, (iii) vehicle-to-environment communication, (iv) artificial intelligence and soft computing tools, (v) embedded high-performance hardware for sensor data processing, (vi) standards and best practices for efficient communication, and (vii) traffic analysis systems.

2. Phase II (2003 to 2008)

Several interesting projects were published in the second phase, such as the first Defense Advanced Research Projects Agency (DARPA)Grand Challenge, the second DARPA Grand Challenge, and the DARPA Urban Challenge [11][12]. These three projects and their corresponding competitions were designed to accelerate the development of intelligent navigation and control by highlighting issues such as off-road navigation, high-speed detection, and collision avoidance with surroundings (such as pedestrians, cycles, traffic lights, and signs). Besides, complex urban driving scenarios such as dense traffic and intersections were also addressed. The Grand Challenge has shown the potential of lidar sensors to perceive the environment and to create 3D projections to manage the challenging urban navigation environment. The Velodyne HDL64[13], a 64-layer lidar, played a vital role for both the winning and the runner-up teams. During the competition, vehicles had to navigate the real environment independently for a long time (several hours). The winner of the Second Grand Challenge (Stanley, Stanford Racing Team, Stanford University) equipped his Stanley vehicle with five lidar units, a front camera, a GPS sensor, an IMU, wheel odometry, and two automotive radars. The winner of the Urban Challenge (2007) (Boss, Carnegie Mellon University Team) with its Boss vehicle featured a perception system made up of two video cameras, five radars, and 13 lidars (including a roof-mounted unit of the novel Velodyne 64HDL). The success of the Grand Challenges highlighted some important information, for example, the size of the sensors and their numbers increased significantly, leading to an increase in data acquisition density, which resulted in several researchers studying different types of fusion algorithms. Further data acquisition density studies have paved the way for the development of advanced driving maneuvers such as lane keeping and collision prevention with warning systems to help the driver avoid potential hazards. We also note that, although different challenges have been addressed in the context of competitions in urban navigation, all of them have been faced with clear weather conditions and no specific report has been provided on tests under varying climatic conditions.

3. Phase III (from 2008)

The third phase is a combination of driver assistance technology advancement and commercial development. The DARPA Challenges have strengthened partnerships between car manufacturers and the education sector and have mobilized several efforts to advance autonomous vehicles (AVs) in the automotive industry. This has involved a collaboration between General Motors and Carnegie Mellon University (Carnegie Mellon University), the Autonomous Driving Joint Research Lab, and a partnership between Volkswagen and Stanford University (Stanford University). Google’s Driverless Car initiative has introduced commercial research into autonomous cars from a university lab. In 2013, a Mercedes-Benz S-Class vehicle[14] was produced by the Karlsruhe Institute of Technology/FZI (Forschungszentrum Informatik) and Daimler R&D, which ran 100 km from Mannheim to Pforzheim (Germany) completely autonomously in a project designed to enhance safety. The vehicle, which is equipped with a single stereo vision system consisting of several new generations of long-range and short-range radar sensors, followed the historic memorial road of Bertha Benz. Phase III has focused on issues like traffic automation, cooperative driving, and intelligent road infrastructure. Among the major European Union initiatives, Highly Automated Vehicles for Intelligent Transport (HAVEIT, 2008–2011)[15][16][17] has tackled numerous driver assistance applications, such as adaptive cruise control, safety lane changing, and side monitoring. The sensor sets used in this project include a radar network, laser scanners, and ultrasonic sensors with advanced machine learning techniques as well as vehicle-to-vehicle communications systems (V2V). The results of this project developed safety architecture software for the management of smart actuators and temporary autopilot system tasks for urban traffic with data redundancy and led to successful green driving systems. Other platooning initiatives were Safe Roads for the Environment (SARTE, 2009–2012)[18], VisLab Intercontinental Autonomous Challenge (VIAC, 2010–2014)[19], the Grand Cooperative Driving Challenge, 2011[20] , and the European Truck Platooning Challenge, 2016[21][22], which were major projects aimed at creating and testing some successful intersection driving strategies in cooperation. Global innovation, testing, and deployment of AV technology called for the adoption of standard guidelines and regulations to ensure a stable integration, which led to the introduction of SAE J3016, which allows for six degrees of autonomy from 0 to 5 in all-weather situations where the navigation tasks at level 0 are managed by the driver and the computer at level 5. To respond to these regulations, the Google driverless car project began in 2009 to create the most advanced driverless car (SAE autonomy level 5) that features a 64-beam lidar rotating rooftop, creating 3D images of objects that help the car see distance and create images of objects within an impressive 200 m range. The camera mounted on the windshield helps the car see objects right in front of it and to record information about road signs and traffic lights. Four radars mounted on the front and rear bumpers of the car make it possible for the car to be aware of the vehicles in front of and behind it and to keep passengers and other motorists safe by avoiding bumps and crashes. To minimize the degree of uncertainty, GPS data are compared to the sensor map data previously collected from the aerial, which is fixed at the rear of the car and receives information on the exact location of the car and updates the internal map. An ultrasonic sensor mounted on one of the rear wheels helps keep track of movements and warn the car about obstacles in the rear. Usually, the ultrasonic sensors are used for parking assistance. Google researchers developed an infrastructure that was successfully tested over 2 million km on real roads. This technology belongs to the company Waymo[23]. Nissan’s Infiniti Q50 debuted in 2013 and became one of the company’s most powerful autonomous cars and the first to use the virtual steering column. The model has various features, such as lane changing, collision prevention, and cruise control, and is equipped with cameras, radar, and other next-generation technology. The driver does not need to handle the accelerator, brake, or steering wheel[24]. Tesla entered the course of automated driving in 2014[25], with all its vehicles equipped with a monocular camera and an automotive radar that enabled autopilot level 2–3 functionality. In 2018, Mobileye, focusing on a vision-only approach to automated driving, presented an automated Ford demo with only 12 small, fully automated mono-cameras[26]. Beside the projects, there are many pilot projects in almost all G7 countries to improve the introduction rate of the ultimate driverless vehicle. Moreover, the ADAS has achieved high technology readiness, and many car manufacturers are now deploying this technology in their mass-market vehicles. Although several key elements for automatic maneuvers have been successfully tested, the features are not fully covered under all weather conditions. In the next section, we discuss the various applications of ADASs which are currently available in the market and their limitations in performing in various weather conditions.

References

  1. Bengler, K.; Dietmayer, K.; Farber, B.; Maurer, M.; Stiller, C.; Winner, H. Three Decades of Driver Assistance Systems: Review and Future Perspectives. IEEE Intell. Transp. Syst. Mag. 2014, 6, 6–22.
  2. Diermeier, D. Mercedes and the Moose Test (B). Kellogg Sch. Manag. Cases 2017.
  3. Billington, J. The Prometheus Project: The Story Behind One of AV’s Greatest Developments. Available online: https://www.autonomousvehicleinternational.com/features/the-prometheus-project.html (accessed on 1 October 2019).
  4. Carnegie Mellon University. No Hands Across America Journal. Available online: https://www.cs.cmu.edu/~tjochem/nhaa/Journal.html (accessed on 1 October 2019).
  5. Bertozzi, M.; Broggi, A.; Conte, G.; Fascioli, R. The Experience of the ARGO Autonomous Vehicle. In Proceedings of the Enhanced and Synthetic Vision, Orlando, FL, USA, 13–17 April 1998.
  6. Automobile, W. Parking Sensor. Available online: https://en.wikipedia.org/wiki/Parking_sensor (accessed on 1 October 2019).
  7. Meinel, H.H. Evolving automotive radar—From the very beginnings into the future. In Proceedings of the 8th European Conference on Antennas and Propagation (EuCAP 2014), The Hague, The Netherlands, 6–11 April 2014; pp. 3107–3114.
  8. Bertozzi, M.; Broggi, A.; Fascioli, A. Vision-based Intelligent Vehicles: State of the Art and Perspectives. Robot. Auton. Syst. 2000, 32, 1–16.
  9. Broggi, A.; Bertozzi, M.; Fascioli, A. Architectural Issues on Vision-Based Automatic Vehicle Guidance: The Experience of the ARGO Project. Real-Time Imaging 2000, 6, 313–324.
  10. Thorpe, C.; Jochem, T.; Pomerleau, D. The 1997 automated highway free agent demonstration. In Proceedings of the Conference on Intelligent Transportation Systems, Boston, MA, USA, 12 October 1997; pp. 496–501.
  11. Urmson, C.; Duggins, D.; Jochem, T.; Pomerleau, D.; Thorpe, C. From Automated Highways to Urban Challenges. In Proceedings of the 2008 IEEE International Conference on Vehicular Electronics and Safety, Columbus, OH, USA, 22–24 September 2008; pp. 6–10.
  12. Bebel, J.C.; Howard, N.; Patel, T. An autonomous system used in the DARPA Grand Challenge. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749), Washington, WA, USA, 3–6 October 2004; pp. 487–490.
  13. Velodyne. HDL—64E Lidar. Available online: https://velodynelidar.com/products/hdl-64e/ (accessed on 1 October 2020).
  14. Dickmann, J.; Appenrodt, N.; Klappstein, J.; Bloecher, H.L.; Muntzinger, M.; Sailer, A.; Hahn, M.; Brenk, C. Making Bertha See Even More: Radar Contribution. IEEE Access 2015, 3, 1233–1247.
  15. Hoeger, R.; Amditis, A.; Kunert, M.; Hoess, A.; Flemisch, F.; Krueger, H.P.; Bartels, A.; Beutner, A.; Pagle, K. Highly Automated Vehicles For Intelligent Transport: Haveit Approach. 2008. Available online: https://www.researchgate.net/publication/225000799_HIGHLY_AUTOMATED_VEHICLES_FOR_INTELLIGENT_TRANSPORT_HAVEit_APPROACH (accessed on 1 October 2019).
  16. Vanholme, B.; Gruyer, D.; Lusetti, B.; Glaser, S.; Mammar, S. Highly Automated Driving on Highways Based on Legal Safety. IEEE Trans. Intell. Transp. Syst. 2013, 14, 333–347.
  17. Thomaidis, G.; Kotsiourou, C.; Grubb, G.; Lytrivis, P.; Karaseitanidis, G.; Amditis, A. Multi-sensor tracking and lane estimation in highly automated vehicles. IET Intell. Transp. Syst. 2013, 7, 160–169.
  18. Dávila, A.; Nombela, M. Sartre–Safe Road Trains for the Environment Reducing Fuel Consumption through lower Aerodynamic Drag Coefficient. SAE Tech. Pap. 2011.
  19. Bertozzi, M.; Bombini, L.; Broggi, A.; Buzzoni, M.; Cardarelli, E.; Cattani, S.; Cerri, P.; Debattisti, S.; Fedriga, R.; Felisa, M.; et al. The VisLab Intercontinental Autonomous Challenge: 13,000 km, 3 months, no driver. In Proceedings of the 17th World Congress on ITS, Busan, Korea, 1–5 September 2010.
  20. Englund, C.; Chen, L.; Ploeg, J.; Semsar-Kazerooni, E.; Voronov, A.; Bengtsson, H.H.; Didoff, J. The Grand Cooperative Driving Challenge 2016: Boosting the Introduction of Cooperative Automated Vehicles. 2016. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7553038&isnumber=7553013 (accessed on 1 October 2019).
  21. R. Tom Alkim. European Truck Platooning Challenge. Available online: http://wiki.fot-net.eu/index.php/European_Truck_Platooning_Challenge (accessed on 1 October 2019).
  22. Tsugawa, S.; Jeschke, S.; Shladover, S.E. A Review of Truck Platooning Projects for Energy Savings. IEEE Trans. Intell. Veh. 2016, 1, 68–77.
  23. Poczte, S.L.; Jankovic, L.M. 10 The Google Car: Driving Toward A Better Future? J. Bus. Case Stud. First Quart. 2014, 10, 1–8.
  24. Nissan. Nissan Q50 2014. Available online: http://www.nissantechnicianinfo.mobi/htmlversions/2013_Q50_Special/Safety.html (accessed on 1 October 2019).
  25. Tesla. Available online: https://en.wikipedia.org/wiki/Tesla_Autopilot (accessed on 1 October 2019).
  26. Ford. Ford and Mobileye to Offer Better Camera-Based Collision Avoidance Tech. Available online: https://www.kbford.com/blog/2020/july/28/ford-and-mobileye-to-offer-better-camera-based-collision-avoidance-tech.htm (accessed on 1 October 2019).
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , ,
View Times: 832
Revisions: 2 times (View History)
Update Date: 25 Nov 2020
1000/1000