1000/1000
Hot
Most Recent
Advanced Driver-Assistance Systems (ADASs) are used for increasing safety in the automotive domain, yet current ADASs notably operate without taking into account drivers’ states, e.g., whether she/he is emotionally apt to drive.
In recent years, the automotive field has been pervaded by an increasing level of automation. This automation has introduced new possibilities with respect to manual driving. Among all the technologies for vehicle driving assistance, on-board Advanced Driver-Assistance Systems (ADASs), employed in cars, trucks, etc.[1], bring about remarkable possibilities in improving the quality of driving, safety, and security for both drivers and passengers. Examples of ADAS technologies are Adaptive Cruise Control (ACC)[2], Anti-lock Braking System (ABS)[3], alcohol ignition interlock devices[4], automotive night vision[5], collision avoidance systems[6], driver drowsiness detection[7], Electronic Stability Control (ESC)[8], Forward Collision Warnings (FCW)[9], Lane Departure Warning System (LDWS)[10], and Traffic Sign Recognition (TSR)[11]. Most ADASs consist of electronic systems developed to adapt and enhance vehicle safety and driving quality. They are proven to reduce road fatalities by compensating for human errors. To this end, safety features provided by ADASs target accident and collision avoidance, usually realizing safeguards, alarms, notifications, and blinking lights and, if necessary, taking control of the vehicle itself.
ADASs rely on the following assumptions: (i) the driver is attentive and emotionally ready to perform the right operation at the right time, and (ii) the system is capable of building a proper model of the surrounding world and of making decisions or raising alerts accordingly. Unfortunately, even if modern vehicles are equipped with complex ADASs (such as the aforementioned ones), the number of crashes is only partially reduced by their presence. In fact, the human driver is still the most critical factor in about 94% of crashes[12].
However, most of the current ADASs implement only simple mechanisms to take into account drivers’ states or do not take them into account it at all. An ADAS informed about the driver’s state could take contextualized decisions compatible with his/her possible reactions. Knowing the driver’s state means to continuously recognize whether the driver is physically, emotionally, and physiologically apt to guide the vehicle as well as to effectively communicate these ADAS decisions to the driver. Such an in-vehicle system to monitor drivers’ alertness and performance is very challenging to obtain and, indeed, would come with many issues.
The incorrect estimation of a driver’s state as well as of the status of the ego-vehicle (also denoted as subject vehicle or Vehicle Under Test (VUT) and referring to the vehicle containing the sensors perceiving the environment around the vehicle itself)[13][14] and the external environment may cause the ADAS to incorrectly activate or to make wrong decisions. Besides immediate danger, wrong decisions reduce drivers’ confidence in the system.
Many sensors are needed to achieve such an ADAS. Sensors are prone to errors and require several processing layers to produce usable outputs, where each layer introduces delays and may hide/damage data.
Dependable systems that recognize emotions and humans’ states are still a research challenge. They are usually built around algorithms requiring heterogeneous data as input parameters as well as provided by different sensing technologies, which may introduce unexpected errors into the system.
An effective communication between the ADAS and the driver is hard to achieve. Indeed, human distraction plays a critical role in car accidents[15] [15] and can be caused by both external and internal causes.
In this paper, we first present a literature review on the application of human state recognition for ADAS, covering psychological models, the sensors employed for capturing physiological signals, algorithms used for human emotion classification, and algorithms for human–car interaction. In particular, some of the aspects that researchers are trying to address can be summarized as follows:
Moreover, the complex processing tasks of modern ADASs are increasingly tackled by AI-oriented techniques. AIs can solve complex classification tasks that were previously thought to be very hard (or even impossible). Human state estimation is a typical task that can be approached by AI classifiers. At the same time, the use of AI classifiers brings about new challenges. As an example, ADAS can potentially be improved by having a reliable human emotional state identified by the driver, e.g., in order to activate haptic alarms in case of imminent forward collisions. Even if such an ADAS could tolerate a few misclassifications, the AI component for human state classification needs to have a very high accuracy to reach an automotive-grade reliability. Hence, it should be possible to prove that a classifier is sufficiently robust against unexpected data [16].
We then introduce a novel perception architecture for ADAS based on the idea of Driver Complex State (DCS). The DCS of the vehicle’s driver monitors his/her behavior via multiple non-obtrusive sensors and AI algorithms, providing emotion cognitive classifiers and emotion state classifiers to the ADAS. We argue that this approach is a smart way to improve safety for all occupants of a vehicle. We believe that, to be successful, the system must adopt unobtrusive sensing technologies for human parameters detection, safe and transparent AI algorithms that satisfy stringent automotive requirements, as well as innovative Human–Machine Interface (HMI) functionalities. Our ultimate goal is to provide solutions that improve in-vehicle ADAS, increasing safety, comfort, and performance in driving. The concept will be implemented and validated in the recently EU-funded NextPerception project[17], which will be briefly introduced.
In the automotive sector, the ADAS industry is a growing segment aiming at increasing the adoption of industry-wide functional safety in accordance with several quality standards, e.g., the automotive-oriented ISO 26262 standard[21]. ADAS increasingly relies on standardized computer systems, such as the Vehicle Information Access API[22], Volkswagen Infotainment Web Interface (VIWI) protocol[23], and On-Board Diagnostics (OBD) codes [24], to name a few.
In order to achieve advanced ADAS beyond semiautonomous driving, there is a clear need for appropriate knowledge of the driver’s status. These cooperative systems are captured in the Society of Automotive Engineers (SAE) level hierarchy of driving automation, summarized in Figure 1. These levels range from level 0 (manual driving) to level 5 (fully autonomous vehicle), with intermediate levels representing semiautonomous driving situations, with a mixed driver–vehicle degree of cooperation.
According to SAE levels, in these mixed systems, automation is partial and does not cover every possible anomalous condition that can happen during driving. Therefore, the driver’s active presence and his/her reaction capability remain critical. In addition, the complex data processing needed for higher automation levels will almost inevitably require various forms of AI (e.g., Machine Learning (ML) components), in turn bringing security and reliability issues.
Driver Monitoring Systems (DMSs) are a novel type of ADAS that has emerged to help predict driving maneuvers, driver intent, and vehicle and driver states, with the aim of improving transportation safety and driving experience as a whole[25]. For instance, by coupling sensing information with accurate lane changing prediction models, a DMS can prevent accidents by warning the driver ahead of time of potential danger[26]. As a measure of the effectiveness of this approach, progressive advancements of DMSs can be found in a number of review papers. Lane changing models have been reviewed in[27], while in [28][29], developments in driver’s intent prediction with emphasis on real-time vehicle trajectory forecasting are surveyed. The work in[30] reviews driver skills and driving behavior recognition models. A review of the cognitive components of driver behavior can also be found in[7], where situational factors that influence driving are addressed. Finally, a recent survey on human behavior prediction can be found in[31].