Recent Advanced Driving Assistance Systems (ADAS) improve safety and comfort while driving. However, ADAS often rely only on telemetry and environment data and active driver monitoring still poses an open challenge due to the high data variability among individual users and modelling complexity.
Despite these setbacks, active driver monitoring could lead to safer human-car interactions and more reliable shared-control. Understanding user focus of attention and anticipating potential future mistakes allows the system to either alert drivers or to take over control before potentially dangerous maneuvers take place.
This talk will show how focus of attention, cognitive states and decision intention of the driver can be effectively inferred using non-invasive physiological and behavioural signals from the driver, such as gaze and head movements and easy-to-access telemetry driving inputs.