학술논문

DERNet: Driver Emotion Recognition Using Onboard Camera
Document Type
Periodical
Source
IEEE Intelligent Transportation Systems Magazine IEEE Intell. Transport. Syst. Mag. Intelligent Transportation Systems Magazine, IEEE. 16(2):117-132 Apr, 2024
Subject
Transportation
Aerospace
Computing and Processing
Components, Circuits, Devices and Systems
Fields, Waves and Electromagnetics
Vehicles
Emotion recognition
Feature extraction
Deep learning
Training
Benchmark testing
Magnetic resonance imaging
Language
ISSN
1939-1390
1941-1197
Abstract
Driver emotion is considered an essential factor associated with driving behaviors and thus influences traffic safety. Dynamically and accurately recognizing the emotions of drivers plays an important role in road safety, especially for professional drivers, e.g., the drivers of passenger service vehicles. However, there is a lack of a benchmark to quantitatively evaluate the performance of driver emotion recognition performance, especially for various application situations. In this article, we propose an emotion recognition benchmark based on the driver emotion facial expression (DEFE) dataset, which consists of two splits: training and testing on the same set (split 1) and different sets (split 2) of drivers. These two splits correspond to various application scenarios and have diverse challenges. For the former, a driver emotion recognition network is proposed to provide a competitive baseline for the benchmark. For the latter, a novel driver representation difference minimization loss is proposed to enhance the learning of common representations for emotion recognition over different drivers. Moreover, the minimum required information for achieving a satisfactory performance is also explored on split 2. Comprehensive experiments on the DEFE dataset clearly demonstrate the superiority of the proposed methods compared to other state-of-the-art methods. An example application of applying the proposed methods and a voting mechanism to real-world data collected in a naturalistic environment reveals the strong practicality and readiness of the proposed methods. The codes and dataset splits are publicly available at https://github.com/wdy806/CDERNet/.