Show simple item record

contributor authorJie Ni
contributor authorWanying Xie
contributor authorYiping Liu
contributor authorJike Zhang
contributor authorYugu Wan
contributor authorHuimin Ge
date accessioned2024-04-27T22:32:17Z
date available2024-04-27T22:32:17Z
date issued2024/01/01
identifier other10.1061-JTEPBS.TEENG-7802.pdf
identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4296886
description abstractAccurate driver emotion recognition is one of the key challenges in the construction of an intelligent vehicle safety assistant system. In this paper, we conduct a driving simulator study on driver emotion recognition. Taking the car-following scene as an example, the multimodal parameters of a driver in the five emotional states of neutral, joy, fear, sadness, and anger are obtained from the emotion induction experiment and the simulated driving experiment. Wavelet denoising and debase processing are used to reduce the influence of signal noise and the individual differences between drivers. The statistical domain and the time-frequency domain features of the electrophysiological response signals, nasal-tip temperature signals, and vehicle behavior signals are analyzed. The factor analysis method is used to extract and reduce the feature parameters, and the driver’s emotion recognition model is established based on machine learning methods such as random forest (RF), K-nearest-neighbor (KNN), and extreme gradient boosting (XGBoost). Through the verification and the comparison of different modalities and different modality combinations with different machine learning methods, the RF model, based on the feature combination of three types of modal data, has the best model recognition effect. The research results can provide a theoretical basis for driver emotion recognition of intelligent vehicles and have positive significance for promoting the development of human-computer interaction (HCI) systems of intelligent vehicles and improving road traffic safety.
publisherASCE
titleDriver Emotion Recognition Involving Multimodal Signals: Electrophysiological Response, Nasal-Tip Temperature, and Vehicle Behavior
typeJournal Article
journal volume150
journal issue1
journal titleJournal of Transportation Engineering, Part A: Systems
identifier doi10.1061/JTEPBS.TEENG-7802
journal fristpage04023125-1
journal lastpage04023125-11
page11
treeJournal of Transportation Engineering, Part A: Systems:;2024:;Volume ( 150 ):;issue: 001
contenttypeFulltext


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record