Mostrar el registro sencillo del ítem
Emotion classification on eye-tracking and electroencephalograph fused signals employing deep gradient neural networks
dc.contributor.author | Wu, Qun | |
dc.contributor.author | Dey, Nilanjan | |
dc.contributor.author | Shi, Fuqian | |
dc.contributor.author | González-Crespo, Rubén | |
dc.contributor.author | Sherratt, Simon | |
dc.date | 2021 | |
dc.date.accessioned | 2022-05-26T09:10:01Z | |
dc.date.available | 2022-05-26T09:10:01Z | |
dc.identifier.issn | 1568-4946 | |
dc.identifier.uri | https://reunir.unir.net/handle/123456789/13175 | |
dc.description.abstract | Emotion produces complex neural processes and physiological changes under appropriate event stimulation. Physiological signals have the advantage of better reflecting a person's actual emotional state than facial expressions or voice signals. An electroencephalogram (EEG) is a signal obtained by collecting, amplifying, and recording the human brain's weak bioelectric signals on the scalp. The eye-tracking (E.T.) signal records the potential difference between the retina and the cornea and the potential generated by the eye movement muscle. Furthermore, the different modalities of physiological signals will contain various information representations of human emotions. Finding this different modal information is of great help to get higher recognition accuracy. The E.T. and EEG signals are synchronized and fused in this research, and an effective deep learning (DL) method was used to combine different modalities. This article proposes a technique based on a fusion model of the Gaussian mixed model (GMM) with the Butterworth and Chebyshev signal filter. Features extraction on EEG and E.T. are subsequently calculated. Secondly, the self-similarity (SSIM), energy (E), complexity (C), high order crossing (HOC), and power spectral density (PSD) for EGG, and electrooculography power density estimation ((EOG-PDE), center gravity frequency (CGF), frequency variance (F.V.), root mean square frequency (RMSF) for E.T. are selected hereafter; the max–min method is applied for vector normalization. Finally, a deep gradient neural network (DGNN) for EEG and E.T. multimodal signal classification is proposed. The proposed neural network predicted the emotions under the eight emotions event stimuli experiment with 88.10% accuracy. For the evaluation indices of accuracy (Ac), precision (Pr), recall (Re), F-measurement (Fm), precision–recall (P.R.) curve, true-positive rate (TPR) of receiver operating characteristic curve (ROC), the area under the curve (AUC), true-accept rate (TAR), and interaction on union (IoU), the proposed method also performs with high efficiency compared with several typical neural networks including the artificial neural network (ANN), SqueezeNet, GoogleNet, ResNet-50, DarkNet-53, ResNet-18, Inception-ResNet, Inception-v3, and ResNet-101. | es_ES |
dc.language.iso | eng | es_ES |
dc.publisher | Elsevier Ltd | es_ES |
dc.relation.ispartofseries | ;vol. 110 | |
dc.relation.uri | https://www.sciencedirect.com/science/article/abs/pii/S1568494621006736?via%3Dihub | es_ES |
dc.rights | restrictedAccess | es_ES |
dc.subject | electroencephalogram | es_ES |
dc.subject | emotion stimuli | es_ES |
dc.subject | eye-tracking | es_ES |
dc.subject | fused deep neural network | es_ES |
dc.subject | gaussian mixed model | es_ES |
dc.subject | signal process | es_ES |
dc.subject | Scopus | es_ES |
dc.subject | JCR | es_ES |
dc.title | Emotion classification on eye-tracking and electroencephalograph fused signals employing deep gradient neural networks | es_ES |
dc.type | article | es_ES |
reunir.tag | ~ARI | es_ES |
dc.identifier.doi | https://doi.org/10.1016/j.asoc.2021.107752 |
Ficheros en el ítem
Ficheros | Tamaño | Formato | Ver |
---|---|---|---|
No hay ficheros asociados a este ítem. |