Mostrar el registro sencillo del ítem

dc.contributor.authorWu, Qun
dc.contributor.authorDey, Nilanjan
dc.contributor.authorShi, Fuqian
dc.contributor.authorGonzález-Crespo, Rubén
dc.contributor.authorSherratt, Simon
dc.date2021
dc.date.accessioned2022-05-26T09:10:01Z
dc.date.available2022-05-26T09:10:01Z
dc.identifier.issn1568-4946
dc.identifier.urihttps://reunir.unir.net/handle/123456789/13175
dc.description.abstractEmotion produces complex neural processes and physiological changes under appropriate event stimulation. Physiological signals have the advantage of better reflecting a person's actual emotional state than facial expressions or voice signals. An electroencephalogram (EEG) is a signal obtained by collecting, amplifying, and recording the human brain's weak bioelectric signals on the scalp. The eye-tracking (E.T.) signal records the potential difference between the retina and the cornea and the potential generated by the eye movement muscle. Furthermore, the different modalities of physiological signals will contain various information representations of human emotions. Finding this different modal information is of great help to get higher recognition accuracy. The E.T. and EEG signals are synchronized and fused in this research, and an effective deep learning (DL) method was used to combine different modalities. This article proposes a technique based on a fusion model of the Gaussian mixed model (GMM) with the Butterworth and Chebyshev signal filter. Features extraction on EEG and E.T. are subsequently calculated. Secondly, the self-similarity (SSIM), energy (E), complexity (C), high order crossing (HOC), and power spectral density (PSD) for EGG, and electrooculography power density estimation ((EOG-PDE), center gravity frequency (CGF), frequency variance (F.V.), root mean square frequency (RMSF) for E.T. are selected hereafter; the max–min method is applied for vector normalization. Finally, a deep gradient neural network (DGNN) for EEG and E.T. multimodal signal classification is proposed. The proposed neural network predicted the emotions under the eight emotions event stimuli experiment with 88.10% accuracy. For the evaluation indices of accuracy (Ac), precision (Pr), recall (Re), F-measurement (Fm), precision–recall (P.R.) curve, true-positive rate (TPR) of receiver operating characteristic curve (ROC), the area under the curve (AUC), true-accept rate (TAR), and interaction on union (IoU), the proposed method also performs with high efficiency compared with several typical neural networks including the artificial neural network (ANN), SqueezeNet, GoogleNet, ResNet-50, DarkNet-53, ResNet-18, Inception-ResNet, Inception-v3, and ResNet-101.es_ES
dc.language.isoenges_ES
dc.publisherElsevier Ltdes_ES
dc.relation.ispartofseries;vol. 110
dc.relation.urihttps://www.sciencedirect.com/science/article/abs/pii/S1568494621006736?via%3Dihubes_ES
dc.rightsrestrictedAccesses_ES
dc.subjectelectroencephalogrames_ES
dc.subjectemotion stimulies_ES
dc.subjecteye-trackinges_ES
dc.subjectfused deep neural networkes_ES
dc.subjectgaussian mixed modeles_ES
dc.subjectsignal processes_ES
dc.subjectScopuses_ES
dc.subjectJCRes_ES
dc.titleEmotion classification on eye-tracking and electroencephalograph fused signals employing deep gradient neural networkses_ES
dc.typearticlees_ES
reunir.tag~ARIes_ES
dc.identifier.doihttps://doi.org/10.1016/j.asoc.2021.107752


Ficheros en el ítem

FicherosTamañoFormatoVer

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem