Mostrar el registro sencillo del ítem

dc.contributor.authorLi, Yuanfeng
dc.contributor.authorDeng, Jiangang
dc.contributor.authorWu, Qun
dc.contributor.authorWang, Ying
dc.date2021-12
dc.date.accessioned2022-05-09T10:43:19Z
dc.date.available2022-05-09T10:43:19Z
dc.identifier.issn1989-1660
dc.identifier.urihttps://reunir.unir.net/handle/123456789/13046
dc.description.abstractUtilizing biomedical signals as a basis to calculate the human affective states is an essential issue of affective computing (AC). With the in-depth research on affective signals, the combination of multi-model cognition and physiological indicators, the establishment of a dynamic and complete database, and the addition of high-tech innovative products become recent trends in AC. This research aims to develop a deep gradient convolutional neural network (DGCNN) for classifying affection by using an eye-tracking signals. General signal process tools and pre-processing methods were applied firstly, such as Kalman filter, windowing with hamming, short-time Fourier transform (SIFT), and fast Fourier transform (FTT). Secondly, the eye-moving and tracking signals were converted into images. A convolutional neural networks-based training structure was subsequently applied; the experimental dataset was acquired by an eye-tracking device by assigning four affective stimuli (nervous, calm, happy, and sad) of 16 participants. Finally, the performance of DGCNN was compared with a decision tree (DT), Bayesian Gaussian model (BGM), and k-nearest neighbor (KNN) by using indices of true positive rate (TPR) and false negative rate (FPR). Customizing mini-batch, loss, learning rate, and gradients definition for the training structure of the deep neural network was also deployed finally. The predictive classification matrix showed the effectiveness of the proposed method for eye moving and tracking signals, which performs more than 87.2% inaccuracy. This research provided a feasible way to find more natural human-computer interaction through eye moving and tracking signals and has potential application on the affective production design process.es_ES
dc.language.isoenges_ES
dc.publisherInternational Journal of Interactive Multimedia and Artificial Intelligence (IJIMAI)es_ES
dc.relation.ispartofseries;vol. 7, nº 2
dc.relation.urihttps://www.ijimai.org/journal/bibcite/reference/2961es_ES
dc.rightsopenAccesses_ES
dc.subjectaffective computinges_ES
dc.subjectconvolutional neural network (CNN)es_ES
dc.subjecteye detectiones_ES
dc.subjectdeep gradientes_ES
dc.subjectconvolutional neural networkes_ES
dc.subjectshort time fourier transformes_ES
dc.subjectIJIMAIes_ES
dc.titleEye-Tracking Signals Based Affective Classification Employing Deep Gradient Convolutional Neural Networkses_ES
dc.typearticlees_ES
reunir.tag~IJIMAIes_ES
dc.identifier.doihttps://doi.org/10.9781/ijimai.2021.06.002


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem