• Mi Re-Unir
    Búsqueda Avanzada
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    Ver ítem 
    •   Inicio
    • RESULTADOS DE INVESTIGACIÓN
    • Artículos Científicos WOS y SCOPUS
    • Ver ítem
    •   Inicio
    • RESULTADOS DE INVESTIGACIÓN
    • Artículos Científicos WOS y SCOPUS
    • Ver ítem

    Human Observers and Automated Assessment of Dynamic Emotional Facial Expressions: KDEF-dyn Database Validation

    Autor: 
    Calvo, Manuel G
    ;
    Fernández-Martín, Andrés
    ;
    Recio, Guillermo
    ;
    Lundqvist, Daniel
    Fecha: 
    26/10/2018
    Palabra clave: 
    facial expression; dynamic; action units; KDEF; FACET; JCR; Scopus
    Revista / editorial: 
    Frontiers in Psychology
    Tipo de Ítem: 
    Articulo Revista Indexada
    URI: 
    https://reunir.unir.net/handle/123456789/7937
    Dirección web: 
    https://www.frontiersin.org/articles/10.3389/fpsyg.2018.02052/full
    Open Access
    Resumen:
    Most experimental studies of facial expression processing have used static stimuli (photographs), yet facial expressions in daily life are generally dynamic. In its original photographic format, the Karolinska Directed Emotional Faces (KDEF) has been frequently utilized. In the current study, we validate a dynamic version of this database, the KDEF-dyn. To this end, we applied animation between neutral and emotional expressions (happy, sad, angry, fearful, disgusted, and surprised; 1,033-ms unfolding) to 40 KDEF models, with morphing software. Ninety-six human observers categorized the expressions of the resulting 240 video-clip stimuli, and automated face analysis assessed the evidence for 6 expressions and 20 facial action units (AUs) at 31 intensities. Low-level image properties (luminance, signal-to-noise ratio, etc.) and other purely perceptual factors (e.g., size, unfolding speed) were controlled. Human recognition performance (accuracy, efficiency, and confusions) patterns were consistent with prior research using static and other dynamic expressions. Automated assessment of expressions and AUs was sensitive to intensity manipulations. Significant correlations emerged between human observers' categorization and automated classification. The KDEF-dyn database aims to provide a balance between experimental control and ecological validity for research on emotional facial expression processing. The stimuli and the validation data are available to the scientific community.
    Mostrar el registro completo del ítem
    Este ítem aparece en la(s) siguiente(s) colección(es)
    • Artículos Científicos WOS y SCOPUS

    Estadísticas de uso

    Año
    2012
    2013
    2014
    2015
    2016
    2017
    2018
    2019
    2020
    2021
    2022
    2023
    2024
    2025
    Vistas
    0
    0
    0
    0
    0
    0
    0
    97
    47
    27
    34
    35
    86
    43
    Descargas
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0

    Ítems relacionados

    Mostrando ítems relacionados por Título, autor o materia.

    • Discrimination between smiling faces: Human observers vs. automated face analysis 

      Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés ; Recio, Guillermo (Acta Psychologica, 06/2018)
      This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; ...
    • Recognition Thresholds for Static and Dynamic Emotional Faces 

      Calvo, Manuel G; Avero, Pedro; Fernández-Martín, Andrés ; Recio, Guillermo (Emotion, 12/2016)
      We investigated the minimum expressive intensity that is required to recognize (above chance) static and dynamic facial expressions of happiness, sadness, anger, disgust, fear, and surprise. To this end, we varied the ...
    • Selective eye fixations on diagnostic face regions of dynamic emotional expressions: KDEF-dyn database 

      Calvo, Manuel G.; Fernández-Martín, Andrés ; Gutiérrez-García, Aida; Lundqvist, Daniel (Scientific Reports, 19/11/2018)
      Prior research using static facial stimuli (photographs) has identified diagnostic face regions (i.e., functional for recognition) of emotional expressions. In the current study, we aimed to determine attentional orienting, ...

    Mi cuenta

    AccederRegistrar

    ¿necesitas ayuda?

    Manual de UsuarioContacto: reunir@unir.net

    Listar

    todo Re-UnirComunidades y coleccionesPor fecha de publicaciónAutoresTítulosPalabras claveTipo documentoTipo de accesoEsta colecciónPor fecha de publicaciónAutoresTítulosPalabras claveTipo documentoTipo de acceso






    Aviso Legal Política de Privacidad Política de Cookies Cláusulas legales RGPD
    © UNIR - Universidad Internacional de La Rioja
     
    Aviso Legal Política de Privacidad Política de Cookies Cláusulas legales RGPD
    © UNIR - Universidad Internacional de La Rioja