Mostrar el registro sencillo del ítem

dc.contributor.authorAlotaibi, Basmah K.
dc.contributor.authorKhan, Fakhri Alam
dc.contributor.authorQawqzeh, Yousef
dc.contributor.authorJeon, Gwanggil
dc.contributor.authorCamacho, David
dc.date2024
dc.date.accessioned2025-01-27T16:52:16Z
dc.date.available2025-01-27T16:52:16Z
dc.identifier.citationAlotaibi, B. K., Khan, F. A., Qawqzeh, Y., Jeon, G., & Camacho, D. Performance and Communication Cost of Deep Neural Networks in Federated Learning Environments: An Empirical Study.es_ES
dc.identifier.issn1989-1660
dc.identifier.urihttps://reunir.unir.net/handle/123456789/17627
dc.description.abstractFederated learning, a distributive cooperative learning approach, allows clients to train the model locally using their data and share the trained model with a central server. When developing a federated learning environment, a deep/machine learning model needs to be chosen. The choice of the learning model can impact the model performance and the communication cost since federated learning requires the model exchange between clients and a central server in several rounds. In this work, we provide an empirical study to investigate the impact of using three different neural networks (CNN, VGG, and ResNet) models in image classification tasks using two different datasets (Cifar-10 and Cifar-100) in a federated learning environment. We investigate the impact of using these models on the global model performance and communication cost under different data distribution that are IID data and non-IID data distribution. The obtained results indicate that using CNN and ResNet models provide a faster convergence than VGG model. Additionally, these models require less communication costs. In contrast, the VGG model necessitates the sharing of numerous bits over several rounds to achieve higher accuracy under the IID data settings. However, its accuracy level is lower under non-IID data distributions than the other models. Furthermore, using a light model like CNN provides comparable results to the deeper neural network models with less communication cost, even though it may require more communication rounds to achieve the target accuracy in both datasets. CNN model requires fewer bits to be shared during communication than other models.es_ES
dc.language.isoenges_ES
dc.publisherInternational Journal of Interactive Multimedia and Artificial Intelligence (IJIMAI)es_ES
dc.relation.urihttps://www.ijimai.org/journal/bibcite/reference/3520es_ES
dc.rightsopenAccesses_ES
dc.subjectcommunication costes_ES
dc.subjectConvolutional Neural Network (CNN)es_ES
dc.subjectDeep Neural Networkses_ES
dc.subjectdistributive learninges_ES
dc.subjectfederated learninges_ES
dc.subjectneural networkses_ES
dc.subjectperformancees_ES
dc.subjectResidual Neural Network (ResNet)es_ES
dc.subjectVisual Geometry Group (VGG)es_ES
dc.titlePerformance and Communication Cost of Deep Neural Networks in Federated Learning Environments: An Empirical Studyes_ES
dc.typeArticulo Revista Indexadaes_ES
reunir.tag~ARIes_ES
dc.identifier.doihttps://doi.org/10.9781/ijimai.2024.12.001


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem