Mostrar el registro sencillo del ítem

dc.contributor.authorBanos, Oresti
dc.contributor.authorCalatroni, Alberto
dc.contributor.authorDamas, Miguel
dc.contributor.authorPomares, Héctor
dc.contributor.authorRoggen, Daniel
dc.contributor.authorRojas, Ignacio
dc.contributor.authorVillalonga, Claudia
dc.date2021
dc.date.accessioned2022-05-25T11:38:28Z
dc.date.available2022-05-25T11:38:28Z
dc.identifier.issn1370-4621
dc.identifier.urihttps://reunir.unir.net/handle/123456789/13168
dc.description.abstractRecognizing human activities seamlessly and ubiquitously is now closer than ever given the myriad of sensors readily deployed on and around users. However, the training of recognition systems continues to be both time and resource-consuming, as datasets must be collected ad-hoc for each specific sensor setup a person may encounter in their daily life. This work presents an alternate approach based on transfer learning to opportunistically train new unseen or target sensor systems from existing or source sensor systems. The approach uses system identification techniques to learn a mapping function that automatically translates the signals from the source sensor domain to the target sensor domain, and vice versa. This can be done for sensor signals of the same or cross modality. Two transfer models are proposed to translate recognition systems based on either activity templates or activity models, depending on the characteristics of both source and target sensor systems. The proposed transfer methods are evaluated in a human–computer interaction scenario, where the transfer is performed in between wearable sensors placed at different body locations, and in between wearable sensors and an ambient depth camera sensor. Results show that a good transfer is possible with just a few seconds of data, irrespective of the direction of the transfer and for similar and cross sensor modalities.es_ES
dc.language.isoenges_ES
dc.publisherSpringeres_ES
dc.relation.ispartofseries;vol. 53, nº 5
dc.relation.urihttps://link.springer.com/article/10.1007/s11063-021-10468-zes_ES
dc.rightsrestrictedAccesses_ES
dc.subjectactivity recognitiones_ES
dc.subjectambient sensorses_ES
dc.subjecthuman–computer interactiones_ES
dc.subjectmultimodal sensorses_ES
dc.subjecttransfer learninges_ES
dc.subjectwearable sensorses_ES
dc.subjectScopuses_ES
dc.subjectJCRes_ES
dc.titleOpportunistic Activity Recognition in IoT Sensor Ecosystems via Multimodal Transfer Learninges_ES
dc.typearticlees_ES
reunir.tag~ARIes_ES
dc.identifier.doihttps://doi.org/10.1007/s11063-021-10468-z


Ficheros en el ítem

FicherosTamañoFormatoVer

No hay ficheros asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem