Utilize este identificador para referenciar este registo:
https://hdl.handle.net/1822/89584
Registo completo
Campo DC | Valor | Idioma |
---|---|---|
dc.contributor.author | Gonçalves, Carolina | por |
dc.contributor.author | Lopes, João Pedro Mendes | por |
dc.contributor.author | Moccia, Sara | por |
dc.contributor.author | Berardini, Daniele | por |
dc.contributor.author | Migliorelli, Lucia | por |
dc.contributor.author | Santos, Cristina | por |
dc.date.accessioned | 2024-03-15T11:33:44Z | - |
dc.date.issued | 2023-05-05 | - |
dc.identifier.issn | 0957-4174 | - |
dc.identifier.uri | https://hdl.handle.net/1822/89584 | - |
dc.description.abstract | Gait disabilities are among the most frequent impairments worldwide. Their treatment increasingly relies on rehabilitation therapies, in which smart walkers are being introduced to empower the user’s recovery state and autonomy, while reducing the clinicians effort. For that, these should be able to decode human motion and needs, as early as possible. Current walkers decode motion intention using information gathered from wearable or embedded sensors, namely inertial units, force sensors, hall sensors, and lasers, whose main limitations imply an expensive solution or hinder the perception of human movement. Smart walkers commonly lack an advanced and seamless human–robot interaction, which intuitively and promptly understands human motions. A contactless approach is proposed in this work, addressing human motion decoding as an early action recognition/detection problematic, using RGB-D cameras. We studied different deep learning-based algorithms, organised in three different approaches, to process lower body RGB-D video sequences, recorded from an embedded camera of a smart walker, and classify them into 4 classes (stop, walk, turn right/left). A custom dataset involving 15 healthy participants walking with the walker device was acquired and prepared, resulting in 28800 balanced RGB-D frames, to train and evaluate the deep learning networks. The best results were attained by a convolutional neural network with a channel-wise attention mechanism, reaching accuracy values of 99.61% and above 93%, for offline early detection/recognition and trial simulations, respectively. Following the hypothesis that human lower body features encode prominent information, fostering a more robust prediction towards real-time applications, the algorithm focus was also quantitatively evaluated using Dice metric, leading to values slightly higher than 30%. Promising results were attained for early action detection as a human motion decoding strategy, with enhancements in the focus of the proposed architectures. | por |
dc.description.sponsorship | This work has been supported by the Fundação para a Ciência e Tecnologia (FCT), Portugal with the Reference Scholarship under Grant 2020.05708.BD and under the national support to R&D units grant, through the reference project UIDB/04436/2020 and UIDP/04436/ 2020. | por |
dc.language.iso | eng | por |
dc.publisher | Elsevier 1 | por |
dc.relation | info:eu-repo/grantAgreement/FCT/POR_NORTE/2020.05708.BD/PT | por |
dc.relation | info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F04436%2F2020/PT | por |
dc.relation | info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F04436%2F2020/PT | por |
dc.rights | restrictedAccess | por |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ | por |
dc.subject | Deep learning | por |
dc.subject | Early action detection | por |
dc.subject | Early action recognition | por |
dc.subject | Human motion decoding | por |
dc.subject | Human–robot interaction | por |
dc.subject | RGB-D video | por |
dc.subject | Smart walker | por |
dc.title | Deep learning-based approaches for human motion decoding in smart walkers for rehabilitation | por |
dc.type | article | por |
dc.peerreviewed | yes | por |
dc.relation.publisherversion | https://www.sciencedirect.com/science/article/pii/S095741742300790X | por |
oaire.citationStartPage | 1 | por |
oaire.citationEndPage | 23 | por |
oaire.citationVolume | 228 | por |
dc.identifier.doi | 10.1016/j.eswa.2023.120288 | por |
dc.date.embargo | 10000-01-01 | - |
dc.subject.fos | Engenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e Informática | por |
dc.subject.fos | Engenharia e Tecnologia::Engenharia Médica | por |
sdum.journal | Expert Systems with Applications | por |
oaire.version | VoR | por |
dc.subject.ods | Saúde de qualidade | por |
Aparece nas coleções: | CMEMS - Artigos em revistas internacionais/Papers in international journals |
Ficheiros deste registo:
Ficheiro | Descrição | Tamanho | Formato | |
---|---|---|---|---|
1-s2.0-S095741742300790X-main.pdf Acesso restrito! | 6,23 MB | Adobe PDF | Ver/Abrir |
Este trabalho está licenciado sob uma Licença Creative Commons