Deep Learning in Visual and Wearable Sensing for Motion Analysis and Healthcare
A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Biosensors".
Deadline for manuscript submissions: 31 May 2025 | Viewed by 12132
Special Issue Editors
Interests: medical data science; machine learning; pattern recognition; activity recognition; motion capture; sensor technologies; medical informatics
Interests: motion capture; sensor technologies; digital health; machine learning; computer animation
Special Issues, Collections and Topics in MDPI journals
Interests: biomedical engineering; artificial intelligence; pattern recognition; machine vision; machine learning; medical sensor
Special Issues, Collections and Topics in MDPI journals
Special Issue Information
Dear Colleagues,
We are pleased to announce this Special Issue, which aims to gather together articles investigating the use of deep learning approaches in visual and wearable sensing, e.g., for motion analysis and healthcare applications. This issue will make a significant contribution to the field of machine learning and cover a broad spectrum of applications in the medical domain.
Applications may include (but are not limited to): diagnostics, activity recognition, motion tracking, motion analysis of body parts or rehabilitation support. As sensor technologies are diverse, we welcome all papers exploring the use of wearable sensors or ambient sensors (such as RGB(D) image/video, millimeter-wave radar, etc.).
Dr. Sebastian Fudickar
Prof. Dr. Björn Krüger
Prof. Dr. Marcin Grzegorzek
Guest Editors
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Keywords
- deep learning techniques
- computer vision
- wearable sensors
- accelerometers
- gyroscopes
- magnetometers
- multimodal sensing
- EMG and force sensors
- human activity recognition
- human movement/gait analysis
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue polices can be found here.
Planned Papers
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
- Type of Paper: Article
- Tentative Title: 3D Semantic Label Transfer and Matching in Human-Robot Collaboration
- Authors: László Kopácsi (1,2), Benjámin Baffy (2), Gábor Baranyi (2), Joul Skaf (2), Gábor Sörös (3), Szilvia Szeier (2), András Lőrincz (2), Daniel Sonntag (1,4)
- Abstract: Allocentric semantic 3D maps are highly useful for a variety of human-machine interactions since ego-centric instructions can be derived by the machine for the human partner. Class labels, however, may differ or could be missing for the participants due to the different perspectives. In order to overcome this issue, we extend an existing real-time 3D semantic reconstruction pipeline with semantic matching across human and robot viewpoints. We use deep recognition networks, which usually perform well from higher, i.e., the human viewpoints but are inferior from lower, such as the viewpoints of a small robot. We propose several approaches for acquiring semantic labels for unusual perspectives. We start with a partial semantic reconstruction from the human perspective that we extended to the new, unusual perspective using superpixel segmentation and the geometry of the surroundings. The quality of the reconstruction is evaluated in the Habitat simulator and in a real environment using Intel's small OpenBot robot that we equipped with an RGBD camera. We show that the proposed approach provides high-quality semantic segmentation from the robot's perspective with accuracy comparable to the original one. In addition, we exploited the gained information and improved the recognition performance of the deep network for the lower viewpoints and showed that the small robot alone is capable of generating high-quality semantic maps for the human partner. Furthermore, as computations are close to real-time, the approach enables interactive applications.