Next Article in Journal
Exploration of Zinc Oxide Nanoparticles for Efficient Photocatalytic Removal of Methylene Blue Dye: Synthesis, Characterization and Optimization
Previous Article in Journal
Silver Nanoparticles in Dentistry: Investigating Research Prospects for Silver-Based Biomaterials
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Gait Segmentation and Grouping in Daily Data Collected from Wearable IMU Sensors †

1
Graduate School of Science and Technology, Keio University, Yokohama 223-8522, Japan
2
Advanced Manufacturing Technology Innovation Center, Guangzhou Institute of Technology, Xidian University, Guangzhou 510555, China
3
Department of Information and Computer Science, Faculty of Science and Technology, Keio University, Yokohama 223-8522, Japan
*
Author to whom correspondence should be addressed.
Presented at the 10th International Electronic Conference on Sensors and Applications (ECSA-10), 15–30 November 2023; Available online: https://ecsa-10.sciforum.net/.
Eng. Proc. 2023, 58(1), 46; https://doi.org/10.3390/ecsa-10-16192
Published: 15 November 2023

Abstract

:
Gait analysis plays a vital role in medicine as it can help diagnose illnesses, monitor recovery, and measure physical performance. Related work in gait analysis has primarily utilized laboratory data due to the inherently low noise and ease of preprocessing. Daily data, gathered through wearable sensors, can also significantly impact medical care. Nonetheless, working with such data poses numerous challenges. This paper proposes an algorithm to solve the problems associated with gait segmentation of daily data obtained by inertial measurement units (IMUs) in wearable devices. The proposed algorithm can handle time-series data collected by wearable IMU sensors, including noise and different gaits. The proposed algorithm within this paper can identify the start and end points of each gait segment within the time series, and the same type of gait will be grouped together.

1. Introduction

Gait detection technology plays a vital role in the field of medical health [1]. Gait serves as a valuable indicator of an individual’s physical well-being, encompassing substantial information regarding their overall health [2]. Monitoring people’s gait in daily life facilitates the timely identification of alterations in their underlying health status. The collection of human gait data can be achieved through the utilization of inertial measurement unit (IMU) sensors. With the increasing prevalence of smart wearable devices, IMU sensors are progressively gaining traction within everyday life.
Gait segmentation using daily gait data collected by IMUs poses several difficulties and challenges. Firstly, the daily gait data often contain motion information that is not solely related to gait. Secondly, individuals employ various walking patterns throughout daily life. Additionally, each person exhibits unique movement characteristics. The complexity and variability of human gait across individuals and activities further increase the difficulty of accurately segmenting the gait cycle.
An IMU is commonly employed for gait data collection. The segmentation of gait data is conducted by considering specific characteristics and regular patterns observed in human gait data obtained by the IMU [3,4,5]. Several studies have implemented a methodology for segmenting continuous gait data from a dataset according to the features of a set of stride patterns [6]. This technological approach has been utilized in evaluating individuals with Parkinson’s disease, enabling them to conduct tests in a home environment without the need for direct medical supervision [7].
The current existing solutions for gait segmentation mostly focus on processing data that only include single gait patterns, mainly handling walking gait data. Additionally, these methods rely on data preprocessing techniques. Most of these approaches are based on laboratory data rather than real-world daily life data.
The processing of daily gait data presents some challenges, because the data will contain multiple gait types and noise. In order to overcome these challenges, a method for processing individuals’ daily gait data is proposed in this paper. This proposed method effectively addresses the issue of noise within the data and achieves segmentation for different gait patterns, all without the need for pre-set template gaits or intensive model training. The output is a set of segmented gaits, categorized according to their respective gait types. It is worth noting that this proposed method finds behavioral patterns that appear frequently throughout the time series. In this study, a method is proposed for processing data that contain multiple gait types simultaneously, including walking, running and going up and down stairs. The results are presented for separate processing of data containing each of the four gaits mentioned above. Moreover, in this study, the influence of varying sensor numbers and sensor placements on the segmentation outcome is analyzed.

2. Methods

The Figure 1 below illustrates the main processing steps of the proposed method. The first step is data collection and use of the PCA method [8] to reduce the dimensions. Then, the sliding window algorithm is used to find the optimal sample gait window, which can be used to match the same type of gait segmentation from the dataset.

2.1. Data Collection

Four young people (two males and two females; average age of 25) were recruited to collect data. All participants signed the agreement and complied with the ethical review.
Participants in the study were fitted with IMUs (MTw; Xsens Technologies Inc., Shanghai, China), aiming to gather the necessary acceleration and angular velocity data. A total of seventeen sensors were placed in specific places on the person’s body, which can be seen in Figure 2a.
Participants were instructed to engage in walking, running, and going up and down stairs, each at a self-determined comfortable pace. These four distinct gaits were selected to encompass a wide range of typical human locomotion patterns in daily life. Each data point collected during each time frame included measurements for velocity, acceleration, angular velocity, and angular acceleration at a frequency of 60 Hz. Each data point encompasses measurements for each IMU in the x, y, and z directions of global coordinates. Specifically, the x-axis indicates the positive direction towards the local magnetic (north), the y-axis aligns with the right-handed coordinates (west), and the z-axis is positive when pointing upwards [9].
Figure 2. (a) displays the position of the 17 IMUs [10]. The two photographs in (b) illustrate the participants when wearing the devices.
Figure 2. (a) displays the position of the 17 IMUs [10]. The two photographs in (b) illustrate the participants when wearing the devices.
Engproc 58 00046 g002

2.2. Data Processing

First, the moduli of velocity, acceleration, angular velocity, and angular acceleration in the horizontal plane of the obtained data are determined. This can be accomplished by employing the following formula: | V | = V x 2 + V y 2 . The input data for the four moduli, along with their respective vertical dimensions, will be considered in this study.
This is due to the fact that for human gait, vertical speed, acceleration, and angular velocity are important features [11]. In the field of gait analysis, the specific direction of human locomotion holds no significance. When individuals move with the same gait in different directions, these movements should not be considered as different gaits. Subsequently, the dataset was standardized to ensure that the features have comparable scales [8]. This was performed to ensure that each feature carries equal importance during the subsequent step of principal component analysis (PCA). Then, the PCA method [8] was to reduce the dataset containing all IMU data to one dimension.
The program will remove the data in a static state based on the given value. If a continuous sequence of data points in a static state reaches a certain threshold (in this experiment, it is set as 15), this segment of data will be excluded.

2.3. Initialize Sample Gait Window and Use It to Match the Following Data

The initial sample gait window is selected from the beginning of the current valid data based on a given parameter (in this experiment, it is set as 18 time frames). The sample gait window gradually slides backwards and is compared with the valid data using the cosine similarity algorithm. When the calculated similarity exceeds the given threshold (in this experiment, it is set as 0.8), the corresponding data will be marked as matching and the corresponding similarity value will be recorded. This will be used to score the current sample gait window. Previously compared data are not re-evaluated to avoid duplication.

2.4. Find the Optimal Sample Gait Window

The proposed method employs the sliding window algorithm to identify the optimal sample gait window. The continuous occurrence of matched segmented data with high similarity values indicates that the sample gait window demonstrates a higher degree of accuracy or quality, resulting in a higher score. The right pointer traverses a given range, and a score is computed in accordance with the aforementioned method. Finally, the position of the right pointer is located at the point that receives the highest score. The same process is applied to the left pointer. By determining the positions of the left and right pointers, the current optimal sample gait window can be determined. The threshold of the range is set to prevent windows with excessively large sizes. The threshold is determined as twice the average step length base on experiments. Furthermore, if the initial window contains content that is not related to any gait, such as noise, it will be unable to match similar data, resulting in a significantly low score. The program will filter out the sample windows with low scores.

2.5. One Group of Gait Was Found

The data that match with the optimal sample gait window will be considered as a group of gaits. They will be labeled as invalid during subsequent iterations in order to prevent redundant marking.
The previous two steps will be iterated to search for additional gaits. The iteration will end when all data have been traversed. Then, an output is obtained in the form of gait groups.

3. Results and Discussion

The ground truth is annotated by the 3D model movements generated by analyzing the data of human movements through the supporting software of the IMU used in this study. The visualization results of ground truth are shown in Figure 3.
A visualization of the results is shown in Figure 4. It illustrates the results obtained by the proposed method. The gait segmentation results are represented in the figure. It can be observed that each detected gait segmentation is marked with starting and ending points. The corresponding gait type is also labeled in Figure 4 according to the ground truth.
The subsequent figures show the validation results obtained from different combinations of sensors across various types of datasets and includes multiple distinct gaits or singular gaits exclusively. The validation results include accuracy and the Rand index. The accuracy of gait segmentation was computed using the following equation:
A c c u r a c y = A c c u r a t e s e g m e n t a t i o n n u m b e r G r o u n d t r u t h s e g m e n t a t i o n n u m b e r
The resulting segmentations were compared with the ground truth gait segmentation and were marked as accurate. The Rand index is a method used to compare the similarity of two clusters [8]. A value close to 1 denotes a higher level of similarity between the compared clusters. In this study, the Rand index was employed to assess the resemblance between the grouped outcomes attained by the algorithm and the ground truth.
The outcomes obtained from the dataset containing four distinct gaits are presented as follows. Across various sensor combinations, the accuracy values range from 0.82 to 0.90, while the Rand index is between 0.76 and 0.83. It is notable that that among all sensor combinations, the combination of three IMUs on the feet and head leads to the best results, with an accuracy and Rand index of 0.9 and 0.83, respectively.
The “corresponding equipment” in the Figure 5 represents the IMU placed at a specific position during the experiment, which may correspond to devices used in daily life.
The validation results for data segmentation, focusing on the dataset containing a single gait, are illustrated in the following Figure 6. Notably, the utilization of different sensor combinations leads to improved final outcomes.
Table 1 below compares the proposed method in this paper with existing alternatives. It can be observed that the proposed method performs comparably well with the existing methods in terms of accuracy when using the dataset containing only walking gaits.

4. Conclusions

This study introduces a practical methodology capable of segmenting and grouping daily gait data obtained by IMUs. Data containing the acceleration, angular velocity, etc., of various parts of the human body were collected. Then, the method proposed in this paper was validated. The results indicated that the method performed according to expectations and demonstrated the ability to perform both gait segmentation and grouping. The findings showed that a better performance was achieved when the sensors were placed on both feet and the head. It was observed that separate gaits tended to offer better performance compared to mixed gaits in the same dataset.
In conclusion, this study proposes a method that establishes a foundation for analyzing daily gait patterns. It has potential applications in the detection of anomalies, monitoring of sports-related activities, and other areas. This includes the development of a more accurate pedometer for different gaits, saving time on manual data annotation for large datasets and aiding in the rehabilitation of Parkinson’s patients through gait segmentation and recording.

Author Contributions

Conceptualization, Z.W.; formal analysis, Z.W.; methodology, Z.W.; software, Z.W.; writing—original draft preparation, Z.W.; writing—review and editing, C.X.; supervision, Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded in part by JSPS KAKENHI (grant number JP21H03485) and JST SPRING (grant number JPMJSP2123).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Faculty of Science and Technology, Keio University (protocol code: 2023-108, date of approval: 10 October 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Anwary, A.R.; Yu, H.; Vassallo, M. Gait quantification and visualization for digital healthcare. Health Policy Technol. 2020, 9, 204–212. [Google Scholar] [CrossRef]
  2. Maki, B.E. Gait Changes in Older Adults: Predictors of Falls or Indicators of Fear? J. Am. Geriatr. Soc. 1997, 45, 313–320. [Google Scholar] [CrossRef] [PubMed]
  3. Jain, R.; Semwal, V.B.; Kaushik, P. Stride segmentation of inertial sensor data using statistical methods for different walking activities. Robotica 2022, 40, 2567–2580. [Google Scholar] [CrossRef]
  4. Gujarathi, T.; Bhole, K. Gait analysis using imu sensor. In Proceedings of the 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kanpur, India, 6–8 July 2019; pp. 1–5. [Google Scholar]
  5. Ullrich, M.; Küderle, A.; Hannink, J.; Del Din, S.; Gassner, H.; Marxreiter, F.; Klucken, J.; Eskofier, B.M.; Kluge, F. Detection of gait from continuous inertial sensor data using harmonic frequencies. IEEE J. Biomed. Health Inf. 2020, 24, 1869–1878. [Google Scholar] [CrossRef] [PubMed]
  6. Han, J.; Jeon, H.S.; Jeon, B.S.; Park, K.S. Gait detection from three dimensional acceleration signals of ankles for the patients with Parkinson’s disease. In Proceedings of the IEEE The International Special Topic Conference on Information Technology in Biomedicine, Ioannina, Greece, 26–28 October 2006; Volume 2628. [Google Scholar]
  7. Ullrich, M.; Mücke, A.; Küderle, A.; Roth, N.; Gladow, T.; Gaßner, H.; Marxreiter, F.; Klucken, J.; Eskofier, B.M.; Kluge, F. Detection of unsupervised standardized gait tests from real-world inertial sensor data in Parkinson’s disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 2103–2111. [Google Scholar] [CrossRef]
  8. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
  9. Roetenberg, D.; Luinge, H.; Slycke, P. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors. Xsens Motion Technol. BV Tech. Rep. 2009, 1, 1–7. [Google Scholar]
  10. Xia, C.; Maruyama, T.; Toda, H.; Tada, M.; Fujita, K.; Sugiura, Y. Knee Osteoarthritis Classification System Examination on Wearable Daily-Use IMU Layout. In Proceedings of the 2022 ACM International Symposium on Wearable Computers, Cambridge, UK, 11–15 September 2022; pp. 74–78. [Google Scholar]
  11. Van Nguyen, L.; La, H.M. Real-Time Human Foot Motion Localization Algorithm with Dynamic Speed. IEEE Trans.-Hum.-Mach. Syst. 2016, 46, 822–833. [Google Scholar] [CrossRef]
  12. Barth, J.; Oberndorfer, C.; Kugler, P.; Schuldhaus, D.; Winkler, J.; Klucken, J.; Eskofier, B. Subsequence dynamic time warping as a method for robust step segmentation using gyroscope signals of daily life activities. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6744–6747. [Google Scholar]
  13. Anwary, A.R.; Yu, H.; Vassallo, M. Optimal foot location for placing wearable IMU sensors and automatic feature extraction for gait analysis. IEEE Sens. J. 2018, 18, 2555–2567. [Google Scholar] [CrossRef]
  14. Jagos, H.; Reich, S.; Rattay, F.; Mehnen, L.; Pils, K.; Wassermann, C.; Chhatwal, C.; Reichel, M. Determination of gait parameters from the wearable motion analysis system eSHOE. Biomed. Eng. 2013, 58, 000010151520134241. [Google Scholar] [CrossRef]
Figure 1. In this experiment, the data include walking, running, and stair climbing. Please note that the graph displayed represents a small portion of the experimental data, and thus, the plotted line does not include all types of gaits mentioned above.
Figure 1. In this experiment, the data include walking, running, and stair climbing. Please note that the graph displayed represents a small portion of the experimental data, and thus, the plotted line does not include all types of gaits mentioned above.
Engproc 58 00046 g001
Figure 3. Ground truth. The time series gait data post-PCA are denoted by the blue line on the graph. The start of a gait is indicated by a solid line, while the termination of a gait is indicated by a dashed line. Different colored lines represent different groups of gaits.
Figure 3. Ground truth. The time series gait data post-PCA are denoted by the blue line on the graph. The start of a gait is indicated by a solid line, while the termination of a gait is indicated by a dashed line. Different colored lines represent different groups of gaits.
Engproc 58 00046 g003
Figure 4. The results. The legend is the same as the figure above.
Figure 4. The results. The legend is the same as the figure above.
Engproc 58 00046 g004
Figure 5. Testing results for all types of gait used along with the accuracy and Rand index.
Figure 5. Testing results for all types of gait used along with the accuracy and Rand index.
Engproc 58 00046 g005
Figure 6. Testing result for only walking gait used along with the accuracy and Rand index.
Figure 6. Testing result for only walking gait used along with the accuracy and Rand index.
Engproc 58 00046 g006
Table 1. Comparative analysis of our proposed methodology with existing approaches.
Table 1. Comparative analysis of our proposed methodology with existing approaches.
ReferenceMethodologyResult
Jens B. et al. [12]Subsequence Dynamic Time Warping97.7% accuracy in walking gait segmentation
A. R. Anwary et al. [13]Peak detection95.47% accuracy in walking gait segmentation
Jagos H. et al. [14]Autocorrelation96% accuracy in walking gait segmentation
Proposed methodMatching gait by cosine similarity
in a sliding window
97% accuracy in walking gait segmentation.
90% accuracy in mixed gait segmentation.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, Z.; Xia, C.; Sugiura, Y. Gait Segmentation and Grouping in Daily Data Collected from Wearable IMU Sensors. Eng. Proc. 2023, 58, 46. https://doi.org/10.3390/ecsa-10-16192

AMA Style

Wang Z, Xia C, Sugiura Y. Gait Segmentation and Grouping in Daily Data Collected from Wearable IMU Sensors. Engineering Proceedings. 2023; 58(1):46. https://doi.org/10.3390/ecsa-10-16192

Chicago/Turabian Style

Wang, Zhuoli, Chengshuo Xia, and Yuta Sugiura. 2023. "Gait Segmentation and Grouping in Daily Data Collected from Wearable IMU Sensors" Engineering Proceedings 58, no. 1: 46. https://doi.org/10.3390/ecsa-10-16192

APA Style

Wang, Z., Xia, C., & Sugiura, Y. (2023). Gait Segmentation and Grouping in Daily Data Collected from Wearable IMU Sensors. Engineering Proceedings, 58(1), 46. https://doi.org/10.3390/ecsa-10-16192

Article Metrics

Back to TopTop