1. Introduction
Gait detection technology plays a vital role in the field of medical health [
1]. Gait serves as a valuable indicator of an individual’s physical well-being, encompassing substantial information regarding their overall health [
2]. Monitoring people’s gait in daily life facilitates the timely identification of alterations in their underlying health status. The collection of human gait data can be achieved through the utilization of inertial measurement unit (IMU) sensors. With the increasing prevalence of smart wearable devices, IMU sensors are progressively gaining traction within everyday life.
Gait segmentation using daily gait data collected by IMUs poses several difficulties and challenges. Firstly, the daily gait data often contain motion information that is not solely related to gait. Secondly, individuals employ various walking patterns throughout daily life. Additionally, each person exhibits unique movement characteristics. The complexity and variability of human gait across individuals and activities further increase the difficulty of accurately segmenting the gait cycle.
An IMU is commonly employed for gait data collection. The segmentation of gait data is conducted by considering specific characteristics and regular patterns observed in human gait data obtained by the IMU [
3,
4,
5]. Several studies have implemented a methodology for segmenting continuous gait data from a dataset according to the features of a set of stride patterns [
6]. This technological approach has been utilized in evaluating individuals with Parkinson’s disease, enabling them to conduct tests in a home environment without the need for direct medical supervision [
7].
The current existing solutions for gait segmentation mostly focus on processing data that only include single gait patterns, mainly handling walking gait data. Additionally, these methods rely on data preprocessing techniques. Most of these approaches are based on laboratory data rather than real-world daily life data.
The processing of daily gait data presents some challenges, because the data will contain multiple gait types and noise. In order to overcome these challenges, a method for processing individuals’ daily gait data is proposed in this paper. This proposed method effectively addresses the issue of noise within the data and achieves segmentation for different gait patterns, all without the need for pre-set template gaits or intensive model training. The output is a set of segmented gaits, categorized according to their respective gait types. It is worth noting that this proposed method finds behavioral patterns that appear frequently throughout the time series. In this study, a method is proposed for processing data that contain multiple gait types simultaneously, including walking, running and going up and down stairs. The results are presented for separate processing of data containing each of the four gaits mentioned above. Moreover, in this study, the influence of varying sensor numbers and sensor placements on the segmentation outcome is analyzed.
2. Methods
The
Figure 1 below illustrates the main processing steps of the proposed method. The first step is data collection and use of the PCA method [
8] to reduce the dimensions. Then, the sliding window algorithm is used to find the optimal sample gait window, which can be used to match the same type of gait segmentation from the dataset.
2.1. Data Collection
Four young people (two males and two females; average age of 25) were recruited to collect data. All participants signed the agreement and complied with the ethical review.
Participants in the study were fitted with IMUs (MTw; Xsens Technologies Inc., Shanghai, China), aiming to gather the necessary acceleration and angular velocity data. A total of seventeen sensors were placed in specific places on the person’s body, which can be seen in
Figure 2a.
Participants were instructed to engage in walking, running, and going up and down stairs, each at a self-determined comfortable pace. These four distinct gaits were selected to encompass a wide range of typical human locomotion patterns in daily life. Each data point collected during each time frame included measurements for velocity, acceleration, angular velocity, and angular acceleration at a frequency of 60 Hz. Each data point encompasses measurements for each IMU in the x, y, and z directions of global coordinates. Specifically, the x-axis indicates the positive direction towards the local magnetic (north), the y-axis aligns with the right-handed coordinates (west), and the z-axis is positive when pointing upwards [
9].
Figure 2.
(
a) displays the position of the 17 IMUs [
10]. The two photographs in (
b) illustrate the participants when wearing the devices.
Figure 2.
(
a) displays the position of the 17 IMUs [
10]. The two photographs in (
b) illustrate the participants when wearing the devices.
2.2. Data Processing
First, the moduli of velocity, acceleration, angular velocity, and angular acceleration in the horizontal plane of the obtained data are determined. This can be accomplished by employing the following formula: . The input data for the four moduli, along with their respective vertical dimensions, will be considered in this study.
This is due to the fact that for human gait, vertical speed, acceleration, and angular velocity are important features [
11]. In the field of gait analysis, the specific direction of human locomotion holds no significance. When individuals move with the same gait in different directions, these movements should not be considered as different gaits. Subsequently, the dataset was standardized to ensure that the features have comparable scales [
8]. This was performed to ensure that each feature carries equal importance during the subsequent step of principal component analysis (PCA). Then, the PCA method [
8] was to reduce the dataset containing all IMU data to one dimension.
The program will remove the data in a static state based on the given value. If a continuous sequence of data points in a static state reaches a certain threshold (in this experiment, it is set as 15), this segment of data will be excluded.
2.3. Initialize Sample Gait Window and Use It to Match the Following Data
The initial sample gait window is selected from the beginning of the current valid data based on a given parameter (in this experiment, it is set as 18 time frames). The sample gait window gradually slides backwards and is compared with the valid data using the cosine similarity algorithm. When the calculated similarity exceeds the given threshold (in this experiment, it is set as 0.8), the corresponding data will be marked as matching and the corresponding similarity value will be recorded. This will be used to score the current sample gait window. Previously compared data are not re-evaluated to avoid duplication.
2.4. Find the Optimal Sample Gait Window
The proposed method employs the sliding window algorithm to identify the optimal sample gait window. The continuous occurrence of matched segmented data with high similarity values indicates that the sample gait window demonstrates a higher degree of accuracy or quality, resulting in a higher score. The right pointer traverses a given range, and a score is computed in accordance with the aforementioned method. Finally, the position of the right pointer is located at the point that receives the highest score. The same process is applied to the left pointer. By determining the positions of the left and right pointers, the current optimal sample gait window can be determined. The threshold of the range is set to prevent windows with excessively large sizes. The threshold is determined as twice the average step length base on experiments. Furthermore, if the initial window contains content that is not related to any gait, such as noise, it will be unable to match similar data, resulting in a significantly low score. The program will filter out the sample windows with low scores.
2.5. One Group of Gait Was Found
The data that match with the optimal sample gait window will be considered as a group of gaits. They will be labeled as invalid during subsequent iterations in order to prevent redundant marking.
The previous two steps will be iterated to search for additional gaits. The iteration will end when all data have been traversed. Then, an output is obtained in the form of gait groups.
3. Results and Discussion
The ground truth is annotated by the 3D model movements generated by analyzing the data of human movements through the supporting software of the IMU used in this study. The visualization results of ground truth are shown in
Figure 3.
A visualization of the results is shown in
Figure 4. It illustrates the results obtained by the proposed method. The gait segmentation results are represented in the figure. It can be observed that each detected gait segmentation is marked with starting and ending points. The corresponding gait type is also labeled in
Figure 4 according to the ground truth.
The subsequent figures show the validation results obtained from different combinations of sensors across various types of datasets and includes multiple distinct gaits or singular gaits exclusively. The validation results include accuracy and the Rand index. The accuracy of gait segmentation was computed using the following equation:
The resulting segmentations were compared with the ground truth gait segmentation and were marked as accurate. The Rand index is a method used to compare the similarity of two clusters [
8]. A value close to 1 denotes a higher level of similarity between the compared clusters. In this study, the Rand index was employed to assess the resemblance between the grouped outcomes attained by the algorithm and the ground truth.
The outcomes obtained from the dataset containing four distinct gaits are presented as follows. Across various sensor combinations, the accuracy values range from 0.82 to 0.90, while the Rand index is between 0.76 and 0.83. It is notable that that among all sensor combinations, the combination of three IMUs on the feet and head leads to the best results, with an accuracy and Rand index of 0.9 and 0.83, respectively.
The “corresponding equipment” in the
Figure 5 represents the IMU placed at a specific position during the experiment, which may correspond to devices used in daily life.
The validation results for data segmentation, focusing on the dataset containing a single gait, are illustrated in the following
Figure 6. Notably, the utilization of different sensor combinations leads to improved final outcomes.
Table 1 below compares the proposed method in this paper with existing alternatives. It can be observed that the proposed method performs comparably well with the existing methods in terms of accuracy when using the dataset containing only walking gaits.
4. Conclusions
This study introduces a practical methodology capable of segmenting and grouping daily gait data obtained by IMUs. Data containing the acceleration, angular velocity, etc., of various parts of the human body were collected. Then, the method proposed in this paper was validated. The results indicated that the method performed according to expectations and demonstrated the ability to perform both gait segmentation and grouping. The findings showed that a better performance was achieved when the sensors were placed on both feet and the head. It was observed that separate gaits tended to offer better performance compared to mixed gaits in the same dataset.
In conclusion, this study proposes a method that establishes a foundation for analyzing daily gait patterns. It has potential applications in the detection of anomalies, monitoring of sports-related activities, and other areas. This includes the development of a more accurate pedometer for different gaits, saving time on manual data annotation for large datasets and aiding in the rehabilitation of Parkinson’s patients through gait segmentation and recording.
Author Contributions
Conceptualization, Z.W.; formal analysis, Z.W.; methodology, Z.W.; software, Z.W.; writing—original draft preparation, Z.W.; writing—review and editing, C.X.; supervision, Y.S. All authors have read and agreed to the published version of the manuscript.
Funding
This work was funded in part by JSPS KAKENHI (grant number JP21H03485) and JST SPRING (grant number JPMJSP2123).
Institutional Review Board Statement
The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Faculty of Science and Technology, Keio University (protocol code: 2023-108, date of approval: 10 October 2023).
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Data Availability Statement
Data are contained within the article.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Anwary, A.R.; Yu, H.; Vassallo, M. Gait quantification and visualization for digital healthcare. Health Policy Technol. 2020, 9, 204–212. [Google Scholar] [CrossRef]
- Maki, B.E. Gait Changes in Older Adults: Predictors of Falls or Indicators of Fear? J. Am. Geriatr. Soc. 1997, 45, 313–320. [Google Scholar] [CrossRef] [PubMed]
- Jain, R.; Semwal, V.B.; Kaushik, P. Stride segmentation of inertial sensor data using statistical methods for different walking activities. Robotica 2022, 40, 2567–2580. [Google Scholar] [CrossRef]
- Gujarathi, T.; Bhole, K. Gait analysis using imu sensor. In Proceedings of the 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kanpur, India, 6–8 July 2019; pp. 1–5. [Google Scholar]
- Ullrich, M.; Küderle, A.; Hannink, J.; Del Din, S.; Gassner, H.; Marxreiter, F.; Klucken, J.; Eskofier, B.M.; Kluge, F. Detection of gait from continuous inertial sensor data using harmonic frequencies. IEEE J. Biomed. Health Inf. 2020, 24, 1869–1878. [Google Scholar] [CrossRef] [PubMed]
- Han, J.; Jeon, H.S.; Jeon, B.S.; Park, K.S. Gait detection from three dimensional acceleration signals of ankles for the patients with Parkinson’s disease. In Proceedings of the IEEE The International Special Topic Conference on Information Technology in Biomedicine, Ioannina, Greece, 26–28 October 2006; Volume 2628. [Google Scholar]
- Ullrich, M.; Mücke, A.; Küderle, A.; Roth, N.; Gladow, T.; Gaßner, H.; Marxreiter, F.; Klucken, J.; Eskofier, B.M.; Kluge, F. Detection of unsupervised standardized gait tests from real-world inertial sensor data in Parkinson’s disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 2103–2111. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Roetenberg, D.; Luinge, H.; Slycke, P. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors. Xsens Motion Technol. BV Tech. Rep. 2009, 1, 1–7. [Google Scholar]
- Xia, C.; Maruyama, T.; Toda, H.; Tada, M.; Fujita, K.; Sugiura, Y. Knee Osteoarthritis Classification System Examination on Wearable Daily-Use IMU Layout. In Proceedings of the 2022 ACM International Symposium on Wearable Computers, Cambridge, UK, 11–15 September 2022; pp. 74–78. [Google Scholar]
- Van Nguyen, L.; La, H.M. Real-Time Human Foot Motion Localization Algorithm with Dynamic Speed. IEEE Trans.-Hum.-Mach. Syst. 2016, 46, 822–833. [Google Scholar] [CrossRef]
- Barth, J.; Oberndorfer, C.; Kugler, P.; Schuldhaus, D.; Winkler, J.; Klucken, J.; Eskofier, B. Subsequence dynamic time warping as a method for robust step segmentation using gyroscope signals of daily life activities. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6744–6747. [Google Scholar]
- Anwary, A.R.; Yu, H.; Vassallo, M. Optimal foot location for placing wearable IMU sensors and automatic feature extraction for gait analysis. IEEE Sens. J. 2018, 18, 2555–2567. [Google Scholar] [CrossRef]
- Jagos, H.; Reich, S.; Rattay, F.; Mehnen, L.; Pils, K.; Wassermann, C.; Chhatwal, C.; Reichel, M. Determination of gait parameters from the wearable motion analysis system eSHOE. Biomed. Eng. 2013, 58, 000010151520134241. [Google Scholar] [CrossRef]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).