Next Article in Journal
GLMB Tracker with Partial Smoothing
Next Article in Special Issue
Mechanical Design and Kinematic Modeling of a Cable-Driven Arm Exoskeleton Incorporating Inaccurate Human Limb Anthropomorphic Parameters
Previous Article in Journal
Temperature Impact in LoRaWAN—A Case Study in Northern Sweden
Previous Article in Special Issue
Thin Magnetically Permeable Targets for Inductive Sensing: Application to Limb Prosthetics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Terrain Feature Estimation Method for a Lower Limb Exoskeleton Using Kinematic Analysis and Center of Pressure

1
Motion Control Laboratory, Department of Mechanical Engineering, Yonsei University, Seoul 03722, Korea
2
Construction Robot and Automation Laboratory, Department of Civil & Environmental Engineering, Yonsei University, Seoul 03722, Korea
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(20), 4418; https://doi.org/10.3390/s19204418
Submission received: 19 July 2019 / Revised: 2 October 2019 / Accepted: 10 October 2019 / Published: 12 October 2019
(This article belongs to the Special Issue Sensors and Wearable Assistive Devices)

Abstract

:
While controlling a lower limb exoskeleton providing walking assistance to wearers, the walking terrain is an important factor that should be considered for meeting performance and safety requirements. Therefore, we developed a method to estimate the slope and elevation using the contact points between the limb exoskeleton and ground. We used the center of pressure as a contact point on the ground and calculated the location of the contact points on the walking terrain based on kinematic analysis of the exoskeleton. Then, a set of contact points collected from each step during walking was modeled as the plane that represents the surface of the walking terrain through the least-square method. Finally, by comparing the normal vectors of the modeled planes for each step, features of the walking terrain were estimated. We analyzed the estimation accuracy of the proposed method through experiments on level ground, stairs, and a ramp. Classification using the estimated features showed recognition accuracy higher than 95% for all experimental motions. The proposed method approximately analyzed the movement of the exoskeleton on various terrains even though no prior information on the walking terrain was provided. The method can enable exoskeleton systems to actively assist walking in various environments.

1. Introduction

Wearable robots or exoskeletons, which are attached to a wearer’s body, are systems that extend, complement, substitute, or enhance the functioning and capability of the wearer by using actuators that can provide mechanical power [1]. The exoskeleton, which combines human intelligence capable of coping with various situations or circumstances and the ability of the robot to handle time-consuming simple tasks or high loads, has been developed for use mainly in military, industrial, and medical applications [2,3,4]. Among them, exoskeletons in the medical and rehabilitation fields are expected to perform as devices for enhancing the physical functions of patients weakened because of damage or aging of the nervous and musculoskeletal system [5,6]. In particular, as the lower limb exoskeleton for gait rehabilitation can restore walking ability, which is an important activity in human life, it can be a solution that will improve the quality of life of patients by enabling patients or elderly people to continue their personal and social activities.
The main function of the lower limb exoskeleton for gait rehabilitation or assistance is to recover or improve the wearer’s ability to walk, by replacing or supplementing the wearer’s leg functions. The exoskeleton is generally classified into treadmill-based and orthosis-based systems [7]. The major difference between the two systems is that the former repeats only certain actions in a limited space, while the latter can move freely without space constraints depending on the wearer’s intentions. Owing to the mobility offered by the latter system, it has attracted considerable attention for applications in activities of daily living (ADLs), including level walking. However, the development of control algorithms for effective assistance and wearer safety in a variety of unspecified conditions and terrains remains a challenging task.
The generalized control framework proposed by Tucker et al. [8] demonstrates the considerations in controlling lower limb prosthesis and orthosis (P/O) devices. They developed the framework comprising the following elements: user, P/O device, controller, and the environment. They also described the role and meaning of each element in a comprehensive and concise manner. In particular, as the environment is a significant factor influencing the stability and balance control of P/O devices, it is significantly important to consider the environment in which lower limb prostheses or exoskeletons operate outside of a laboratory in actual applications. Du et al. [9] demonstrated that prior knowledge about the operating environment is effective for improving the accuracy in classifying the locomotion mode of a powered knee. To address these requirements, Cybathlon 2020 [10], which is a unique world championship for people with physical disabilities, is proposing various terrains, such as rough terrains, stairs, tilted paths, and ramps, to challenge exoskeleton systems. Therefore, to provide efficient and stable assistance to wearers of exoskeletons in various environments in daily life, it is necessary to consider various types of operating environments.
Terrain, which is an element of the environment and corresponds to the geometry of the ground, is a key factor influencing the safety of the wearer requiring walking assistance from the lower limb exoskeleton. To control the exoskeleton properly according to the various terrains, the information of a standing terrain should be identified. There are two methods for identifying terrains: explicit methods, which measure the geometry of terrains directly using additional sensors, and implicit methods, which estimate the terrain geometry using sensors that are embedded in the exoskeleton [8]. The former uses a laser distance meter [11], camera [12], or an infrared range sensor [13,14] to measure the terrain features in front of the wearer. The latter uses an inertial measurement unit (IMU) [15,16,17] or an electromyography (EMG) sensor [18,19] to classify the types of terrains on which the wearers will walk in their current or next steps. The explicit method may be more suitable for controlling the lower limb exoskeleton than the implicit method in that it can estimate the terrain that the wearer would walk on in the next step; however, utilizing additional sensors can increase the burden on the controller and increase the cost of the system. On the other hand, the implicit method can estimate information about the current terrain only; however, it can be compensated by using the characteristics of walking movements, which are repetitive and cyclic [20], and algorithms that compute the intention of a wearer’s walking. Moreover, because this method does not use additional sensors but uses built-in sensors for control in the majority of exoskeleton systems, it can be used freely in most exoskeletons.
Therefore, in this study, we developed a terrain feature estimation method that can be applied for controlling the lower limb exoskeleton for effective walking assistance and safety of the wearer on various walking terrains. The proposed method utilizes the center of pressure (CoP), which is measured by the foot pressure sensor, as the contact point between the exoskeleton and the ground and calculates the position of the contact point in space through kinematic analysis. Because all these contact points created during walking are points on the ground, the geometry of the ground can be determined through the trajectories of these points. The contact point set calculated for each step is modeled as a plane that reflects the geometry of the ground by using the least-square method. Finally, the proposed method estimates the slope and elevation of the terrain for each step by using the normal vector of the modeled plane. Unlike previous studies that focused on the type of terrain for selecting the locomotion mode, our study focused on understanding the detailed features of terrains, such as the slope, and elevation, to provide further information for control algorithms. Furthermore, as our method does not utilize any additional sensors other than posture, angle, and foot sensors, which are used in general exoskeletons, to control the system and to identify the wearer’s intention, it can be easily configured in a variety of exoskeleton systems and will not affect the cost of the system.
In the lower limb exoskeleton used in this study, the hip and knee joints were assisted by an electrical actuator that was configured as a module. Modular actuators can be selectively attached to or detached from joints that require assistance depending on the wearer’s condition or purpose. This allows the exoskeleton to be operated in different modes depending on the situation and can thus be applied to various subjects. We describe the exoskeleton used in developing the terrain feature estimation method in the next section.
The remainder of the paper is organized as follows: Section 2 is comprised of the descriptions for the materials and methods that are used for the development in this study. The first portion of Section 2 describes the mechanical structure, sensor system, and modularization of the actuation part of the developed lower limb exoskeleton. The second portion of Section 2 describes the calculation of the spatial position of the CoP through kinematic analysis of the exoskeleton and terrain feature estimation using this spatial position. Section 3 describes the experimental results for level ground, stairs, and a ramp using the developed method. Section 4 presents the discussion of the results and concludes the paper.

2. Materials and Methods

2.1. Overall Structure of the Exoskeleton

The exoskeleton system illustrated in Figure 1 was designed for normal people and for patients or elderly individuals with partially weakened muscles due to nervous system diseases or aging. The system is comprises orientation and angular sensors for calculating its spatial posture, force sensors inserted in the fastening parts between the wearer and robot for detecting wearer’s intention. And electric motors are used to support motion of the hip and knee joints of a wearer during walking, sit-to-standing and squatting in their daily lives or rehabilitation trainings by decreasing the load on the joints. In this study, we focused on estimating the normal vector of the ground surface on which the wearer stands through kinematic analysis of the exoskeleton; therefore, actuator control was not considered. However, we briefly describe all the components of the exoskeleton including the actuators in the following sections.

2.1.1. Mechanical Joints and Components

Each leg of the exoskeleton was designed to accommodate the motion of human joints in the sagittal, transversal, and frontal planes, which are responsible for stretching the legs forward to advance, changing walking direction by rotating legs, and maintaining balance by shifting the weight center, respectively. The leg has three joints, namely the hip, knee, and ankle joints; they were positioned to reduce discomfort caused by misalignments between the exoskeleton and wearer by aligning their axes to pass through the anatomical joint axis of the wearer [21].
The hip joint of the exoskeleton has five degrees of freedom (DOFs). The joint is composed of two revolute joints for hip flexion/extension and adduction/abduction and one revolute with two prismatic joints for hip medial/lateral rotation (Figure 2). The axes of the first two joints can be easily aligned to pass through the center of the hip joint, which is usually modeled as a ball and socket joint [22], by adjusting the location of the axes manually. However, as the axis of hip medial/lateral rotation is inside the wearer’s hip joint and thigh segment, the remote-center rotation mechanism comprising linkages [23,24] or curved sliders [25,26] is required to align the axis of the exoskeleton to the axis of the wearer’s hip motion. It is because the motion of a rigid body rotating about a remote-center consists of rotation and translation; these mechanisms cause the axis of the revolute joint to slide on the transversal plane during hip medial/lateral rotation. For this reason, we added two prismatic joints (P1, P2) to the hip medial/lateral rotation joint to align the axes of the exoskeleton and wearer (Figure 3). P1 (blue rectangle) is fixed to the link attached to the hip abduction/adduction joint. The remaining part of each leg is connected to P2. The revolute joint for the hip medial/lateral rotation is located between P1 and P2. The prime symbol (′) denotes the position of the thigh segment and remaining part of the leg after medial/lateral rotation (θrotation). By allowing movement (dslide) of the revolute joint with P1, the rest of the leg can rotate with a constant distance (Lhip,2) from the center. Inside the prismatic joints, springs were added to maintain a neutral position. Moreover, springs were installed in the hip adduction/abduction joint to compensate for the torque caused by the weight of the exoskeleton leg.
The knee joint of the exoskeleton has a one DOF revolute joint for knee flexion/extension (Figure 4a). Technically, a human knee joint cannot be modeled as a simple hinge joint because it shows polycentric motion, i.e., its instant center of rotation is not fixed [27]. However, as the variation of the axis in walking is relatively small (approximately 10–15 mm [28,29]), it could be compensated by the motion of the other joints. Thus, we designed the knee joint as a revolute joint.
The ankle joint of the exoskeleton is composed of two revolute joints for ankle dorsi/plantar flexion and inversion/eversion (Figure 4b), which occur concurrently during walking [30]. We did not consider the rotation of the ankle joint on the transversal plane because it mainly resulted from the hip medial/lateral rotation. A spring-loaded support that transfers the weight of the exoskeleton to the ground was added under the ankle dorsi/plantar flexion joint to reduce the burden of the wearer.
The foot segment of the exoskeleton secures the foot of the wearer wearing a shoe by using a buckle and covers only the heel side of the wearer’s foot (Figure 5). This configuration permits the forefoot to be bent during the terminal stance phase. It is effective for walking naturally [31,32] because the flexibility of the foot required for smooth contact is maintained. Rubber pads are attached under the sole of the foot segment and the wearer’s shoe for shock absorption and to prevent slip, and to compensate for the height difference between the wearer’s shoes.
The available ranges of motion (ROMs) of each joint were set to be between the total ROMs of anatomical human joints and the ROMs of normal walking motions (Table 1). For wearer safety, mechanical stoppers were installed to restrict the joints so that they did not exceed their allowable ROMs.
The frames were made of an aluminum alloy (7075 used for high-stress parts, 6061 used for the other parts) for obtaining a light-weight structure. The length of each link is manually adjustable for adapting to different body sizes of wearers by using a slider and a bolt with a spring washer. The adjustable ranges of each link length were set based on the dataset of Korean males aged 20 to 60 years [33]. The physical interface on the back, thigh, and shank segment was composed of a rigid cuff with a soft pad and Velcro straps. It covers a large area of the wearer’s body to prevent pressure concentration.

2.1.2. Sensors and Electronics

The schematic of the entire control system is illustrated in Figure 6. We designed a distributed controller architecture to reduce the calculation load caused by a large amount of sensor data on the master controller (180 MHz, 32F429I-DISC1, STMicroelectronics). The slave controllers (72 MHz, NUCLEO-F303K8, STMicroelectronics) were installed on the thigh and shank segment of each leg, respectively; they collected the data measured by the sensors on each segment. The collected data of each slave controller were transferred to the master controller through the controller area network (CAN) protocol. An attitude and heading reference system (AHRS; 3DM-GX4-25, LORD-MicroStrain®) was installed on the back panel and was directly connected to the master controller. The master controller on the back panel merged the data from the slave controllers and the acquisition time into a unified dataset. An LCD mounted board (180 MHz, 32F469IDISCOVERY, STMicroelectronics) was stacked above the master controller for monitoring the acquired dataset, and a PC was used for recoding the dataset. The acquisition rate of the dataset was set to 100 Hz. Two Li–Po batteries (11.1 V, 2600 mAh) with a 5 V regulator were utilized to supply power to the controllers and sensors. An emergency switch was utilized for the safety of the wearer during the test.
To estimate the spatial position and orientation of the exoskeleton, a total of five AHRSs and two absolute encoders were utilized to measure the angle of each joint. In many studies, for controlling an object in space, it is common to use an AHRS or an IMU. However, these require a magnetometer and carefully designed filter to correct accumulated errors such as drifts. Moreover, they have a relatively lower resolution than encoders; thus, measuring a joint angle using an AHRS or IMU is considered inappropriate in robotics. Nevertheless, several studies [34,35] estimated the joint angle of a manipulator with AHRSs and IMUs to utilize their advantages of contactless sensing and simple installation. Additionally, as an AHRS returns the attitude as an Euler angle consisting of three independent variables (roll, pitch, and yaw), it can estimate up to three joint angles without installing encoders on all the joint axes to be measured [36,37]. Thus, we applied two additional AHRSs (MW-AHRSv1, NTREX Corp.) on each leg to estimate the angle of the hip and ankle joint, which have multiple DOF structures. The angle of the knee joint, which has one DOF, was measured by an absolute encoder (12-bit resolution, AMT203, CUI). Consequently, the number of sensors required for estimating the state of each leg of the exoskeleton, which six DOFs, was reduced to three from six, which was required when using encoders.
Unfortunately, in this study, it is impossible to utilize a magnetometer for compensating the yaw drift because of the magnetic disturbance caused by electrical actuators and the indoor environment. Thus, we used only the gyro measurements for calculating yaw angles, and tests using the system were conducted for short durations to minimize the drift error. The level of the drift error was checked up to ±0.1 °/min in a stationary condition.
Thin and flexible force sensing resistors (FSRs) were used to measure the foot plantar pressure distribution and the interaction forces exerted through the physical interface on the wearer’s thigh and shank. Our previous study [38] has shown that measuring these forces is useful for the exoskeleton to recognize the wearer’s intention and to operate in accordance with it. To insert the FSRs into the shoes of the wearer, an insole-type sensor was fabricated by bonding four FSRs (A401, Tekscan Inc.) to a polypropylene sheet (Figure 7). The FSRs were placed on protruding areas of the human foot where pressure concentrates during walking [39,40]. The interaction forces were also measured using FSRs (A301, Tekscan Inc.) that were smaller than the FSRs on the insole sensor.
Custom-made sensors (Figure 8) for measuring the interaction forces between the wearer and exoskeleton were installed on the base of each fastener of the thigh and shank segments. As an FSR can only measure compressive forces, a pair of FSRs was inserted with springs to measure bidirectional forces. Parts A and B shown in Figure 8 were fixed to the fastener (wearer side) and the thigh or shank frame (exoskeleton side), respectively. When the wearer moves his/her limb, part A is translated along the linear guides (yellow arrows) and compresses the spring. Then, the measured force (red arrows) on one of the FSR will be increased while that on the other side will be decreased. All of the FSR output signals were filtered using a second order Butterworth filter with a 10 Hz cutoff frequency. The output of each FSR was calibrated with a second order polynomial curve for the range of 0–20 kgf (Figure 9).

2.1.3. Actuation Modules

The flexion/extension of the hip and knee joints was assisted by the actuation module shown in Figure 10. Unlike built-in actuators of a general exoskeleton, the actuation modules designed in this study can be detached from the exoskeleton frame and selectively provide assistive torque to the joints. Modularization can be one of the means to extend the capability of an exoskeleton limited to a specific user or task. Through modularization, a system can be reconfigured depending on the user requirements [41], applying a specific stiffness or damping force on the joint [42] and converting from a passive orthosis to a motorized orthosis [43]. Consequently, it will offer different testing options for developers and provide various functions for users and reduce the cost for customers. The actuation module is composed of an electric motor, a gear reducer (100:1 for hip module and 80:1 for knee module, SHD-20-2SH, Harmonic Drive LLC), and a motor driver (24V-16A, CUBE-2416-SIH, Robocube Tech co.) with the CAN protocol. Two types of electric motors (TBMS-6025-B for hip module and TBMS-6013-B for knee module, KOLLMORGEN) were selected to handle the torque–speed characteristics of human hip and knee joint during walking [44], as shown in Figure 11. Jaw couplings were applied to the output shaft of the module and joint of the exoskeleton to transfer the output power of the actuator to the wearer’s joint.
The exoskeleton developed in this study can operate in five different modes through assembling the actuation modules depending on the assistance requirements of wearers, as shown in Figure 12. If a wearer cannot move his/her lower limb entirely, mode 1, with four actuation modules on the hip and knee joints, is appropriate for assistance. Likewise, the other modes can be used according to the assistance requirements of wearers. In mode 2, actuator modules are attached to both the hip or knee joints; in mode 3, the actuation modules are attached to the hip and knee joints of one of the legs; in mode 4, the actuator modules are attached to one of the lower limb joints. Although the exoskeleton in mode 0 where no actuation module is incorporated cannot support the movement of the wearer, the joint angle and plantar pressure data of the wearer can be obtained from sensor measurements thus it is available to be utilized as a motion analysis device. In this study, we utilized the exoskeleton with mode 0 in developing the terrain feature estimation method because this mode can follow the wearer’s movements without using additional control schemes such as the transparent mode [45] to eliminate the actuator inertia. The total weight of the exoskeleton with and without the actuation modules is 9.5 kg and 15.5 kg, respectively.

2.2. Strategy for Terrain Feature Estimation

This study calculated the position of the contact points on the ground at each step while the wearer walked; the slope and elevation of the terrain were estimated by modeling the contact points as a spatial plane. The proposed method calculated the position and orientation of each segment and foot through kinematic data derived from the embedded sensors. Simultaneously, we used the plantar pressure, which was measured by the insole sensor, to calculate the CoP and utilized it as a contact point with the ground. This was then introduced into the previously performed kinematic analysis, and the spatial locations of the contact points were collected while walking. The collected contact points on the ground on which the exoskeleton stepped were modeled as a plane through the least-square method. Finally, we used the normal vectors of the modeled planes during each step to estimate the terrain features, namely the slope and elevation.

2.2.1. Kinematic Analysis of Lower Limb Exoskeleton

Kinematic analysis of the lower limb exoskeleton was performed based on the Denavit–Hartenberg (D–H) convention [46]. Starting from the back panel, frame {0}, the frames consisting of the X (dotted red arrow) – Z (solid red arrow) axis were sequentially attached to the joints of each leg with eight DOFs, as shown in Figure 13. XN and ZN denote the X and Z axes of the frame attached to the Nth segment, respectively. Furthermore, to distinguish the left and right leg frames, “r” and “l” for indicating left and right legs, respectively, are used after the frame number N. Although the figure shows only the kinematic model for the left leg of the exoskeleton, the frames are attached to the right leg using the same rule. In the figure frames ({B}, {T}l,r, and {F}l,r), which comprise the X–Y–Z axes, indicate the positions and postures of the AHRSs installed on the back panel, the thighs, and feet of the left and right legs. Frame {W} indicates the world coordinate system, and its ZW axis is parallel to the vector of gravity. Table 2 presents the D–H parameters that were calculated using the defined frames. The displacement of the prismatic joints P1 and P2 was not considered because it is normally small and the joints maintain their neutral position with the help of the springs if not in an abnormal situation. The frames {2} and {4} are the virtual frames added to arrange the joints to fit the hip structure of the exoskeleton.
The homogeneous transformation matrix, i i 1 T , which represents the orientation and position of the ith frame with respect to the i-1th frame according to the D–H convention, can be expressed as follows:
i i 1 T l , r = [ i i 1 R 3 × 3 i 1 d 3 × 1 0 1 × 3 1 ] l , r = [ c θ i s θ i 0 a i 1 s θ i c α i 1 c θ i c α i 1 s α i 1 s α i 1 d i s θ i s α i 1 c θ i s α i 1 c α i 1 c α i 1 d i 0 0 0 1 ] l , r ,
where i i 1 R 3 × 3 and i i 1 d 3 × 1 respectively denote the rotation and translation parts of the transformation matrix; ‘c’ and ‘s’ denote cosine and sine functions, respectively.
The position and orientation of the foot segment and frame {8} with respect to frame {0}, which is the frame on the back panel, were calculated as follows according to the segment connection order of the exoskeleton:
8 0 T l , r = i = 1 8 i i 1 T l , r ,
Consequently, using the rotation matrix of frame {B}, which was calculated by the AHRS installed on the back panel, the position on frame {W} of each segment was calculated as follows:
8 W T l , r = B W T 0 B T 8 0 T l , r ,
where B W T is calculated by the AHRS on the back panel; 0 B T is a constant matrix determined by the installation of the AHRS on the back panel.
In this study, as explained in Section 2, the angles of the hip joint (θ1, θ3, and θ5) and ankle joint (θ7 and θ8) of both the legs, excluding the knee joint angle (θ6), were indirectly measured by the five installed AHRSs. Each AHRS returns its posture as a rotation matrix with respect to frame {W}. The following relationship was derived using frame {B} of the AHRS, which was installed on the back panel, and frame {T}l, r of the AHRS, which was installed on the thigh segments:
T W R l , r = B W R 0 B R 5 0 R l , r T 5 R l , r ,
where R means the rotation part of the transformation matrix T; T 5 R l , r is a constant matrix determined by the installation of the AHRS on the thigh segment.
As, matrix T W R l , r is directly measured by the AHRSs installed on the thigh segment, matrix 5 0 R l , r was calculated using the known values as follows:
5 0 R l , r = ( 0 B R ) 1 ( B W R ) 1 T W R l , r ( T 5 R l , r ) 1 = [ r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 ] l , r ,
where rij is an element of matrix 5 0 R l , r .
Rotation matrix 5 0 R l , r was also derived using D–H parameters in Table 2 as follows:
5 0 R l , r = i = 1 5 i i 1 R l , r = [ s θ 1 c θ 5 s θ 3 s θ 5 c θ 1 s θ 1 s θ 5 s θ 3 c θ 1 c θ 5 c θ 1 c θ 3 c θ 1 c θ 5 s θ 1 s θ 3 s θ 5 s θ 5 c θ 1 s θ 1 s θ 3 c θ 5 s θ 1 c θ 3 s θ 5 c θ 3 c θ 3 c θ 5 s θ 3 ] l , r ,
Therefore, the hip joint angles (θ1, θ3, and θ5) were calculated using Equations (5) and (6) as follows:
θ 3 = atan 2 ( r 33 , r 31 2 + r 32 2 ) θ 5 = atan 2 ( r 31 / c θ 3 , r 32 / c θ 3 ) θ 1 = atan 2 ( r 23 / c θ 3 , r 13 / c θ 3 ) for   l , r ,
Similarly, the ankle joint angles (θ7 and θ8) were also obtained as in Equations (4)–(6). Using frame {T}l,r of the AHRS in the thigh segment and frame {F}l,r of the AHRS in the foot segment, matrix 8 6 R l , r was calculated as follows:
8 6 R l , r = ( 6 5 R l , r ) 1 ( 5 T R l , r ) 1 ( T W R l , r ) 1 F W R l , r ( F 8 R l , r ) 1 = [ q 11 q 12 q 13 q 21 q 22 q 23 q 31 q 32 q 33 ] l , r ,
where F 8 R l , r is a constant matrix determined by the installation of the AHRS on the foot segment; matrix F W R l , r was calculated by the AHRS installed on the foot segment; matrix 6 5 R l , r was calculated using the knee joint angle (θ6) and qij is an element of matrix 8 6 R l , r .
Matrix 8 6 R l , r was also derived using D–H parameters as follows:
8 6 R l , r = i = 7 8 i i 1 R l , r l , r = [ c θ 7 c θ 8 c θ 7 s θ 8 s θ 7 s θ 7 c θ 8 s θ 7 s θ 8 c θ 7 s θ 8 c θ 8 0 ] l , r ,
Finally, the ankle joint angles, θ7 and θ8, of both legs were derived using Equations (8) and (9) as follows:
θ 7 = atan 2 ( q 13 , q 23 ) θ 8 = atan 2 ( q 31 , q 32 ) for   l , r ,
Figure 14 compares the estimated result using Equation (7) for the hip joint angle (θ5) and the measured result using an absolute encoder.

2.2.2. Calculation of CoP and Foot Phase

In this study, we used the CoP, which was calculated by using plantar pressures measured by an insole sensor (Figure 7) comprising four FSRs, and angular velocity ( ω F ) of the foot segment, measured by the AHRS, to determine the foot phase of the exoskeleton. We assumed that slip does not occur owing to the rubber pads under the sole of the foot segment by the rubber pads under the sole of the foot segment while the exoskeleton is in contact with the ground. The CoP values of both feet were calculated using the position ( r FSR , i ) and measured force ( F FSR , i ) of each FSR as follows:
CoP ( x , y ) = i = 1 4 r FSR , i F FSR , i i = 1 4 F FSR , i ;
the notations for the FSR and coordinate axes for the CoP are depicted in Figure 7, and the coordinate of the FSR is calculated on frame {8}l,r.
The foot phases were classified into four phases according to the calculated position of the CoP and angular velocity ( ω F ): heel contact (HC), foot flat (FF), heel off (HO), and foot off (FO). First, if the calculated CoP was zero, it indicated that the exoskeleton did not touch the ground. In this case, the foot phase became the FO phase. If the CoP was not zero, the foot phase was determined by considering the position of the CoP and angular velocity. If the CoP was located close to the heel side and the angular velocity was higher than a certain threshold, the foot phase became the HC phase. If the CoP was located on the front side and the angular velocity was higher than a certain threshold, the foot phase became the HO phase. In other cases, if the CoP was located in the middle of the foot or the angular velocity was low enough, the foot phase became the FF phase. Figure 15 shows the foot phase calculation results for one gait cycle while walking.

2.2.3. Position Calculation in Frame {W}

The spatial movement of the exoskeleton was analyzed by calculating the relative motion of each segment with respect to the contact point on the ground being a pivot. For the analysis, it was important to choose a pivot appropriately based on foot contact conditions. In a single stance, as only the CoP for one foot, which was touching the ground, was measured, the CoP of the supporting leg was selected as the pivot. In the double stance where both legs touched the ground, the CoP of one of the feet in the FF phase, considered to be in full contact with the ground, was selected as the pivot. If both the feet were in the FF phase or any foot was not in the FF phase, the CoP on the side with more weight was selected as the pivot.
Figure 16 shows the calculation of the position vector of the pivot on frame {W} according to the changes in the CoP during walking. Based on the state shown in Figure 16, at t = n − 1 (Figure 16a), the CoP of the left leg was used as the pivot (Equation (12)). Subsequently, as the position of the CoP was calculated on frame {8}, the position vector of the CoP on frame {W} was calculated through a homogeneous transformation from kinematic analysis (Equation (13)):
8 P p i v o t t = 8 P CoP , l t ,
W P p i v o t t = B W T t 8 B T l t 8 P p i v o t t
where P is a 4 × 1 position vector with its fourth element being 1.
At t = n (Figure 16b), as the CoP moved because of weight shifting by the wearer during the single stance, the position vector of the pivot on frame {W} was updated considering the same variation as the CoP. The variation in the CoP was calculated by comparing the position vector of the current CoP ( 8 P CoP , l t ) and previous CoP ( 8 P CoP , l t 1 ) on frame {B}. The calculated variation was also transformed to the vector on frame {W} ( Δ W P p i v o t t ) through homogeneous transformation; it was used for updating the current pivot position vector by adding it to the previous pivot position vector as follows:
Δ W P p i v o t t = B W T t ( B P p i v o t t B P p i v o t t 1 | t ) ,
W P p i v o t t = W P p i v o t t 1 + Δ W P p i v o t t ,
where B P p i v o t t and B P p i v o t t 1 | t are 8 B T l t 8 P CoP , l t and 8 B T l t 8 P CoP , l t 1 , respectively, and indicate the position vector of the pivots calculated with respect to frame {B} at time t.
At t = n + 1 (Figure 16c), i.e. double stance, the supporting leg changed from the left leg to the right leg; the CoP of the left foot was used as the position vector of the previous pivot, and that of the right foot was used as the position vector of the current pivot as follows:
B P p i v o t t = 8 B T r t 8 P CoP , r t   and   B P p i v o t t 1 | t = 8 B T l t 8 P CoP , l t ,
The variation in the CoP on frame {W} was calculated using Equation (14), and the position vector of the current pivot was also updated using Equation (15). While the exoskeleton walked on the ground, the position vector of the pivot was continually updated to calculate the exoskeleton motion for frame {W} through this process.

2.2.4. Modeling the Contact Surface as a Plane

The calculated position vectors of the pivot are the contact points on the walking ground. Therefore, we used the pivots as markers representing the geometry of the ground. Only the pivot collected in the FF phase was utilized as a marker for terrain estimation because it was fully in contact with the ground. The markers were separately grouped for each step. The set SN,m consisting of m markers collected for the Nth step is expressed as follows:
S N , m = { W P p i v o t t m + 1 , W P p i v o t t m + 2 , , W P p i v o t t } ,
If a sufficient number of markers was accumulated (m ≥ 10), the walking ground for the Nth step was modeled as a spatial plane ( a N x + b N y + c N z + d N = 0 ) through the least-square method using set SN,m (Figure 17). However, because the collected points were sufficient to determine a spatial line but insufficient to determine the spatial plane, we added additional virtual markers (purple squares in Figure 17), which were located in the medial direction of the foot and parallel to the calculated CoP, to model the plane. Finally, plane ON, which was modeled for the Nth step, is defined as follows:
O N = { u N , W P c e n t e r , N } ,
where u N indicates the normal vector of the modeled plane, and W P c e n t e r , N is the center point of the plane and is the average value of the marker set, SN,m.
Because the markers were continuously collected while the foot of the exoskeleton remained in contact with the ground, the size of the marker set increased during the contact. Therefore, the plane of the ground for each step also changed as the set was updated. Figure 18 shows that the transitions of the plane equation components (aN, bN) are continuously updated for one gait cycle during walking. Therefore, the longer the contact with the ground, the closer the modeled plane will be to the actual terrain.

2.2.5. Terrain Feature Estimation

As the wearer walked on the ground, the contact point sets of both feet were modeled as planes that varied based on the geometry of the ground. Figure 19 shows planes ON and ON − 1 modeled for steps N and N − 1, respectively. We calculated the slope and elevation of the walking terrain by comparing the normal vector ( u N ) and center point ( W P c e n t e r , N ) of the planes, as follows:
θ N = atan 2 ( [ ( u N × Z W ) u N ] , [ u N Z W ] ) ,
h N = W P c e n t e r , N ( z ) W P c e n t e r , N 1 ( z ) ,
where W P c e n t e r , N ( z ) indicates component Z W of W P c e n t e r , N .
The slope θ N represents the angle of the normal vector uN with respect to the ZW axis. This value indicates the degree to which the ground is tilted with respect to the direction of gravity. The elevation hN is the height difference between the floor in the previous step and the floor in the current step. These two features reflect the geometry of the ground; for example, both these features will be low for the level ground case; however, for stairs, the elevation will be high while the slope will be low. In the next section, we show the results of the estimated features of different terrains through walking experiments involving level ground, stairs, and a ramp.

3. Experimental Results

For evaluating the performance of the proposed method, five types of ambulation tests on different terrains (Figure 20) were performed by involving normal healthy subjects and their demographic characteristics are listed in Table 3. Each subject provided informed consent before participating in the test. The segments and joint axes of the exoskeleton were adjusted to fit to the subject’s body size so that the subject could move comfortably. The experiment began after the subject had sufficient practice to move naturally in the experimental terrain.
The three types of terrains used in the experiment are shown in Figure 21; the horizontal distance (Dcase), vertical distance (Hcase), and inclination angle (θcase) for each terrain are summarized in Table 4. Three ramps with different slopes were used to validate the performance of the slope estimation. It was assumed that the experimental terrains did not change in the lateral direction. The subject began ambulation with the standing posture and then repeated the level walk (LW), stair ascent (SA), stair descent (SD), ramp ascent (RA), and ramp descent (RD) processes on each terrain 10 times. For obtaining consistent experimental results, the start and end positions were marked on the ground as the reference positions for the subject. Sensor data from the exoskeleton were collected at a sampling rate of 100 Hz using a PC. The collected data were analyzed by using MATLAB (MATLAB R2017a, The MathWorks, Inc.).
Walking data equal for 6510 steps were collected during the experiments for all subjects. Figure 22 shows the kinematic analysis results of the exoskeleton on frame {W} in each terrain and Figure 23 shows the average of the terrain feature estimation results for all subjects. The analysis result in the intermediate process is overlapped in the figure. The estimated planes of each step on the ground on which the exoskeleton walked are depicted with their normal vectors (red arrows). As shown in the results, the method analyzed the exoskeleton movement for each terrain, even though we did not provide any prior knowledge about the experiment terrain. The videos showing the analysis results of the exoskeleton on the different terrains can be found in Supplementary Materials Video S1.
Table 5 summarizes the estimation errors in the total displacement of the subject and the slope and elevation, which were calculated using the proposed method for the experimental terrain. The total accuracy (Stotal) calculated using the entire data was inserted in the last row for each test. The position error per step was calculated by dividing the error between the total calculated displacement of the subject and the actual distance of the experimental section (Table 4) by the number of steps performed for each section. The total root mean square (RMS) error of the horizontal movement displacement in all experiments (DRMSE) was approximately 13 mm per step, which is approximately 2% of the step length of a normal person, approximately 660 mm [47]. Although the total RMS error (HRMSE) of the vertical movement displacement was within 10 mm per step except for the SD test, the SD test showed a large error of approximately 20 mm per step. This is because, in the SD motion, the front of the flexible foot touches the ground first, instead of the rigid heel cuff at each step. Therefore, if an accurate human foot model is used to calculate the position of the accurate contact point, the error can be minimized. The lateral direction error (YRMSE) reflects the degree of the drift effect caused by the AHRS. The error was resulted lower than 10 mm/step for the total walking distance in the experiments on the level, the ramps 2 and 3, which is a reasonable result; however, the error with a range of 10–20 mm per step was observed in the experiments on the stairs and the ramp 1.
Among the estimation results for each terrain, the RMS error ( θ R M S E ) of the slope was less than 2° for all tests. The estimated RMS error of the elevation ( h R M S E ) was calculated for the LW, SA, and SD tests only, as the stride of the subject was not limited in the RA and RD tests. The estimation error of the elevation in the LW test was found to be within approximately 10 mm; and, the errors in the SA and SD tests were approximately 7 % and 17% of 165 mm which is the height of the experimental staircase, respectively. For the SD test, there was a larger error than the other experiments because the front part of the foot that is allowed to bent touched the ground first.
On average, the errors of the proposed method for the entire terrain in each direction were determined as 10.29, 9.45, and 9.22 mm per step, respectively; the terrain feature estimation result showed that the RMS errors of the slope and elevation were approximately 1.2° and 10% of the height of the stairs, respectively. Consequently, we could obtain best results in the level ground experiment, but the errors in the stair and ramp experiments need to be minimized in future. In the next section, we discuss directions to reduce these errors.
The terrain features of each step—slope and elevation—, which were estimated using the proposed method, can also be used for classifying the locomotion mode. We normalized the slope and elevation results of 6510 steps in total by dividing them by their maximum values; we then used the normalized values as features to classify the walking terrain. Figure 24 shows the distribution of the samples obtained using the selected features in the feature space. In the figure, SA and SD test samples are clearly distinguished from other samples, but the samples of RA and RD test on low slopes are partially mixed with the samples of LW test. Table 6 presents the classification performance using a support vector machine (SVM) for the total dataset of steps, and Table 7 shows the classification performance on each subject. Classification using the estimated features showed recognition accuracy higher than 95% for all experimental motions and for all subjects. From these results, we confirmed that the terrain features estimated using the proposed method can be used for classifying the walking movements.

4. Discussion

In this paper, we proposed a method to estimate the terrain features, namely the slope and elevation, of the walking ground by finding the contact surface based on the kinematic analysis of the lower limb exoskeleton. The proposed method was evaluated via experiments involving three types of terrains: level walk for 10 m, stair ascent and descent, and ramp ascent and descent with three different slopes. In the experimental results, except for SD walking, the total displacement calculation error for all directions was approximately 10 mm per step, which corresponds to 1.5% of 660 mm, which is the step length of a normal person; the slope and elevation estimation errors of the ground were approximately 0.8 to 1.8° and 8 to 28 mm, respectively. These error values also show how roughly the motion of the exoskeleton is being calculated. In addition, the classification results obtained using estimated features for the five walking movements showed recognition accuracy higher than 95%; this indicates that the terrain features estimated using the proposed method can be used as features for classification.
The main contribution of this study is that the proposed method can estimate information about unknown terrains using the sensors embedded in the exoskeleton only, without direct measurements of the surrounding terrain. This advantage allows the system to determine the surrounding terrain and control it appropriately in unpredicted and unknown environments, rather than being confined to specific environments. In addition, by identifying the specific slope and elevation of the walking terrain, it can be used to support the wearer’s walking motion safely, by performing stability evaluation or trajectory generation for the next step. Additionally, as this study estimated the terrain features by using only the sensors embedded in the general exoskeleton, it is expected that this method can be applied to other exoskeleton systems without difficulty.
The updating process of the modeled plane is an important advantage of our method. The excellent and comprehensive work demonstrated by Huo et al. [48] used terrain features for the gait mode detection of a lower limb exoskeleton. In their reported results, they estimated terrain features with the following errors of mean = 6 mm, std = 34 mm in LW; mean = 1 mm, std = 18 mm in SA and SD for the elevation; mean = 0.37°, std = 2° in LW, SA and SD for the slope. In our study, the estimated terrain features with the following errors of: mean = 0.93 mm, std = 8.55 mm for LW; mean = 5.16 mm, std = 9.73 mm for SA; mean = 25.9 mm, std = 10.9 mm for SD for the elevation; and mean = 0.03°, std = 1.15° for LW, SA and SD for the slope. Although, the mean value of their stairs height estimation result is better than the result of ours, our method shows lower standard deviations in all cases. Therefore, the updating process that continuously improves the surface vectors while the foot is in contact with the ground is expected to further refine the estimation performance.
However, the proposed method still has challenges to overcome. First, as terrain information is estimated after the wearer steps on the ground, the terrain for the next step cannot be estimated before contact. However, the prediction of the next terrain can be complemented using a combination of the nature of the walking cycle [20], which is usually repeated for the next few steps after the first step, and a user intention estimation algorithm. The drift error from the AHRS also needs to be reduced. In the experiments, we limited the test time to minimize the effect of the drift error. Generally, the drift error can be corrected by a magnetometer; however, it cannot be used for robotic applications because of the magnetic distortions caused by electrical actuators and the indoor environment, such as steel window frames or handrails on stairs. Therefore, in future studies, a more elaborate filter, such as the ZUPT algorithm [49] and the model-based extended Kalman filter [35,50], or an additional sensor should be considered to predict the gyroscope bias and correct drift errors. Finally, the exact position of the contact point between the exoskeleton and ground should be calculated. In particular, this issue was apparently identified during the SD test in our study. We configured the front part of the foot to be bent for the smooth walking motion of the wearer, but the largest error was found in the SD test owing to the absence of the model for this part. This position error can be reduced by utilizing the roll-over model of the human foot for calculating the position of the contact point.
In this study, because the exoskeleton was used without any actuation modules, the subjects had to move their limbs by themselves. However, the weight of the exoskeleton did not significantly affect the normal subjects because only 1 female subject who had asked for a rest for 15 min in the test which was performed in an average of more than three hours. But, controlling the joints of the exoskeleton using the actuation modules should be developed to assist safely the motion of the people with muscular weakness in the future study.
Additionally, utilizing the exoskeleton system, which consists of the modular actuators used in this study, in various modes according to the purpose of the user in various applications is a future challenge. As part of this work, we confirmed the possibility of using the developed exoskeleton as a system for motion capture of the wearer. Figure 25 shows the estimation results for the wearer’s position obtained when the wearer wore the exoskeleton inside a building. Through this result, we verified that our exoskeleton could be used to estimate the wearer’s position in a hospital or outdoor environment by overcoming the drawbacks of the vision-based motion capture system, which can only be used in a limited space, and global positioning systems, which cannot estimate positions in a building. The other objective for future work is that will focus on applying our method from this study to modify the gait trajectory on various terrains in real-time.
In conclusion, we developed a method to estimate the slope and elevation of walking terrains based on the kinematic analysis of the lower limb exoskeleton by finding the contact surface with the terrain. We verified the position and terrain feature estimation accuracies of the proposed method through experiments on different terrains involving level walking for 10 m, stair ascent and descent, and ramp ascent and descent. The proposed method approximately analyzed the movement of the exoskeleton on various terrains even though no prior information on the walking terrain was provided. The method is expected to enable exoskeleton systems to actively assist walking in various environments including unrestricted daily living environments.

Supplementary Materials

The following are available online at https://www.mdpi.com/1424-8220/19/20/4418/s1, Video S1: Kinematic analysis results of the exoskeleton for level walk, stair ascent, stair descent, ramp ascent, and ramp descent, Video S2: Kinematic analysis result for the motion of the exoskeleton in a building environment.

Author Contributions

Conceptualization, M.S., J.I.H, H.S.C., and Y.S.B; formal analysis, M.S., J.I.H.; funding acquisition, M.S., H.S.C., and Y.S.B.; investigation, J.I.H., H.S.C., and S.M.H.; methodology, M.S., J.-H.K.; project administration, Y.S.B.; resources, supervision, J.-H.K., Y.S.B.; validation, M.S., S.M.H.; data curation, software, visualization, writing—original draft, M.S.; writing—review and editing, M.S., J.-H.K., and Y.S.B.

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea Government, Ministry of Science, ICT and Future Planning (MSIP) (No. NRF-2013R1A2A2A01069067 and NO. NRF-2017M1A3A3A02016507).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pons, J.L. Wearable Robots: Biomechatronic Exoskeletons, 1st ed.; John Wiley Sonos Ltd.: Chichester, UK, 2008. [Google Scholar]
  2. Young, A.J.; Ferris, D.P. State-of-the-Art and Future Directions for Robotic Lower Limb Robotic Exoskeletons. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 171–182. [Google Scholar] [CrossRef] [PubMed]
  3. Aliman, N.; Ramli, R.; Haris, S.M. Design and Development of Lower Limb Exoskeletons: A Survey. Rob. Auton. Syst. 2017, 95, 102–116. [Google Scholar] [CrossRef]
  4. De Looze, M.P.; Bosch, T.; Krause, F.; Stadler, K.S.; O’Sullivan, L.W. Exoskeletons for Industrial Application and Their Potential Effects on Physical Work Load. Ergonomics 2016, 59, 671–681. [Google Scholar] [CrossRef] [PubMed]
  5. Herr, H. Exoskeletons and Orthoses: Classification, Design Challenges and Future Directions. J. Neuroeng. Rehabil. 2009, 6, 1–9. [Google Scholar] [CrossRef] [PubMed]
  6. Gorgey, A.S. Robotic Exoskeletons: The Current Pros and Cons. World J. Orthop. 2018, 9, 112–119. [Google Scholar] [CrossRef] [PubMed]
  7. Meng, W.; Liu, Q.; Zhou, Z.; Ai, Q.; Sheng, B.; Xie, S.S. Recent Development of Mechanisms and Control Strategies for Robot-Assisted Lower Limb Rehabilitation. Mechatronics 2015, 31, 132–145. [Google Scholar] [CrossRef]
  8. Tucker, M.R.; Olivier, J.; Pagel, A.; Bleuler, H.; Bouri, M.; Lambercy, O.; Millán J Del, R.; Riener, R.; Vallery, H.; Gassert, R. Control Strategies for Active Lower Extremity Prosthetics and Orthotics: A Review. J. Neuroeng. Rehabil. 2015, 12, 1–29. [Google Scholar] [CrossRef]
  9. Du, L.; Zhang, F.; Liu, M.; Huang, H. Toward Design of an Environment-Aware Adaptive Locomotion-Mode-Recognition System. IEEE Trans. Biomed. Eng. 2012, 59, 2716–2725. [Google Scholar]
  10. CYBATHLON, Races and Disciplines, Powered Exoskeleton Race. Available online: https://cybathlon.ethz.ch/races-and-disciplines/powered-exoskeleton-race.html (accessed on 12 July 2019).
  11. Liu, M.; Wang, D.; Huang, H.H. Development of an Environment-Aware Locomotion Mode Recognition System for Powered Lower Limb Prostheses. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 434–443. [Google Scholar] [CrossRef]
  12. Laschowski, B.; McNally, W.; Wong, A.; McPhee, J. Preliminary Design of an Environment Recognition System for Controlling Robotic Lower-Limb Prostheses and Exoskeletons. In Proceedings of the 2019 IEEE International Conference on Rehabilitation Robotics, Toronto, ON, Canada, 24–28 June 2019. [Google Scholar]
  13. Xu, F.; Lin, X.; Cheng, H.; Huang, R.; Chen, Q. Adaptive Stair-Ascending and Stair-Descending Strategies for Powered Lower Limb Exoskeleton. In Proceedings of the 2017 IEEE International Conference on Mechatronics and Automation, Takamatsu, Japan, 6–9 August 2017. [Google Scholar]
  14. Scandaroli, G.G.; Borges, G.A.; Ishihara, J.Y.; Terra, M.H.; da Rocha, A.F.; de Oliveira Nascimento, F.A. Estimation of Foot Orientation with Respect to Ground for an above Knee Robotic Prosthesis. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 11–15 October 2009. [Google Scholar]
  15. Li, Q.; Young, M.; Naing, V.; Donelan, J.M. Walking Speed and Slope Estimation Using Shank-Mounted Inertial Measurement Units. In Proceedings of the IEEE 11th International Conference on Rehabilitation Robotics, Kyoto, Japan, 23–26 June 2009. [Google Scholar]
  16. Lawson, B.E.; Varol, H.A.; Goldfarb, M. Standing Stability Enhancement with an Intelligent Powered Transfemoral Prosthesis. IEEE Trans. Biomed. Eng. 2011, 58, 2617–2624. [Google Scholar] [CrossRef]
  17. Zhu, A.; Li, Y.; Wu, Y.; Wu, M.; Zhang, X. Locomotion Mode Recognition Based on Foot Posture and Ground Reaction Force. In Proceedings of the 15th International Conference on Ubiquitous Robots, Hawaii, HI, USA, 27–30 June 2018. [Google Scholar]
  18. Kyeong, S.; Shin, W.; Yang, M.; Heo, U.; Feng, J.; Kim, J. Recognition of Walking Environments and Gait Period by Surface Electromyography. Front. Inf. Technol. Electron. Eng. 2019, 20, 342–352. [Google Scholar] [CrossRef]
  19. Au, S.; Berniker, M.; Herr, H. Powered Ankle-Foot Prosthesis to Assist Level-Ground and Stair-Descent Gaits. Neural Netw. 2008, 21, 654–666. [Google Scholar] [CrossRef] [PubMed]
  20. Ronsse, R.; Lenzi, T.; Vitiello, N.; Koopman, B.; van Asseldonk, E.; De Rossi, S.M.M.; van den Kieboom, J.; van der Kooij, H.; Carrozza, M.C.; Ijspeert, A.J. Oscillator-Based Assistance of Cyclical Movements: Model-Based and Model-Free Approaches. Med. Biol. Eng. Comput. 2011, 49, 1173–1185. [Google Scholar] [CrossRef] [PubMed]
  21. Schiele, A.; van der Helm, F.C.T. Kinematic Design to Improve Ergonomics in Human Machine Interaction. IEEE Trans. Neural Syst. Rehabil. Eng. 2006, 14, 456–469. [Google Scholar] [CrossRef]
  22. Wu, G.; Siegler, S.; Allard, P.; Kirtley, C.; Leardini, A.; Rosenbaum, D.; Whittle, M.; D’Lima, D.D.; Cristofolini, L.; Witte, H.; et al. ISB Recommendation on Definitions of Joint Coordinate System of Various Joints for the Reporting of Human Joint Motion—Part I: Ankle, Hip, and Spine. J. Biomech. 2002, 35, 543–548. [Google Scholar] [CrossRef]
  23. Bartenbach, V.; Wyss, D.; Seuret, D.; Riener, R. A Lower Limb Exoskeleton Research Platform to Investigate Human-Robot Interaction. In Proceedings of the 2015 IEEE International Conference on Rehabilitation Robotics, Singapore, 11–14 August 2015. [Google Scholar]
  24. Beil, J.; Marquardt, C.; Asfour, T. Self-Aligning Exoskeleton Hip Joint: Kinematic Design with Five Revolute, Three Prismatic and One Ball Joint. In Proceedings of the 2017 International Conference on Rehabilitation Robotics, London, UK, 17–20 July 2017. [Google Scholar]
  25. Rosen, J.; Perry, J.C.; Manning, N.; Burns, S.; Hannaford, B. The Human Arm Kinematics and Dynamics during Daily Activities-Toward a 7 DOF Upper Limb Powered Exoskeleton. In Proceedings of the 12th International Conference on Advanced Robotics, Seattle, WA, USA, 18–20 July 2005. [Google Scholar]
  26. Yang, W.; Yang, C.-J.; Wei, Q.X. Design of an Anthropomorphic Lower Extremity Exoskeleton with Compatible Joints. In Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics, Bali, Indonesia, 5–10 December 2014. [Google Scholar]
  27. Hill, P.F.; Vedi, V.; Williams, A.; Iwaki, H.; Pinskerova, V.; Freeman, M.A. Tibiofemoral Movement 2: The Loaded and Unloaded Living Knee Studied by MRI. J. Bone Jt. Surg. Br. 2000, 82, 1196–1198. [Google Scholar] [CrossRef]
  28. Komistek, R.D.; Dennis, D.A.; Mahfouz, M. In Vivo Fluoroscopic Analysis of the Normal Human Knee. Clin. Orthop. Relat. Res. 2003, 410, 69–81. [Google Scholar] [CrossRef]
  29. Blankevoort, L.; Huiskes, R.; de Lange, A. The Envelope of Passive Knee Joint Motion. J. Biomech. 1988, 21, 705–720. [Google Scholar] [CrossRef]
  30. Brockett, C.L.; Chapman, G.J. Biomechanics of the Ankle. Orthop. Trauma 2016, 30, 232–238. [Google Scholar] [CrossRef]
  31. Ren, L.; Jones, R.K.; Howard, D. Predictive Modelling of Human Walking over a Complete Gait Cycle. J. Biomech. 2007, 40, 1567–1574. [Google Scholar] [CrossRef]
  32. Srinivasan, S.; Raptis, I.A.; Westervelt, E.R. Low-Dimensional Sagittal Plane Model of Normal Human Walking. J. Biomech. Eng. 2008, 130, 051017. [Google Scholar] [CrossRef] [PubMed]
  33. The 7th Survey of the Korean Body Size. Available online: https://sizekorea.kr/page/report/1 (accessed on 13 July 2019).
  34. Cheng, P.; Oelmann, B. Joint-Angle Measurement Using Accelerometers and Gyroscopes-A Survey. IEEE Trans. Instrum. Meas. 2010, 59, 404–414. [Google Scholar] [CrossRef]
  35. Cantelli, L.; Muscato, G.; Nunnari, M.; Spina, D. A Joint-Angle Estimation Method for Industrial Manipulators Using Inertial Sensors. IEEE/ASME Trans. Mechatron. 2015, 20, 2486–2495. [Google Scholar] [CrossRef]
  36. Brennan, A.; Zhang, J.; Deluzio, K.; Li, Q. Quantification of Inertial Sensor-Based 3D Joint Angle Measurement Accuracy Using an Instrumented Gimbal. Gait Posture 2011, 34, 320–323. [Google Scholar] [CrossRef]
  37. Wang, Y.; Chen, W.; Tomizuka, M. Extended Kalman Filtering for Robot Joint Angle Estimation Using MEMS Inertial Sensors. IFAC Proc. Vol. 2013, 46, 406–413. [Google Scholar] [CrossRef]
  38. Kim, J. H.; Shim, M.; Ahn, D. H.; Son, B. J.; Kim, S. Y.; Kim, D. Y.; Baek, Y. S.; Cho, B. K. Design of a Knee Exoskeleton Using Foot Pressure and Knee Torque Sensors. Int. J. Adv. Robot. Syst. 2015, 12, 112. [Google Scholar] [CrossRef]
  39. Pataky, T.C.; Mu, T.; Bosch, K.; Rosenbaum, D.; Goulermas, J.Y. Gait Recognition: Highly Unique Dynamic Plantar Pressure Patterns among 104 Individuals. J. R. Soc. Interface 2011, 9, 790–800. [Google Scholar] [CrossRef]
  40. Hessert, M.J.; Vyas, M.; Leach, J.; Hu, K.; Lipsitz, L.A.; Novak, V. Foot Pressure Distribution during Walking in Young and Old Adults. BMC Geriatr. 2005, 5, 8. [Google Scholar] [CrossRef]
  41. Bartenbach, V.; Gort, M.; Riener, R. Concept and Design of a Modular Lower Limb Exoskeleton. In Proceedings of the 6th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, Singapore, 26–29 June 2016. [Google Scholar]
  42. dos Santos, W.M.; Nogueira, S.L.; de Oliveira, G.C.; Peña, G.G.; Siqueira, A.A.G. Design and Evaluation of a Modular Lower Limb Exoskeleton for Rehabilitation. In Proceedings of the 2017 International Conference on Rehabilitation Robotics, London, UK, 17–20 July 2017. [Google Scholar]
  43. Grosu, V.; Rodriguez-Guerrero, C.; Grosu, S.; Vanderborght, B.; Lefeber, D. Design of Smart Modular Variable Stiffness Actuators for Robotic-Assistive Devices. IEEE/ASME Trans. Mechatron. 2017, 22, 1777–1785. [Google Scholar] [CrossRef]
  44. Cho, S. H.; Park, J. M.; Kwon, O. Y. Gender Differences in Three Dimensional Gait Analysis Data from 98 Healthy Korean Adults. Clin. Biomech. 2004, 19, 145–152. [Google Scholar] [CrossRef]
  45. Giovacchini, F.; Vannetti, F.; Fantozzi, M.; Cempini, M.; Cortese, M.; Parri, A.; Yan, T.; Lefeber, D.; Vitiello, N. A Light-Weight Active Orthosis for Hip Movement Assistance. Rob. Auton. Syst. 2015, 73, 123–134. [Google Scholar] [CrossRef]
  46. Craig, J.J. Introduction to Robotics, Mechanics and Control, 3rd ed.; Pearson Education Inc.: Upper Saddle River, NJ, USA, 2005. [Google Scholar]
  47. Bovi, G.; Rabuffetti, M.; Mazzoleni, P.; Ferrarin, M. A Multiple-Task Gait Analysis Approach: Kinematic, Kinetic and EMG Reference Data for Healthy Young and Adult Subjects. Gait Posture 2011, 33, 6–13. [Google Scholar] [CrossRef] [PubMed]
  48. Huo, W.; Mohammed, S.; Amirat, Y.; Kong, K. Fast Gait Mode Detection and Assistive Torque Control of an Exoskeletal Robotic Orthosis for Walking Assistance. IEEE Trans. Robot. 2018, 34, 1035–1052. [Google Scholar] [CrossRef]
  49. Wu, X.; Wang, Y.; Pottie, G. A Non-ZUPT Gait Reconstruction Method for Ankle Sensors. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014. [Google Scholar]
  50. Lee, M.S.; Ju, H.; Song, J.W.; Park, C.G. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking. Sensors 2015, 15, 28129–28153. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Subject wearing the lower limb exoskeleton and main configuration of the system.
Figure 1. Subject wearing the lower limb exoskeleton and main configuration of the system.
Sensors 19 04418 g001
Figure 2. Hip joint structure for hip flexion/extension, abduction/adduction, and medial/lateral rotation.
Figure 2. Hip joint structure for hip flexion/extension, abduction/adduction, and medial/lateral rotation.
Sensors 19 04418 g002
Figure 3. Schematic of the remote-center rotation of hip medial/lateral joint in the transversal view.
Figure 3. Schematic of the remote-center rotation of hip medial/lateral joint in the transversal view.
Sensors 19 04418 g003
Figure 4. Configuration of (a) the joint for knee flexion/extension and (b) the joint for ankle dorsi/plantar flexion and inversion/eversion.
Figure 4. Configuration of (a) the joint for knee flexion/extension and (b) the joint for ankle dorsi/plantar flexion and inversion/eversion.
Sensors 19 04418 g004
Figure 5. Foot configuration of the exoskeleton. The forefoot is free to be bent for ensuring smooth ground contact.
Figure 5. Foot configuration of the exoskeleton. The forefoot is free to be bent for ensuring smooth ground contact.
Sensors 19 04418 g005
Figure 6. Schematic of the control system.
Figure 6. Schematic of the control system.
Sensors 19 04418 g006
Figure 7. Insole sensor for measuring plantar pressure distribution.
Figure 7. Insole sensor for measuring plantar pressure distribution.
Sensors 19 04418 g007
Figure 8. Custom-made sensor for measuring the interaction forces on the thigh and shank segments.
Figure 8. Custom-made sensor for measuring the interaction forces on the thigh and shank segments.
Sensors 19 04418 g008
Figure 9. Calibration curve of the FSR in the range of 0 to 20 kgf. The ADC values divided by 1000 were fitted with a second order polynomial. The coefficients of this curve, B1 and B2, are 6.97 and −0.47, respectively.
Figure 9. Calibration curve of the FSR in the range of 0 to 20 kgf. The ADC values divided by 1000 were fitted with a second order polynomial. The coefficients of this curve, B1 and B2, are 6.97 and −0.47, respectively.
Sensors 19 04418 g009
Figure 10. (a) Composition of the actuation module and (b) the assembly state with the hip joint of the exoskeleton.
Figure 10. (a) Composition of the actuation module and (b) the assembly state with the hip joint of the exoskeleton.
Sensors 19 04418 g010
Figure 11. Comparison for (a) hip and (b) knee torque–speed characteristics of the actuation module and human joint during walking. The angular speed of each joint was calculated based on the stride time of 1.1 s.
Figure 11. Comparison for (a) hip and (b) knee torque–speed characteristics of the actuation module and human joint during walking. The angular speed of each joint was calculated based on the stride time of 1.1 s.
Sensors 19 04418 g011
Figure 12. Configuration of the five modes of the exoskeleton according to the assembly of the actuation modules. The blue-circle and red-cross indicates whether the actuation module is assembled to the joint or not, respectively. Each mode of the exoskeleton can be used for the purpose of assisting (a) hip and knee joints of both legs; (b) hip or knee joints of both legs; (c) hip and knee joints of one leg; and (d) one of the joint of lower-limb. In the case of (e) mode 0, although it cannot assist the joints of a wearer by using actuation modules, it can be used to measure the motion of a wearer by using embedded sensors.
Figure 12. Configuration of the five modes of the exoskeleton according to the assembly of the actuation modules. The blue-circle and red-cross indicates whether the actuation module is assembled to the joint or not, respectively. Each mode of the exoskeleton can be used for the purpose of assisting (a) hip and knee joints of both legs; (b) hip or knee joints of both legs; (c) hip and knee joints of one leg; and (d) one of the joint of lower-limb. In the case of (e) mode 0, although it cannot assist the joints of a wearer by using actuation modules, it can be used to measure the motion of a wearer by using embedded sensors.
Sensors 19 04418 g012
Figure 13. Kinematic model of the exoskeleton and its frame attachment.
Figure 13. Kinematic model of the exoskeleton and its frame attachment.
Sensors 19 04418 g013
Figure 14. Joint angle estimation result of θ5 compared with the angle measured by an encoder.
Figure 14. Joint angle estimation result of θ5 compared with the angle measured by an encoder.
Sensors 19 04418 g014
Figure 15. Foot contact phase determination result using the CoP and the angular velocity of the foot (ωF) with respect to frame {W}.
Figure 15. Foot contact phase determination result using the CoP and the angular velocity of the foot (ωF) with respect to frame {W}.
Sensors 19 04418 g015
Figure 16. Calculation of the pivot vector on frame {W} according to the contact phase of feet. (a) At the first contact of a foot with the ground, the CoP of the supporting leg is set to the pivot. (b) As the CoP moves during foot flat phase, the pivot is continuously updated for the variation of the CoP. (c) When the supporting leg is changed to the other leg, the position vector between both feet is used to update the pivot.
Figure 16. Calculation of the pivot vector on frame {W} according to the contact phase of feet. (a) At the first contact of a foot with the ground, the CoP of the supporting leg is set to the pivot. (b) As the CoP moves during foot flat phase, the pivot is continuously updated for the variation of the CoP. (c) When the supporting leg is changed to the other leg, the position vector between both feet is used to update the pivot.
Sensors 19 04418 g016
Figure 17. Modelling of the walking ground as a plane using the collected pivot points of each step.
Figure 17. Modelling of the walking ground as a plane using the collected pivot points of each step.
Sensors 19 04418 g017
Figure 18. Updating process of the calculated plane coefficients while the exoskeleton is in contact with the ground. The coefficient, cN, is a constant equal to −1.
Figure 18. Updating process of the calculated plane coefficients while the exoskeleton is in contact with the ground. The coefficient, cN, is a constant equal to −1.
Sensors 19 04418 g018
Figure 19. Result of contact surface estimation for both feet during walking.
Figure 19. Result of contact surface estimation for both feet during walking.
Sensors 19 04418 g019
Figure 20. Five types of ambulation tests: ((a) level walk (LW), (b) stair ascent (SA), (c) stair descent (SD), (d) ramp ascent (RA), and (e) ramp descent (RD)) on three different terrains for performance validation of the proposed method. Ramp ascent and descent were conducted on three ramps with different slopes to validate the performance of the slope estimation.
Figure 20. Five types of ambulation tests: ((a) level walk (LW), (b) stair ascent (SA), (c) stair descent (SD), (d) ramp ascent (RA), and (e) ramp descent (RD)) on three different terrains for performance validation of the proposed method. Ramp ascent and descent were conducted on three ramps with different slopes to validate the performance of the slope estimation.
Sensors 19 04418 g020
Figure 21. Terrains used for the tests ((a) level ground, (b)stair, (c)ramp). The shoes in the figure indicate the start and end positions of the subject. The height of each step (hstep) of the stairs is 165 mm.
Figure 21. Terrains used for the tests ((a) level ground, (b)stair, (c)ramp). The shoes in the figure indicate the start and end positions of the subject. The height of each step (hstep) of the stairs is 165 mm.
Sensors 19 04418 g021
Figure 22. Visualization of position data of the exoskeleton on frame {W} for (a) level walk, (b) stair ascent, (c) stair descent, (d) ramp ascent, and (e) ramp descent.
Figure 22. Visualization of position data of the exoskeleton on frame {W} for (a) level walk, (b) stair ascent, (c) stair descent, (d) ramp ascent, and (e) ramp descent.
Sensors 19 04418 g022
Figure 23. Average (a) slope and (b) elevation estimation results of all tests for each terrain. The absolute values were used for stairs and ramp descent cases.
Figure 23. Average (a) slope and (b) elevation estimation results of all tests for each terrain. The absolute values were used for stairs and ramp descent cases.
Sensors 19 04418 g023
Figure 24. Normalized terrain slope and elevation results in feature space. The maximum values of θ N and h N were used for normalization. The samples of RA and RD for different slopes are colored differently (light gray: ramp 1, medium gray: ramp 2, dark gray: ramp 3).
Figure 24. Normalized terrain slope and elevation results in feature space. The maximum values of θ N and h N were used for normalization. The samples of RA and RD for different slopes are colored differently (light gray: ramp 1, medium gray: ramp 2, dark gray: ramp 3).
Sensors 19 04418 g024
Figure 25. Test result for the motion of the exoskeleton in a building environment. (The video showing the analysis results of this test can be found in Supplementary Materials Video S2).
Figure 25. Test result for the motion of the exoskeleton in a building environment. (The video showing the analysis results of this test can be found in Supplementary Materials Video S2).
Sensors 19 04418 g025
Table 1. Joint ranges of motion of a human and the exoskeleton.
Table 1. Joint ranges of motion of a human and the exoskeleton.
Joint MotionROM (°)In Walking (°)Exoskeleton (°)
Hip flexion/extension120/3036/6110/25
Hip adduction/abduction35/407/620/30
Hip medial/lateral rotation30/6010/1320/20
Knee extension/flexion10/1400/6410/110
Ankle dorsi/plantar flexion20/5011/1920/20
Ankle inversion/eversion35/205/710/10
Table 2. D-H parameters for the kinematic analysis. The variables are depicted in Figure 2, Figure 3 and Figure 4.
Table 2. D-H parameters for the kinematic analysis. The variables are depicted in Figure 2, Figure 3 and Figure 4.
FrameLeft LegRight Leg
iαi− 1 (rad)ai− 1 (mm)di (mm)θi (rad)αi− 1 (rad)ai− 1 (mm)di (mm)θi (rad)
10Lhip,1Lhip,6θ1,l + (π/2)0Lhip,1Lhip,6θ1,r + (π/2)
2π/2Lhip,3Lhip,2π/2π/2Lhip,3Lhip,2π/2
3π/20Lhip,4θ3,l + (π/2)π/20Lhip,4θ3,r + (π/2)
4π/200−π/2π/200−π/2
5−π/2Lhip,50θ5,l−π/2Lhip,50θ5,r
60Lthigh0θ6,l0Lthigh0θ6,r
70LshankLankle,2θ7,l0LshankLankle,2θ7,r
8π/20Lankle,1θ8,lπ/20Lankle,1θ8,r
Table 3. Demographic characteristics of the subjects
Table 3. Demographic characteristics of the subjects
SubjectGenderAge (y)Height (m)Weight (kg)BMI (kg/m2)Shoe Size (mm)
S1Male331.706823.5260
S2Male351.687426.2260
S3Male251.766721.6265
S4Male291.807422.8270
S5Male391.768828.4275
S6Female361.616023.1240
S7Female431.657025.7250
Table 4. Dimensions of the terrains for evaluating the performance of the developed method.
Table 4. Dimensions of the terrains for evaluating the performance of the developed method.
CaseDcase (mm)Hcase (mm)θcase (°)
Level10,00000
Stair19009900
Ramp 112,77010205
Ramp 224602306.37
Ramp 3360062014
Table 5. Position error per step and terrain feature estimation error for each test.
Table 5. Position error per step and terrain feature estimation error for each test.
Position Error Per StepTerrain Estimation Error
CaseSubjectDRMSE (mm)HRMSE (mm)YRMSE (mm)θRMSE (°)hRMSE (mm)
LWS16.154.185.931.308.90
S24.834.025.350.716.36
S32.452.783.410.6611.43
S42.116.676.771.108.01
S51.941.934.510.848.33
S610.895.573.820.768.81
S74.006.145.460.697.38
Stotal5.384.705.110.868.60
SAS14.9411.2312.221.2013.06
S28.481.9012.751.759.62
S36.048.9818.241.2312.38
S417.708.1016.831.5110.65
S57.988.8118.591.7912.54
S619.044.7711.451.527.93
S710.893.4410.411.3210.08
Stotal12.057.4214.631.4811.00
SDS17.4624.1217.921.6626.28
S25.1814.0215.751.9521.72
S38.7920.5910.621.4127.09
S412.5326.029.261.3233.04
S54.6219.908.702.6426.51
S67.5024.0113.361.5932.49
S712.1419.4815.981.2627.57
Stotal8.8921.5513.511.7428.12
RA1S111.788.8114.801.09-
S215.8413.897.820.87-
S316.136.0611.250.83-
S412.472.739.191.03-
S58.409.786.661.04-
S611.0114.269.820.91-
S74.392.889.330.91-
Stotal11.969.499.940.94-
RD1S17.979.0421.741.13-
S213.987.739.151.14-
S310.252.358.690.63-
S49.592.7410.451.30-
S59.403.555.911.13-
S611.475.408.231.06-
S78.665.027.911.31-
Stotal10.375.5010.991.11-
RA2S14.4714.845.211.05-
S22.6610.465.181.33-
S32.455.903.071.15-
S413.816.874.621.55-
S513.063.846.401.32-
S610.2117.864.351.14-
S711.783.317.831.19-
Stotal9.7310.495.411.25-
RD2S15.4415.053.821.60-
S27.2810.712.991.54-
S311.526.144.381.40-
S411.091.849.161.60-
S514.145.706.311.49-
S67.9810.368.741.37-
S714.614.473.801.42-
Stotal11.038.376.191.50-
RA3S18.296.577.481.70-
S217.9112.469.760.85-
S314.1811.286.131.23-
S415.324.3314.950.84-
S514.355.856.250.99-
S66.5313.765.940.62-
S78.389.978.370.62-
Stotal12.749.778.990.99-
RD3S110.384.948.370.78-
S28.863.827.011.39-
S39.075.066.020.93-
S49.936.8111.920.59-
S511.425.535.730.60-
S68.447.406.890.95-
S713.5914.259.460.75-
Stotal10.437.778.170.89-
The total accuracy (Stotal) calculated using entire subject data for each case is represented in bold type.
Table 6. Confusion matrix and classification performance of the results using SVM.
Table 6. Confusion matrix and classification performance of the results using SVM.
PredictedPerformance
LWSASDRARDPrecisionRecallF1-score
ActualLW1770001240.9900.9860.988
SA04260001.0001.0001.000
SD00419011.0000.9980.999
RA100187900.9990.9990.999
RD1700019720.9870.9910.989
Table 7. Classification performance (F1-score) for each subject.
Table 7. Classification performance (F1-score) for each subject.
CaseS1S2S3S4S5S6S7
LW0.9530.9840.9850.9950.9961.0000.998
SA1.0000.9811.0001.0001.0001.0001.000
SD1.0001.0001.0001.0001.0001.0001.000
RA1.0000.9951.0000.9981.0001.0001.000
RD0.9620.9830.9850.9980.9971.0000.998

Share and Cite

MDPI and ACS Style

Shim, M.; Han, J.I.; Choi, H.S.; Ha, S.M.; Kim, J.-H.; Baek, Y.S. Terrain Feature Estimation Method for a Lower Limb Exoskeleton Using Kinematic Analysis and Center of Pressure. Sensors 2019, 19, 4418. https://doi.org/10.3390/s19204418

AMA Style

Shim M, Han JI, Choi HS, Ha SM, Kim J-H, Baek YS. Terrain Feature Estimation Method for a Lower Limb Exoskeleton Using Kinematic Analysis and Center of Pressure. Sensors. 2019; 19(20):4418. https://doi.org/10.3390/s19204418

Chicago/Turabian Style

Shim, Myounghoon, Jong In Han, Ho Seon Choi, Seong Min Ha, Jung-Hoon Kim, and Yoon Su Baek. 2019. "Terrain Feature Estimation Method for a Lower Limb Exoskeleton Using Kinematic Analysis and Center of Pressure" Sensors 19, no. 20: 4418. https://doi.org/10.3390/s19204418

APA Style

Shim, M., Han, J. I., Choi, H. S., Ha, S. M., Kim, J. -H., & Baek, Y. S. (2019). Terrain Feature Estimation Method for a Lower Limb Exoskeleton Using Kinematic Analysis and Center of Pressure. Sensors, 19(20), 4418. https://doi.org/10.3390/s19204418

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop