Next Article in Journal
Vector-Based Eddy-Current Testing Method
Previous Article in Journal
Markov Chain Investigation of Discretization Schemes and Computational Cost Reduction in Modeling Photon Multiple Scattering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A User Interface System with See-Through Display for WalkON Suit: A Powered Exoskeleton for Complete Paraplegics

1
Angel Robotics Co. Ltd., 3 Seogangdae-gil, Mapo-gu, Seoul 04111, Korea
2
Department of Mechanical Engineering, Sogang University, 35 Baekbeom-ro, Mapo-gu, Seoul 04107, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2018, 8(11), 2287; https://doi.org/10.3390/app8112287
Submission received: 18 October 2018 / Revised: 13 November 2018 / Accepted: 14 November 2018 / Published: 19 November 2018
(This article belongs to the Section Applied Biosciences and Bioengineering)

Abstract

:
In the development of powered exoskeletons for paraplegics due to complete spinal cord injury, a convenient and reliable user-interface (UI) is one of the mandatory requirements. In most of such robots, a user (i.e., the complete paraplegic wearing a powered exoskeleton) may not be able to avoid using crutches for safety reasons. As both the sensory and motor functions of the paralyzed legs are impaired, the users should frequently check the feet positions to ensure the proper ground contact. Therefore, the UI of powered exoskeletons should be designed such that it is easy to be controlled while using crutches and to monitor the operation state without any obstruction of sight. In this paper, a UI system of the WalkON Suit, a powered exoskeleton for complete paraplegics, is introduced. The proposed UI system consists of see-through display (STD) glasses and a display and tact switches installed on a crutch for the user to control motion modes and the walking speed. Moreover, the user can monitor the operation state using the STD glasses, which enables the head to be positioned up. The proposed UI system is verified by experimental results in this paper. The proposed UI system was applied to the WalkON Suit for the torch relay of the 2018 Pyeongchang Paralympics.

1. Introduction

Walking is not only one of the most fundamental activities that enables a person to be an active member of our society, but it is also the key to maintain the health condition [1]. For people who have completely lost their walking ability due to spinal cord injury, therefore, it is mandatory to stand up and perform walking motions frequently with the help of assistive devices (e.g., a harness system, a robotic rehabilitation treatment system, etc. [2]) to maintain the health condition. In this aspect, powered exoskeletons provide the complete paraplegic with the possibility to walk on the real ground, which is meaningful for rehabilitation of complete paraplegics, as well as their mobility.
Powered exoskeletons have been developed by many researchers in recent years, and several robots have been commercialized [3]. Among many applications of the wearable robots, ones for assisting people with complete paraplegia may be the most advanced. Wheelchairs can improve the mobility of patients with complete paralyses, but they cannot provide standing postures and bipedal walking, which are important for healthcare. Many studies have shown that bipedal walking training can increase life satisfaction [1,2,4,5]. ReWalk, a commercial robot by ReWalk Robotics [6], is being distributed to rehabilitation hospitals around the world. In addition, Ekso [7] from Ekso Bionics, Indego [8] from Parker Hannifin corporation and Rex [9] from Rex Bionics are commercially available robots for disabled and paralyzed people. Many other robotic exoskeletons are being developed and commercialized [10,11].
The mechanism and control of powered exoskeletons have been studied mainly in terms of efficient force transmission generating a natural gait pattern. Through intensive studies, this has reached the completion stage to some extent. For more improved usability of powered exoskeletons for people with complete paralyses, it is important to accomplish not only the transmission of the actuation force, but also the immediate recognition of the user intention. At the same time, the user must be informed immediately if any event (e.g., mode change, fault detection, battery discharge, etc.) occurs in the robot system. Therefore, it is mandatory to design a user-friendly, comfortable, convenient, yet reliable user-interface (UI).
Various intention detection methods have been studied. One of the promising methods is to measure the physical contact force, such as a ground reaction force (GRF), measuring the reaction force between the ground and the user’s foot [12,13]. The GRF can be a real-time indicator of whether or not the feet are on the ground, but the intention cannot be detected in advance. In general, when the GRF is measured below a certain threshold value, the foot leaves the ground and is detected as a swing phase. However, for the case of complete paraplegics, it is impossible to generate the swing phase due to the complete loss of motor function of the lower limb. Human motion intent can also be detected using biomedical signals such as electromyography (EMG) of the leg muscles [14,15,16,17,18,19]. The EMG method is not suitable for patients who have completely lost their motor and sensor functions because they cannot sense the actual motion or contraction of the muscles.
The intention of movement is transferred by electrical signals induced by the brain’s neurons, and a control method using a brain-machine interface (BMI) has recently been widely studied. An invasive BMI method has been used in monkeys in many studies to measure and analyze the brain signals of body movements [20,21]. However, the invasive BMI methods are too risky and cannot be applied to daily-life applications [10,22]. As a non-invasive BMI application, scalp electroencephalographic (EEG) has been utilized also [10]. For example, many applications used the EEG-based BMI to detect intention and control the assistive devices, such as orthotic devices and wheelchairs [11,23,24,25]. Donati et al. applied the EEG-based BMI to their exoskeleton robot and a gait training platform to confirm the effectiveness of the long-term training [26]. Although the BMI intention recognition method is being studied intensively, it is not yet reliable and suitable for powered exoskeletons, which require 100 % detection reliability. Malfunctions of the exoskeleton robot due to incorrect detection may add a significant risk to user safety [10]. Therefore, these methods are not yet suitable for the UI of a robot for complete paraplegics for mobile use; a new UI system that makes the user reliably interact with the robot with minimal distraction is necessary.
In this paper, a new UI of the WalkON Suit, a powered exoskeleton for complete paraplegics, developed by Angel Robotics Co. Ltd. (formerly SG Robotics Co. Ltd., Singapore), is introduced. The new UI consists of see-through display (STD) glasses, as well as an OLED display and two tact switches installed on the handle of a crutch. The STD glasses are usually utilized for augmented-reality applications [27,28], but are utilized as a head-up display to transfer necessary information from the robot to the user in this paper. The wearer of the WalkON Suit can check the overall operation state, such as the foot displacement, the trunk inclination, the current gait mode, the elapsed time and the total step counts through the STD glasses, as well as through the OLED display of the crutch. The proposed UI system improves the usability of the WalkON Suit, such that the wearer’s field of view is widened and the training period with the robot is shortened, resulting in improvement of the overall safety and walking speed.
The WalkON Suit with the proposed UI system was utilized for a complete paraplegic to carry the torch of the Pyeongchang Paralympics 2018, as shown in Figure 1. In addition to the new UI, the WalkON Suit revealed through the Pyeongchang Paralympics was different from the previous version introduced at the Cybathlon 2016 [29] in many aspects. In particular, the design of the WalkON Suit has been improved aesthetically while fulfilling the role of a torch bearer. In this paper, the proposed UI system of the WalkON Suit and its verification by a human subject are introduced.
This paper is organized as follows. Section 2 introduces the mechanical and electrical configuration of the WalkON Suit briefly. Section 3 explains the motivation and the realization method of the proposed UI system. The experimental results are provided and discussed in Section 4. Section 5 concludes this paper and proposes future works.

2. Configuration of the WalkON Suit

The WalkON Suit has various unique features. For example, it utilizes a bi-articular actuation mechanism, which actuates both the hip and knee joints simultaneously. The bi-articular mechanism improves the overall energy efficiency significantly [30]. The brace of the WalkON Suit is manually fabricated by medical orthotists and customized for each user, which is to guarantee the best fitness between the human body and the robot.
The WalkON Suit was first introduced to the public through the Powered Exoskeleton Race of the Cybathlon 2016 [29], where the pilot of the WalkON Suit, Mr. Byeongwook Kim, won the bronze medal. The control of the WalkON Suit, however, was not simple enough for a paralyzed user, because the user had to check the operation state continuously through a small LCD display, like the other powered exoskeletons.
An improved version of the WalkON Suit was introduced in the torch relay of the Pyeongchang Paralympics 2018. Both robots for Cybathlon and Pyeongchang Paralympics share the same actuation mechanism, basic UI system, crutches, and so on, as shown in Figure 2. Four synchronized 100-W BLDC motors are utilized to actuate each joint in both systems. The hip joints are actuated directly by the motors, and the knee joints are bi-articularly actuated through a linkage mechanism [29]. The bi-articular mechanism is effective for controlling the position of an end-effector (i.e., the foot of the robotic leg), because it is similar to the human musculoskeletal system. In addition, all the hip and knee joint motors are mounted in the backpack placed at the trunk; therefore, the knee joint motor does not impose any unnecessary burden unlike the conventional mono-articular mechanism [31].
The Cybathlon version of the WalkON Suit introduced in 2016 used the CompactRIO of National Instruments as the primary control processor, while the second prototype introduced in this paper uses a customized printed circuit board and the system on module (SoM) manufactured by the same company. By this replacement, the weight of the processor has been reduced from 2250 g to 77 g.
In addition, the UI system of the WalkON Suit has been improved compared to that in 2016. The crutches of the WalkON Suit are specially designed as a primary user interface; a 1.3 -inch OLED display and two tact switches are installed on the handle of the crutch. Moreover, the information mandatory for the paralyzed human to control the WalkON Suit is informed through see-through display (STD) glasses specially developed for the WalkON Suit. The details of the proposed UI are described in the following section.

3. Design of UI System for the WalkON Suit

A person with paralysis due to complete spinal cord injury often encounters the impairment of both sensory and motor functions of the lower limbs. Therefore, the sensors and the motion analysis system of the powered exoskeletons should be able to monitor the overall status of the human-robot system, and the status should be informed to the user immediately and continuously.
People without gait disorders can walk uncautiously without checking their foot displacement or the balancing state. As normal walking is a habitual task, a normal person can do other things such as talking or looking around while walking. In this sense, the UI for powered exoskeletons should not impose much distraction on the user while operating the robot. The UI system of the WalkON Suit is developed in this manner.

3.1. Posture Analysis with the Conventional UI

Since the sensory function, as well as the motor function, of complete paraplegics is impaired, they are not able to recognize their postures. Thus, they need to observe their leg movements frequently for maintaining balance. For example, they need to know which foot will move forward (i.e., swing forward) to make the appropriate trunk movement for balancing. This is still necessary even after being fully adapted to the powered exoskeleton.
The posture to observe their lower limbs, however, is not easy for complete paraplegics wearing a powered exoskeleton. Since the flexion angle range of the neck is limited, they need to slightly lean forward to observe the leg movements, as shown in Figure 3a. Moreover, they need to bend their neck even more excessively when staring at the screen to operate the robot, as shown in Figure 3b, because the display is usually installed at the crutches or at the chest. It should be noted that leaning forward not only makes the overall posture unstable, but also imposes a large burden on the shoulder joints.
Figure 4 shows the schematic analysis of the posture of a person wearing the WalkON Suit. As the posture is quasi-static when staring at the screen, the forces acting on the crutches can be calculated by the static moment balance equation with respect to the foot (i.e., the point B in Figure 4):
2 l sin ( θ t ) f c r u t c h α l sin ( θ t ) m t g l sin ( θ t ) + l h sin ( θ t + θ n ) m h g = 0 ,
or:
f c r u t c h = α l sin ( θ t ) m t g + l sin ( θ t ) + l h sin ( θ t + θ n ) m h g 2 l sin ( θ t ) ,
where l is the distance between the foot and the shoulder joint, f c r u t c h and f G R F are respectively the forces acting on the feet and the crutches, θ t and θ n are respectively the angles of the body inclination and the neck flexion and m h and m t are respectively the masses of the head and the overall body excluding the head, as shown in Figure 4. α is a ratio to define the position of the center of mass of the whole body including the human and the robot. For the sake of simplicity, it was assumed that all the forces are in the vertical direction and the frontal plane posture is well-balanced. In addition, it was assumed that l is the same as the distance from the shoulder joint to the tips of the crutches, which is a reasonable assumption in practice, because the length of the crutches is determined in this manner.
By a simple free body diagram analysis, it can be easily found that the force acting on the crutches, i.e., f c r u t c h , has the same magnitude as the force acting on the shoulder joints. It should be noted in (2) that f c r u t c h is increased as θ t and θ n increase. As the shoulder joints are vulnerable to repetitive large forces, the magnitude of f c r u t c h must be minimized, which requires minimizing θ t and θ n . This is, however, not simple in practice, because of the limitations in the UI, shown in Figure 3.

3.2. Configuration of the Proposed UI System

Figure 5 shows the overall configuration of the proposed UI system for the WalkON Suit. The user transfers a command (i.e., the human intention to initiate or to terminate a leg motion) through two tact switches installed on the crutch (see Figure 5a). Then, the user receives information necessary to operate the robot and to check the overall status through an OLED display installed on a crutch (see Figure 5b) and see-through display (STD) glasses (see Figure 5c). Both the displays show basically the same information, but the STD glasses show more details.

3.3. Interface for the Persons in the Vicinity

For safe operation, it is important to notify persons in the vicinity of a powered exoskeleton immediately when any risk or hazardous event is detected or expected. Like all the other powered exoskeletons, the WalkON Suit is not able to stand up from the ground by itself, and the user (i.e., a complete paraplegic) cannot restore the standing posture once having fallen down. Therefore, it is mandatory to prevent any hazard of falling down. To monitor the risks of falling down, the WalkON Suit has a risk monitoring algorithm [29], which continuously checks the status of the battery and actuators, the trunk inclination, and so on. In order to notify the potential risk detected by the algorithm to the persons in the vicinity of the WalkON Suit, a multi-color LED and buzzer are utilized, as shown in Figure 5d.

3.4. Input Interface for the User of the WalkON Suit

Since the advanced methods such as voice recognition or EEG analysis are not yet fully reliable, a physical interface such as tact button switches may still be the most intuitive and reliable means for an input interface. Complete paraplegics wearing powered exoskeletons should hold the crutches for stabilizing and maintaining the body balance. Therefore, the crutches of the WalkON Suit are designed not only to meet the ergonomic conditions, but also for the user to access and operate the tact switches easily while holding the crutches.
The number of tact switches to be installed on the crutch handle should be determined considering both the complexity of operation and the possibility of maloperation. (1) If the number of tact switches is less than that of the motion modes, which are the different leg motions that a powered exoskeleton can realize according to the user’s command, it is mandatory for the user to select the desired mode by pressing the button(s) appropriate times. This may cause an operation error due to the user’s mistake. (2) If the number of tact switches matches that of the motion modes, the operation strategy becomes simple. The powered exoskeleton introduced in [32] used four tact switches to control four motion modes. In this case, however, too many buttons may be necessary, which increases the risk of maloperation.
Since the paraplegic user should hold the crutch handle tightly while operating the powered exoskeleton, the workspace of the thumb is the only area where tact switches can be placed. Therefore, the number of tact switches of the crutch handle of the WalkON Suit was selected as two; Button 1 and Button 2, as shown in Figure 5. The main function of Button 1 is action or select, and that of Button 2 is the termination of the current mode. The action and select are distinguished by the duration of pressing Button 1, i.e., the user can initiate an action by pressing Button 1 for a short duration, and select any desired mode by pressing Button 1 for a longer duration.
The two tact switches are used to mange the following four motion modes.

3.4.1. Sit Mode

This is the motion mode to sit down. Sit mode can be selected after completing stand mode or align mode, as listed in Table 1. When the user makes a command of action (i.e., pressing Button 1 for a short duration) in sit mode, the WalkON Suit starts bending both the hip and knee joints until sitting on the chair. If the user is already sitting, the robot does not take any action even if the action command is made.
Since the user may embark on the powered exoskeleton from a wheelchair, the algorithm of the WalkON Suit starts from sit mode.

3.4.2. Stand Mode

This is a motion mode to stand up. Stand mode can be selected after completing sit mode, as listed in Table 1. When the user makes an action command in stand mode, the WalkON Suit starts extending both the hip and knee joints until standing straight. If the user is already standing, the robot does not take any action. After standing up, the user can start walking, sitting down or re-aligning his/her legs.

3.4.3. Walk Mode

This is a motion mode to move forward. Walk mode can be selected after completing stand mode or align mode, as listed in Table 1. Walk mode consists of two motion phases, swing and stance, according to the ground contact condition. If the user makes an action command while the left foot is placed forward, then the left leg is controlled as the stance phase, and the right leg is controlled as the swing phase to place the right foot forward. Likewise, the right leg is in the stance phase, and the left leg is in the swing phase, when the user makes an action command while the right foot is placed forward.
If it is the first time making an action command in walk mode after completing stand mode or align mode, the right leg starts the swing phase first.
It should be noted that the user must make supplementary efforts to stabilize the overall posture while walking. As walk mode consists of two different actions (i.e., the stance and swing phases), the user must always check which foot is placed forward before making an action command.

3.4.4. Align Mode

This is a motion mode to re-align the legs. Align mode can be selected after completing stand mode or walk mode, as listed in Table 1. When the user makes an action command in align mode, the WalkON Suit starts positioning the feet one by one such that both feet are placed at the same position. After completing align mode, the user should be standing straight.
Figure 6 shows the overall operation strategy of the WalkON Suit using the two tact switches. The selectable modes are expressed with different-colored lines in Figure 7.

3.5. Output Interface 1 of the WalkON Suit: OLED Display

As the number of tact switches is less than the number of motion modes, it is mandatory for the user to select the desired mode by pressing the buttons appropriate times. For this purpose, a display is necessary, and thus, an OLED display is installed on the handle of a crutch to indicate the mode lists that the user can select, as shown in Figure 5.
When the robot is turned on so that the program is in an initialization mode, the display shows the logo of the WalkON Suit, as in Figure 8a.
Figure 8d shows an example of the mode list. In the example of Figure 8, if the user makes a select command (i.e., Button 1 pressed for a long duration), sit mode is to be selected, and the display is changed as in Figure 8e. Then, the robot is in sit mode, and it starts sitting down when the user makes an action command (i.e., pressing Button 1 for a short duration). In this procedure, the display shows Figure 8f.
Figure 8b,c show the examples of stand mode and walk mode. In walk mode, the feet positions are also displayed, as shown in the figure, to help the user be aware of which foot will move forward.

3.6. Output Interface 2 of the WalkON Suit: Head-Up Display with See-Through Display Glasses

A head-up display (HUD) refers to any transparent display that provides necessary information without requiring users to look away from their usual viewpoints [33,34]. As the name of HUD describes, it is a UI system that allows a user to obtain information for operating machines with the head positioned up. It is also a great advantage of an HUD that the users do not have to repetitively change the eye focus between distant objects and near displays. HUDs have been proposed for many applications. For example, an HUD is being utilized for military purposes to overlay tactical information.
In this paper, an HUD is proposed for the user of the WalkON Suit to walk with the head positioned up, i.e., to minimize θ t and θ n in Figure 4. This is very important, because the larger θ t and θ n , the more increased burden to the users’ arms, as in (2). Excessive θ t and θ n may result in falling forward, which is fatal for a complete paraplegic wearing a robot. In order for the user to be able to walk with his/her head straight up, an HUD may be a good solution to provide the necessary information to operate the WalkON Suit without looking at the toe or the display of the crutch. However, unlike the HUDs in the other applications, such as vehicles and airplanes, the powered exoskeletons do not have a windshield in front of the user. Therefore, to realize the HUD system in the WalkON Suit, smart glasses that have a see-through display (STD) are utilized, as shown in Figure 9. The transparent display of the smart glasses allows the user to view information while looking forward.
The STD glasses system receives the operation status of the WalkON Suit (e.g., the motion modes) via Bluetooth serial communication. In addition to the motion modes, the remaining battery level, the total number of steps and the system error status are displayed on the STD glasses. As the WalkON Suit must operate safely even if the Bluetooth connection is lost, the data are transmitted only from the robot to the glasses.
The STD glasses include a transparent screen, which shows a progress indicator, the foot positions, a warning indicator and an analysis result after completion of walking. The transparent screen of STD glasses shows an image overlapped with the environment. In the display that the user sees, a donut chart shows how long the user has walked with respect to the maximum step counts, which is preset (see Figure 9a). On the center of the donut chart, the number of steps and the current motion mode are displayed; e.g., walk mode is shown in Figure 9c. The foot positions are also shown, as in Figure 9d, which allows the user looking forward while walking. In addition to the necessary information related to motion modes, the STD glasses also show warning messages if any potential hazard is detected, as in Figure 9e. When the operation is terminated, the total operation time and the total number of steps are displayed as in Figure 9f.

4. Experimental Results

The human subject that participated in this research, shown in Figure 10, is a 54 year-old male with paraplegia due to complete spinal cord injury that occurred 27 years ago. The neurological level was at L1 motor and T12 sensory. Although he was diagnosed as ASIA (American Spinal Injury Association) Impairment Scale C, the actual strength of all muscles below the neurological level was feeble by a manual muscle test. Therefore, he cannot stand and walk independently even with conventional KAFO (knee-ankle-foot orthosis) and crutches.
Since he started wearing the WalkON Suit in January 2018, he has regularly practiced walking, about once or twice a week. As he had experience walking with powered exoskeletons, including ReWalk, he was the right human subject who could immediately adapt to a new robot and check the effectiveness of the proposed UI system.

4.1. Walking Experiments with the Proposed UI

The training procedure of a pilot for the WalkON Suit with the proposed UI system is shown in Figure 10. Each training session lasted about one hour, which include standing up, walking on even ground, sitting on a chair and taking a break. The human subject was asked to walk on a 20-m straight track at a walking speed he selected. After walking 20 m, he took a break while sitting down on a chair for about 5 min. After the break, he stood up from the chair and walked with crutches. For safety reasons, at least two assistants were in the vicinity of the human subject at all times; they neither held nor touched the robot and the subject. At the early stage of training, the pilot frequently looked down to check the leg movements and foot positions, as he was not accustomed to the STD glasses. Consequently, his entire body slightly leaned forward, and the neck was bent, as shown in Figure 10a; the flexion angle of the neck was larger than 10 degrees. In the middle stage of training, he started being accustomed to the STD glasses, resulting in the head positioned up, as shown in Figure 10b. The flexion angle of the neck was reduced to 5∼7 degrees. Figure 10c shows the final stage of training when the pilot was fully accustomed to the proposed UI. The pilot did not continuously check the crutch screen or the leg movements, but looked forward while walking. Notice that the neck angle was close to zero, as shown in the figure, since the pilot could receive enough information without looking down. In spite of the proposed UI system including the HUD, however, the user occasionally looked down to check the positions of feet and to see if the ground was flat and even. Nevertheless, he did not have to look down continuously, and his head was positioned up most of the time.
As the neck flexion angle was reduced, the walking speed was improved also. Figure 11 and Table 2 show the changes in the walking speed before and after the pilot was accustomed to the proposed UI system. As shown in the figure, the walking speed was improved from 0.12 m/s to 0.16 m/s, which is about a 33 % improvement. Similarly, the average step time was reduced from 2.48 s to 1.78 s. One of the main reasons for this performance improvement is that the pilot did not spend much time checking the crutch display and his leg movements.

4.2. Robot Control Power Consumption Analysis

The gait trajectory of the WalkON Suit was determined considering the pilot’s anthropometric data and the physical conditions of the joints, including the contracture angle. The actuators were controlled to follow position trajectories introduced in the previous work [29]. In walk mode, the pilot can move one step forward by making an action command in walk mode. WE remind that walk mode includes two different actions, swing and stance. The swing period was pre-determined in the control algorithm, and the stance period was controlled by the user. In this experiment, the swing period was set to one second.
Figure 12 shows the motor torque data of each joint with respect to the gait cycle. The thick continuous lines show the averaged values for ten cycles. The terms LH, LK, RH and RK refer to the left hip, the left knee, the right hip and the right knee, respectively. Figure 12a–d shows the experimental results when the left leg moved forward (i.e., the left leg was in a swing phase and the right leg was in a stance phase). Up to one second, distinguished by the red vertical lines, the left leg swung in the air (see Figure 12a,b), and the right leg supported the whole body weight, generating a propulsion force to move forward (see Figure 12c,d). After one second, the robot was waiting for the next action command from the pilot. The same analyses apply to Figure 12a,d, which were when the right leg moved forward.
Figure 12 also shows the difference between left and right legs when performing the same actions. Figure 12c,e shows the hip joint torque for supporting the body weight and moving the whole body forward, and Figure 12d,f represents the knee joint torque in the same condition. In both cases of the hip and knee joints, the left leg required larger torques than the right leg. The root-mean-squared (RMS) values for the actuation torque for the left and right hip joints in a stance phase were 103.39 Nm and 95.27 Nm, respectively. This may be because of the difference in the leg length of the pilot.

4.3. Load Carrying Experiment

As the user of the WalkON Suit must hold crutches while walking, he/she cannot manually handle a load. In order for the user to carry a load (e.g., the torch of the Paralympics), an additional device for carrying a load was installed on the back of the WalkON Suit, as shown in Figure 10c. As an additional weight was imposed on the back, the center of mass was moved backward, and thus, the pilot had to practice carrying the load.
Table 3 shows the magnitudes of actuation torques before and after the load was added. Results of Trial 1 were the root-mean-squared (RMS) values of actuation torques without the load, and the ones of Trial 2 were the RMS values after the load was added. Notice that the required actuation torques were increased by the additional load. Nevertheless, the pilot could easily adapt to the additional load, and he could successfully deliver the torch of the 2018 Pyeongchang Paralympics as the first torch bearer.

5. Discussion and Conclusions

In this paper, a new user interface for a powered exoskeleton, the WalkON Suit, for complete paraplegics was introduced. The WalkON Suit is a robot developed for complete paraplegics and aimed at allowing the wearer to walk freely in the outdoors and had showed its possibilities through events such as Cybathlon 2016 and the Pyeongchang Paralympics Games 2018. Therefore, the WalkON Suit is not a rehabilitation device for short-term or long-term treatment effects in medical use, but rather a device that allows people to do their daily activities. With the proposed powered exoskeleton, a complete paralysis patient will be able to improve his/her quality of life by obtaining a feeling of walking, gaining confidence and participating in social activities. As a person with complete spinal cord injury can neither move, nor sense the lower limb, it was mandatory for the users of the previous version of the WalkON Suit [29] to observe their leg movements continuously. Moreover, the conventional UI utilized a small display installed on the crutch or on the chest, which required looking down when operating the powered exoskeletons. For these reasons, the user of a powered exoskeleton had to look down frequently, which caused gait instability, reduced walking speed and shoulder and neck pain. Therefore, in this paper, a new user-interface (UI) of a powered exoskeleton for complete paraplegics was introduced. The proposed UI system consisted of two tact switches and a display installed on the crutch and a head-up display based on glasses with a see-through display (STD).
Different technologies may also be available for detecting the gait intentions of people, such as EMG and foot pressure patterns, which are more suitable for people with partial impairment in walking ability, The foot contact status may be indicated by a visual, tactile or auditory interface, as important information indicating that the wearer of the robot is ready to begin the next step. Donati et al introduced a long-term gait training protocol using a combination of virtual reality training, visual-tactile feedback training, powered exoskeleton training and robotic body weight support (BWS) gait training on a treadmill [26]. However, the effects of single training in tactile feedback, virtual-reality training or exoskeleton training, respectively, have not been clearly verified yet. Many studies have shown that it is possible to use a brain-machine-interface (BMI) for control of robotic exoskeletons [35,36,37,38].
Although the methods listed above provided powered exoskeletons with the possibility to interact with humans in a more sophisticated way, they are not yet reliable for complete paraplegics in practice. According to the authors’ experience, the best user interface for each paraplegic user may be highly dependent on the personal preference of the individual. For example, complete paraplegics who have completely lost their proprioception, like the human subject who participated in this research, are very sensitive to the reliability of the robot system. When a robot does not recognize a command (i.e., the motion intention) on occasion, they become very nervous. This may be a general case for people with complete paraplegia, as their body is vulnerable and their curability with respect to any further injury is very poor. Therefore, the most important requirement for the powered exoskeletons of complete paraplegics is reliability. This is the reason why the proposed system was developed based on buttons and an HUD to reduce their physical burden to check their leg positions visually frequently, while maintaining a high operation reliability. Complete paraplegics with the ASIA level of A or B, in particular those with complete loss of sensory functions, neither generate any EMG signal nor feel the damaged body parts. Therefore, it is clear that the EMG is not a proper solution for the user-interface for such users. A haptic device may be a good solution to deliver the interaction forces with an environment (e.g., the ground reaction force) to the body part that is not damaged (e.g., the trunk) by bypassing the cut spinal nervous system. For example, the human subject who participated in this research has a spinal injury at the level of T-10, which means he does not have any feeling below his navel. For this case, it is impossible to transfer all the necessary tactile information through the haptic system. Nevertheless, the authors fully admit that a haptic device is a good solution for people with incomplete paraplegia and with a low paraplegia level. In particular, for those with loss of sensory function only below the knee joints, the haptic device would be a perfect solution for regaining the gait stability. This will definitely be the future work of this paper.
In this paper, a combination of buttons and an HUD was applied as a user interface for a complete paraplegic user wearing a powered exoskeleton. The proposed method was developed for the sake of 100% reliability while providing the minimal necessary information for realizing a stable gait. The finite state machine for operating the robot with buttons was optimized to minimize the number of clicks to minimize any potential malfunction by selecting an undesired motion mode. The proposed UI system was applied to the WalkON Suit, a powered exoskeleton for complete paraplegics. As the STD glasses provided the necessary information for operating a powered exoskeleton, such as a progress indicator, the foot positions, a warning indicator, etc., the user of the WalkON Suit did not have to look down continuously to check the leg movements and to watch the display on the crutch. Consequently, the proposed UI system resulted in a critical improvement in the gait stability, the gait speed, as well as the user experience. The WalkON Suit with the proposed UI was utilized for the torch relay of the 2018 Pyeongchang Paralympics. A complete paraplegic successfully walked about 20 m carrying a torch and delivered it to the next torch bearer.
The proposed UI method was verified by only one subject in this paper. For the proof of the generality of the effectiveness of the proposed method, a statistical study with enough human subjects may be necessary. A practical problem was that the powered exoskeleton used in the experiment, called the WalkON Suit, was custom-fitted, which means that the robot could be worn only by the human subject who participated in this paperwork. Nevertheless, we believe that the improvement in the gait speed and the neck posture was scientifically meaningful even though it was a result verified by only a single subject. We assumed that the limited gait speed and the poor gait stability were due to the problem of complete paraplegics needing to check their leg positions visually frequently, and we proposed an engineering solution to solve this problem by adopting the see-through display technology. The experimental result showed that the proposed hypothesis was correct from the engineering viewpoint. A new version of the WalkON Suit, which can be worn by multiple users, is being developed by Angel Robotics. Once the new version of the WalkON Suit is ready, a statistical study of the user interface will be carried out, which is the future work of this paper.
The application of a whole-body control method to a powered exoskeleton to enable walking without holding crutches may also be a possible topic in the future work. Currently, there is no exoskeleton available on the market that supports self-balancing [2,39]. Therefore, powered exoskeletons for over-ground walking of people with complete paraplegics need supplementary equipment such as a cane, forearm crutches or a wheeled walker [2]. Future research on how to safely fall down and get up again while wearing robots will also be conducted. This can be more realistic and practical than developing robust exoskeletons that do not collapse.

Author Contributions

Conceptualization, H.C. Formal analysis, J.L. Methodology, B.N. Project administration, K.K. Writing, original draft, H.C.

Funding

This research was supported by the Commercializations Promotion Agency for R&D Outcomes(COMPA) funded by the Ministry of Science & ICT (No 2018K000031,Commercialization research of a wearable robot for assisting the mobility of people with complete paraplegia).

Acknowledgments

The authors would like to express special thanks to Young-ro Lee for his dedication for this research and for his willingness to walk again with the WalkON Suit. Part of this work was previously presented at the 44th Annual Conference of the IEEE Industrial Electronics Society.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Holanda, L.J.; Silva, P.M.; Amorim, T.C.; Lacerda, M.O.; Simão, C.R.; Morya, E. Robotic assisted gait as a tool for rehabilitation of individuals with spinal cord injury: A systematic review. J. Neuroeng. Rehabil. 2017, 14, 126. [Google Scholar] [CrossRef] [PubMed]
  2. Onose, G.; Cârdei, V.; Crăciunoiu, Ş.T.; Avramescu, V.; Opriş, I.; Lebedev, M.A.; Constantinescu, M.V. Mechatronic wearable exoskeletons for bionic bipedal standing and walking: A new synthetic approach. Front. Neurosci. 2016, 10, 343. [Google Scholar] [CrossRef] [PubMed]
  3. Lee, H.; Kim, W.; Han, J.; Han, C. The technical trend of the exoskeleton robot system for human power assistance. Int. J. Precis. Eng. Manuf. 2012, 13, 1491–1497. [Google Scholar] [CrossRef]
  4. Contreras-Vidal, J.L.; Bhagat, N.A.; Brantley, J.; Cruz-Garza, J.G.; He, Y.; Manley, Q.; Nakagome, S.; Nathan, K.; Tan, S.H.; Zhu, F.; et al. Powered exoskeletons for bipedal locomotion after spinal cord injury. J. Neural Eng. 2016, 13, 031001. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Louie, D.R.; Eng, J.J.; Lam, T. Gait speed using powered robotic exoskeletons after spinal cord injury: A systematic review and correlational study. J. Neuroeng. Rehabil. 2015, 12, 82. [Google Scholar] [CrossRef] [PubMed]
  6. Esquenazi, A.; Talaty, M.; Packel, A.; Saulino, M. The ReWalk powered exoskeleton to restore ambulatory function to individuals with thoracic-level motor-complete spinal cord injury. Am. J. Phys. Med. Rehabil. 2012, 91, 911–921. [Google Scholar] [CrossRef] [PubMed]
  7. Kolakowsky-Hayner, S.A.; Crew, J.; Moran, S.; Shah, A. Safety and feasibility of using the EksoTM bionic exoskeleton to aid ambulation after spinal cord injury. J. Spine 2013, 4, 003. [Google Scholar] [CrossRef]
  8. Quintero, H.; Farris, R.; Hartigan, C.; Clesson, I.; Goldfarb, M. A powered lower limb orthosis for providing legged mobility in paraplegic individuals. Top. Spinal Cord Injury Rehabil. 2011, 17, 25–33. [Google Scholar] [CrossRef] [PubMed]
  9. Kilicarslan, A.; Prasad, S.; Grossman, R.G.; Contreras-Vidal, J.L. High accuracy decoding of user intentions using EEG to control a lower-body exoskeleton. In Proceedings of the 2013 35th Annual International Conference on Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 5606–5609. [Google Scholar]
  10. He, Y.; Eguren, D.; Azorín, J.M.; Grossman, R.G.; Luu, T.P.; Contreras-Vidal, J.L. Brain–machine interfaces for controlling lower-limb powered robotic systems. J. Neural Eng. 2018, 15, 021004. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Simic, M.; Tariq, M.; Trivailo, P. EEG-Based BCI Control Schemes for Lower-Limb Assistive-Robots. Front. Hum. Neurosci. 2018, 12, 312. [Google Scholar]
  12. Hassan, M.; Kadone, H.; Suzuki, K.; Sankai, Y. Wearable gait measurement system with an instrumented cane for exoskeleton control. Sensors 2014, 14, 1705–1722. [Google Scholar] [CrossRef] [PubMed]
  13. González, I.; Fontecha, J.; Hervás, R.; Bravo, J. An ambulatory system for gait monitoring based on wireless sensorized insoles. Sensors 2015, 15, 16589–16613. [Google Scholar] [CrossRef] [PubMed]
  14. Lenzi, T.; De Rossi, S.M.M.; Vitiello, N.; Carrozza, M.C. Intention-based EMG control for powered exoskeletons. IEEE Trans. Biomed. Eng. 2012, 59, 2180–2190. [Google Scholar] [CrossRef] [PubMed]
  15. Yin, Y.H.; Fan, Y.J.; Xu, L.D. EMG and EPP-integrated human–machine interface between the paralyzed and rehabilitation exoskeleton. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 542–549. [Google Scholar] [CrossRef] [PubMed]
  16. Aguilar-Sierra, H.; Yu, W.; Salazar, S.; Lopez, R. Design and control of hybrid actuation lower limb exoskeleton. Adv. Mech. Eng. 2015, 7. [Google Scholar] [CrossRef] [Green Version]
  17. Sczesny-Kaiser, M.; Höffken, O.; Aach, M.; Cruciger, O.; Grasmücke, D.; Meindl, R.; Schildhauer, T.A.; Schwenkreis, P.; Tegenthoff, M. HAL® exoskeleton training improves walking parameters and normalizes cortical excitability in primary somatosensory cortex in spinal cord injury patients. J. Neuroeng. Rehabil. 2015, 12, 68. [Google Scholar] [CrossRef] [PubMed]
  18. Kubota, S.; Abe, T.; Fujii, K.; Marushima, A.; Ueno, T.; Haginoya, A. Improvement of walking ability using hybrid assistive limb training in a patient with severe thoracic myelopathy caused by ossification of the posterior longitudinal ligament. A case report. J. Spine 2016, 7, 3–10. [Google Scholar] [CrossRef]
  19. Shimizu, Y.; Nakai, K.; Kadone, H.; Yamauchi, S.; Kubota, S.; Ueno, T.; Marushima, A.; Hiruta, K.; Endo, A.; Kawamoto, H.; et al. The Hybrid Assistive Limb® intervention for a postoperative patient with spinal dural arteriovenous fistula and chronic spinal cord injury: A case study. J. Spinal Cord Med. 2018, 41, 710–717. [Google Scholar] [CrossRef] [PubMed]
  20. Vouga, T.; Zhuang, K.Z.; Olivier, J.; Lebedev, M.A.; Nicolelis, M.; Bouri, M.; Bleuler, H. EXiO—A brain-controlled lower limb exoskeleton for rhesus macaques. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 131–141. [Google Scholar] [CrossRef] [PubMed]
  21. Fitzsimmons, N.; Lebedev, M.; Peikon, I.; Nicolelis, M.A. Extracting kinematic parameters for monkey bipedal walking from cortical neuronal ensemble activity. Front. Integr. Neurosci. 2009, 3, 3. [Google Scholar] [CrossRef] [PubMed]
  22. Perge, J.A.; Homer, M.L.; Malik, W.Q.; Cash, S.; Eskandar, E.; Friehs, G.; Donoghue, J.P.; Hochberg, L.R. Intra-day signal instabilities affect decoding performance in an intracortical neural interface system. J. Neural Eng. 2013, 10, 036004. [Google Scholar] [CrossRef] [PubMed]
  23. Xu, R.; Jiang, N.; Mrachacz-Kersting, N.; Lin, C.; Prieto, G.A.; Moreno, J.C.; Pons, J.L.; Dremstrup, K.; Farina, D. A closed-loop brain-computer interface triggering an active ankle-foot orthosis for inducing cortical neural plasticity. IEEE Trans. Biomed. Eng. 2014, 61, 2092–2101. [Google Scholar] [PubMed]
  24. Do, A.H.; Wang, P.T.; King, C.E.; Chun, S.N.; Nenadic, Z. Brain-computer interface controlled robotic gait orthosis. J. Neuroeng. Rehabil. 2013, 10, 111. [Google Scholar] [CrossRef] [PubMed]
  25. Carlson, T.; Millan, J.D.R. Brain-controlled wheelchairs: A robotic architecture. IEEE Robot. Autom. Mag. 2013, 20, 65–73. [Google Scholar] [CrossRef]
  26. Donati, A.R.; Shokur, S.; Morya, E.; Campos, D.S.; Moioli, R.C.; Gitti, C.M.; Augusto, P.B.; Tripodi, S.; Pires, C.G.; Pereira, G.A.; et al. Long-term training with a brain-machine interface-based gait protocol induces partial neurological recovery in paraplegic patients. Sci. Rep. 2016, 6, 30383. [Google Scholar] [CrossRef] [PubMed]
  27. Chen, X.; Xu, L.; Wang, Y.; Wang, H.; Wang, F.; Zeng, X.; Wang, Q.; Egger, J. Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display. J. Biomed. Inf. 2015, 55, 124–131. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Muensterer, O.J.; Lacher, M.; Zoeller, C.; Bronstein, M.; Kübler, J. Google Glass in pediatric surgery: An exploratory study. Int. J. Surg. 2014, 12, 281–289. [Google Scholar] [CrossRef] [PubMed]
  29. Choi, J.; Na, B.; Jung, P.G.; Rha, D.W.; Kong, K. WalkON Suit: A Medalist in the Powered Exoskeleton Race of Cybathlon 2016. IEEE Robot. Autom. Mag. 2017, 24, 75–86. [Google Scholar] [CrossRef]
  30. Salvucci, V.; Kimura, Y.; Oh, S.; Hori, Y. Force maximization of biarticularly actuated manipulators using infinity norm. IEEE/ASME Trans. Mech. 2013, 18, 1080–1089. [Google Scholar] [CrossRef]
  31. Choi, H.; Oh, S.; Kong, K. Control of a robotic manipulator in the polar coordinate system using a biarticular actuation mechanism. Int. J. Control Autom. Syst. 2016, 14, 1095–1105. [Google Scholar] [CrossRef]
  32. Wu, C.H.; Mao, H.F.; Hu, J.S.; Wang, T.Y.; Tsai, Y.J.; Hsu, W.L. The effects of gait training using powered lower limb exoskeleton robot on individuals with complete spinal cord injury. J. Neuroeng. Rehabil. 2018, 15, 1–10. [Google Scholar] [CrossRef] [PubMed]
  33. Liu, Y.C.; Wen, M.H. Comparison of head-up display (HUD) vs. head-down display (HDD): Driving performance of commercial vehicle operators in Taiwan. Int. J. Hum.-Comput. Stud. 2004, 61, 679–697. [Google Scholar] [CrossRef]
  34. Tonnis, M.; Lange, C.; Klinker, G. Visual longitudinal and lateral driving assistance in the head-up display of cars. In Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 91–94. [Google Scholar]
  35. Gancet, J.; Ilzkovitz, M.; Motard, E.; Nevatia, Y.; Letier, P.; de Weerdt, D.; Cheron, G.; Hoellinger, T.; Seetharaman, K.; Petieau, M.; et al. MINDWALKER: Going one step further with assistive lower limbs exoskeleton for SCI condition subjects. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1794–1800. [Google Scholar]
  36. Noda, T.; Sugimoto, N.; Furukawa, J.; Sato, M.A.; Hyon, S.H.; Morimoto, J. Brain-controlled exoskeleton robot for BMI rehabilitation. In Proceedings of the 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids), Osaka, Japan, 29 November–1 December 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 21–27. [Google Scholar]
  37. Kwak, N.S.; Müller, K.R.; Lee, S.W. A lower limb exoskeleton control system based on steady state visual evoked potentials. J. Neural Eng. 2015, 12, 056009. [Google Scholar] [CrossRef] [PubMed]
  38. Lee, K.; Liu, D.; Perroud, L.; Chavarriaga, R.; Millán, J.D.R. Endogenous control of powered lower-limb exoskeleton. In Wearable Robotics: Challenges and Trends; Springer: Berlin, Germany, 2017; pp. 115–119. [Google Scholar]
  39. Veneman, J.F. Exoskeletons supporting postural balance—The BALANCE project. In Replace, Repair, Restore, Relieve–Bridging Clinical and Engineering Solutions in Neurorehabilitation; Springer: Berlin, Germany, 2014; pp. 203–208. [Google Scholar]
Figure 1. A complete paraplegic wearing the WalkON Suit for the torch relay of the Pyeongchang Paralympics 2018.
Figure 1. A complete paraplegic wearing the WalkON Suit for the torch relay of the Pyeongchang Paralympics 2018.
Applsci 08 02287 g001
Figure 2. Configuration and comparison of two WalkON Suit systems; the right one was introduced in the Powered Exoskeleton Race of Cybathlon 2016, and the left one was used for the torch relay of the Pyeongchang Paralympics 2018.
Figure 2. Configuration and comparison of two WalkON Suit systems; the right one was introduced in the Powered Exoskeleton Race of Cybathlon 2016, and the left one was used for the torch relay of the Pyeongchang Paralympics 2018.
Applsci 08 02287 g002
Figure 3. Postures and the associated eyesight of a complete paraplegic wearing the WalkON Suit; eyesight during (a) walking and (b) changing the mode.
Figure 3. Postures and the associated eyesight of a complete paraplegic wearing the WalkON Suit; eyesight during (a) walking and (b) changing the mode.
Applsci 08 02287 g003
Figure 4. Posture analysis of a person with the WalkON Suit while changing the mode.
Figure 4. Posture analysis of a person with the WalkON Suit while changing the mode.
Applsci 08 02287 g004
Figure 5. Configuration of the UI of the WalkON Suit. The UI includes (a) two tact switches and (b) an OLED display installed on a crutch, (c) see-through display glasses, (d) a multi-color LED on a backpack and a buzzer.
Figure 5. Configuration of the UI of the WalkON Suit. The UI includes (a) two tact switches and (b) an OLED display installed on a crutch, (c) see-through display glasses, (d) a multi-color LED on a backpack and a buzzer.
Applsci 08 02287 g005
Figure 6. Operation strategy of the WalkON Suit using two tact switches.
Figure 6. Operation strategy of the WalkON Suit using two tact switches.
Applsci 08 02287 g006
Figure 7. State transition rules in the physical layer (in Figure 6) given the selected mode.
Figure 7. State transition rules in the physical layer (in Figure 6) given the selected mode.
Applsci 08 02287 g007
Figure 8. UI images for the OLED display in the mode selection process: (a) initialization process, (b) stand mode action, (c) right foot forward action in walk mode, (d) mode selection procedure in a sitting state, (e) sit mode and (f) sit mode action.
Figure 8. UI images for the OLED display in the mode selection process: (a) initialization process, (b) stand mode action, (c) right foot forward action in walk mode, (d) mode selection procedure in a sitting state, (e) sit mode and (f) sit mode action.
Applsci 08 02287 g008
Figure 9. Images shown on the see-through display of smart glasses: (a) a progress chart, (b) the number of steps, (c) the current motion mode, (d) the foot placement, (e) a warning message and (f) the result analysis.
Figure 9. Images shown on the see-through display of smart glasses: (a) a progress chart, (b) the number of steps, (c) the current motion mode, (d) the foot placement, (e) a warning message and (f) the result analysis.
Applsci 08 02287 g009
Figure 10. A pilot training for wearing the WalkON Suit with the proposed UI: (a) the early stage of training (the first week of training), (b) the middle stage of training and (c) the final stage of training (the fourth week of training).
Figure 10. A pilot training for wearing the WalkON Suit with the proposed UI: (a) the early stage of training (the first week of training), (b) the middle stage of training and (c) the final stage of training (the fourth week of training).
Applsci 08 02287 g010
Figure 11. Walking speed result for each trial. The step length of each step was set as 30 cm, and the step time was controlled by the pilot’s intention.
Figure 11. Walking speed result for each trial. The step length of each step was set as 30 cm, and the step time was controlled by the pilot’s intention.
Applsci 08 02287 g011
Figure 12. Actuation torque of each joint for one gait cycle. Positive values represent joint flexion torques, and negative values represent joint extension torques. LH, LK, RH and RK stand for the left hip, left knee, right hip and right knee joints, respectively.
Figure 12. Actuation torque of each joint for one gait cycle. Positive values represent joint flexion torques, and negative values represent joint extension torques. LH, LK, RH and RK stand for the left hip, left knee, right hip and right knee joints, respectively.
Applsci 08 02287 g012
Table 1. List of selectable modes in each physical state.
Table 1. List of selectable modes in each physical state.
Current Mode
Selectable ModesStandWalkAlignSit
Walk
AlignAlignWalk SitStand
Sit
Table 2. Step time of difference interfaces.
Table 2. Step time of difference interfaces.
Output Interface MethodAverage Step Time (s)
Crutch display only 2.48
Proposed UI 1.78
Table 3. RMS joint torque of walking motion (Nm).
Table 3. RMS joint torque of walking motion (Nm).
TrialPhaseLHLKRHRKTotal
1LSW 54.25 22.77 95.27 49.92 222.2
RSW 103.4 63.78 24.76 15.47 207.4
2LSW 63.60 36.80 86.33 51.53 238.3
RSW 100.8 63.35 40.78 32.49 237.5
RMS: root mean square, LSW: left leg swing, RSW: right leg swing, LH: left hip, LK: left knee, RH: right hip, RK: right knee.

Share and Cite

MDPI and ACS Style

Choi, H.; Na, B.; Lee, J.; Kong, K. A User Interface System with See-Through Display for WalkON Suit: A Powered Exoskeleton for Complete Paraplegics. Appl. Sci. 2018, 8, 2287. https://doi.org/10.3390/app8112287

AMA Style

Choi H, Na B, Lee J, Kong K. A User Interface System with See-Through Display for WalkON Suit: A Powered Exoskeleton for Complete Paraplegics. Applied Sciences. 2018; 8(11):2287. https://doi.org/10.3390/app8112287

Chicago/Turabian Style

Choi, Hyunjin, Byeonghun Na, Jangmok Lee, and Kyoungchul Kong. 2018. "A User Interface System with See-Through Display for WalkON Suit: A Powered Exoskeleton for Complete Paraplegics" Applied Sciences 8, no. 11: 2287. https://doi.org/10.3390/app8112287

APA Style

Choi, H., Na, B., Lee, J., & Kong, K. (2018). A User Interface System with See-Through Display for WalkON Suit: A Powered Exoskeleton for Complete Paraplegics. Applied Sciences, 8(11), 2287. https://doi.org/10.3390/app8112287

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop