1. Introduction
Motion controller devices for user interfacing are utilized to recognize user’s motions or to control remote robots. Motion recognition is an approach that enables users to interact with a computer naturally by using wearable devices to capture their movements, positions, and gestures. These devices become increasingly smaller, more intelligent, and more varied in type and configuration. For example, motions are detected by measuring arm angles or muscle activities from a band worn on a arm [
1], wearing a ring can allow for finger gestures to be utilized as input signals [
2], and gloves can be sued to recognize users’ hand gestures [
3]. Myo device, developed by Thalmic Labs, is a band-shaped gesture control device that is worn on one of arms [
4].
Researchers have proposed motion estimation methods based on Bayesian probability to reduce the number of Myo device [
5], but these methods have several problems. First, Bayesian probability lowers accuracy because it is calculated by dividing the Myo device data into sections without considering its distribution. Second, motion estimation accuracy is reduced when Bayesian probability is calculated without considering the associations of each of the x, y, and z orientations. Therefore, it is necessary to improve the Bayesian probability approach to motion estimation to increase estimation accuracy. In other words, accuracy can be increased by using additional data such as electromyogram (EMG) of Myo device.
This paper proposes a method for calculating Bayesian probabilities and estimating motions using EMG and orientation data from a Myo device. The final motions are determined by comparing the Bayesian probabilities of motions applying the weights that are calculated by the proposed method based on orientations and EMGs. In this paper, the measured raw data of subjects who are not calibrated for predefined motions are used. In this paper, only the orientation and EMG are utilized without using any velocity. Orientation data and EMG signals of a Myo device measured at a rate of 30 frames is utilized.
The remainder of the paper is organized as follows:
Section 2 introduces research on wearable devices.
Section 3 proposes a method for Bayesian probability calculation and motion estimation based on multiple types of Myo device data.
Section 4 describes the experimental method, results, and the analysis of those results. Finally,
Section 5 outlines the conclusions and explores directions of future research.
2. Related Works
There are the diverse kinds of contactable wearable devices, such as Myo device and GestTrack3D Hand Tracker of GestTrackTek in USA [
6]. Myo device is a built-in armband processor that recognizes motions by sensing the movements of the hands and arms delivered by the muscles through orientation, Inertial Measurement Units (IMU), and eight EMG sensors. Unlike Kinect and Leap Motion, which are widely known as motion recognition devices, Myo device and GestTrack3D Hand Tracker are Bluetooth-connected, so that they can be used remotely at up to 15 m.
When a Myo device is worn, the sensors measure muscle activity to determine the kind of gestures performed expressed as electrical signals. For example, an EMG extracts and amplifies the activity accompanying muscle contraction. The muscle sensor is composed of an amplifier, a high-pass filter (HPF), and a low-pass filter (LPF). EMG signals are amplified because the intensity of signals from muscle contractions is minute. The range of signals measured in the muscles is 10–2000 Hz, whereas the informative signals from the typical human’s muscles usually appears at 20–450 Hz. Therefore, signals of other frequencies can be regarded as noise. Myo device is equipped with an EMG, Bluetooth 4.0 communication, a three-axis accelerator, a three-axis gyroscope, and a three-axis magnetometer. Myo device can measure 25 finger and arm movements that are related to muscle activity, including movements more complex than simple hand gestures. It also measured the degrees of freedom (DOFs) of the hand, and can provide natural language interactions. Harrison et al. [
1] studied the differences in recognition accuracy across user group in a system like Myo device, in which 10 sensors were attached to the arm sensor to learn the waveform using machine learning.
The Myo device also has a 9-axis Inertial Measurement Unit (IMU) that includes a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer. The direction and movement of the wearer’s arm are measured by these units and determined by analyzing the spatial data provided. Orientation represents the positions of the band in terms of roll, pitch, and yaw. The angular velocity of Myo device is provided in vector form and the accelerometer represents the acceleration corresponding of Myo device.
Strelow and Singh [
7] proposed a motion estimation method using a camera and various sensors. Performance improved when the motion estimation method using the camera, the gyro sensor, and the acceleration sensor, as compared to when the camera was used independently. Gesture recognition through the integration of acceleration and EMG sensors [
8] and the fast Fourier transform (FFT) of EMG sensors has been investigated when setting a specific EMG recognition interval [
9]. Additional learning methods, such as a decision tree of acceleration sensor data and a k-nearest neighbor algorithm, have been studied for behavior recognition [
10]. Further, behavior recognition has been performed using dynamic Bayesian networks of bio-signals and acceleration sensors [
11].
A study of cube games using gesture recognition [
6] recognized hand gestures through several EMG sensors and three-axis acceleration sensors. In addition, studies on gesture recognition have examined using EMG signals of static hand movements [
12], and Wilcher motor control of wearable devices to recognize the EMG signals of neck muscles [
13]. However, most of these studies are limited to a static motion, rather than a changing dynamic pattern of motion over time [
8,
13], or a motion that is dynamic, but clearly distinguishable [
9]. On the other hand, there are also other research methods using genetic algorithm optimizations [
14,
15,
16,
17].
Since biological signals have very different body characteristics, it is necessary to use very complex algorithms, such as the classification of precise sensors and clear operation, learning, and processing using a large amount of data, and so the research scope is very limited.
To improve motion estimation accuracy, the equipment used must also be improved, which can be costly. Kim et al. [
5] proposed a motion estimation method using one Myo device sensor that estimates subordinate motion using data placed in dependency by the Bayesian probability of the data obtained from two Myo devices for motion estimation. To estimate motion, the sensor value section is determined after finding the interval based on all Myo device sensor value ranges, and the Bayesian probability is calculated when considering the number of sensor values measured in each section. However, most sensor values are contained in only a few intervals, and are excluded from others.
Lee [
18] proposed an algorithm that redefines a data segment by redefining only the segment with the measured sensor value, and determines the motion to be meaningful data. However, there are limitations in solving this problem by considering only orientation data; we propose a motion estimation method based on weighted with Genetic algorithms and Bayesian probability, which suggests a Bayesian probability for orientation data and EMG data separately, and then selects the final motion by comparing the two in motion estimation. In this study, we define five basic operations and compute the motion probabilities that are estimated by various data characteristics to derive the final probability, enabling more accurate motion estimation under a variety of conditions.
Recently, Myo’s EMG motion estimation study is Kim [
19]. They suggested an algorithm to estimate the arm motion using the MYO armband attached to the upper arm. The motion of the lower arm is estimated through the EMG signal of the upper arm. The motion of the upper arm is estimated through the IMU sensor. They used an algorithm with sum, average, and rate based on the data measured on EMG of MYO. According to the results of this study, accurate motion estimation results can be confirmed at 0° and 90°.
Nourhan T. [
20] proposed a method to measure the muscle momentum according to the degree of muscle contraction during exercise using EMG data. The characteristics of the EMG obtained from the muscles are analyzed to classify the RMS (Root Mean Square) and the MDF (M-dimensional feature vector), and then the data set is learned using the artificial neural network (ANN). They are using ANN proposed two algorithms to predict muscle fatigue by classifying them as non-fatigue and fatigue.
Mendez, I. [
21] proposed a method to solve the problem by decision tree using measured data from Myo’s EMG to determine user’s pose. After learning the measured data of 10 scissors, rocks, and beams, we randomly played the game and showed 78% of the results. Myo’s EMG signal is analyzed to show the related research on classification method for motion estimation as a research method that is applied to game after learning.
However, the Myo device is better suited to determine the relative position of the arm, not the absolute position. Myo device-related approaches can be applied to diverse fields, such as robot control and remote medical support. Since Myo device’s sensor data feature always provides a constant absolute coordinate system, the Myo device orientation starts measuring and is arranged in the current position with the movement of the next position and relative position coordinates.
In this sense, “relative” in this study is the Bayesian probability by taking Myo device on the upper arm and forearm of right arm in the learning stage and then taking into account their correlation. In the recognition phase, Myo device is put on the forearm of the arm, and then the Bayesian probability is determined and the motion is determined by selecting the motion having the Euclidian distance closest to the motion defined in the learning.
Therefore, in order to control remote or remote objects such as robot control, it is possible to perform more accurate and elaborate missions, by calculating the displacement of the relative coordinates from the current position starting point to provide various motion recognition information. The approach of wearing Myo devices on the upper arm and forearm is generalizable to other parts of the body, such as the legs.
In this paper, the methodology for motion estimation is as follows. We used the orientation and EMG raw data measured at 30 frame rates after wearing the upper arm and forearm of the right arm for the five defined stopping operations as input data. The data measured in two Myo devices are defined as independent motion and the Myo device worn at the upper arm is defined as the dependent motion. Learning is defined by defining the correlation in the learning stage. In the recognition phase, the final motion is determined by looking at the motion defined at the learning stage and the motion at the upper arm as the orientation and EMG raw data measured at one Myo device worn at the forearm of the arm for motion estimation. Detailed methodologies are described in detail in the following chapters.
3. The Myo Device Motion Estimation Framework
User’s motions can be estimated more accurately by utilizing multiple currently measured data. The following details the procedure of the proposed motion estimation method.
3.1. Measurement Methodology
The proposed method utilizes a wearable Myo device. To estimate a user’s motions, Myo devices are set on the upper arm and forearm of one arm, and a subordinate relation of the two are set, for example, the upper arm can be defined as a dependency on the forearm. At the end of learning, the subordinate relationship enables motion estimation of the upper arm in subordinate relation to the independent motion of the forearm at the recognition phase.
Figure 1 shows Myo devices attached to the forearm. In this paper, signal patterns are analyzed by measuring a user’s wrist movements with eight EMGs.
Figure 2 shows the EMG signals acquired after wearing Myo devices on the upper arm and forearm. The eight EMGs have different signals depending on the muscle activity being measured. As shown in
Figure 3b, one Myo device has eight EMG sensors built in the band. Therefore, the EMG signals corresponding to the corresponding muscles of the wearable area are input to each of the eight EMG channels. The human arm muscles are composed of several muscles with different strengths depending on the degree of development and the magnitude of the force, and all of the eight EMG signals show different measured values even when measured at the same time, according to the position of the EMG channel measured. However, when worn at the same position, a certain pattern is maintained depending on the motion.
The final motion is determined by selecting the motion corresponding to the signal with the highest Bayesian probability calculated by each signal.
The overall structure of the motion estimation approach is shown in
Figure 3. First, Bayesian probability is obtained by using the orientation x, y, and z and EMG obtained from the Myo devices. Weights are calculated and then used to estimate the final motion.
The proposed method consists of data processing, training, and recognition stages. Orientation and EMG data are recorded with Myo devices. The orientation data are transferred to the Bayesian probability calculation step of the training stage. The calculated Bayesian probability is used in a genetic algorithm to obtain weights, which are then transferred to the motion estimation step in the recognition stage.
3.2. Data Measurement
In the data processing phase, the Myo device measures data. Data that can be thus obtained are shown in
Table 1.
In the proposed method, only the orientation and EMG are used, but motion estimation accuracy can be improved using additional data of other types by applying the proposed method.
3.3. Traning Stage
Training consists of two steps. The orientation data and EMG input from the data processing phase are processed, as shown in
Figure 4, in the Bayesian probability calculation.
The motions not worn by Myo device are estimated by the orientation and EMG of the motions worn by Myo device. Set the independent motion , and the dependent motion Set , , and where are the ith possible value of the values collected in advance, , and is the amount of sections.
The weight calculation step is shown in
Figure 5.
,
) is weight function. Each orientation is weighted and expressed as 15 bits for the genetic algorithm. The fitness function for weight calculation in the genetic algorithm is defined, as follows. The x, y, and z orientations are defined as fitness functions by defining the Euclidean distance as the closest value to the previously defined motion.
Each weight ranges from −1 to 1 in increments of 0.2. For orientation, three weights,
,
, and
, are utilized for
,
, and are expressed as 15 bits for the genetic algorithm. The orientation fitness function is defined as shown in Equation (1).
where
. Therefore, each weight is normalized during the genetic algorithm process. For example,
,
, and
are normalized by
The dependent motion are also calculated by EMGs, , …, and , …,, obtaining the weights, for EMGs by the Generic algorithm.
3.4. Recognition Stage
The Bayesian probability calculation from the orientation and EMG data in the motion estimation step reflects the weights determined by the genetic algorithm as shown in
Figure 6.
Then, the Bayesian probability updated by the weights is calculated, and final motion is estimated. Of the Bayesian probabilities calculated from the x, y, and z orientations and EMG data, the motion with the highest probability as the final motion is selected as in Equation (3). Handling multiple data types can more accurately estimate motions than traditional methods.