Next Article in Journal
C3Net: Cross-Modal Feature Recalibrated, Cross-Scale Semantic Aggregated and Compact Network for Semantic Segmentation of Multi-Modal High-Resolution Aerial Images
Next Article in Special Issue
Adaptive 3D Imaging for Moving Targets Based on a SIMO InISAR Imaging System in 0.2 THz Band
Previous Article in Journal
Hyperspectral Image Classification with Localized Graph Convolutional Filtering
Previous Article in Special Issue
Performance Evaluation of Vibrational Measurements through mmWave Automotive Radars
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review

1
Department of Electronic Engineering, Hanyang University, 222 Wangsimini-ro, Seongdong-gu, Seoul 133-791, Korea
2
Robotics and Intelligent Machine Engineering (RIME), School of Mechanical and Manufacturing Engineering (SMME), National University of Science and Technology (NUST), H-12 Islamabad 44000, Pakistan
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(3), 527; https://doi.org/10.3390/rs13030527
Submission received: 7 January 2021 / Revised: 30 January 2021 / Accepted: 1 February 2021 / Published: 2 February 2021

Abstract

:
Human–Computer Interfaces (HCI) deals with the study of interface between humans and computers. The use of radar and other RF sensors to develop HCI based on Hand Gesture Recognition (HGR) has gained increasing attention over the past decade. Today, devices have built-in radars for recognizing and categorizing hand movements. In this article, we present the first ever review related to HGR using radar sensors. We review the available techniques for multi-domain hand gestures data representation for different signal processing and deep-learning-based HGR algorithms. We classify the radars used for HGR as pulsed and continuous-wave radars, and both the hardware and the algorithmic details of each category is presented in detail. Quantitative and qualitative analysis of ongoing trends related to radar-based HCI, and available radar hardware and algorithms is also presented. At the end, developed devices and applications based on gesture-recognition through radar are discussed. Limitations, future aspects and research directions related to this field are also discussed.

Graphical Abstract

1. Introduction

In recent years, computing technology has become embedded in every aspect of our daily lives and man–machine interaction is becoming inevitable. It is widely believed that computer and display technology will keep on progressing further. A gateway which allows humans to communicate with machines and computers is known as the human–computer interface (HCI) [1]. Keyboard and mouse and touch-screen sensors are the traditional HCI approaches. However, these approaches are becoming a bottleneck for developing user friendly interfaces [2]. Contrary to this, human gestures can be a more natural way of providing an interface between humans and computers. Short-range radars have the ability to detect micro-movements with high precision and accuracy [3]. Radar sensors have shown potential in several research areas such as presence detection [4], vital sign monitoring [5], and Radio Frequency (RF) imaging purpose [6]. Choi et al. [7] used an Ultra-Wideband (UWB) Impulse Radar for indoor people counting. Similar research presented by [8] used milli-metric-wave radar for occupancy detection. In addition to this, radar sensors have shown their footprints in hand-motion sensing and dynamic HGR [9,10,11,12,13,14,15,16,17,18,19]. The interest in radar-based gesture recognition has surged in recent years.
Recently, radar sensors have been deployed in a network-fashion for the detection and classification of complex hand gestures to develop applications such as the wireless keyboard [9]. In the aforementioned study [9], in-air movement of the hand was recorded with three UWB radars and a tracking algorithm was used to type 0–9 counting digits in the air. Another study published by same authors presented a continuous alphabet writing based on the gestures drawn in front of two radars [20]. A work presented by Ahmed and Cho [12] (2020) demonstrated a performance comparison of different Deep Convolutional Neural Network (DCNN)-based deep-learning algorithms using multiple radars. In addition to that, hand gesture recognition through radar technology also found application in the operating room to assist medical staff in processing and manipulating medical images [21].
Based on the transmitted signal, short-range radar sensors used for HGR can broadly be categorized as pulsed radar and continuous-wave (CW) radar. This categorization has been adopted previously in several radarprelated review articles for applications other than HGR [22]. Pulsed radar, such as Ultra-Wideband Impulse-Radar (UWB-IR), transmits short duration pulses, whereas continuous-wave radar, such as Frequency Modulated Continuous Wave (FMCW) radar, transmits and receives a continuous wave. Both these radars are widely used for HGR purposes.

1.1. Understanding Human Gestures

Prior to making HCI, an understanding of the term “Gestures” is important. Researchers in [23] defined a gesture as a movement of any body part such as arms, hands and face in order to convey information. This nonverbal communication constitutes up to two thirds of all communication among the humans [24]. Amongst the different body parts, hand gestures are widely used for constructing interactive HCIs (Figure 1) [25]. As seen in Figure 1, 44% of studies focused on developing HCIs using hand, multiple hand and finger movements. Hand gestures are an important part of non-verbal communication in our daily life, and we extensively use hand gestures for communication purposes such as pointing towards an object and conveying information about shape and space. Utilizing hand movements as an input source instead of a keyboard and mouse can help people to communicate with computers in more easy and intuitive way. HGR systems have also found many applications in environments which demand contactless interaction with machinery, such as hospital surgery rooms [26,27], to prevent the spread of viruses and germs. As a result, contactless HCI can be a safe means of man–machine interaction in epidemiological situations such as MERS and the recent and ongoing COVID-19 outbreaks.

1.2. Hand-Gesture Based HCI Design

The system overview of hand-gesture-based HCI development is explained in Figure 2. First, a neural spike is produced in the brain, which generates a signal that results in a voluntarily motion of the hand. Various studies have tried to decode the brain signal corresponding to hand movement and these signals can be seen through electrocorticography [28,29]. To detect the hand movements, several sensors exist, such as camera, depth-camera, and radio sensors. The signal at the output of these sensors is analyzed using suitable algorithmic techniques to detect a predefined hand gesture. Researchers have either used signal-processing-based techniques [30], or machine-learning- and deep-learning-based techniques [14]. After successfully recognizing the desired hand-movements, these gesture-based systems can be used to build different applications such as gaming and robot controllers.
As shown in Figure 2, a wide range of sensors are available for acquiring signals against the performed hand gesture, and radar sensor is one of the candidate solutions. Traditionally, optical sensors (camera), and wearable sensors (gloves) are widely used. These sensors can be classified as wearable and non-wearable. Table 1 summarizes the comparison of existing wearable and non-wearable (wireless) sensors used for recognizing hand gestures. It can be seen that both types of technology possess their own strengths and weaknesses and can be selected according to the requirements of the application in consideration. Both the radar and the cameras provide a wireless interface for gesture recognition. Radar sensors have several benefits over camera-based recognition systems [11]. Radar is not affected by lightning conditions and there are no related privacy issues. Users often do not feel comfortable being watched by a camera.
At present, devices a have built-in radar for HGR, such as Google’s smart phone Pixel 4, which contains a Soli radar chip [6], and was ubiquitously designed and developed by Google Inc.
After data acquisition, the next step is processing the data and recognizing hand gestures. This includes data representation, useful features extraction, and classification. The classification can be performed by using signal-processing approaches, traditional machine-learning approaches [12,15,16,19] or deep-learning approaches [14].
One of the earliest uses of radar for gesture recognition was introduced in 2009 [31]. This research primarily focused on activity classification along with gesture classification. Zheng et al. [30], in 2013, presented hand gesture classification using multiple Doppler radars. In the beginning, for detection and recognition, researchers relied heavily on techniques based on the analysis and manipulation of the received radar signal and, later, the focus shifted towards machine-learning- and deep-learning-based classification techniques.
Figure 3 summarizes the overall workflow of HGR through radar. As shown in Figure 3, based on the literature survey, it was observed that the task of radar-based HGR can be further classified into three different sub-tasks:
  • Hand-gesture movement acquisition, where one of the available radar technologies is chosen;
  • Pre-processing the received signal, which involves pre-filtering followed by a data formatting which depends on step 3. For example, the 1D, 2D, and 3D deep Convolutional Neural Network (DCNN) will, respectively, require data to be in a 1D, 2D or 3D shape;
  • The final step of hand-gesture classification is similar to any other classification problem, where the input data are classified using a suitable classifier.

1.3. Main Contribution and Scope of Article

This article provides a comprehensive survey and analysis of the available literature for HGR through radar sensors. Previously, Li et al. [32] discussed a couple of studies related to the use of radar for gesture recognition while reviewing applications of portable radars. However, to the best of the authors’ knowledge, there is no review article for HGR through radar sensor. Researchers have previously reviewed camera- and optical-sensor-based HGR systems only [25]. The main contributions and scope of analysis can be defined as follows:
  • We provide a first ever comprehensive review of the available radar-based HGR systems;
  • We have discussed different available radar technologies to comprehend their similarities and differences. All the aspects related to HGR recognition, including data acquisition, data representation, data preprocessing and classification, are explained in detail;
  • We explained the radar-recorded hand-gesture data representation techniques for 1D, 2D and 3D classifiers. Based on this data representation, details of the available HGR algorithms are discussed;
  • Finally, details related to application-oriented HGR research works are also presented;
  • Several trends and survey analyses are also included.
The remainder of this paper is organized as follows. Section 2 provides an overview of the hand-gesture data acquisition with radar hardware, which includes a discussion on the different types of radar technologies currently being used. In Section 2, we have divided the radar hardware into two categories: pulsed radars and continuous-wave radars. Section 3 defines the methods used to present radar-recorded hand-gesture data. Section 4 deals with the HGR algorithms for different types of radar. In Section 5, we present a summary of observed trends, limitations and future directions. Finally, Section 6 summarizes the paper.

2. Hand-Gesture Signal Acquisition through Radar

As stated above, based on the nature of transmitted signals, radar technologies can be classified into two categories [32]:
  • Pulsed Radar;
  • Continuous-Wave (CW) Radar.
CW radars can further be classified as Frequency-Modulated CW (FMCW) radars and single-frequency CW (SFCW) radars. Several studies also denote SFCW radars as Doppler radars [33] since their detection mechanism relies heavily on the Doppler phenomenon. SFCW radar sends a single-tone frequency signal and a shift in the frequency of the received signal occurs when it encounters the hand. On the other hand, FMCW radar transmits varying frequency signals. FMCW radar is also capable of implementing the Doppler phenomenon.
Table 2 summarizes the use of the above-mentioned radars for HGR. Several researchers have tried to make a customized hardware for gesture recognition. Here, we have classified the available research based on the scope of the article. Since companies such as Texas-Instrument (TI) Dallas, Texas 75243, USA are developing multi-purpose radars suitable for several short-range applications [34], more effort has been devoted by the academic researcher to developing an HGR algorithm for existing commercial sensors rather than developing a new hardware. In addition to the mentioned radar categories, several researchers have utilized other radio sensors to implement the phenomena of Radio Detection and Ranging. For example, Islam and Nirjon [35], and Pu et al. [36] used the transmitted and the corresponding reflected WIFI signals to sense different gesture movements.

2.1. Pulsed Radars

Pulsed radars transmit an impulse-like signal which has a wide frequency spectrum. The transmission and reception systems based on pulsed signal are usually termed the Ultra-Wideband (UWB) communication systems. These systems have a wider frequency spectrum and usually have a lower Power Spectral Density (PSD) than noise signal PSD. A modern UWB transmitter–receiver pair comprises nearly “all-digital” components and has minimal radio frequency (RF) or microwave components. Consequently, radars based on UWB technology will have a smaller size and can provide a compact portable radar hardware.
As seen in Table 2, most of the research focused on algorithm development for HGR for pulsed radars. However, several researchers have also tried to design a customized pulsed radar hardware, dedicated only to hand gesture recognition purposes. For example, Arbabian and co-workers from Stanford University, California 94305, USA [37], designed a hardware architecture of pulsed radar transceiver intended specifically for gesture recognition and radar imaging applications. A single-chip radar was developed using the SiGe BiCMOS process with 0.13 um technology. The proposed hardware is shown in Figure 4, consisting mainly of a pair of transmit and receive antennas, quadrature mixers, a Phased Lock Loop, Power Amplifier (PA), and Low-Noise Amplifier (LNA).
In references [9,11,13,37], the signal transmitted by UWB radar, s(t) is considered to be reflected back from several different paths. The received echoes from “N” different paths are digitized as x[k] using an analog-to-digital converter (ADC), which can be represented as [7]
x [ k ] =   m = 1 N p a t h A m s [ k t m ] + n o i s e    
In Equation (1), k represents index of digitized sample and Npath and noise represent multipath reflections and noise signal, respectively. When a signal reflected from the hand is received, it contains unwanted information reflections from the objects in the radar’s operational range, commonly termed as clutter. It was observed that all the articles related to HGR using pulsed radar first performed clutter removal operation before further processing, and a loop-back filter is a common choice [11,12,50,77] as it exhibits a simple structure and the computational power is very minimal in comparison to other clutter-reduction filters. The operation of this filter can be defined as
c n [ k ] =   α c n [ k 1 ] + ( 1 α )   x n [ k ]
where c represents the clutter term, x[k] represents the received signal as defined by Equation (1), and alpha is a weighting factor. This clutter value is later subtracted from (1) to obtain clutter-free reflections of the human hand, which are further exploited by pattern-recognition techniques to recognize the desired hand gesture. Other clutter removal filters used for UWB radar are the based Singular Value Decomposition (SVD) filter and Moving Target Indicator (MTI) filters.

2.2. CW Radars

In this section, the two main types of CW radar used to sense hand movements are presented together.

2.2.1. SFCW Radar

Single-frequency radars sense the target based on the change in frequency of the transmitted and received pulse. In case of hand gestures, the Doppler shift frequency caused by the motion of the hand and fingers is limited to ranges of several hertz [40]. Consequently, it is easier to design radar hardware to detect hand gestures, as the required baseband devices and ADCs operate in a limited frequency.
Figure 5 show a Doppler radar hardware specifically designed for hand-gesture acquisition, designed by [40]. The transmitter proposed in this work mainly consists of a low-frequency crystal oscillator (denoted as fsub with a frequency of 6 MHz), mixed with a local oscillator (LO), band pass filter, a power amplifier and a transmitting antenna. On the other hand, upon reflection from the hand, the receiver signal is demodulated and passed through a band-pass filter to obtain I phase I(t) and Quadrature-Phase Q(t), which can be expressed as
I ( t ) =   A I ( t )   cos   [ θ 0 +   4 π X ( t )   λ   +   ϕ 0 ]   + DC I ( t )
Q ( t ) =   A Q ( t )   sin   [ θ 0 +   4 π X ( t )   λ   +   ϕ 0 ]   + DC Q ( t )
Here, “DC” represents the DC offset, Ai represents the in-phase amplitude, and Aq represents the quadrature-phase amplitude of the received signal. Wavelength, phase noise and phase delay are represented by λ ,   ϕ 0   a n d   θ 0 , respectively.

2.2.2. FMCW Radar

FMCW radar transmits varying frequency signals. At the beginning, a low frequency signal is transmitted and, with the passage of time, the frequency is increased continuously up to a certain bandwidth span. The time duration during which the signal continuously spans from a low to high frequency is known as the chirp time.
Figure 6a shows a typical FMCW radar hardware for hand-gesture sensing, designed by Zhang et al. [15]. The front end of the radar is based on a BGT24MTR12 signal generator and receptor chip developed by a company named Infinion technologies, and a Cyclone-3 FPGA chip developed by Altera technologies. The proposed radar operates in K-Band with a center frequency of 24 GHz. As shown in Figure 6b, the transmitted signal frequency constantly increases between fmin and fmax, with the bandwidth B, commonly known as the chirp signal. The transmitted signal is expressed as
S ( t ) =   e j 2 π   ( f c t +   1 2   B T   t 2 )
where fc is the carrier frequency, B, is the bandwidth of chirp. The corresponding received signal will have a delay factor (t-τ) series of chirps, which are usually transmitted to sense the target movement.
In Section 2, we described the basic radar hardware and corresponding data acquisition framework. After data acquisition, the next key step is the description of data.

3. Hand-Gesture Radar Signal Representation

Capturing the finest details of human hand movement is a challenging task, since it requires all the backscattered radar signals to be saved properly. As stated in the above section, the data can be expressed in several formats:
  • Time-Amplitude: The time-varying amplitude of the received signal is exploited to extract a hand-gesture motion profile. The signal, in this case, is 1-Dimensional (1-D), as represented in Figure 7a. For gesture-recognition, such types of signal have been used as input for a deep-learning classifier such as 1-D CNN [51], where several American Sign Language (ASL) gestures were classified, as shown in Figure 7a. Additionally, this type of representation of hand-gesture signal can be utilized to develop signal-processing-based simple classifiers as well [62];
  • Range-Amplitude (Figure 7b): Intensity (amplitude) of reflections at different distance is used to extract features, as mentioned in [75];
  • Time-Range: Time-varying distance of received signal is used to classify hand gestures. Against hand-gestures, the magnitude of variations in distance of hand is recorded over time to obtain a 1-D [30] and 2-D signal [12]. For example, authors in [38] used the Time-Range 2D gesture data representation scheme shown in Figure 7c. For gesture recognition, authors have used 2-D and 3-D Time-Range signals to drive 2-D [13,20] and 3-D [12,38] CNN;
  • Time-Doppler (frequency/speed): Time-varying Doppler shift is used to extract features of hand gestures; for example, the authors in [54] used the change in Doppler frequency over time as input to CNN for gesture classification;
  • Range-Doppler frequency/Time-Doppler speed (Figure 7d): Rate of change of Doppler shift with respect to distance is used to extract features of hand movements. Several authors have used range and Doppler velocity to represent hand gestures [55];
  • Time-Frequency (Figure 7f): change in frequency over time is observed to detect hand-gestures [67].

4. HGR Algorithms

In this section, the available HGRs for pulsed and continuous wave radars are discussed in detail. The scope of detail for each research work reviewed includes: (1) the discussion on radar-recorded hand-gesture data format; (2) the dimensions of prepared final data, which are supposed to present the hand movement profiles; (3) the details of the used HGR algorithm; (4) the operating frequency of radar. All details and the scope of these studies are summarized in separate table for each type of radar. Since each study used their own set of gestures and hardware, the overall accuracy is not mentioned in tables to avoid comparative confusion amongst the presented studies.

4.1. HGR Algorithms for Pulsed Radar

Table 3 presents the overall summary of all the pulsed radar studies reviewed in this paper. Early works to recognize hand gestures mainly relied on signal-processing techniques [37], where the received radar signals are manipulated to recognize and classify hand gesture. Most of the Pulsed-radar-based research shown in Table 3 used machine-learning-based techniques for HGR. As stated in the previous section, one of the early pieces of research into pulsed-radar-based HGR [37] mainly focused on customized hardware development. In [49], six different gestures were captured and the overall 86% success rate was reported using simple conditional statements based on the distance and direction of performed hand gestures. A simple 1-D distance-based thresholding technique was employed to differentiate the gesture. The technique worked well for the defined gestures and since only one parameter was used to differentiate the gestures, it may not work well for gestures other than the ones defined by the authors. Park and Cho (2016) [50] captured five gestures and used a Support Vector Machine (SVM) to classify hand gestures. The authors exploited PCA as a feature set for the SVM classifier and an accuracy of 99% was reported. Khan and Cho. (2017) [48] performed a classification of five gestures using a neural network driven by three features, namely, distance variations, magnitude variance and surface area variance. Another similar study in [47] implemented HGR inside a car using a UWB radar. The five gestures used in this study consist of three small gestures where only the fingers are moved, and two big gestures where the full hand is moved. An additional empty gesture was also added. The authors used only the three-feature-driven unsupervised clustering algorithm and reported an accuracy of 97%. All the gestures were performed inside a car. Kim et al. [51] used a pulsed radar to classify six dynamic gestures from American sign language (S,E,V,W,B, and C), and reported a 90% accuracy rate. The authors utilized a CNN-based feature extraction and classification algorithm on five different hand gestures. Ahmed and coworkers [11] used a six-layered CNN to classify the finger-counting gestures. This study reported a high accuracy of 96%. For UWB Pulsed radars, this was the first study that presented a complete methodology to implement deep-learning in UWB radars. The authors also provided a technique to convert the UWB radar signal into an image for CNN architecture. In [38], a team of researchers from Sweden used ResNet-50 to classify 12 hand gestures and demonstrated a classification accuracy of 99.5%.
As stated in the introduction section, researchers in [9] used multiple radars to design a wireless digit-writing application. The authors used both a radar-tracking algorithm and deep learning to classify the hand movement used to write digits. First, the hand movement was tracked in a 2D plane, and then the tracked trajectory was fed as input to a CNN algorithm. The work presented in [13] showed a feature-based SVM classifier to classify simple pointing gestures. The feature extraction technique used in this study, named the histogram of gradients (HOG), was adopted directly from the image-processing field. Heunisch and coworkers (2019) [39] used a transmitter and receiver pair to make a pulsed radar. They used a wavelet signal as series of pulses instead of transmitting square impulses. A normal-square pulse was generated using a commercial chip (Agilent: N4906B) and fed as input to an in-house wavelet function generator. The authors successfully classifie three postures, instead of dynamic gestures. Previously, Shahzad and Cho [12] presented a classifier for hand gesture recognition and classification based on the GoogLeNet architectures shown in Figure 8. Rather than linearly increasing layers only, the structural variations were also performed to form a very deep CNN algorithm using the inception modules. The deep network comprising seven inception-modules is shown in Figure 8, where the data from two radars were first converted into 3D Red, Green and Blue (RGB) images and fed as input to a feature-extraction block of the deep-network. The research presented in [53] used a drawn pattern of hand movement as gesture, and proposed a remote authentication application. The paper provided two-stage authentication, first by recognizing a pattern in air, followed by a drawn midair signature using a tracking algorithm (i.e., Kalman filter). A recent study published by Khan and coworkers [20] implemented continuous writing using multiple radars and CNN. In this study, based on the pattern drawn in air, continuous letters were recognized using CNN.
Table 3 shows that, for the classifiers based on deep-learning algorithms, the 2D and 3D data representation shown in Figure 7c–f is commonly used. For example, the researcher work presented in [54] constructed an RGB image by observing the change in Doppler frequency over a specific time duration, and used this image as input to a three-layerd DCNN algorithm, resulting in 87% accuracy with 10 hand gestures.

4.2. HGR through CW Radar

In this section, HGR through CW is discussed in detail for both the SFCW and the FMCW radar.

4.2.1. HGR through SFCW (Doppler) Radar

Table 4 summarizes all SFCW-radar-based studies that are considered in our review paper. One of the initial studies that used an SFCW radar for HGR was reported by Kim et al. in 2009 [31], which used a multi-class SVM classifier to recognize seven different activates. As mentioned in Table 4, the radar data were represented as a 3D image by taking the Short-Time Fourier-Transform (STF) of 6 s raw radar returns. The extracted features used in this this study were based on the Doppler characteristics of a signal reflected from hand. Rather than using the machine-learning technique, another study [31] used a differentiate and cross-multiply (DACM)-based signal processing technique to classify seven hand gestures. Similarly, several studies [21,58,63], used k-Nearest Neighbor kNN for gesture recognition. In [63], the authors used Doppler frequency and time features, along with several additional features based on variations in physical attributes, while performing the gesture, such as the direction and speed of hand movement.
In early years, a lot of attempts were made to extract rich features from the radar-recorded hand gesture signals. However, nowadays, rather than extracting features, the input de-noised data are represented in a suitable format, and a deep-learning-based classifier is used. For example, a radar assembly shown in Figure 9, presented by Skaria et al. [14], used DCNN on the raw time–Doppler (Frequency) signals. The commercial radar sensor, consisting of two antennas, was used to acquire data, which were converted into an image using the STFT algorithm. These frequency domain image data were passed to a DCNN architecture for convolutional features’ extraction. Finally, the classification operation is performed. It can be observed that, over the past decade, for the same kind of sensor data, the trend is shifting from feature-based classification [31] to deep-learning-based classification [14]. The rest of the studies are summarized in Table 4.
Sakamoto and co-workers published two studies [64,65] using I-Q plot and CNN architectures. In the first study, only three gestures were used, whereas the second study demonstrated the performance against six gestures with an accuracy above 90%. Sang et al. [55] used a hidden Markov Model (HMM) and Range–Doppler image features to classify 96.34% of seven gestures. These gestures were all based on finger movements rather than full hand movements. According to the authors, their main focus was on developing a real-time algorithm. Based on a comparison of computational expense, HMM was selected rather than DCNN. Amin and coworkers [58] used kNN-based PCA features to classify 15 gestures. The researcher in [14] implemented DCNN based on images captured by Short-Time Fourier-Transform. A total of 14 gestures were exploited in this study. Miller et al. [21] recently proposed a wearable sensor based on the Doppler radar, named RadSense, intended for use in operation theaters. The sensor is pseudo-contactless and is supposed to be worn on the chest. Four gestures were used, with an average accuracy of 94.5%. A study published by [33] presented a features extraction method, followed by cross-correlation and peak detector gesture classification. Another very recent study published by Yu and co-workers [57] reported 96.88 % accuracy for Doppler-radar-based HGR using spectrograms as input to the DCNN network. A summary of all these studies can be seen in Table 4.

4.2.2. HGR Algorithms for FMCW

In 2015, researchers from the NVIDIA group [45] developed an FMCW radar with one transmitter and four receivers to classify three gestures. The same NVIDIA group published another study, aiming to provide gesture-based assistive interface to drivers using the DCNN algorithm. The presented framework was based on three different sensors, namely, (1) color camera, (2) depth camera and (3) FMCW radar. Prominent and ground-breaking research was performed by Lien and coworkers, from Google, in which they launched a project named Soli, a multipurpose FMCW radar [17]. This radar successfully recognized micro-movements such as finger-tip movement and moving the thumb over a finger. The process diagram and pipeline of the Soli radar is shown in Figure 10. The received signals are first processed using Analog circuitry, followed by a digital signal-processing part, where features are extracted and gesture recognition is performed. Soli demonstrated several use cases, such as finger-movements-based slider design and volume controller. This radar was later used in the Google smartphone as well.
Other research mainly focused on designing HGR algorithms. For example, in reference [72], the researchers used HMM to classify hand gestures with FMCW and reported 82% accuracy for five different hand gestures. Researchers in [71] used the Soli radar developed by Google [17] along with deep-learning to classify hand gestures. In a similar way, another study, presented in [75], used Soli radar in their project, titled “RadarCat”, to classify different objects and HCI for digital painting. Random forest algorithm was used for gesture recognition purposes. Dekker and Geurts [43] used CNN and a low-powered FMCW radar chip in their work. Another work in [46] used Doppler processing to classify two different continuous gestures. A team of engineers from TI [34], USA, designed and developed a multipurpose FMCW radar and demonstrated its use for car door opening and closing. Hazra and Santra [44], from Infineon Technologies AG, Neubiberg, Germany, developed FMCW radar hardware along with a deep learning algorithm. Researchers developed penny-sized FMCW radar chip titled “BGT60TR24”, which is available commercially as well. Ryu et al. [42] used a quantum-inspired evolutionary algorithm (QEA) driven by features extracted using Short Time Fourier Transforms (STFT) to classify seven different gestures, and reported 85% classification accuracy. Suh et al. [70] developed a customized hardware comprising of one transmitter and four receivers, and classified seven different gestures. The study used Range Doppler Map (RDM) as input to the Long Short-Term Memory (LSTM)-encoder-based Neural Network to classify hand gestures. Researchers in [74] used kNN with k = 10 to classify seven hand gestures. They deployed RDM and micro-Doppler to extract five features, namely, the number of chirp cycles, bandwidth of Doppler signal, Median Doppler frequency, normalized STD, and ration between negative and positive Doppler frequency components. Five cross-fold validation-based analysis demonstrated 84% accuracy. In [69], the same authors published a conference paper with similar specifications to those mentioned in the above study. In this study, an additional Connectionist Temporal Classification (CTC) algorithm was also utilized to predict the class labels by using unsegmented input streams.
Choi and co-workers [16] developed a real-time algorithm for HGR. The HGR framework presented by Choi et al. [16] was based on an LSTM encoder with Gaussian Mixture Model (GMM)-based clutter removal operation. The study first detected the gestures using a well-known Constant False Alarm rate (CFAR) algorithm, followed by gesture recognition based on LSTM. Ten different hand gestures were recognized and the proposed algorithm is claimed to work well even in low-SNR conditions. Similar to the Pulsed and Doppler radar, the details regarding use of the FMCW radar are summarized in Table 5 for convenience. Similar to Table 3 and Table 4, which provide a summary of the Pulsed and Doppler radar, the final classification accuracy is not mentioned in Table 5.
A summary of all algorithms presented in this article related to HGR using radar can be seen in Table 3, Table 4 and Table 5.

5. Summary

5.1. Quantitative Analysis

In this section, we have summarized the trends and choice preference of researchers related to the selection of technology and algorithms for HGR. Figure 11a represents the publications related to HGR with portable radars per year. Figure 11a suggests that the trend is constantly increasing per year and the maximum number of papers was seen in 2019. It can be predicted that, ultimately, 2020 will have the largest amount of research. Similarly, Figure 11b show the trend in used radar technology for HGR. It can be seen the CW radar is mostly used for gesture recognition. The trends also suggest that both Doppler and FMCW radars are equally and widely used.

5.2. Nature of Used Hand Gestures and Application Domain Analysis

Several studies for camera-based gesture recognition have focused on the detection and recognition of standard hand gestures such as American Sign Language (ASL) recognition. However, in the case of radar sensors, most of the studies presented in Table 2 primarily demonstrated the effectiveness of their presented HGR algorithm using arbitrary author-defined gestures. The main reason for this is that radar is more proficient for recognizing dynamic gestures instead of static-hand gestures, and there is no standardized gesture-vocabulary for dynamic gestures. It was observed that hand-swiping gestures such as left-to-right, right-to-left, up-to-down and down-to-up swipes are the most widely used hand-gestures. With few exceptions, these four gestures are mostly present in the gesture vocabulary when evaluating the effectiveness of HGR algorithms. Additionally, the hand gestures can be categorized as (1) micro-gestures and (2) macro-gestures. Micro-gestures include minimal movement such as finger movements [11], whereas macro-movements are large movements where the whole hand is displaced from one position to another.
As stated above, most of the research used arbitrary author-selected gestures to evaluate accuracy. Contrary to that, several authors developed algorithms for specific applications, such as in-air digit-writing [9], digital painting [75], and scroll function implementation using finger swipes [17]. Table 6 outlines the details of HCI applications built using radar sensors. Leem et al. [9] demonstrated an implementation of wireless keyboard by using (UWB) impulse radar to write digits in the air. Three radars were deployed to track the hand movement. Another study shows the implementation of digital painting using an FMCW radar [75]. Ahmed and co-workers [11] used the count of raised fingers as input to a single (UWB) Impulse Radar to control devices inside a car. Google Soli demonstrated several actions based on hand movement detection, including scroll function implementation with finger movement, virtual dial tool with finger rotation, and slider tool implementation in two different zones of RCS. In [48], researchers demonstrated the use of a pulsed UWB radar to provide HCI inside vehicles. The remainder of studies are mentioned in Table 6.

5.3. Real-Time HGR Examples

This section highlights a few of the many real-time HGR algorithms. Many successful attempts have been made to recognize hand gestures in real time [9,15,16,18,20,57,70]. The authors of [18] in developed a real-time solution for HGR using the chips from different vendors. The sensing radar was adopted from TI (AWR1642), and the signal processing block (AurixTM TC397 microcontroller) was adopted from Infineon technologies, Am Campeon 1-15 85579 Neubiberg, Germany. The processing times for feature extraction and gesture classification were 0.26 and 0.01 milliseconds, respectively. Another study [70] connected a custom radar with the computer using parallel interface and reported a processing time of less than 200 milliseconds. Leem et al. [9] demonstrated in-air writing with a Novelda radar connected to intel Core i5 PC (with 8 GB ram), and reported a processing time of 50 milliseconds. Without implementing a real-time solution, the authors of [15] mentioned that their system can be used in real-time.

5.4. Security and Privay Analysis of Radar-Based HGR System

As stated in the introduction section, unlike the camera, radar sensors do not carry serious privacy concerns. Only the user’s hand movements are recorded for gesture recognition, instead of filming the users. However, one may raise concerns related to the security of radar-based gesture-recognition devices. The research works reviewed in these articles did not cover the issues related to the security of radar systems.

5.5. Commercially Available Radars for HGR

Since a lot of studies used commercially available radars, this section provides additional details of these commercial solutions. Table 7 lists a few examples of commercially available radars that can be used for gesture recognition. For all these radars, a reference example is also included.

5.6. Ongoing Trends Limitations and Future Direction

Figure 1 suggests that hand gestures are the most widely used gestures for making HCIs. Additionally, the trends shown in Figure 11a represent that, in the last decade, the use of radar sensors for gesture recognition has increased rapidly. The last few years have shown a particularly huge growth. The types of radar used for HGR are (1) pulsed radar (Ultra-Wideband Impulse radars) and (2) CW radar. The trend can be summarized as follows:
  • Continuous-wave radars (FMCW and Doppler radar) are the most widely used radars for HGR;
  • All the research presented in this paper used a single hand for gesture recognition. No research work has been done to detect gestures performed by two hands simultaneously. The detection of gestures using two hands simultaneously has yet to be explored;
  • With few exceptions [11,17,71], macro-movements that include full-hand motions are usually considered when selecting gestures. Here, the term macro corresponds to full-hand movement instead of part of the hand being moved;
  • The most commonly used machine learning algorithms for gesture classification are (kNN), Support vector machine (SVM), CNN, LSTM;
  • For pulsed radar, raw-data-driven deep-learning algorithms are mostly used for HGR. In comparison to CW radars, less work has been done on feature extraction;
  • For FMCW and SFCW radars, various feature-extraction techniques exist. Contrary to this, for pulsed radar, we observed that most of the studies used deep-learning approaches, and hand-crafted features for classification are often not considered. There is a need for a strong set of features for Pulsed UWB radars. All the machine-learning-based classifiers were utilized supervised learning only;
  • Experimentation is normally performed in a controlled lab environment, and scalability to outdoor spaces, large crowds and indoor spaces needs to be tested. Real-time implementation is another challenge, particularly for deep-learning-based algorithms. Several studies performed offline testing only. Usually, the gesture set used is limited to 12–15 gestures only, and each gesture is classified separately. Classifying a series of gestures and continuous gestures remain open issues;
  • Soli radar was seen to be used in Smart Phone and smart watches. However, most of the research did not suggest any strategy to make gesture recognition radars interoperable with other appliances;
  • Researchers focused on training algorithms using supervised machine-learning concepts only. The un-supervised machine-learning algorithms have a great potential for gesture-recognition algorithms in future;
  • The security of radar-based HGR devices has yet to be explored.

6. Conclusions

A huge upsurge and rapid advancement of radar-based HGR was witnessed in the past decade. This paper reviewed some of the research related to HGR applications using radars. Currently, the researchers rely heavily on the commercially available radars made by tech companies such as Infenion, Novelda and Texas Instrument. With these systems being on chips, much attention has been paid to develop the gesture detection and recognition algorithms. In recent years, interest is shifting from signal-processing-based HGR algorithms to deep-learning-based algorithms. Particularly, variants of CNN have shown promising applicability. Although radar sensors offer several advantages over the other HGR sensors (i.e., wearable sensors and cameras), the adoption of radar-based HGR in our daily lives is still lagging behind these competing technologies. Attention must be paid to miniature hardware development and real-time recognition algorithms’ development.

Author Contributions

S.A. (Shahzad Ahmed) conducted the literature review, gathered the related papers, and wrote the paper. K.D.K. and S.A. (Sarfaraz Ahmed) structured the paper. S.H.C. edited the final paper and supervised the overall project. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Bio & Medical Technology Development Program of the National Research Foundation (NRF) funded by the Korean government -MSIT. (2017M3A9E2064626). The authors would also like to thank all the human volunteers for their time and effort in data acquisition.

Acknowledgments

The authors would like to thank the reviewers for their time and effort.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Laurel, B.; Mountford, S.J. The Art of Human-Computer Interface Design; Addison-Wesley Longman Publishing Co. Inc.: Boston, MA, USA, 1990. [Google Scholar]
  2. Yeo, H.-S.; Lee, B.-G.; Lim, H. Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware. Multimed. Tools Appl. 2015, 74, 2687–2715. [Google Scholar] [CrossRef]
  3. Pisa, S.; Chicarella, S.; Pittella, E.; Piuzzi, E.; Testa, O.; Cicchetti, R. A Double-Sideband Continuous-Wave Radar Sensor for Carotid Wall Movement Detection. IEEE Sens. J. 2018, 18, 8162–8171. [Google Scholar] [CrossRef]
  4. Nanzer, J.A. A review of microwave wireless techniques for human presence detection and classification. IEEE Trans. Microw. Theory Tech. 2017, 65, 1780–1794. [Google Scholar] [CrossRef]
  5. Kang, S.; Lee, Y.; Lim, Y.-H.; Park, H.-K.; Cho, S.H. Validation of noncontact cardiorespiratory monitoring using impulse-radio ultra-wideband radar against nocturnal polysomnography. Sleep Breath. 2019, 24, 1–8. [Google Scholar] [CrossRef] [PubMed]
  6. Putzig, N.E.; Smith, I.B.; Perry, M.R.; Foss, F.J., II; Campbell, B.A.; Phillips, R.J.; Seu, R. Three-dimensional radar imaging of structures and craters in the Martian polar caps. Icarus 2018, 308, 138–147. [Google Scholar] [CrossRef]
  7. Choi, J.W.; Quan, X.; Cho, S.H. Bi-directional passing people counting system based on IR-UWB radar sensors. IEEE Internet Things J. 2017, 5, 512–522. [Google Scholar] [CrossRef]
  8. Santra, A.; Ulaganathan, R.V.; Finke, T. Short-range millimetric-wave radar system for occupancy sensing application. IEEE Sens. Lett. 2018, 2, 1–4. [Google Scholar] [CrossRef]
  9. Leem, S.K.; Khan, F.; Cho, S.H. Detecting Mid-air Gestures for Digit Writing with Radio Sensors and a CNN. IEEE Trans. Instrum. Meas. 2019, 69, 1066–1081. [Google Scholar] [CrossRef]
  10. Li, G.; Zhang, S.; Fioranelli, F.; Griffiths, H. Effect of sparsity-aware time–frequency analysis on dynamic hand gesture classification with radar micro-Doppler signatures. IET RadarSonar Navig. 2018, 12, 815–820. [Google Scholar] [CrossRef] [Green Version]
  11. Ahmed, S.; Khan, F.; Ghaffar, A.; Hussain, F.; Cho, S.H. Finger-counting-based gesture recognition within cars using impulse radar with convolutional neural network. Sensors 2019, 19, 1429. [Google Scholar] [CrossRef] [Green Version]
  12. Ahmed, S.; Cho, S.H. Hand Gesture Recognition Using an IR-UWB Radar with an Inception Module-Based Classifier. Sensors 2020, 20, 564. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Ghaffar, A.; Khan, F.; Cho, S.H. Hand pointing gestures based digital menu board implementation using IR-UWB transceivers. IEEE Access 2019, 7, 58148–58157. [Google Scholar] [CrossRef]
  14. Skaria, S.; Al-Hourani, A.; Lech, M.; Evans, R.J. Hand-gesture recognition using two-antenna Doppler radar with deep convolutional neural networks. IEEE Sens. J. 2019, 19, 3041–3048. [Google Scholar] [CrossRef]
  15. Zhang, Z.; Tian, Z.; Zhou, M. Latern: Dynamic continuous hand gesture recognition using FMCW radar sensor. IEEE Sens. J. 2018, 18, 3278–3289. [Google Scholar] [CrossRef]
  16. Choi, J.-W.; Ryu, S.-J.; Kim, J.-H. Short-range radar based real-time hand gesture recognition using LSTM encoder. IEEE Access 2019, 7, 33610–33618. [Google Scholar] [CrossRef]
  17. Lien, J.; Gillian, N.; Karagozler, M.E.; Amihood, P.; Schwesig, C.; Olson, E.; Raja, H.; Poupyrev, I. Soli: Ubiquitous gesture sensing with millimeter wave radar. ACM Trans. Graph. 2016, 35, 1–19. [Google Scholar] [CrossRef] [Green Version]
  18. Liu, C.; Li, Y.; Ao, D.; Tian, H. Spectrum-Based Hand Gesture Recognition Using Millimeter-Wave Radar Parameter Measurements. IEEE Access 2019, 7, 79147–79158. [Google Scholar] [CrossRef]
  19. Wang, Y.; Wang, S.; Zhou, M.; Jiang, Q.; Tian, Z. TS-I3D based hand gesture recognition method with radar sensor. IEEE Access 2019, 7, 22902–22913. [Google Scholar] [CrossRef]
  20. Khan, F.; Leem, S.K.; Cho, S.H. In-Air Continuous Writing Using UWB Impulse Radar Sensors. IEEE Access 2020, 8, 99302–99311. [Google Scholar] [CrossRef]
  21. Miller, E.; Li, Z.; Mentis, H.; Park, A.; Zhu, T.; Banerjee, N. RadSense: Enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radar. Smart Health 2020, 15, 100089. [Google Scholar] [CrossRef]
  22. Thi Phuoc Van, N.; Tang, L.; Demir, V.; Hasan, S.F.; Duc Minh, N.; Mukhopadhyay, S. Microwave Radar Sensing Systems for Search and Rescue Purposes. Sensors 2019, 19, 2879. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Mitra, S.; Acharya, T. Gesture recognition: A survey. IEEE Trans. Syst. ManCybern. Part C 2007, 37, 311–324. [Google Scholar] [CrossRef]
  24. Hogan, K. Can’t Get through: Eight Barriers to Communication; Pelican Publishing: New Orleans, LA, USA, 2003. [Google Scholar]
  25. Rautaray, S.S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
  26. Wachs, J.P.; Stern, H.I.; Edan, Y.; Gillam, M.; Handler, J.; Feied, C.; Smith, M. A gesture-based tool for sterile browsing of radiology images. J. Am. Med Inform. Assoc. 2008, 15, 321–323. [Google Scholar] [CrossRef]
  27. Joseph, J.; Divya, D. Hand Gesture Interface for Smart Operation Theatre Lighting. Int. J. Eng. Technol. 2018, 7, 20–23. [Google Scholar] [CrossRef]
  28. Hotson, G.; McMullen, D.P.; Fifer, M.S.; Johannes, M.S.; Katyal, K.D.; Para, M.P.; Armiger, R.; Anderson, W.S.; Thakor, N.V.; Wester, B.A. Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject. J. Neural Eng. 2016, 13, 026017. [Google Scholar] [CrossRef] [Green Version]
  29. Nakanishi, Y.; Yanagisawa, T.; Shin, D.; Chen, C.; Kambara, H.; Yoshimura, N.; Fukuma, R.; Kishima, H.; Hirata, M.; Koike, Y. Decoding fingertip trajectory from electrocorticographic signals in humans. Neurosci. Res. 2014, 85, 20–27. [Google Scholar] [CrossRef]
  30. Zheng, C.; Hu, T.; Qiao, S.; Sun, Y.; Huangfu, J.; Ran, L. Doppler bio-signal detection based time-domain hand gesture recognition. In Proceedings of the 2013 IEEE MTT-S International Microwave Workshop Series on RF and Wireless Technologies for Biomedical and Healthcare Applications (IMWS-BIO), Singapore, 9–13 December 2013. [Google Scholar]
  31. Kim, Y.; Ling, H. Human activity classification based on micro-Doppler signatures using a support vector machine. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1328–1337. [Google Scholar]
  32. Li, C.; Peng, Z.; Huang, T.-Y.; Fan, T.; Wang, F.-K.; Horng, T.-S.; Munoz-Ferreras, J.-M.; Gomez-Garcia, R.; Ran, L.; Lin, J. A review on recent progress of portable short-range noncontact microwave radar systems. IEEE Trans. Microw. Theory Tech. 2017, 65, 1692–1706. [Google Scholar] [CrossRef]
  33. Pramudita, A.A. Time and Frequency Domain Feature Extraction Method of Doppler Radar for Hand Gesture Based Human to Machine Interface. Prog. Electromagn. Res. 2020, 98, 83–96. [Google Scholar] [CrossRef] [Green Version]
  34. Rao, S.; Ahmad, A.; Roh, J.C.; Bharadwaj, S. 77GHz single chip radar sensor enables automotive body and chassis applications. Tex. Instrum. 2017. Available online: Http://Www.Ti.Com/Lit/Wp/Spry315/Spry315.Pdf (accessed on 24 August 2020).
  35. Islam, M.T.; Nirjon, S. Wi-Fringe: Leveraging Text Semantics in WiFi CSI-Based Device-Free Named Gesture Recognition. arXiv 2019, arXiv:1908.06803. [Google Scholar]
  36. Pu, Q.; Gupta, S.; Gollakota, S.; Patel, S. Whole-home gesture recognition using wireless signals. In Proceedings of the 19th Annual International Conference on Mobile Computing & Networking, Miami, FL, USA, 30 September–4 October 2013. [Google Scholar]
  37. Arbabian, A.; Callender, S.; Kang, S.; Rangwala, M.; Niknejad, A.M. A 94 GHz mm-wave-to-baseband pulsed-radar transceiver with applications in imaging and gesture recognition. IEEE J. Solid-State Circuits 2013, 48, 1055–1071. [Google Scholar] [CrossRef]
  38. Fhager, L.O.; Heunisch, S.; Dahlberg, H.; Evertsson, A.; Wernersson, L.-E. Pulsed Millimeter Wave Radar for Hand Gesture Sensing and Classification. IEEE Sens. Lett. 2019, 3, 1–4. [Google Scholar] [CrossRef]
  39. Heunisch, S.; Fhager, L.O.; Wernersson, L.-E. Millimeter-wave pulse radar scattering measurements on the human hand. IEEE Antennas Wirel. Propag. Lett. 2019, 18, 1377–1380. [Google Scholar] [CrossRef]
  40. Fan, T.; Ma, C.; Gu, Z.; Lv, Q.; Chen, J.; Ye, D.; Huangfu, J.; Sun, Y.; Li, C.; Ran, L. Wireless hand gesture recognition based on continuous-wave Doppler radar sensors. IEEE Trans. Microw. Theory Tech. 2016, 64, 4012–4020. [Google Scholar] [CrossRef]
  41. Huang, S.-T.; Tseng, C.-H. Hand-gesture sensing Doppler radar with metamaterial-based leaky-wave antennas. In Proceedings of the 2017 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Nagoya, Aichi, Japan, 19–21 March 2017. [Google Scholar]
  42. Ryu, S.-J.; Suh, J.-S.; Baek, S.-H.; Hong, S.; Kim, J.-H. Feature-based hand gesture recognition using an FMCW radar and its temporal feature analysis. IEEE Sens. J. 2018, 18, 7593–7602. [Google Scholar] [CrossRef]
  43. Dekker, B.; Jacobs, S.; Kossen, A.; Kruithof, M.; Huizing, A.; Geurts, M. Gesture recognition with a low power FMCW radar and a deep convolutional neural network. In Proceedings of the 2017 European Radar Conference (EURAD), Nuremberg, Germany, 10–13 October 2017. [Google Scholar]
  44. Hazra, S.; Santra, A. Robust gesture recognition using millimetric-wave radar system. IEEE Sens. Lett. 2018, 2, 1–4. [Google Scholar] [CrossRef]
  45. Molchanov, P.; Gupta, S.; Kim, K.; Pulli, K. Short-range FMCW monopulse radar for hand-gesture sensing. In Proceedings of the 2015 IEEE Radar Conference, Johannesburg, South Africa, 27–30 October 2015. [Google Scholar]
  46. Peng, Z.; Li, C.; Muñoz-Ferreras, J.-M.; Gómez-García, R. An FMCW radar sensor for human gesture recognition in the presence of multiple targets. In Proceedings of the 2017 First IEEE MTT-S International Microwave Bio Conference (IMBIOC), Gothenburg, Sweden, 15–17 May 2017. [Google Scholar]
  47. Khan, F.; Leem, S.K.; Cho, S.H. Hand-based gesture recognition for vehicular applications using IR-UWB radar. Sensors 2017, 17, 833. [Google Scholar] [CrossRef]
  48. Khan, F.; Cho, S.H. Hand based Gesture Recognition inside a car through IR-UWB Radar. Korean Soc. Electron. Eng. 2017, 154–157. Available online: https://repository.hanyang.ac.kr/handle/20.500.11754/106113 (accessed on 24 August 2020).
  49. Ren, N.; Quan, X.; Cho, S.H. Algorithm for gesture recognition using an IR-UWB radar sensor. J. Comput. Commun. 2016, 4, 95–100. [Google Scholar] [CrossRef] [Green Version]
  50. Park, J.; Cho, S.H. IR-UWB radar sensor for human gesture recognition by using machine learning. In Proceedings of the 2016 IEEE 18th International Conference on High Performance Computing and Communications; IEEE 14th International Conference on Smart City; IEEE 2nd International Conference on Data Science and Systems (HPCC/SmartCity/DSS), Sydney, Australia, 12–14 December 2016. [Google Scholar]
  51. Kim, S.Y.; Han, H.G.; Kim, J.W.; Lee, S.; Kim, T.W. A hand gesture recognition sensor using reflected impulses. IEEE Sens. J. 2017, 17, 2975–2976. [Google Scholar] [CrossRef]
  52. Khan, F.; Leem, S.K.; Cho, S.H. Algorithm for Fingers Counting Gestures Using IR-UWB Radar Sensor. Available online: https://www.researchgate.net/publication/323726266_Algorithm_for_fingers_counting_gestures_using_IR-UWB_radar_sensor (accessed on 28 August 2020).
  53. Leem, S.K.; Khan, F.; Cho, S.H. Remote Authentication Using an Ultra-Wideband Radio Frequency Transceiver. In Proceedings of the 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 10–13 January 2020. [Google Scholar]
  54. Kim, Y.; Toomajian, B. Hand gesture recognition using micro-Doppler signatures with convolutional neural network. IEEE Access 2016, 4, 7125–7130. [Google Scholar] [CrossRef]
  55. Sang, Y.; Shi, L.; Liu, Y. Micro hand gesture recognition system using ultrasonic active sensing. IEEE Access 2018, 6, 49339–49347. [Google Scholar] [CrossRef]
  56. Kim, Y.; Toomajian, B. Application of Doppler radar for the recognition of hand gestures using optimized deep convolutional neural networks. In Proceedings of the 2017 11th European Conference on Antennas and Propagation (EUCAP), Paris, France, 19–24 March 2017. [Google Scholar]
  57. Yu, M.; Kim, N.; Jung, Y.; Lee, S. A Frame Detection Method for Real-Time Hand Gesture Recognition Systems Using CW-Radar. Sensors 2020, 20, 2321. [Google Scholar] [CrossRef] [Green Version]
  58. Amin, M.G.; Zeng, Z.; Shan, T. Hand gesture recognition based on radar micro-Doppler signature envelopes. In Proceedings of the 2019 IEEE Radar Conference (RadarConf), Boston, MA, USA, 22–26 April 2019. [Google Scholar]
  59. Li, G.; Zhang, R.; Ritchie, M.; Griffiths, H. Sparsity-based dynamic hand gesture recognition using micro-Doppler signatures. In Proceedings of the 2017 IEEE Radar Conference (RadarConf), Seattle, WA, USA, 8–12 May 2017. [Google Scholar]
  60. Li, G.; Zhang, R.; Ritchie, M.; Griffiths, H. Sparsity-driven micro-Doppler feature extraction for dynamic hand gesture recognition. IEEE Trans. Aerosp. Electron. Syst. 2017, 54, 655–665. [Google Scholar] [CrossRef]
  61. Zhang, S.; Li, G.; Ritchie, M.; Fioranelli, F.; Griffiths, H. Dynamic hand gesture classification based on radar micro-Doppler signatures. In Proceedings of the 2016 CIE International Conference on Radar (RADAR), Guangzhou, China, 10–13 October 2016. [Google Scholar]
  62. Gao, X.; Xu, J.; Rahman, A.; Yavari, E.; Lee, A.; Lubecke, V.; Boric-Lubecke, O. Barcode based hand gesture classification using AC coupled quadrature Doppler radar. In Proceedings of the 2016 IEEE MTT-S International Microwave Symposium (IMS), San Francisco, CA, USA, 22–27 May 2014. [Google Scholar]
  63. Wan, Q.; Li, Y.; Li, C.; Pal, R. Gesture recognition for smart home applications using portable radar sensors. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014. [Google Scholar]
  64. Sakamoto, T.; Gao, X.; Yavari, E.; Rahman, A.; Boric-Lubecke, O.; Lubecke, V.M. Radar-based hand gesture recognition using IQ echo plot and convolutional neural network. In Proceedings of the 2017 IEEE Conference on Antenna Measurements & Applications (CAMA), Tsukuba, Japan, 4–6 December 2017. [Google Scholar]
  65. Sakamoto, T.; Gao, X.; Yavari, E.; Rahman, A.; Boric-Lubecke, O.; Lubecke, V.M. Hand gesture recognition using a radar echo I–Q plot and a convolutional neural network. IEEE Sens. Lett. 2018, 2, 1–4. [Google Scholar] [CrossRef]
  66. Wang, Z.; Li, G.; Yang, L. Dynamic Hand Gesture Recognition Based on Micro-Doppler Radar Signatures Using Hidden Gauss-Markov Models. IEEE Geosci. Remote Sens. Lett. 2020, 18, 291–295. [Google Scholar] [CrossRef]
  67. Klinefelter, E.; Nanzer, J.A. Interferometric radar for spatially-persistent gesture recognition in human-computer interaction. In Proceedings of the 2019 IEEE Radar Conference (RadarConf), Boston, MA, USA, 22–26 April 2019. [Google Scholar]
  68. Li, K.; Jin, Y.; Akram, M.W.; Han, R.; Chen, J. Facial expression recognition with convolutional neural networks via a new face cropping and rotation strategy. Vis. Comput. 2020, 36, 391–404. [Google Scholar] [CrossRef]
  69. Zhang, Z.; Tian, Z.; Zhou, M.; Liu, Y. Application of FMCW radar for dynamic continuous hand gesture recognition. In Proceedings of the 11th EAI International Conference on Mobile Multimedia Communications, Qingdao, China, 21–22 June 2018. [Google Scholar]
  70. Suh, J.S.; Ryu, S.; Han, B.; Choi, J.; Kim, J.-H.; Hong, S. 24 GHz FMCW radar system for real-time hand gesture recognition using LSTM. In Proceedings of the 2018 Asia-Pacific Microwave Conference (APMC), Kyoto, Japan, 6–9 November 2018. [Google Scholar]
  71. Wang, S.; Song, J.; Lien, J.; Poupyrev, I.; Hilliges, O. Interacting with soli: Exploring fine-grained dynamic gesture recognition in the radio-frequency spectrum. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016. [Google Scholar]
  72. Malysa, G.; Wang, D.; Netsch, L.; Ali, M. Hidden Markov model-based gesture recognition with FMCW radar. In Proceedings of the 2016 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Washington, DC, USA, 7–9 December 2016. [Google Scholar]
  73. Molchanov, P.; Gupta, S.; Kim, K.; Pulli, K. Multi-sensor system for driver’s hand-gesture recognition. In Proceedings of the 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Ljubljana, Slovenia, 4–8 May 2015. [Google Scholar]
  74. Sun, Y.; Fei, T.; Schliep, F.; Pohl, N. Gesture classification with handcrafted micro-Doppler features using a FMCW radar. In Proceedings of the 2018 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Munich, Germany, 15–17 April 2018. [Google Scholar]
  75. Yeo, H.-S.; Flamich, G.; Schrempf, P.; Harris-Birtill, D.; Quigley, A. Radarcat: Radar categorization for input & interaction. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016. [Google Scholar]
  76. Gupta, S.; Morris, D.; Patel, S.; Tan, D. Soundwave: Using the doppler effect to sense gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012. [Google Scholar]
  77. Lazaro, A.; Girbau, D.; Villarino, R. Techniques for clutter suppression in the presence of body movements during the detection of respiratory activity through UWB radars. Sensors 2014, 14, 2595–2618. [Google Scholar] [CrossRef]
Figure 1. Usage of different body parts to make a human–computer interface (HCI) [25].
Figure 1. Usage of different body parts to make a human–computer interface (HCI) [25].
Remotesensing 13 00527 g001
Figure 2. Pipeline of hand gesture recognition (HGR)-based HCI design. (1) User performs pre-defined hand gestures, (2) the performed gesture is captured with different sensors, (3) sensor data are digitized and formatted in an appropriate way and (4) finally, the devices are controlled.
Figure 2. Pipeline of hand gesture recognition (HGR)-based HCI design. (1) User performs pre-defined hand gestures, (2) the performed gesture is captured with different sensors, (3) sensor data are digitized and formatted in an appropriate way and (4) finally, the devices are controlled.
Remotesensing 13 00527 g002
Figure 3. HCI with radar-recorded hand-gestures as an input: Predefined hand gesture is performed by users in radar cross-sectional area. This signal is passed through a clutter reduction filter and the signal is represented in any of the mentioned signal representation schema. Finally, recognition is performed.
Figure 3. HCI with radar-recorded hand-gestures as an input: Predefined hand gesture is performed by users in radar cross-sectional area. This signal is passed through a clutter reduction filter and the signal is represented in any of the mentioned signal representation schema. Finally, recognition is performed.
Remotesensing 13 00527 g003
Figure 4. Block diagram of 94 GHz pulsed radar designed for hand-gesture sensing [38].
Figure 4. Block diagram of 94 GHz pulsed radar designed for hand-gesture sensing [38].
Remotesensing 13 00527 g004
Figure 5. Block diagram of hand gesture sensing SFCW radar [40].
Figure 5. Block diagram of hand gesture sensing SFCW radar [40].
Remotesensing 13 00527 g005
Figure 6. (a) block diagram of FMCW radar hardware (b) the frequency response of transmitted and received waveform [15].
Figure 6. (a) block diagram of FMCW radar hardware (b) the frequency response of transmitted and received waveform [15].
Remotesensing 13 00527 g006
Figure 7. Hand-gesture data representation examples; (a) Time–Amplitude [51]: ASL radar data for S, V, B gesture. (b) Range–Amplitude [75] data of different objects touching radar (c) Time–Range variations [38] of finger-swiping gesture (d) Time–Doppler [54] frequency vitiation for left–right swipe (e) Range–Doppler (f) Time–Frequency representation [67].
Figure 7. Hand-gesture data representation examples; (a) Time–Amplitude [51]: ASL radar data for S, V, B gesture. (b) Range–Amplitude [75] data of different objects touching radar (c) Time–Range variations [38] of finger-swiping gesture (d) Time–Doppler [54] frequency vitiation for left–right swipe (e) Range–Doppler (f) Time–Frequency representation [67].
Remotesensing 13 00527 g007
Figure 8. A deep-learning framework for UWB Impulse Radar. Data from two radars are concatenated together and fed as input to inception-module based modified GoogLeNet Net architecture Ahmed et al. [12].
Figure 8. A deep-learning framework for UWB Impulse Radar. Data from two radars are concatenated together and fed as input to inception-module based modified GoogLeNet Net architecture Ahmed et al. [12].
Remotesensing 13 00527 g008
Figure 9. Two antenna Doppler radar assembly and DCNN architectures for HGR [14].
Figure 9. Two antenna Doppler radar assembly and DCNN architectures for HGR [14].
Remotesensing 13 00527 g009
Figure 10. Google’s Soli Radar for gesture recognition. (a) Overall block diagram and pipeline of gesture-based HCI development (b) Exemplary demonstration of several gestures and their machine interfaces.
Figure 10. Google’s Soli Radar for gesture recognition. (a) Overall block diagram and pipeline of gesture-based HCI development (b) Exemplary demonstration of several gestures and their machine interfaces.
Remotesensing 13 00527 g010
Figure 11. Performed quantitative analysis for HGR (a) Number of research articles written per year (b) usage of different types of radar technologies for HGR.
Figure 11. Performed quantitative analysis for HGR (a) Number of research articles written per year (b) usage of different types of radar technologies for HGR.
Remotesensing 13 00527 g011
Table 1. Comparison of wearable and non-wearable (wireless) sensors for sensing and measuring hand gestures.
Table 1. Comparison of wearable and non-wearable (wireless) sensors for sensing and measuring hand gestures.
Comparison CriterionWearable Systems (Gloves, Wristbands, etc.)Wireless Systems (Radar and Camera)
Health-related issuesMay cause discomfort to users, as they are always required to wear gloves, or other related sensorsWireless sensor will not cause any skin-related issue
Sensing/operational rangeUsually high, if wireless data transfer is supportedOperates in a contained environment. Line of sight is usually required between hand and the sensors
Usage convenienceLess convenient (for case of HCI): Users are always required to wear a sensorUsers are not required to wear any sensor
Simultaneous recognition of multiple
users/hands within a small area.
Can sense gestures from different users simultaneously at one locationAt one location, recognition capability is often limited to a specific number of users/hands
Sensitivity to background conditions (such as noise)Often less sensitive to ambient conditionsMore sensitive than wearable devices
Device theft issuesCan be lost or forgottenNo such concerns, since the sensor is usually fabricated inside device or installed at a certain location.
Table 2. Use of different radars for HGR and research scope in terms of hardware and software design.
Table 2. Use of different radars for HGR and research scope in terms of hardware and software design.
Research FocusPulsed RadarSingle Frequency Continuous Wave (SFCW) Radar Frequency Modulated Continuous Wave (FMCW) RadarRadar Alike Hardware’s (SONAR, etc.)
Hardware
designing
[37,38,39][40,41][17,34,42,43,44,45,46]N/A
Algorithm
Designing
[9,11,12,13,20,47,48,49,50,51,52,53,54,55,56][14,21,30,31,33,57,58,59,60,61,62,63,64,65,66,67][15,16,18,19,68,69,70,71,72,73,74,75][27,35,36,76]
Table 3. Summary of HGR algorithms for Pulsed Radar.
Table 3. Summary of HGR algorithms for Pulsed Radar.
Study
and Year
Data
Representation and
Data Dimensions
Algorithmic
Details
FrequencyNo. of
Gestures
Distance
Between Hand
and Sensor
Participants
and
Samples Per Gesture
Number of Radars
Arbabian et al. [37] (2013)N/AHardware only, no algorithm proposed94 GHzN/ANot mentionedTested hand tracking only1
Park and Cho [50] (2016)Time–Range (2D)SVM7.29 GHz50–1 m1,
500
1
Ren et al. [49]
(2016)
Time–Amplitude (1D)Conditional statements6.8 GHz61 m1,
50
1
Khan and Cho [48] (2017)Time–Range (2D)Neural Network6.8 GHz6Not
specified
1,
10 s
(Samples not specified)
1
Kim and Toomajian [54], (2016)Time–Doppler (3D-RGB)DCNN5.8 GHz100.1 m1,
500
1
Khan et al. [47]
(2017)
Time–Range (2D matrix)Unsupervised clustering. K-means6.8 GHz5~ 1 m
approx
3,
50
1
Kim et al. [51]
(2017)
Time–Amplitude (1-D)(1-D) CNNNot mentioned60.15 m5,
81
1
Kim and Toomajian [56], (2017)Time–Doppler (3D-RGB)DCNN5.8 GHz70.1 m1,
25
1
Sang et al. [55], (2018)Range–Doppler image
features (2D; constructed greyscale image from data)
HMM300 kHz
(active sensing)
7Not
specified
9,
50
1
Ahmed et al. [11], (2019)Time–Range (2D; constructed greyscale image from data)Deep-Learning7.29 GHz50.45 m3,
100
1
Fhager et al. [38], (2019)Time–Range envelop (1D)DCNN60 GHz30.10–0.30 m2,
180
1
Heunisch et al. [39], (2019)Range–RCS (1D)Observing backscattered waves60 GHz30.25 mNote specified,
1000
1
Ghaffar et al. [13] (2019)Time–Range (2D; constructed greyscale image from data)Multiclass
SVM
7.29 GHz9Less than 0.5 m4,
100
4
Leem et al. [9] (2019)Time–Range (2D; constructed greyscale image from data)CNN7.29 GHz100–1 m5,
400
3
Ahmed and Cho. [12], (2020)Time–Range (3D-RGB data)GoogLeNet framework7.29 GHz83–8 m3,
100
1 & 2
Leem et al. [53], (2020)Time–Range (2D; constructed greyscale image from data)DCNN7.29 GHzDrawing gestureNot
Specified
5,
Not specified
4
Khan et al. [20], (2020)Time–Range (2D; constructed greyscale image from data)CNN7.29 GHzPerformed digit writing0–1 m3,
300
4
Table 4. Summary of HGR algorithms for SFCW (Doppler) radar.
Table 4. Summary of HGR algorithms for SFCW (Doppler) radar.
Study
and Year
Data Representation and Data
Dimensions
Algorithmic
Details
FrequencyNo. of
Gestures
Distance
Between Hand
& Sensor
Participants
and Total
Samples Per Gesture
Number of Radars
Kim et al. [31]
(2009)
Time–Frequency (3D; radar signal was passed through a STFT)SVM2.4 GHz7
(including activities)
2–8 m12,
Not specified
1
Zheng et al. [30] (2013)Time–Range (1-D; hand motion vector)Differentiate and
Cross-Multiply
N/ANot
applicable
0–1 m (tracking)Not applicable
(tracked hand)
2 and 3
Wan et al. [63]
(2014)
Time–Amplitude (1D)kNN
(k = 3)
2.4 GHz3Up to 2 m1,
20
1
Fan et al. [40] (2016)Positioning (2D; motion imaging)Arcsine
Algorithm,
2D motion imaging algorithm
5.8 GHz20–0.2
m
Did not trained algorithm1
(with
multiple antennas)
Gao et al. [62] (2016)Time–Amplitude (1D; A barcode was made based on zero-crossing rate)time-domain zero-crossing2.4 GHz81.5 m, 0.76
m
Measured for 60 s to generate the barcode1
Zhang et al. [61] (2016)Time–Doppler frequency (2D)SVM9.8 GHz40.3
m
1,
50
1
Huang et al. [41] (2017)Time–Amplitude (1D)Range–Doppler map (RDM)5.1, 5.8, 6.5 GHz20.2 mNot applicable
(hand-tracking)
1
Li. et al. [60] (2018)Time-Doppler (2D)NN Classifier (with Modified Hausdorff Distance)25 GHz40.3 m3,
60
1
Sakamoto et al. [64] (2017)Image made with the In-Phase and Quadrature signal trajectory (2D)CNN2.4 GHz61.2 m1,
29
1
Sakamoto et al. [65] (2018)Image made with the In-Phase and Quadrature signal trajectory (2D)CNN2.4-GHz61.2 m1,
29
1
Amin et al. [58] (2019)Time–Doppler frequency (3D RGB image)kNN with k = 125 GHz150.2 m4,
5
1
Skaria et al. [14] (2019)Time–Doppler (2D image)DCNN24 GHz140.1–0.3 m1,
250
1
Klinefelter and Nanzer [67] (2019)Time–Frequency (2D; frequency analysis)Angular velocity of hand motions16.9 GHz50.2 mNot applicable1
Miller et al. [21] (2020)Time–Amplitude (1D)kNN with k = 1025 GHz5Less than 0.5 m5,
Continuous data
1
Yu et al. [57]
(2020)
Time–Doppler (3D RGB image)DCNN24 GHz60.3 m4,
100
1
Wang et al. [66] (2020)Time–Doppler (2D)Hidden Gauss Markov Model25 GHz40.3 m5,
20
1
Table 5. Summary of HGR algorithms for FMCW radar.
Table 5. Summary of HGR algorithms for FMCW radar.
Study
and Year
Data Representation
and Data Dimensions
Algorithmic
Details
Frequency
and BW
No. of
Gestures
Distance
Between Hand
and Sensor
Participants
and
Samples Per Gesture
Number of Radars
Molchanov et al. (NVIDIA) [45] (2015)Time–Doppler (2D)Energy estimation4 GHzNot
specified
Not
specified
Performed hand tracking1 (and 1 depth sensor)
Molchanov et al. (NVIDIA) [73] (2015)Time–Doppler (2D)DCNN4 GHz100–0.5 m3,
1714 (total
samples in
dataset)
1
Lien eta al. (Google) [17] (2016)Range–Doppler, Time–Range and Time-Doppler (1D and 2D representation)Random Forest60 GHz4 (performed several tracking tasks too)Limited to 0.3 m [71]5,
1000
1
Malysa et al. [72] (2016)Time–Doppler, Time–velocity (2D)HMM77 GHz6Not specified2,
100
1
Wang et al. [71] (2016)Range–Doppler (3D RGB image)CNN/RNN60 GHz110.3 m10,
specified 251
1
Yeo et al. [75] (2016)Range–Amplitude (1D)Random forest60 GHzNot
applicable
0–0.3 mTracked the hand1
Dekker et al. [43] (2017)Time–Doppler velocity (3D RGB image)CNN24 GHz3 3,
1000 samples in total
1
Peng et al. [46] (2017)Range–Doppler frequency (3D RGB image)Did not performed the
classification
5.8 GHz3Not specified2,
Not applicable
1
Rao et al. (TI) [34] (2017)Range–Doppler velocity (3D RGB image)Demonstrated the potential use only.77 GHz1
(car trunk door
opening)
0.5 mNot specified1
Li and Ritchie [59]
(2017)
Time–Doppler frequency (3D RGB image)Naïve Bayes, kernel estimators NN, SVM25 GHz40.3 m3,
20
1
Hazra and Santra [44] (2018)Range–Doppler (3D RGB image)DCNN60 GHz5Not
specified
10,
150 later used 5 other individuals for testing.
1
Ryu et al. [42] (2018)Range–Doppler (2D FFT)QEA25 GHz70–0.5 mNot specified,
15
1
Suh et al. [70] (2018)Range–Doppler (2D greyscale image)Machine learning24 GHz70.2–0.4 m2,
120 (140 additional samples
for testing.
1
Sun et al. [74]
(2018)
Time–Doppler frequency (2D; features were extracted from an image)kNN77 GHz7, performed by car-driverNot
specified
6,
50
1
Zhang et al. [15] (2018)Time–Range (3D RGB image)Deep learning (DCNN)24 GHz81.5, 2, and 3 m4,
100
1
Zhang et al. [69] (2018)Time–Range (3D RGB image)Deep-Learning (DCNN)24 GHz8Not specifiedAuthors
mentioned 80 seconds
1
Choi et al. [16] (2019)Range–Doppler. 1D Motion profiles generated using 3D range–Doppler map (1D)LSTM encoder.77–81 GHz10Not specified10,
20
1
Liu et al. [18] (2019)Time–Range and and time–Velocity (2D)Signal processing-based technique77–81 GHz6Not specifiedMentioned 869 total samples only.1
Wang et al.
[19] (2019)
Range–Doppler velocityDeep-Learning77–79 GHz10Not specifiedNot specified,
400
1
Table 6. Summary of developed applications using Radar Sensors.
Table 6. Summary of developed applications using Radar Sensors.
StudyImplemented Application(s)
[9]In-air digit-writing virtual keyboard using multiple UWB Impulse Radars
[75]Digital painting.
[17]Scroll and dial implementation using finger sliding and rotation
[11]Finger-counting-based HCI to control devices inside car
[73]A multisensory HGR system to provide HCI to assist drivers
[48]HGR inside car using pulsed radar, intended for vehicular applications.
[36,63]Smart Home applications
Table 7. Summary of developed applications using Radar Sensors.
Table 7. Summary of developed applications using Radar Sensors.
Radar HardwareCompanyStudies
Bumblebee Doppler radarSamraksh Co. Ltd., Dublin OH 43017,
Ireleand
[54,56]
Xethru X2Novelda, Oslo, Gjerdrums vei 8
0484 Oslo, Norway
[47]
NVA6100 (Novelda)Novelda, Oslo, Gjerdrums vei 8
0484 Oslo, Norway
[49]
NVA6201 (Novelda)Novelda, Oslo, Gjerdrums vei 8
0484 Oslo, Norway
[51]
MA300D1-1 (Transducer only)Murata Manufacturing, Nagaokakyo-shi, Kyoto 617-8555, Japan[55]
X4 (Novelda, Norway)Novelda, Oslo, Gjerdrums vei 8
0484 Oslo, Norway
[9,12,13,20,53]
MAX2829 (transceiver only)Maxim Integrated, California, 95134 United States.[40]
Ancortek SDR-kit 2500BAncortek Radars, Fairfax, VA 22030, United States[66]
BGT23MTR12Infineon Technologies, Neubiberg Germany[45]
BGT24MRT122Infineon Technologies, Neubiberg Germany[15]
77 GHz FMCW TITexas-Instrument (TI) Dallas,
Texas 75243, USA
[34,72]
SoliGoogle and the Infineon Technologies, Neubiberg, Germany[71,75]
BGT60TR24Infineon Technologies, Neubiberg,
Germany
[44]
TI’s AWR1642Texas-Instrument (TI) Dallas,
Texas 75243, USA
[19]
Acconeer Pulsed coherent radarAcconeer AB, Lund, Sweden[38]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ahmed, S.; Kallu, K.D.; Ahmed, S.; Cho, S.H. Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review. Remote Sens. 2021, 13, 527. https://doi.org/10.3390/rs13030527

AMA Style

Ahmed S, Kallu KD, Ahmed S, Cho SH. Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review. Remote Sensing. 2021; 13(3):527. https://doi.org/10.3390/rs13030527

Chicago/Turabian Style

Ahmed, Shahzad, Karam Dad Kallu, Sarfaraz Ahmed, and Sung Ho Cho. 2021. "Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review" Remote Sensing 13, no. 3: 527. https://doi.org/10.3390/rs13030527

APA Style

Ahmed, S., Kallu, K. D., Ahmed, S., & Cho, S. H. (2021). Hand Gestures Recognition Using Radar Sensors for Human-Computer-Interaction: A Review. Remote Sensing, 13(3), 527. https://doi.org/10.3390/rs13030527

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop