Next Article in Journal
BART-IT: An Efficient Sequence-to-Sequence Model for Italian Text Summarization
Previous Article in Journal
A Cross-Platform Personalized Recommender System for Connecting E-Commerce and Social Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Human–Machine Interaction through Advanced Haptic Sensors: A Piezoelectric Sensory Glove with Edge Machine Learning for Gesture and Object Recognition

by
Roberto De Fazio
1,2,*,
Vincenzo Mariano Mastronardi
1,3,
Matteo Petruzzi
1,
Massimo De Vittorio
1,3 and
Paolo Visconti
1,3
1
Department of Innovation Engineering, University of Salento, 73100 Lecce, Italy
2
Facultad de Ingeniería, Universidad Panamericana, Aguascalientes 20290, Mexico
3
Center for Biomolecular Nanotechnologies, Italian Technology Institute IIT, 73010 Arnesano, Italy
*
Author to whom correspondence should be addressed.
Future Internet 2023, 15(1), 14; https://doi.org/10.3390/fi15010014
Submission received: 24 November 2022 / Revised: 19 December 2022 / Accepted: 22 December 2022 / Published: 27 December 2022
(This article belongs to the Section Big Data and Augmented Intelligence)

Abstract

:
Human–machine interaction (HMI) refers to systems enabling communication between machines and humans. Systems for human–machine interfaces have advanced significantly in terms of materials, device design, and production methods. Energy supply units, logic circuits, sensors, and data storage units must be flexible, stretchable, undetectable, biocompatible, and self-healing to act as human–machine interfaces. This paper discusses the technologies for providing different haptic feedback of different natures. Notably, the physiological mechanisms behind touch perception are reported, along with a classification of the main haptic interfaces. Afterward, a comprehensive overview of wearable haptic interfaces is presented, comparing them in terms of cost, the number of integrated actuators and sensors, their main haptic feedback typology, and their future application. Additionally, a review of sensing systems that use haptic feedback technologies—specifically, smart gloves—is given by going through their fundamental technological specifications and key design requirements. Furthermore, useful insights related to the design of the next-generation HMI devices are reported. Lastly, a novel smart glove based on thin and conformable AlN (aluminum nitride) piezoelectric sensors is demonstrated. Specifically, the device acquires and processes the signal from the piezo sensors to classify performed gestures through an onboard machine learning (ML) algorithm. Then, the design and testing of the electronic conditioning section of AlN-based sensors integrated into the smart glove are shown. Finally, the architecture of a wearable visual-tactile recognition system is presented, combining visual data acquired by a micro-camera mounted on the user’s glass with the haptic ones provided by the piezoelectric sensors.

1. Introduction

Humans have always been seeking more and more improvements in their natural sensing and physical characteristics. These improvements could be associated with some cognitive evolutions caused by the human instinct to adapt to new changes in living environments. Due to the exponential growth in the complexity machine structures and the consequent need for interaction between the functionalities of machines and human sensations, human–machine interfaces (HMIs) were born [1,2,3]. This has led to a new era of virtual reality (VR) and augmented reality (AR), where humans can interact with machines through specially designed smart interfaces [4,5,6].
Haptics or haptic technology reproduces the touch feeling during the interaction with a real-world or virtual environment [7,8,9]. Teleoperation, the term for remote physical interaction, refers to using computer software to interact with a virtual world [10,11,12]. Thanks to the fast advancement of technology in recent years, the design of haptic devices has increased tremendously. However, multidisciplinary competencies are required for designing haptic interfaces, including computer sciences, software design, electronics, communication, electromechanical design, and ergonomics [13,14,15]. The haptic interface comprises a manipulator acting as a mediator between humans and simulations and continuous visualization of virtual or remote environments. The user moves around in the virtual or remote environment by manipulating the robotic equipment. Computer simulations of different activities may convey lifelike, tactile feelings to a user using haptic feedback, consisting of mechanical feedback in a human–machine interfacing system [16,17]. Haptic feedback enables things generally visualized to assume real physical characteristics such as mass, hardness, and texture. Haptic interfaces can improve production evaluation in computer-aided design and manufacturing (CAD/CAM), design prototyping, and letting people handle virtual products before producing them. Furthermore, haptic interfaces can be useful for training and education purposes, for instance, in surgical procedures or to simulate activities carried out in dangerous environments, enabling the optimization of and improvement in personal skills [18,19]. Haptic interfaces can also deliver force feedback while teleoperation procedures, such as telesurgery or hazardous waste clearance, are being carried out remotely. The advantages of haptic feedback are obvious, given the variety of uses.
These technologies are discussed in detail in this work; specifically, an analysis of haptic feedback interfaces, one of the many principal vectors in HMIs, has been conducted. It will also be discussed how reliable these systems are in applications fields such as healthcare (physical rehabilitation, smart virtual elderly healthcare, etc.), communication devices for disabled or deafblind people, entertainment, and VR and AR applications.
After describing the human tactile sensation and analyzing the overall anatomy of the various mechanoreceptors in the human skin, the principal haptic interfaces are classified according to the main effect that defines their functionality in terms of haptic feedback. Then, a comprehensive account of the state of the art of wearable haptic interfaces is presented, comparing them in terms of cost, number of integrated actuators and sensors, main haptic feedback typology, and future applications. Finally, a survey of sensing systems that implement haptic feedback technologies, particularly smart gloves, is introduced by discussing their general key design requirements and technical features.
A fundamental task that an HMI’s transducer should accomplish is to gather and elaborate data coming from its sensing system through an efficient and well-designed signal conditioning section that improves the functionality of the same HMI’s transducer. Indeed, a novel smart glove is presented based on ultra-thin AlN (aluminum nitride) piezoelectric sensors to detect finger movements and gestures [20,21,22]. Notably, the device allows gestures to be recognized through a machine learning (ML) algorithm performed locally on the microcontroller. Furthermore, the architecture of a wearable hybrid visual-tactile recognition system is introduced, combining pictures acquired by a micro-camera mounted on the user’s glasses and haptic signals gathered by the presented smart glove to identify the grasped objects. Afterward, the development and testing of the conditioning section interfaced with AlN-based piezoelectric sensors are introduced.
The main novelties and contributions of the presented scientific work are:
  • Analysis of the main applications of the haptic feedback technologies, providing a general discussion of the physiological mechanism of the haptic feedback, as well as a general classification of modern haptic interfaces.
  • A comprehensive overview of wearable interfaces for haptic feedback, providing comparative analysis and insights about the discussed systems and defining features and architectures of the next generation of haptic devices.
  • A survey of recent sensing systems for haptic interfaces in the form of smart gloves for monitoring hand and finger movements and providing biofeedback to the user.
  • A novel smart glove based on ultrathin AlN sensors for detecting hand motions. Additionally, the architecture of a hybrid visual-tactile recognition system based on the developed glove is introduced. Finally, the design and testing of the electronic conditioning section for handling signals generated by piezoelectric sensors are reported.
The remainder of the paper is arranged as follows: Section 2 reports the problem definition and applications of haptic feedback technologies. Next, Section 3 presents a general classification of modern haptic interface technologies. Section 4 presents an overview of wearable interfaces for haptic feedback, along with a comparative analysis and discussion of the presented systems. Additionally, Section 5 reports a survey of sensing systems for haptic interfaces. Section 6 describes the smart glove supported by a machine learning (ML) algorithm to recognize gestures by analyzing signals provided by five piezoelectric AlN-based sensors. Then, the design and testing of the electronic conditioning section for the AlN flexible sensors are described. Lastly, the architecture of a wearable hybrid recognition system is presented, which combines pictures acquired by a micro-camera and haptic data acquired by the smart glove to recognize the grasped objects.

Selection and Exclusion Criteria for Performing the Survey of Haptic Technologies

The fundamental step for arranging the survey of haptic technologies was a preparatory screening to identify the most suitable scientific articles and review papers according to a set of selection and exclusion criteria. Specifically, the definition of the latter was been performed considering different aspects of the analyzed documents, such as the suitability to the treated topics, relevance, publication year, and redundancy with respect to other selected papers. Notably, the goal was to provide the reader with the broadest and most detailed view possible regarding the technologies used to provide haptic feedback. The selection process was carried out in accordance with the workflow depicted in Figure 1; a three-step analysis was carried out, beginning with the title, moving on to the abstract, and finally reading the full text.
Overall, the survey of haptic technologies led to the analysis of 138 documents, distributed as reported in Figure 2, where they are classified according to their typology (research articles, review papers, books, and websites).

2. Problem Definition and Application of Haptic Feedback Technologies

This section analyzes and classifies the existing typologies of haptic technologies and their use cases in different environments. “Haptic” derives from the Greek “haptesthai”, signifying linked to touch sensation, commonly connected with adaptive connections to pass on or visualize items [23]. The tactile sensation can be described as unintended and dynamic. Hence, simple haptic devices such as remote controls, directional wheels, and gaming controllers are crucial.
Haptic systems, also called kinesthetic correspondence or 3D touch, refer to any technical solutions that can implement touch by applying a few powers, vibrations, or movements to the user. In the last few years, several innovative haptic interfaces have been reported in the literature [23].
The demand for haptic feedback technologies has increased due to the quick advancement of VR and AR. Especially with portable or wearable formats, these technologies enhance users’ immersive experiences in a wide range of fields, including social media, entertainment, medical equipment, mechatronics, and more. The primary idea behind haptic feedback technology is to activate mechanoreceptors or afferent nerves beneath the skin with sophisticated actuators to produce a sensation or a feeling. Multi-modal pseudo-haptic feedback on replicated weight sensations in virtual reality has made some progress. Since VR has grown in popularity, it is crucial to give virtual items an accurate haptic sensation. Understanding the psychological impact of such feedback is still a work in progress [24].
Ensuring greater rigidity and reduced inertia is a significant difficulty for modern tactile systems. Specifically, impedance-based systems are constrained by their high rigidity, whereas admittance-based ones are constrained by their ability to produce little inertia. As a result, it is challenging to reproduce both forceful touch and minimal inertia in virtual settings [25].
To introduce sensing and feeling improvements by implementing interfaces for haptic feedback, a typical use case of such interfaces is the development of smart prostheses. Generating haptic feedback for smart prostheses is still quite difficult since they have trouble accurately gripping items when they do not have them. Recent scientific studies have employed several techniques to create haptic feedback. It has been demonstrated that vibrotactile feedback may be achieved by employing a compact and lightweight vibrator. The frequency restriction and the additional 400 ms delay are this method’s drawbacks. However, the pressure feedback devices unavoidably increase the bulk of the equipment. Pressure feedback may typically imitate the pressure of the smart prosthesis-grabbing objects. Therefore, L. Chen et al. developed the so-called Multimodal Fusion Transcutaneous Electrical System to overcome these problems [26]. This system uses various combinations and stimulation feedback with variable stimulus level, frequency, waveforms, and stimulating places to produce a multimodal stimulus haptic feedback. The stimulator has numerous independent channels and can create up to eight waveforms per electrode channel. To increase its dependability, it can connect to any device using Bluetooth 4.0 [27]. Nevertheless, under some circumstances, this could hurt and be uncomfortable for the amputee. Therefore, transcutaneous electrical nerve stimulation is the most practical way to produce multimodal fusion nerve stimulation (MFNS) when no stinging sensation is present.

2.1. Haptic Feedback’s Physiological Mechanisms

Kinesthetic feedback and tactile feedback are the two main categories of haptic feedback. Both have to do with how human muscles, joints, and tendons feel, which may provide us with information regarding weight, strain, gestures, etc. Skin sensation, created through tactile feedback, enables perceiving fine tactile perceptions such as vibration, pressure, texture, etc. As a result, when we engage with the actual world, they help us discern between the form, texture, and other surface features. Joint motions with several degrees of flexibility enable various actions in our daily and professional life. Actuators that can deliver both force and torque are necessary for kinesthetic feedback. The majority of kinesthetic feedback systems rely on heavy external mechanical equipment. Including actuators within the body is a huge problem in creating a torque with a high force required.

2.2. Haptic Feedback

Under the skin, Merkel disks, Pacinian corpuscles, Meissner corpuscles, Ruffini endings, Krause bulbs, and free nerve endings all contribute to the human sense of touch. Every type of mechanoreceptor has distinct purposes and sensing characteristics (Figure 3). The normal threshold for the tactile sensation on human fingers is 10–100 mN, with displacements of 10–100 mm, and it varies from person to person, closely correlated with the frequency of actuation. Low-frequency vibrations (10 Hz) affect the sensitivity of Merkel cells and Ruffini terminals. However, Meissner and Pacinian corpuscles react to high-frequency vibrations between 10 Hz and 800 Hz, with 30 Hz and 250 Hz as the most sensitive frequencies. To appropriately stimulate the receptors and optimize the dimension of the haptic device, these parameters are crucial for building haptic interfaces.
Another aspect of tactile feedback is temperature. Skin’s thermoreceptors gather various thermal data, enabling accurate temperature sensing. The hand’s skin commonly ranges in temperature from 25 to 36 °C when at rest. Thermoreceptors typically respond to temperature changes between 5 and 45 °C, and a sudden shift in temperature can make people experience discomfort. One of the human body’s most thermally sensitive areas is the skin at the base of the thumb, which can detect temperature differences of 0.02 to 0.07 °C for chilling pulses and 0.03 to 0.09 °C for heating pulses.

3. General Classification of Modern Haptic Interfaces

To highlight the real status and prospects of tactile feedback interfaces, it would be beneficial to summarize the present haptic feedback interfaces, classified into three distinct approaches:
  • Force-based tactile devices;
  • Thermal-based tactile devices;
  • Nerve stimulation tactile devices.
Force-based tactile systems are the most common feedback type to replicate kinesthetic and tactile experiences. They are constituted by mechanical systems that use actuators to exert stress on the muscles and skin to produce deformations miming the interaction of the body with actual items. Point-to-point mechanical actuators cannot replicate kinesthetic input because humans require torque to evaluate fine details such as weight, stretch, and joint and muscle gestures. Four categories of force-based haptic interfaces may be defined: elastomer-based bubbles, piezoelectric tactile interfaces, hydraulic tactile interfaces, pneumatic tactile interfaces, and electromagnetic tactile interfaces. The first typology relies on the movement of bubbles to give the impression of an impact or the item’s shape. By tuning the microchannel system’s transmission intensity for the multi-variate feedback to the joints, they could also provide kinesthetic feedback. Piezoelectric and electromagnetic devices, which can react quickly and be combined into miniature systems, are frequently utilized to simulate tactile sensations, particularly vibration.

3.1. Hydraulic Haptic Interfaces

The creation of wearable hydraulic haptic feedback interfaces is limited by the frequent use of massive pipelines with external fluid conveyance in traditional hydraulic equipment. For soft robotics and wearable technology, on the other hand, novel flexible and stretchy pumping technologies have recently been developed. These include electrostatic, stimuli-sensitive gels, and thermally responsive materials. In this case, liquid dielectrics could be a functional implementation rather than an elastomeric polymer [28]. To understand why they are a valuable solution, the most common use case of liquid dielectrics is dielectric elastomers [29]. A dielectric elastomer membrane stretches over 100% when a voltage is applied across its thickness, causing it to narrow out and expand. Dielectric elastomers have thus been created as transducers for a wide range of applications, such as soft robotics and adaptive optics. Additionally, liquid dielectrics more effectively solve problems of catastrophic failure in dielectric elastomer actuators, given their self-healing nature, which is not present in solid dielectrics.
On the other hand, as reported in [30], the polydimethylsiloxane (PDMS) stretchable polymer was chosen to realize pumps and channels due to its chemical stability and easy processing, allowing excellent stretchability and a long lifetime. Figure 4 describes various typical structures of hydraulic activating haptic feedback interfaces (Figure 4b,c).
In conclusion, it should be noted that hydraulic haptic interfaces may efficiently produce straining forces, out-of-plane deformations, pressures, and vibrations, as a function of the volume and liquid’s particular characteristics. The hydraulic haptic interface often applies more force than other technologies because of the high density of the fluid employed. The reaction time difficulties of hydraulic haptic devices, on the other hand, approaches hundreds of Hz due to the low liquid flow speed, and the deformation range would decrease, reaching zero at a high-frequency stage. A voltage of several hundred volts, or even more than 1 kV, is required to activate electrorheological fluid and dielectric liquid. Integrating a high-voltage amplifier on the body raises safety issues, restricting the future wearable applications of hydraulic systems.
Finally, realistic kinesthetic feedback can be provided using electrorheological (ER) fluids controlled via an electric field [32]. In particular, by applying a time-varying electric field, different haptic feedback can be provided, enabling issues of magnetorheological (MR) fluids to be overcome; in fact, the electrical circuitry for regulating ER fluid is simpler than that for MR fluid.

3.2. Pneumatic Haptic Interfaces

Using pneumatic actuators, similar to those used in hydraulic feedback-based systems, is another method for generating haptic feedback. Indeed, a pneumatic interface is composed of a pipe or cavity system containing gas pressured by a compressor to cause shape deformation. The analysis of compressed gas at rest and/or during motion and its application to the device’s design are the core principles of pneumatics, as described in [33]. In a pneumatic system, working energy (kinesthetic energy and pressure) is held in the potential status by air pressure, allowing air expansion. The expanding air is the fundamental element of the tactile and kinesthetic interfaces. Pneumatic actuation has medical uses in addition to its traditional uses in industrial automation and robotics. To overcome the drawbacks of commercial haptic devices, pneumatics researchers use the benefits listed below:
  • Cheap and high responsivity;
  • Compact and lightweight systems;
  • No constraints on output size or design;
  • No recirculation lines, unlike a hydraulic system;
  • Simple pressure and speed adjustment
  • Appropriate for a spotless workplace;
  • High power-to-weight ratio;
  • A safe usage.
These advantages have resulted in the widespread adoption of pneumatic methods in the haptics field. However, because of specific pneumatic actuator restrictions, the development of a pneumatic tactile system involves a methodical design process. The design of a pneumatic haptic glove that uses the tube’s change in air pressure to control hand gestures is shown in the image below. Because of their resemblance to human skin and durable working properties, polymer-based pneumatic actuators are commonly utilized in wearable tactile interfaces. To provide lifelike tactile feedback, a haptic device’s directional indications and diverse mechanical sensations are crucial.

3.3. Piezoelectric Haptic Interfaces

Piezoelectric actuators transform electrical energy into mechanical deformations or stress using the inverse piezoelectric effect, generating forces on the skin as tactile input. Moreover, piezoelectric-based devices’ mechanism and mode of operation enable the production of vibrational tactile feedback with a fast response time (Figure 5a).
Numerous piezoelectric materials, including lead zirconate titanate (PZT), potassium niobate (KNbO3), barium titanate (BTO), polyvinylidene fluoride (PVDF), sodium tungstate, etc., can be used in haptic interfaces. Due to its excellent piezoelectric performance, PZT is the most common piezoelectric material for implementing sensors and actuators due to its excellent piezoelectric performance (Figure 5b). Furthermore, for use in biocompatible devices, lead-free piezoelectric materials such as KNbO3 have been created. In short, piezoelectric-based haptic actuators make use of their compact size and quick reaction time as a typical vibrotactile haptic technology. However, even at their resonance frequency, the displacement of piezoelectric actuators makes it hard to reach hundreds of micrometers. Multilayer piezoelectric transducers, or those operating at the resonance frequency, can amplify the actuator’s deformation to provide a stronger feeling. These latter are very difficult to fabricate, restricting their use in permanent haptic feedback systems.

3.4. Electromagnetic Haptic Interfaces

Another alternative for providing pressure and vibration as haptic feedback is electromagnetic-based actuators. A permanent magnet reacts rapidly and provides significant deformation within an electromagnetic field. A coil, a supporting body, a cavity shape, a permanent magnet, and a handling layer make up a standard electromagnetic actuator. The created magnetic field, also known as the Lorentz force, will attract or repel the magnet hung by the handling layer when current flows through the coil. The number of coil turns, coil size, coil–magnet distance, magnet size, and magnetism affect the actuator’s force. Individual tiny electromagnetic actuators must be created and designed to create skin-integrated electromagnetic haptic interfaces. After being packaged, the actuators and electronic sections may be combined within flexible circuits using microfabrication techniques. This approach results in a controlled haptic interface. Advanced structural and mechanical design may be used to improve the performance of LR electromagnetic actuators. Additionally, the benefits of electromagnetic actuators are higher force-displacement and stronger tactile feedback than piezoelectric ones, as well as a larger vibration frequency with quick response, compared to pneumatic or hydraulic haptic ones. Finally, the electromagnetic tactile interface has further benefits over pneumatic and piezoelectric ones, including non-contact sensing and control features that make it a viable option for touchless VR interfaces. Their working bandwidth is too narrow for ultrasonic applications compared to piezoelectric actuators; they also feature higher dimensions.

3.5. Thermal-Based Haptic Interfaces

It is crucial to accurately reproduce thermal perception in VR and AR applications, allowing users to simulate touching a virtual item with different thermal characteristics. The performance of virtual reality is significantly improved by thermal stimulation or sensing-based haptic interfaces that make it easier to identify and characterize associated objects. The best options for achieving these perception goals are thermal-based interfaces that modify the skin’s perceived temperature depending on the thermal characteristics of virtual objects [35]. To create a device integrating thermo-haptic materials to accurately replicate real tactile perception, many prerequisites must be met. At first, a broad range of temperatures around the typical body temperature of a human should be generated via the thermo-haptic interface. The thermal-haptic interfaces must also correctly and quickly manage the temperature. A variety of physical haptic interfaces, including thermoelectric (TE), microfluidic, and other temperature-controlling haptic interfaces, as well as Joule heating haptic interfaces (Figure 6a), can be used to regulate the temperature of human skin [36].
Joule heater-based haptic interfaces—also known as thermal heaters—increase the temperature through a resistive path in which an electric current flows, simulating the thermal feeling with a fast reaction time. The Joule heating device’s thermal actuators are essential to provide heat feedback. Thin metal films and functional carbon textiles are two examples of flexible and elastic materials that generate thermal heating feedback for wearable electronics. These materials make it easier to construct Joule-heating actuators for wearable haptic interfaces. Thermal heaters may generate heat and sustain high temperatures, but it might be difficult to build a device with a steady performance that can dissipate heat effectively enough to prevent overheating.
Joule heaters cannot provide a cooling feeling due to a lack of cooling microstructures. Thus, for VR/AR applications, a cooling device must be used in conjunction with a Joule heater.

3.5.1. Thermoelectric Haptic Interfaces

Due to their ability to deliver heating, thermoelectric-based haptic interfaces are essential for thermal feedback [37]. Through the thermoelectric effect, sometimes referred to as the Peltier effect, they provide a cooling feeling to human skin. The thermoelectric devices are made up of thermoelectric pairs (constituted by p- and n-type semiconductors) linked in series and placed between two electrode layers (Figure 6b). Heat transfers from the heat sink to the junction as a result of electric charge in p-type and n-type materials diffusing to the junction, raising the temperature on the device’s top surface. The figure of merit (ZT) quantifies the performance of thermoelectric materials, depending on the Z coefficient, a function of the thermoelectric properties of used materials, and T, the absolute temperature. On the other hand, the electric current flows in the opposite direction, causing the top surface’s temperature to fall.
Thermoelectric haptic interfaces may convey information through various temperature distributions in thermal pixels, in addition to controlling body temperature. For instance, S. Kim et al. described employing a flexible thermoelectric device to create a 2D arrayed thermal haptic interface [35]. To transfer thermal information, the flexible thermoelectric version includes a 2D thermal distribution array; specifically, the device enables the modulation of each thermoelectric unit cell included in the array. The skin’s thermoreceptors transmit the real-time thermal pattern to users, who can then use it as a blind-assistance tool to determine where there are obstructions. In addition, thermoelectric haptic interfaces must be stretchable to be integrated into human skin and accommodate everyday motions. A high-level combination of pixelized thermal outputs was necessary to provide high-resolution reproduction of thermal feeling in VR/AR, separately regulating groups of thermoelectric cells and thus enabling a quick temperature response. However, these criteria are difficult to meet due to the unpredictable heat flow and inefficient heat dissipation. Several devices utilize metallic finned structures to increase heating and air efficiency for heat dissipation (Figure 6b) [38]. Typical thermoelectric systems can only be used in combination with rigid heat-conductive support layers, reducing their application in flexible interfaces. Thus, for wearable thermal-based tactile devices, heat-conductive layers are essential.

3.5.2. Microfluidic and Other Thermal-Based Haptic Interfaces

As depicted in Figure 6c, fluidic heat transfer technologies are also used to create thermal haptic interfaces by controlling the temperature of water or air inside channels in contact with the human skin. Air is also employed in VR and AR applications to simulate cold and warm feelings as an alternative to aqueous liquids. To simulate thermal feeling, S. Cai et al. developed a pneumatic glove that inflates airbags containing hot and cold air. [36]. Since fluidic action relies on pneumatic forces, it ensures a high reaction rate (i.e., 27.5 °C/s). The authors employed the developed interface in VR/AR applications; the experimental data indicated that the device successfully reproduced thermal feelings.
The thermoelectric effect and microfluidic heat transfer technologies are the main methods for generating tactile and thermal sensations. Devices reproducing the cooling effect have also been reported in [39], which rely on electrocaloric (EC), magnetocaloric (MC), and mechanocaloric (MEC) phenomena. Since the external electric field can affect performance, the electrocaloric effect, which refers to the change in adiabatic temperature or isothermal entropy produced by the polarization state in polar materials, offers significant promise for wearable cooling interfaces. Using EC materials and the thermodynamic cycle, skin surfaces may be warmed and cooled (based on the Carnot cycle). When they apply an electric field, their temperature increases (an adiabatic temperature change), and the extra heat is dissipated through a heat sink. The temperature of the EC materials falls when the electric field is removed. Without the electric field, these materials may absorb more heat from the load (iso-thermal entropy change). The repetition of this procedure will lower the load’s temperature. For wearable thermal VR/AR systems, thin and flexible electrocaloric materials are preferred [40,41,42].

3.6. Nerve Stimulation-Based Haptic Interfaces

Electrical nerve stimulation, sometimes known as “electrotactile feedback,” is one of the most promising techniques for delivering efficient haptic feedback, along with mechanical forces and heat transfer. The direct application of conductive electrodes to the skin causes an electric current to flow through that body area, particularly the hands or fingertips, stimulating the local nerves under the skin to create a haptic sensation. Since it requires no energy conversion and is simple, the term “electrotactile” is frequently used in haptic feedback research. Direct electrical stimulation is the most successful and frequently reported method of using electricity for stimulation. This topic will be the main focus of this section, even though there are other ways to use electricity for stimulation (such as electro-vibration and electrostatic tactile display). Electrotactile feedback is typically in a frequency range from 1 to 800 Hz. The particular mechanoreceptors would be stimulated to a great extent by the electrical input frequency. For instance, H. Kajimoto et al. developed a stimulation technique that used a 200 Hz pulsed current to activate Merkel receptors and induce pressure feeling. The Meissner corpuscles, which have a sensitivity range of 20–70 Hz, were stimulated by a current with a frequency below 100 Hz, leading to a perception of low-frequency vibration [43].
The solicitation frequency is not the unique parameter in stimulating specific mechanoreceptors; indeed, the direction of the current flow plays an important role. A tingling, itch, vibration, buzz, touch, pressure, pinch, or pinch can all be qualitatively defined as the result of electrostimulation. At skin–electrode interfaces, frequency has an impact on impedance as well. Increasing the current intensity, the resistive component rapidly drops, producing erratic feedback strength. Current tuning is an approach that has been extensively used to address this issue [44].
In 2018, A. Akhtar et al. presented a new model that captures the relation between electrical characteristics and the strength of electrotactile sensation. Additionally, they presented an automated controller that regulated parameters following the values above, thereby maintaining the same feeling intensity even during exercise or when electrodes peeled off (Figure 7) [45].
The main parameters are peak resistance ( R p ), peak pulse energy ( E p ), and phase charge ( Q ). It was possible to calculate R p from the measured peak voltage V p supposing known current intensity I :
R p = V p I
Additionally, the Q charge for a monophasic input waveform with current I and duration T is as follows:
Q = 0 T Idt = IT
E p at the same condition:
E p = R p 0 T I 2 dt = I 2 R p T
In their testing, participants had to alter I and T to achieve the same feeling intensity under various electrode peeling-off circumstances (impedance changes). The results suggest that E p vs. R   p exhibits an almost perfect linear proportional trend, considering various slopes for the different feeling intensities of each subject.
To accomplish electrotactile stimulation in the human body, a pair of electrodes is required since a current flow demands a closed circuit. Nano-sized metallic contacts of gold, silver, platinum, or stainless steel are the most popular. Due to their chemical stability, biocompatibility, flexibility, and compatibility with microfabrication techniques for creating ultra-flexible electrodes, nanosized Au films have found widespread usage.

3.7. Multi-Mode Integrated Haptic Interfaces

The demands and difficulties of the present and future haptic interfaces can be greatly addressed by multi-mode feedback technology. This section examines the prospects and state of the art of haptic feedback systems, highlighting their benefits and drawbacks through comparison and analysis of each discussed implementation.
In detail, the benefits and drawbacks of the various haptic technologies utilized in wearable and portable feedback interfaces are outlined in Table 1. Each technique successfully replicates some tactile feedback in a limited range. Due to the demands of multi-variate forces, only pneumatic and hydraulic actuators and electrostimulation are practical for kinesthetic feedback. By creating bumps due to changes in liquid or gas volume within the cavity, pneumatic and hydraulic actuators can generate pressure. These devices have relatively slow rates of shape change, which causes modest flow velocities. However, kinesthetic feedback can be generated because the gas and liquid tube system is deployed over the joints and muscles.
Unlike the other methods stated earlier, electrostimulation is a mechanism that generates pseudo-haptic feedback. Skin deformation or application of force to the skin or muscle is not made. These interfaces merely attempt to send electrical pulses from the neurons or mechanoreceptors to the brain. The structure of nerve-stimulation-based devices might be simplified to enable their wider diffusion in practical applications. There is still more work to be performed to better understand the nerve stimulation mechanism and select an electrical signal group that will produce a more realistic haptic impression. A novel system that combines several haptic feedback interfaces can create multi-mode haptic feedback and enhance perception. The development of wearable haptic devices that concurrently give several skin stimulations is still far from adequate. Few studies discuss how several tactile stimulus systems work in concert. Integrating many haptic feedback interfaces into one device should rely on their mechanisms to achieve a better result. To minimize space and create mixed stimulation modes, C. W. Carpenter et al. created an electro-pneumotactile actuator [46]. This actuator integrated pneumatic and electro-tactile stimulators inside the same device. Stretchable wires and pneumatic actuators offered both mechanical and electrical stimulation.

4. Overview of Wearable Interface for Providing Haptic Feedback

Reproducing human touch is one of the primary goals of virtual reality research; thus, a significant part of the research aiming to improve touch perception is centered on sensory stimulation, avatar mobility, and avatar representativeness. The user is deeply engaged in the VR experience, regardless of whether changes impact their cognition, emotions, or bodily functions (interaction with objects, etc.) [47,48]. From the need to recreate the interconnection with the surrounding environment for increasing the efficiency of biomedical systems, haptic interfaces were used to develop modern health instruments.
Force feedback has been introduced and explored for teleoperation purposes, taking into account the many classes of haptic feedback currently accessible in several simulators and gaming interfaces. Furthermore, everyone agrees on the significant value of haptic force feedback in rehabilitative and therapeutic systems [49]. In grasping and manipulation tasks, where force feedback is an essential requirement, it has been demonstrated to lower their rejection ratio [50,51] and improve the success rate [52]. Additionally, it reduces mental and physical stress and fosters a sense of embodiment [53,54]. Through several techniques that may be generally categorized as invasive and non-invasive, much work is being performed to recreate this unusual, bio-inspired characteristic for impaired individuals [55]. R. Yunus et al. developed an innovative non-invasive wearable vibrotactile haptic feedback (Vi-HaB) system [56]. By allowing them to recognize and differentiate between specific fingers and different amounts of force input from the fingers to the upper limb, this device restores a disabled person’s proprioceptive awareness. A plastic dummy hand integrates five force-sensitive resistors (FSRs), one on each fingertip. Five motor motors transmit force input to the user via these sensors.
Three separate units constitute the Vi-Hab system: slave side, master side, and processing unit; the first includes five force-sensitive resistors (FSRs) and connects the master side and the surrounding world [57]. The processing unit converts the sensors’ output voltages into the corresponding force intensities. To obtain the best resolution, the band was placed on the biceps region [58]. The wearable band has five equally spaced vibrational coin motors installed on the upper arm aligned with the fingers. Changes in the amplitude and frequency of vibrations are used to distinguish between different force levels. The frequency range of these motors (40 to 400 Hz) falls in the range of the Pacinian corpuscles, FA II-type receptors (Figure 8) [59].
A glove with multiple sensors and haptic feedback is needed for VR/AR applications and sophisticated robotics. Glove-based devices have the distinctive benefits of a high accuracy and multiple degrees of freedom (DOF) management functionality. These technologies, however, present several drawbacks for VR and AR scenarios. For instance, finger movement is a fine characteristic that vision and voice identification have trouble picking up. A smart glove with improved joint manipulation capabilities was proposed by M. Zhu et al. [34]; it comprises elastomer-based touch sensors and a PZT haptic stimulator actuator. This glove can be a supplemental control interface for augmented interactions for VR/AR applications and the existing visual and voice control interfaces (Figure 9).
Sensors and actuators for multimodal movement detection systems and fast haptic feedback are included in a 3D-printed glove shell. The smart glove comprises finger-bending sensors to sense the movements of each phalanx with multiple DoF, whereas the palm sensor detects the shear and normal force in eight directions (Figure 9). Hemispheric-shaped triboelectric sensors were made with the Eco-flex elastomer. The sensing mechanism is based on the interaction between the finger skin, which serves as a positive triboelectric element, and the elastomer, acting as a negative element. A microcontroller generates a pulse-width modulation (PWM) signal for activating the vibrating actuator. The device’s size may be decreased when Smore DOFs can be detected.
Machine learning provides an appealing approach for in-depth analysis of the observed triboelectric signals and extracting useful patterns from various occurrences [60,61], enabling the alternative implementation of the presented smart glove with significantly fewer sensors. Sixteen sensors integrated into the smart glove were simultaneously monitored as a preliminary test [62,63]. To give adequate characteristics for their study, the authors aggregated the dataset from sixteen channels to extract the required features [60]. Convolutional neural networks (CNN) are a great tool to extract features automatically and simultaneously decrease data complexity. The confusion map of the models demonstrates that both approaches may help the glove achieve above 96% recognition accuracy using 400 training data for every item.
Now, arguing on a different design strategy, several recent studies focused on improving users’ orientation and positioning perceptions when using haptic devices. According to their force reference system, tactile devices employed for this purpose may be divided into three categories: grounded, body grounded, and ungrounded [64]. These technologies have a wide range of uses, from personal assistants for visually impaired individuals [65] to teleoperation, particularly in the fields of micro-robotics [66], medicine, and training using emulators [67,68]. Therefore, one strategy to enhance these impressions is to employ haptics to deliver feedback. B. Sauvet et al. suggested employing piezoelectric actuators to meet these requirements (Figure 10). In their work, a novel ungrounded haptic device was created, examined, and experimentally validated. The unique gadget creates an illusion of force using asymmetric accelerations and a piezo-actuator. Experimental evaluations are performed to ascertain the force produced by the device and verify the user’s impression [69].
The presented tactile system is made up of an aluminum frame, a linear guideway, a brass moving component, and a piezo actuator, which drives the mobile component (Figure 10). Three-axis accelerometers monitor the acceleration of both the mobile item and the framework. The research is based on Amemiya’s haptic principle, a method that relies on a pseudo-attraction force produced by unbalanced accelerations [70,71,72,73].
An innovative method to improve orientation for users who are blind or visually challenged users, or in other contexts such as simulators, is using portable haptic devices. There are several existing methods; however, they all primarily concentrate on reducing volume and force. As a result, systems that offer both a compact size and the development of strong forces are required. The haptic feedback mechanism can be considered a combination of acoustic radiation force and acoustic streaming. Indeed, recent scientific works revealed much interest in creating mid-air tactile feedback devices based on ultrasound; these latter ones enable the induction of tactile sensations at any place and time without hindering human mobility. In the literature, several works were presented that exploit the abovementioned effects employing transducers’ arrays, which were brittle, heavy, and stiff. [74,75]. In their work, H.B. Akkerman et al. introduced a novel tactile interface relying on printed polymer transducers (PPTs). They also focused on developing PPTs piezomembranes deposited using a printing process. The membrane is a sandwich of polyimide (substrate) and P(VDF-TrFE); the element connections were realized using metallic electrodes, and the element edges were formed by a thick organic material. The PPTs are fully flexible, given their reduced thickness (<0.25 mm) and weight, making them suitable for wearable applications [76]. Tests indicated that a 2 mm displacement was obtained for PPT stimulated at its primary resonance of 29.5 kHz with a 500 V amplitude [77].
One of the most widely used haptic feedback strategies to provide users with tactile information is vibrotactile feedback. High-frequency stimuli are produced by a vibrotactile actuator and trigger FA-type mechano-receptors [78,79]. Numerous tactile interfaces necessitate a substantially wider contact area [56,57], such as the torso or waist, because such mechanoreceptors have larger receptive fields [80,81,82]. As a result, several tactile interfaces have been created to deliver relatively basic alerts or give the user geographic location data for navigating. Nevertheless, finding a haptic system where a person can use their hand to engage with a virtual world by sensing local touch represented by vibrotactile feedback is challenging. Despite vibrotactile feedback’s benefits, such as rapid user reaction times, it is usually used for applications requiring a greater contact area.
A first example is the following study by S. Baik et al., which presents a vibrotactile-array type tactile interface applied to the fingertip for VR/AR applications [83] (Figure 11). A previous study by the same authors showed that a participant could recognize a virtual item’s 2.5D shape produced with a finger-tip haptic interface that used a two-by-two tactile matrix (Figure 11a) [84]. Then, the authors upgrade their system by extending haptic contact with three-dimensional objects. A fingertip image is translated to the fingertip location using the hand’s position data [83].
In a different study, the same researchers presented two finger-tip interfaces for the index and thumb fingers. The type-A interface was originally intended to generate 2.5D surface characteristics [85]. It contains four fingertip-sized piezoelectric actuators, each consisting of a ceramic disk with a 9 mm diameter placed coaxially on a metal plate with a 12 mm diameter. The second typology of vibrotactile fingers haptic interface (type B) is depicted in Figure 11b.
Continuous motion is another prerequisite for a realistic representation of a contact point. By carefully determining the elapsed time between the onsets of successive actuation, Israr and Poupyrev proved that discontinuous vibrotactile arrays could give the impression of continuous motion [86,87]. The duration is further known as inter-stimulus signal onset asynchrony (SOA). The following criteria must be met to produce the perceived tactile movement:
SOA = 0.32 · Ts + 0.0473
where Ts and SOA are the temporal distance between consecutive actuation’s onsets, respectively. The SOA has to be greater than the signal latency since the authors employed numerous actuators to produce phantom feelings [88]. Accordingly, SOA     Ts demands that the signal duration Ts be less than 0.0698 s [84].
To obtain this, the authors in [84] used an RGB-D sensor to track the hand position and a computer to calculate the target solicitations. A sinusoidal signal with a 250 Hz frequency represents the source signal. Then, the driving signals are synthesized through two DAQ boards coupled with a custom piezo amplifier according to the actuators’ target intensities. The mean Weber fraction for the developed haptic systems is depicted in Figure 12a. Specifically, it reports the outcomes of prior work in which a virtual item dimension assessment task was carried out using a mechanical feedback interface [89]. Figure 12b illustrates the experiment’s outcomes related to personal realism evaluations, classified by the kind of haptic interface. Participants gave Type-B interfaces’ rendering of the contact a considerably better realism rating than Type-A ones. The experimental results show that the proposed vibrotactile fingertip interface renders virtual objects with a perception comparable to that of a force feedback interface.
Through this investigation, it has been shown that different technologies, sometimes even combined, can provide efficient and reliable haptic feedback for multipurpose applications, starting from biomedical to entertainment implementation. Therefore, the following table (Table 2) resumes the characteristics and technical specifications of all the devices analyzed in this paragraph.

5. Survey of Sensing Systems for Haptic Interface

A large number of perspectives on intelligent wearable systems such as HMIs can be expected thanks to the outstanding advancements in emerging technologies such as 5G/6G communication, the Internet of things (IoT) [90,91,92], tactile Internet [93], machine learning, image processing [94,95], and neural data processing [96,97].
Hand gestures are commonly involved in every form of human–computer interaction [98,99]. The preponderance of HMI systems relies on touch, audio, and visual modalities. Using two-way tactile communication is crucial since sensory feedback alone is insufficient for productive conversation. Most VR/AR systems mainly offer visual and audio feedback but do not give consumers immersive experiences.
Innovative implantable and wearable devices having flexible and elastic form factors and characteristics such as wearability, non-obtrusiveness, and biocompatibility have also been made possible thanks to advances in material sciences [100,101]. To monitor tiny movements, pressure changes, or physiological signals, these latter must be conformally applied to the body [102,103]. These advancements in sensing technologies have expanded the possibilities to develop intelligent, user-friendly glove-based HMIs. Technology advancements are making interactive gadgets quicker; therefore, it’s important to investigate the modalities to enhance the user experience and accommodate several users, including deafblind persons with compromised visual and audible modalities.
Realizing a robust tactile communication system that offers genuine emotions requires understanding the physiological and emotional aspects of human touch [104]. O. Ozioko et al. considered these factors while integrating interdisciplinary inputs from several disciplines, including electronics, mechanics, communications, informatic, material science, and robotic systems. They mainly concentrated on haptic smart gloves that use feedback and tactile sensing [93].
Human contact largely relies on visual, audio, and tactile interactions, as already mentioned; however, deafblind persons cannot use these senses. Nevertheless, impaired people with poor eyesight and hearing can rely on the sense of touch [105]. Therefore, using systems with traditional tactile interfaces, such as keyboards, presents challenges for them. People who are deafblind employ a variety of tactile and nontactile communication techniques depending on the situation or stage of life. Nonverbal (typically employed by users born blind and deaf), block alphabets, symbol-based approaches, lip-reading (such as Tadoma), imitation, observation, and Braille are some of these communication techniques. Deafblind people often employ gestures or touch to communicate; two popular tactile options are Braille and the deafblind manual alphabets (touch-based) [106]. The deafblind gesture language is an alphabet-based representation of spelling out words on the speaker’s hand, where letters are indicated by touch, movement of the finger palm, or a finger symbol [106]. The British deafblind manual alphabet (B-DMA), which is employed in the UK [107], the Lorm alphabet, which is used in Austria, Germany, and Poland [108], and the Malossi alphabet, which is used in Italy [109], are some of the most widely used alphabets. Another touch-based method used by deafblind persons to communicate is Braille. Louis Braille created a tactile reading and writing method that blind people and a few deafblind persons mostly use. Numbers and letters are coded with raised dots arranged in six-dot Braille units.
The main prerequisites for developing smart gloves are outlined in Table 3, which could be employed for alternative and augmented communication.
Smart gloves are increasingly being used in a variety of industrial, learning, gaming, and biomedical applications. Such devices comprise multiple sensors (such as touch, pressure, and flex) to detect the hand’s pressure, flexion, and direction [79] (Figure 13). Additionally, the glove acts as an input source in these applications, enabling natural and immediate interaction with machinery in industrial, learning, gaming, and biomedical applications. For these applications, researchers have created a variety of glove-based HMIs. Figure 13a illustrates the main applications of the available smart gloves [93,110,111,112]; for instance, they can be used for interacting manually without any tools, such as the manual measurement of the angle between fingers using goniometers during hand physiotherapy.
The three basic types of smart gloves are also depicted in Figure 13a: gesture-based smart gloves (G-SGs), touch-based smart gloves (T-SGs), and motion and touch-based intelligent gloves (GT-SGs). Different sensing techniques, such as piezoelectric, piezoresistive, and capacitive, were employed to create the sensors used in smart gloves [103]. Hence, such components affect the smart glove’s performance, hindering a fluid and straightway user experience. For instance, the quantity and sensitivity of the connected IMUs affect the accuracy and precision of the smart gloves in determining hand position. Since most IMUs employ MEMS technology, an established technology, they often provide excellent repeatability and accuracy [110]. Smart gloves relying on IMUs, unfortunately, are unable to quantify applied load to the hand or finger, which limits their applicability. By creating adaptable tactile sensors, actuators, and artificial e-skins, the issues mentioned above may be effectively addressed thanks to rapid improvements in flexible and printed electronics. For instance, glove-based HMIs might advance with the help of the e-skin-type solutions made possible by flexible electronics, as illustrated in Figure 13b, to offer a deeper intuitive user experience [103].
Smart gloves with built-in sensors have been used in gesture-based HMIs to assess finger position and identify important facial expressions [113]. Indeed, in G-SGs, the data provided by the embedded sensing devices are analyzed using various computing approaches to detect hand orientations [93]. The application of this technology for individuals with stroke, cerebral palsy, Parkinson, brain dysfunctions, and other conditions is now being investigated [113]. Orthopedic surgery has also been researched using virtual-reality-based rehabilitation, primarily for accident patients’ hands or ankles [90,114]. These systems were created primarily to address the shortcomings and errors of the manual methods used by practitioners to measure joint angles and progress in rehabilitation, respectively, using goniometers and questionnaires. Gesture-based smart gloves have also been used for AAC in addition to rehabilitation. For instance, due to their visual and hearing impairments, deafblind persons are supported by the fingerspelling communication method employing various finger motions to transmit letters, words, or sentences [115]. The output of sensors is analyzed to measure the intended letter using an adaptive pattern recognition algorithm. Although the device allows deafblind individuals to communicate, it is neither wearable nor portable.
A new data glove relying upon inertial and magnetic measurement units (IMMUs) (namely, gyroscopes, accelerometers, and magnetometers) was introduced by B. Fang et al. [110]. The projected data glove contains 18 inexpensive, wearable, and small-sized IMMUs. Cables link all sensor units to a centralized control device (Figure 14).
The three-step ELM (Extreme Learning Machine)-based gesture recognition approach was also applied to the data generated by the designed data glove. Creating a gesture database and gathering motion data for various gestures constitute the initial stage. The training classifier is the second stage, followed by the experiment and analysis stages. To confirm the efficacy of the orientation estimation technique, three IMMU orientations were assessed in various contexts. According to performance assessments, the suggested data glove can precisely record 3D motions, and ELM-based algorithms can precisely identify the gestures.
G-SGs are known for their remarkable performance in healthcare and rehabilitation applications. In recent years, advancements in the tools and techniques used to carry out these operations have made it feasible to treat and rehabilitate the upper limb in stroke patients. Additionally, based on various rehabilitation techniques and the severity of the patient’s disability, several robots can be selected. Robotics employment does have the benefit of enabling a quantitative evaluation of patient enhancements. The upper limbs are currently rehabilitated using a variety of traditional methods. Examples include coercive, bilateral, mirror, and repeated task training. In this approach, the quality of life for those with difficulties after a stroke might be improved [116]. Careful follow-up and evaluation are necessary to measure hand motions that the patient can execute when customizing a rehabilitation program to help the patient restore the abilities that have been lost. Therefore, C. Luca et al. designed an intelligent sensory glove customizable to the user’s residual physiological functions [117]. The proposed system relies on the Arduino platform and comprises different bending and pressure sensors applied on three fingers and an LCD screen mounted on common gloves. The sensors track the amount of pressure the patient applies when first starting to grab the items. The device was tested on three healthy volunteers, who were asked to perform particular rehabilitation exercises to ascertain the joint angles’ range. Furthermore, the force to carry out typical gripping tasks was measured using the integrated pressure sensors.
A person’s lifestyle is unavoidably altered by limb loss following an amputation. Loss of muscular and cognitive capabilities severely restricts their capacity to carry out everyday tasks and lowers their quality of life. An upper limb amputee can partially regain motor capabilities using a prosthesis based on an analysis of electromyography or powered by body movements. Nevertheless, the restoration of normal sensory abilities is still difficult. The B: Ionic glove, a prosthetic sensory system that may provide mechano-tactile stimuli on the user’s arm as a function of the pressures generated at the glove’s fingers, was introduced by M.F. Simons et al. in [118] (Figure 15).
Pressure pads were attached to the glove’s fingers, which contain a conductive fluid. This fluid moves through a system of silicone tubes when pressure is applied, joining electrode pairs and constituting electrical circuits. As a result, SMA (shape memory alloys) actuators on a wristband worn by the user’s remaining limb begin to softly compress their forearm [119].
The B: Ionic glove contains electro-fluidic sensor pads beneath each fingertip carrying a conducting liquid, a fluidic control system, and a vibratory elastic band applied to the forearm. When the pad is pushed, the conductive liquid is forced through silicone channels closing a couple of contacts. The authors also created a wearable tactile device that stimulates the user’s upper limb mechanically (Figure 16). The squeeze armband shown in [120] inspired the design of the tactile armband used in this work. M. Garrad et al. developed SoftMatter Computer (SMC) [121]; it was manufactured in silicone and formed into a 3D-printed mold, employing an ionic-conductive medium represented by saltwater, which is cheap and non-toxic. Five channels within the SMC allow silicone tubes to fit into the device. Each channel contains two holes to insert copper contacts with an applied AC voltage for preventing the saltwater from electrolyzing.
The armband can alleviate phantom limb discomfort and improve embodiment, promoting prosthetic device acceptance and lowering high rejection rates. The armband has the potential for more sophisticated computing and can rely on a variety of pressure points and strengths applied by prosthetic fingertips to the user’s skin.
Tremors, a common movement disease, can have a negative impact on a person’s life. There is a requirement for a biomechanical solution due to the many limitations of the present therapeutic techniques. Neurodegenerative illnesses such as Parkinson’s, neurological disorders such as multiple sclerosis, brain traumas, or drug side effects have been reported to produce tremors. [122]. However, the tremor-reduction systems now on the market have several drawbacks. Therefore, it is clear that low-weight, less imposing orthoses are required to control hand tremors. It is possible to apply the jamming mechanism, suggested by A.T. Wanasinghe et al. in [123], to develop such products for hand tremor inhibition. To determine the nature and features of tremor movements, an initial tremor motion evaluation was carried out [124]. As a result, the layer jamming mechanism was suggested as the stiffening approach for tremor reduction devices [125]. Figure 17 illustrates the damping process of this tremor reduction device integrated into a glove. Negative pressure triggers the layer jamming layer, consisting of a pile of paper sheets within a closed polyethylene housing. When negative pressure is applied to this housing, the air between the sheets is removed, increasing the frictional forces between them, withstanding flexion.
The standard HMI solutions are self-occluding and unintuitive, as in vision- and voice-based systems. To achieve a seamless and more user-friendly HMI, e-skin technology can be applied. Creating an HMI that can enable extensive bidirectional user interaction while still conforming to the human skin is difficult. As mentioned, an e-skin should include four crucial elements to function as an interactive two-way HMI: (1) touch sensors; (2) vibratory actuators; (3) strain sensors; and (4) other specific and sophisticated capabilities for healthcare applications, etc.
Besides tactile perception and actuating capabilities, detecting other chemical and physical quantities from the body will allow accurate basic health monitoring and early illness diagnosis. Consequently, users can track their health conditions using the same wearable HMI they already use for interaction. This aim is essential for elderly individuals, persons with sensory impairment (such as deafblind people), and these with chronic illnesses such as heart disease or diabetes [93]. After evaluating the reliability and functional purpose of HMIs, a comparison between the devices implementing haptic interfaces was reported in Table 4, considering the quantity and types of the employed sensors, collecting data technology, and future applications and improvements. According to us, the sensing approach based on inertial sensors, implemented in [110] and [125], is the most promising in terms of un-obtrusiveness, accuracy, and integrability in wearable devices. Indeed, they do not require tight contact with the monitored body parts. Furthermore, given the advances in recent decades of MEMS-based inertial sensors, they can be easily integrated into garments and simultaneously ensure very high accuracy in detecting body movement [126,127].

6. Designed Architecture of the Smart Sensory Glove Based on AlN-Based Sensors

This section presents a novel smart glove supported by an onboard ML algorithm to classify gestures by processing the signals acquired by the integrated sensors. The proposed solution relies on five ultra-thin and flexible AlN-based piezoelectric sensors placed at proper points on the back of the hand. Then, the design and testing of the electronic conditioning section for AlN-based flexible sensors are presented. Finally, the architecture of a wearable hybrid recognition system is shown, which combines images captured by a micro-camera with haptic data collected by the smart glove to identify the touched object.
The piezoelectric sensors integrated into the smart glove are based on an aluminum nitride (AlN) thin layer, a biocompatible piezoelectric material with high resistivity and electromechanical response, well suited for applications on the human body. The transducer was made by superimposing several layers: an initial protective 25 µm thick Kapton film, a flexible material resistant to high temperatures, and then a 120 nm thick AlN layer and a 200 nm thick molybdenum (Mo) electrode are placed by photolithography, and then two further patterned layers of 1µm thick AlN and 200 nm thick Mo. A Parylene layer was applied both above and below to isolate the sensor’s surface, leaving a part of the upper surface free on which the electrical connections are to be applied. In this non-isolated area, a connection package made with 3D printing was applied, depositing conductive (CI) and dielectric (DI) ink. Figure 18 shows an exploded diagram of the obtained multilayer transducer. Such sensors have previously been integrated into other wearable systems to monitor swallowing in patients suffering from dysphagia [20], collect electrical energy from human body movements [21], or in combination with a triboelectric sensor, collect human joint movements [22].
The sensor was characterized using a specially calibrated setup to establish the piezoelectric response as the applied pressure varies. First, the sensor was inserted in a chamber where controlled pulsed air flows were generated, resulting in pressure on the sensor surface between 10 kPa and 50 kPa measured using a pull shear apparatus. Then, the voltage signal was taken and measured directly from the sensor, without any intermediate filtering or amplification stage, finding good linearity between the open-circuit voltage signal and the applied pressure, according to the following equation:
V piezo = α A contact P
where the α parameter is a function of the intrinsic material properties and its response to mechanical deformations, A contact is the sensor’s contact surface, and P is the applied pressure. Within the examined pressure interval, a 0.025 V/N sensitivity and a 15 ± 2 ms response time (the rise time between 10% and 90% of the signal) were calculated.
Specifically, the smart glove is constituted by a standard cotton glove on which five custom piezoelectric sensors are placed at the knuckles of the fingers by silicone glue. These sensors allow the discrete detection of finger movements, given their very low thickness and weight and high flexibility. A similar application of AlN-based piezo-tribo sensors was proposed in [22], in which they were tested for their ability to detect different body movements, including hand gestures (Figure 19). This study introduces a wearable hybrid sensor (WHS) with three inorganic/elastomeric electrodes for multifunctional and multi-site identification of human movements. The piezoelectric component is based on the same stack previously presented in Figure 18, which was coupled with a conformable triboelectric component consisting of a PDMS and Ecoflex blend and encapsulated a parylene C layer, whose surface was modified with UV/ozone and oxygen plasma. In this paper, we perform a step forward in terms of integration and data processing compared to the idea presented in [22].
The tiny signals generated by the five sensors must be amplified and conditioned before being acquired by the following acquisition section. In laboratories, professional and specialized equipment is used to acquire and process the piezoelectric output voltage from sensors; however, realizing an electronic acquisition and amplification board is quite simple and allows almost the same performances to be obtained at lower costs and dimensions. Additionally, to ensure better comfort, the smart glove should integrate a stage to amplify and filter the signals generated by the piezoelectric transducers and then acquire and process them to extract useful information. Specifically, the conditioning section comprises a preamplification stage and a filtering stage; the first aims to convert the input signal supplied by the piezoelectric sensor into a low-impedance signal source and limit the noise as possible (Figure 20). The latter carries out the same function and eliminates the undesired signal components. These operations are performed in parallel by a conditioning board including five identical conditioning channels, one for each piezoelectric sensor. The design and testing of this conditioning section are presented in Section 6.1.
The conditioned signals from the five sensors are acquired by the 12-bit analog-to-digital-converter (ADC) integrated into the nRF52840 System on Chip (SoC) (Figure 20). It is a very low-power system-on-chip (SoC) that integrates a multi-protocol 2.4 GHz transceiver and a powerful CPU with Arm® Cortex®-M4F architecture, equipped with a 1MB flash program memory. The chip is the ideal SoC for developing IPv6-compatible automation applications or short-range wireless personal area networks. Furthermore, the device comprises a 12-bit ADC, multiplexed on eight configurable channels, operating with a sampling frequency of up to 200k sps (samples per second).
Afterward, the acquired data were employed to infer the gestures carried out by the user. For this aim, the Edge Impulse® platform was used, an integrated development environment for developing machine-learning models. It manages the acquisition and processing of data easily and quickly; in fact, it conveniently imports sensor, audio, and camera data and seamlessly distributes them to embedded systems, GPUs, and custom accelerators. The platform allows using data from any sensor to extract meaningful information for recognizing sounds or keywords, as well as testing the created model with real data to detect bugs early in the life cycle of the newly trained model [128]. The device software, including SDK, client, and generated code, is provided in open-source mode with an Apache 2.0 license. The collaboration with the TensorFlow Lite Micro project enables supporting a wide range of architectures, operators, and targets.
The feature extraction is performed for the designed model with a 2 s time window size and a 0.2 s overlap. In particular, a spectral analysis is chosen for the considered application, which extracts the frequency and power features of the signal [128] (Figure 21).
Low-pass and high-pass filters can also be applied to eliminate unwanted frequencies. It is suitable for analyzing repetitive patterns such as those in motion or vibration signals provided by inertial sensors [128]. The resulting spectral features are sent in input to a neural network comprising an input layer with 55 nodes (55 features), two hidden layers constituted by 20 and 10 neurons, and 6 output nodes, representing six classes related to six gestures associated with gestures indicating the numbers from 0 to 5 (Figure 19c). The graphic representation of the designed inferring chain is reported in Figure 21.
Thanks to the communication capability of the selected microcontroller unit, the developed smart glove can wirelessly send indications related to the system status and recognized classes to a remote host device (Figure 20). Finally, the 3D model of the designed smart glove is reported in Figure 22. As shown, the five AlN-based piezoelectric sensors are placed in correspondence to the junction between the first and second phalanx. Additionally, electronic conditioning, acquisition and processing sections are installed on the hand’s upper face, along with the 500 mAh LiPo battery to supply the smart glove.

6.1. Development and Testing of the Conditioning Section for the AlN-Based Flexible Sensors

Figure 23 shows the block diagram of the analog front-end developed for the AlN-based piezoelectric sensors. This architecture comprises the following main sections:
  • Preamplifier, which transforms the input signal provided at the sensor’s high-impedance output into a low-impedance signal source.
  • Amplifying–filtering block, which filters and amplifies the voltage signal provided in output from the preamplification.
  • Analog to digital Converter, which digitalizes the signal provided by the transducer.
  • Power supply.
A charge amplifier represents the preamplification stage, a circuital configuration wherein output voltage is proportional to the total injected electrical charge or the input current’s integral [130,131,132,133]. Figure 24 depicts the circuital configuration of the charge amplifier implemented in the designed conditioning section based on the AD8542ARMZ opamp.
The sensor has been represented with the equivalent Thevenin circuit, i.e., as a voltage generator Vq in series with Rp and Cp in parallel. Cc represents the cable’s capacitance, Ri is used to prevent electrostatic discharges (ESD), while Rf and Cf are the feedback resistance and capacitance, respectively. A first voltage buffer stage was equipped with a Vcc voltage divider that provided a Vcc⁄2 on the non-inverting terminal, using two resistors R 1 and R 2 , with the same value, enabling the stage to be supplied with a single power supply. Indeed, the buffer output was connected to the non-inverting terminal of the charge amplifier and the sensor’s ground. In this way, the DC contribution ( Vcc 2 ) on the output is not amplified. Indeed, replacing the capacitors with an open circuit and applying the superposition principle on the two amplifier’s terminals results in a DC output contribution of
Vo = Vcc 2 Rf Ri + Rp + Vcc 2 1 + Rf Ri + Rp Vo = Vcc 2
A common-mode voltage equal to V c c 2 is present on the charge amplifier’s output by disconnecting the sensor from the circuit. The transfer function at the output of the charge amplifier is given by
Vo Vq = Rf Ri + Rp   1 + sRpCp 1 + sRfCf 1 + s RiRp Ri + Rp Cp + Cc
The function has a zero at the frequency f Z   = 1 2 π RpCp , determined by the sensor impedance Rp and Cp , and two poles: the first, which depends on Rf and Cf on the feedback loop, determines the lower cut-off frequency f L = 1 2 π RfCf   ; the second, depending on Ri | | Rp and Cp | | Cc , determines the upper cut-off frequency f H = 1 2 π RiRp Ri + Rp ( Cp + Cc ) . For f = 0 , the output signal has a static gain: G = Rf Ri + Rp . The mid-band gain (the frequency range between f L and f H ) is given by G = Rp   ·   Cp Ri + Rp   ·   Cf . Beyond the high cut-off frequency f H , the gain amplitude drops by −20 dB/decade.
The components’ values are shown in Figure 24, where sensor resistance and capacitance were assumed equal to Rp = 1 MΩ and Cp = 1 nF, respectively. After performing two simulations with two different values of Ri , (one with a value of 100 kΩ and the second one with a value of 1 kΩ), the same values of f Z and f L were obtained, while as regards f H , the value obtained (with Ri equals to 1 kΩ) is two decades greater than that found with Ri = 100 kΩ.
The amplitude of the signal generated from the piezoelectric sensor is typically low, as was previously mentioned. Thus, well-designed filtering and amplification are essential within the conditioning chain. The input signal’s informative frequencies should pass through a properly constructed filter, whereas other components should be rejected. Moreover, the filter typology and order should be chosen based on the specific application. A Bessel filter should be employed if the piezoelectric pulse response’s shape is essential. A Butterworth filter should be deployed if a flat frequency response in the pass-band is necessary, but the pulse response is not crucial. Indeed, the latter is the filter type used in our work since it meets our design’s requirements [129]. We deployed an active band-pass filter based on Sallen–Key cells with Butterworth’s response (Figure 25).
Since the main focus was to obtain a band-pass filtering stage, according to the characteristics of the preamplifier stage of the abovementioned charge amplifier, the filter’s bandwidth was defined by f L = 159   Hz and f H = 1.59   KHz , which are, respectively, the low and high cut-off frequencies. f L and f H are related to the quality factor Q and the center-band frequency f 0 by these two relations:
f L = f 0 · [ 1 + 1 4 Q 2 1 2 Q ]
f H = f 0 · [ 1 + 1 4 Q 2     + 1   2 Q ]
Applying the frequency conditions to the Equations (8) and (9), the values of the quality factor (Q) and f 0 resulted 10 9 and 503   Hz , respectively.
We assume that values of all resistors and capacitors are R 1 = mR ,   R 2 = R , and C 1 = C 2 = C = 10   nF . Therefore, considering the calculated value of the quality factor Q and assuming a unitary static gain for the first stage (high-pass filtering), the following values of m and R are derived:
Q = 10 9 = m 2 + m ( 1 K )   m     0.701
f L = 159   Hz = 1 2 π RC m     R = 1 2 π · 159   Hz   ·   C m = 140   K Ω
Stepping into the development of the next stage (the low-pass filter), some desirable features we wanted for the low-pass filter were a flat-like signal attenuation near the cut-off frequency. This requirement was satisfied by a Butterworth-type filter. Considering the sizing method proposed in [134], the same values for both resistors and capacitors as R 3 = R 4 = R ,   C 3 = C 4 = C = 10   nF were assumed. Since the filtering stage must have an amplifying characteristic, a static gain of K = 1 + R 6 R 5 = 2 was set, introduced by the resistors R 5 = R 6 = 10   K Ω . Therefore, knowing the value previously already defined for f H = 1.56   KHz , the following R and Q values were given:
f H = 1.59   KHz = 1 2 π RC     R = 1 2 π · 1.59   KHz   ·   C   = 10   K Ω
Q = 1 3 K = 1
Due to technical reasons and the independence of the actual stage’s quality factor from the previous one, an f H value equal to 2   KHz was preferred. Considering this variation, a different value for R was also determined, that is, 8.2   K Ω (according to the available commercial resistors’ values). To overcome the increase in the filter stage’s frequency response at high input frequencies, a low-pass RC filter was placed at the amplifier’s output, constituted by a 100   Ω resistor ( R 7 ) and a 0.047   µ F capacitor ( C 5 ) [135]. The latter added a passive pole ( 1 sR 7 C 5 ) in the transfer function at about 34   kHz , which improved the high-frequency response. The transfer function of the resulting filtering and amplification stage is depicted in Figure 26.
The printed circuit board (PCB) of the developed condition section for a single channel of the smart glove is represented in Figure 27a, along with the corresponding 3D model (Figure 27b). The Autodesk Eagle CAD software was used to develop the project’s schematic and the corresponding PCB.
After the conditioning section was realized, different tests were performed to verify its correct operation; the flexible piezoelectric sensors based on AlN were employed to carry out the tests. Specifically, three piezoelectric AlN sensors were used, which differ slightly in shape and size (Figure 28).
In detail, the large rectangular sensor (Figure 28b) was employed because it is considered the most suitable for the selected application areas (finger and wrist) since it features a larger active surface, enabling a more significant response. An oscilloscope (model DSO5102P, manufactured by Hantek Inc., Honeoye Falls, NY, USA) was employed to acquire the voltage signals provided by the developed conditioning board (i.e., yellow trace for the unfiltered signal and cyan trace for the filtered signal). A first functional test was performed to assess the correct board’s operation by applying flexural (Figure 29a) and impulsive (Figure 29b) solicitations. At first, flexural stress was applied, gently bending the sensor’s tip and quickly releasing it; afterward, impulsive in-plane solicitations were applied on the sensor’s surface and keeping it on a hard surface. Then, the sensor was applied on the index finger through a glove (Figure 30) and on the wrist (Figure 31), detecting the stress induced by bending these body areas.

6.2. Architecture of a Hybrid Recognition System Based on the Developed Smart Glove

Furthermore, the designed smart glove could be combined with a wearable vision system to obtain a hybrid recognition system useful to assist visually impaired patients [136]. Specifically, the system integrates a micro-camera (model HD 500 W OTG, manufactured by GesiousSpy Inc., Shenzhen, China) mounted on the user’s glasses (Figure 32). The camera is characterized by a 2592 × 1944 pixel resolution, 30 FPS frame rate, and small dimensions (15 mm × 15 mm × 13 mm), providing the pictures by a common USB 2.0 (Universal Serial Bus) interface. The system combines the visual data acquired by the micro-camera with those acquired by the smart glove to improve the performance of the systems in recognizing objects. Notably, the system core is a Raspberry Pi4 (8 GB RAM version), which collects pictures from the micro-camera through a USB interface, and from the data from the developed smart glove, transmitted through the BLE interface.
In particular, the smart glove locally calculates the tactile features by spectral analysis over 2 s windows and transmits them to the Raspberry Pi board through Bluetooth. In parallel, the latter acquires the camera’s image and synchronizes them with the haptic information transmitted by the smart glove. Furthermore, the Raspberry board processes the pictures to extract the features; specifically, the processing consists of normalizing image data to a float value between 0 and 1 and reducing the color depth. Then, the processing board implements a neural network based on MobileNetV2 architecture [137].
The MobileNetV2 relies on bottleneck that depth-separable with residuals, In particular, the architecture includes, at first, a full convolution layer constituted by 32 filters; then, 19 residual bottleneck layers are placed. Batch normalization and dropout are employed during training along with a 3 × 3 kernel size, typical for contemporary networks [137]. Relu6 functions are used as non-linearity due to their higher reliability when implemented in low-precision computing. The neural network has an output layer constituted by n nodes, corresponding to as many class objects to be identified. In this way, the neural network allows objects to be classified based on both tactile and visual data.
Similarly, the Edge Impulse platform, based on the TensorFlow ecosystem, can be used for developing the described model. As discussed above, it enables handling all the involved phases, in other words, gathering the dataset, including pictures and signals from piezoelectric sensors, the creation of processing and inferring chain, and model testing to evaluate its performances. The model can be employed in C++ and continuously run on the Raspberry board. Furthermore, the recognized object class can be communicated to the patient by means of a vocal indication synthesized by the Festival speech synthesis package, reproduced through earphones [138].

7. Conclusions

Due to technological advancements in materials science and electronics, researchers are becoming increasingly interested in studying and developing wearable devices that can track body motions and vital signs in recent years. HMIs were created due to the exponential growth of the complexity of machines’ structures and the resulting necessity for interaction between their functionality and human perceptions.
This paper combines a systematic and comprehensive overview of the current technologies and devices for implementing HMIs with the presentation of an innovative assistive solution for supporting impaired people. In particular, the definition, applications, and classification of current technologies for generating haptic feedback are introduced, including force-based, thermal-based, and nerve-stimulation-based methods. Later, a survey of recent wearable systems for generating tactile feedback is reported, which can find applications in different fields, such as rehabilitative or assistive systems, entertainment, and automation. Then, a comprehensive overview of sensing systems integrated into tactile interfaces as introduced, focusing on devices in the form of smart gloves. Furthermore, comparative analyses were presented, highlighting the potentialities, limitations, and perspectives of each technology to outline the characteristics, features, and performances of the next generation of HMIs. Additionally, a novel smart glove was introduced, including thin and conformable AlN-based sensors. Five flexible, ultra-thin, AlN-based piezoelectric sensors were integrated into the presented system, strategically positioned on the back of the hand; a microcontroller section acquires and processes the signal from piezoelectric sensors to classify gestures using an onboard ML algorithm. In addition, the design and testing of the conditioning section for AlN-based flexible sensors were covered. Finally, the architecture of a wearable hybrid recognition system was presented. The latter employs a micro-camera to acquire pictures and the smart glove to gather haptic data for identifying the touched objects.

Author Contributions

Conceptualization, R.D.F., M.P. and P.V.; investigation, R.D.F., M.P., V.M.M. and P.V.; resources, R.D.F. and M.D.V.; data curation, P.V., M.P., V.M.M. and R.D.F.; writing—original draft preparation, R.D.F., V.M.M. and P.V.; writing—review and editing, R.D.F., V.M.M., P.V. and M.D.V.; visualization, R.D.F., M.P. and P.V.; supervision, M.D.V. and P.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data from our study are available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, H.; Ma, X.; Hao, Y. Electronic Devices for Human-Machine Interfaces. Adv. Mater. Interfaces 2017, 4, 1600709. [Google Scholar] [CrossRef]
  2. Zhu, M.; He, T.; Lee, C. Technologies toward next Generation Human Machine Interfaces: From Machine Learning Enhanced Tactile Sensing to Neuromorphic Sensory Systems. Appl. Phys. Rev. 2020, 7, 031305. [Google Scholar] [CrossRef]
  3. Bolton, C.; Machova, V.; Kovacova, M.; Valaskova, K. The Power of Human-Machine Collaboration: Artificial Intelligence, Business, Automation, and the Smart Economy. Econ. Manag. Financ. Mark. 2018, 13, 51–57. [Google Scholar]
  4. Andrews, C.; Southworth, M.K.; Silva, J.N.A.; Silva, J.R. Extended Reality in Medical Practice. Curr. Treat. Options Cardiovasc. Med. 2019, 21, 1–12. [Google Scholar] [CrossRef] [PubMed]
  5. Chakraborty, B.K.; Sarma, D.; Bhuyan, M.K.; MacDorman, K.F. Review of Constraints on Vision-Based Gesture Recognition for Human—Computer Interaction. IET Comput. Vis. 2018, 12, 3–15. [Google Scholar] [CrossRef]
  6. Juraschek, M.; Büth, L.; Posselt, G.; Herrmann, C. Mixed Reality in Learning Factories. Procedia Manuf. 2018, 23, 153–158. [Google Scholar] [CrossRef]
  7. Bermejo, C.; Hui, P. A Survey on Haptic Technologies for Mobile Augmented Reality. ACM Comput. Surv. 2021, 54, 1–35. [Google Scholar] [CrossRef]
  8. Yang, T.-H.; Kim, J.R.; Jin, H.; Gil, H.; Koo, J.-H.; Kim, H.J. Recent Advances and Opportunities of Active Materials for Haptic Technologies in Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 31, 2008831. [Google Scholar] [CrossRef]
  9. Maisto, M.; Pacchierotti, C.; Chinello, F.; Salvietti, G.; De Luca, A.; Prattichizzo, D. Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications. IEEE Trans. Haptics 2017, 10, 511–522. [Google Scholar] [CrossRef] [Green Version]
  10. Richter, J.; Thomas, B.H.; Sugimoto, M.; Inami, M. Remote Active Tangible Interactions. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction, Louisiana, LA, USA, 15–17 February 2007; ACM: New York, NY, USA, 2007; pp. 39–42. [Google Scholar]
  11. Leithinger, D.; Follmer, S.; Olwal, A.; Ishii, H. Physical Telepresence: Shape Capture and Display for Embodied, Computer-Mediated Remote Collaboration. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Hawaii, HI, USA, 5–8 October 2014; ACM: New York, NY, USA, 2014; pp. 461–470. [Google Scholar]
  12. Wang, R.; Quek, F. Touch & Talk: Contextualizing Remote Touch for Affective Interaction. In Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction, Massachusetts, MA, USA, 24–27 January 2010; ACM: New York, NY, USA, 2014; pp. 13–20. [Google Scholar]
  13. Wee, C.; Yap, K.M.; Lim, W.N. Haptic Interfaces for Virtual Reality: Challenges and Research Directions. IEEE Access 2021, 9, 112145–112162. [Google Scholar] [CrossRef]
  14. Biggs, S.J.; Srinivasan, M.A. Haptic Interfaces. In Handbook of Virtual Environments; CRC Press: Florida, FL, USA, 2002; ISBN 978-0-429-16393-7. [Google Scholar]
  15. Nitzsche, N.; Hanebeck, U.D.; Schmidt, G. Design Issues of Mobile Haptic Interfaces. J. Robot. Syst. 2003, 20, 549–556. [Google Scholar] [CrossRef]
  16. Våpenstad, C.; Hofstad, E.F.; Langø, T.; Mårvik, R.; Chmarra, M.K. Perceiving Haptic Feedback in Virtual Reality Simulators. Surg. Endosc. 2013, 27, 2391–2397. [Google Scholar] [CrossRef]
  17. Sapkaroski, D.; Baird, M.; McInerney, J.; Dimmock, M.R. The Implementation of a Haptic Feedback Virtual Reality Simulation Clinic with Dynamic Patient Interaction and Communication for Medical Imaging Students. J. Med. Radiat. Sci. 2018, 65, 218–225. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Tannous, M.; Miraglia, M.; Inglese, F.; Giorgini, L.; Ricciardi, F.; Pelliccia, R.; Milazzo, M.; Stefanini, C. Haptic-Based Touch Detection for Collaborative Robots in Welding Applications. Robot. Comput.-Integr. Manuf. 2020, 64, 101952. [Google Scholar] [CrossRef]
  19. Hamza-Lup, F.G.; Bergeron, K.; Newton, D. Haptic Systems in User Interfaces: State of the Art Survey. In Proceedings of the 2019 ACM Southeast Conference, Georgia, GA, USA, 18–20 April 2019; ACM: New York, NY, USA, 2019; pp. 141–148. [Google Scholar]
  20. Natta, L.; Guido, F.; Algieri, L.; Mastronardi, V.M.; Rizzi, F.; Scarpa, E.; Qualtieri, A.; Todaro, M.T.; Sallustio, V.; De Vittorio, M. Conformable AlN Piezoelectric Sensors as a Non-Invasive Approach for Swallowing Disorder Assessment. ACS Sens. 2021, 6, 1761–1769. [Google Scholar] [CrossRef] [PubMed]
  21. Guido, F.; Qualtieri, A.; Algieri, L.; Lemma, E.D.; De Vittorio, M.; Todaro, M.T. AlN-Based Flexible Piezoelectric Skin for Energy Harvesting from Human Motion. Microelectron. Eng. 2016, 159, 174–178. [Google Scholar] [CrossRef]
  22. Mariello, M.; Fachechi, L.; Guido, F.; Vittorio, M. Conformal, Ultra-Thin Skin-Contact-Actuated Hybrid Piezo/Triboelectric Wearable Sensor Based on AlN and Parylene-Encapsulated Elastomeric Blend. Adv. Funct. Mater. 2021, 31, 2101047. [Google Scholar] [CrossRef]
  23. Siraj, M.; Sethi, A.; Kumar, A.; Dahiya, P. Haptic Feedback System for Differently Abled Using Their Inputs. In Proceedings of the 2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), Noida, India, 3–4 September 2021; IEEE: New Jersey, NJ, USA, 2021; pp. 1–4. [Google Scholar]
  24. Huang, Y.; Yao, K.; Li, J.; Li, D.; Jia, H.; Liu, Y.; Yiu, C.K.; Park, W.; Yu, X. Recent Advances in Multi-Mode Haptic Feedback Technologies towards Wearable Interfaces. Mater. Today Phys. 2022, 22, 100602. [Google Scholar] [CrossRef]
  25. Chu, R.; Zhang, Y.; Zhang, H.; Xu, W.; Ryu, J.-H.; Wang, D. Co-Actuation: A Method for Achieving High Stiffness and Low Inertia for Haptic Devices. IEEE Trans. Haptics 2020, 13, 312–324. [Google Scholar] [CrossRef]
  26. Lv, X.; Chen, L.; Dai, C.; Lang, Y.; Tang, R.; He, J. Multimodal Fusion Transcutaneous Electrical System for Haptic Feedback. In Proceedings of the 2019 IEEE International Conference on Advanced Robotics and its Social Impacts (ARSO), Beijing, China, 31 October–2 November 2019; IEEE: New Jersey, NJ, USA, 2019; pp. 102–105. [Google Scholar]
  27. Crago, P.E.; Nakai, R.J.; Chizeck, H.J. Feedback Regulation of Hand Grasp Opening and Contact Force during Stimulation of Paralyzed Muscle. IEEE Trans. Biomed. Eng. 1991, 38, 17–28. [Google Scholar] [CrossRef]
  28. Acome, E.; Mitchell, S.K.; Morrissey, T.G.; Emmett, M.B.; Benjamin, C.; King, M.; Radakovitz, M.; Keplinger, C. Hydraulically Amplified Self-Healing Electrostatic Actuators with Muscle-like Performance. Science 2018, 359, 61–65. [Google Scholar] [CrossRef] [PubMed]
  29. Suo, Z. Theory of Dielectric Elastomers. Acta Mech. Solida Sin. 2010, 23, 549–578. [Google Scholar] [CrossRef]
  30. Cacucciolo, V.; Shintake, J.; Kuwajima, Y.; Maeda, S.; Floreano, D.; Shea, H. Stretchable Pumps for Soft Machines. Nature 2019, 572, 516–519. [Google Scholar] [CrossRef]
  31. Leroy, E.; Hinchet, R.; Shea, H. Multimode Hydraulically Amplified Electrostatic Actuators for Wearable Haptics. Adv. Mater. 2020, 9, 2002564. [Google Scholar] [CrossRef] [PubMed]
  32. Mazursky, A.; Koo, J.-H.; Yang, T.-H. Design, Modeling, and Evaluation of a Slim Haptic Actuator Based on Electrorheological Fluid. J. Intell. Mater. Syst. Struct. 2019, 30, 2521–2533. [Google Scholar] [CrossRef]
  33. Talhan, A.; Jeon, S. Pneumatic Actuation in Haptic-Enabled Medical Simulators: A Review. IEEE Access 2018, 6, 3184–3200. [Google Scholar] [CrossRef]
  34. Zhu, M.; Sun, Z.; Zhang, Z.; Shi, Q.; He, T.; Liu, H.; Chen, T.; Lee, C. Haptic-Feedback Smart Glove as a Creative Human-Machine Interface (HMI) for Virtual/Augmented Reality Applications. Sci. Adv. 2020, 6, eaaz8693. [Google Scholar] [CrossRef]
  35. Kim, S.; Kim, T.; Kim, C.S.; Choi, H.; Kim, Y.J.; Lee, G.S.; Oh, O.; Cho, B.J. Two-Dimensional Thermal Haptic Module Based on a Flexible Thermoelectric Device. Soft Robot. 2020, 7, 736–742. [Google Scholar] [CrossRef]
  36. Cai, S.; Ke, P.; Narumi, T.; Zhu, K. ThermAirGlove: A Pneumatic Glove for Thermal Perception and Material Identification in Virtual Reality. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; IEEE: New Jersey, NJ, USA, 2020; pp. 248–257. [Google Scholar]
  37. Pyun, K.R.; Rogers, J.A.; Ko, S.H. Materials and Devices for Immersive Virtual Reality. Nat. Rev. Mater. 2022, 7, 841–843. [Google Scholar] [CrossRef]
  38. Rezania, A.; Rosendahl, L.A. A Comparison of Micro-Structured Flat-Plate and Cross-Cut Heat Sinks for Thermoelectric Generation Application. Energy Convers. Manag. 2015, 101, 730–737. [Google Scholar] [CrossRef]
  39. Moya, X.; Kar-Narayan, S.; Mathur, N.D. Caloric Materials near Ferroic Phase Transitions. Nat. Mater. 2014, 13, 439–450. [Google Scholar] [CrossRef] [PubMed]
  40. Zhang, G.; Zhang, X.; Huang, H.; Wang, J.; Li, Q.; Chen, L.-Q.; Wang, Q. Toward Wearable Cooling Devices: Highly Flexible Electrocaloric Ba 0.67 Sr 0.33 TiO 3 Nanowire Arrays. Adv. Mater. 2016, 28, 4811–4816. [Google Scholar] [CrossRef] [PubMed]
  41. Wang, D.; Chen, X.; Yuan, G.; Jia, Y.; Wang, Y.; Mumtaz, A.; Wang, Y.; Liu, J.-M. Toward Artificial Intelligent Self-Cooling Electronic Skins: Large Electrocaloric Effect in All-Inorganic Flexible Thin Films at Room Temperature. J. Mater. 2019, 5, 66–72. [Google Scholar] [CrossRef]
  42. Yang, C.; Han, Y.; Feng, C.; Lin, X.; Huang, S.; Cheng, X.; Cheng, Z. Toward Multifunctional Electronics: Flexible NBT-Based Film with a Large Electrocaloric Effect and High Energy Storage Property. ACS Appl. Mater. Interfaces 2020, 12, 6082–6089. [Google Scholar] [CrossRef]
  43. Yang, Y.; Zhou, Y.; Zeng, J.; He, K.; Liu, H. Electrotactile Stimulation Waveform Modulation Based on A Customized Portable Stimulator: A Pilot Study. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; IEEE: New Jersey, NJ, USA, 2019; pp. 1838–1843. [Google Scholar]
  44. Kaczmarek, K.A.; Webster, J.G.; Bach-y-Rita, P.; Tompkins, W.J. Electrotactile and Vibrotactile Displays for Sensory Substitution Systems. IEEE Trans. Biomed. Eng. 1991, 38, 1–16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Akhtar, A.; Sombeck, J.; Boyce, B.; Bretl, T. Controlling Sensation Intensity for Electrotactile Stimulation in Human-Machine Interfaces. Sci. Robot. 2018, 3, eaap9770. [Google Scholar] [CrossRef] [Green Version]
  46. Carpenter, C.W.; Malinao, M.G.; Rafeedi, T.A.; Rodriquez, D.; Tan, S.T.M.; Root, N.B.; Skelil, K.; Ramírez, J.; Polat, B.; Root, S.E.; et al. Electropneumotactile Stimulation: Multimodal Haptic Actuators Enabled by a Stretchable Conductive Polymer on Inflatable Pockets. Adv. Mater. Technol. 2020, 5, 1901119. [Google Scholar] [CrossRef]
  47. Frohner, J.; Salvietti, G.; Beckerle, P.; Prattichizzo, D. Can Wearable Haptic Devices Foster the Embodiment of Virtual Limbs? IEEE Trans. Haptics 2019, 12, 339–349. [Google Scholar] [CrossRef]
  48. Biocca, F. The Cyborg’s Dilemma: Progressive Embodiment in Virtual Environments. J. Comput.-Mediat. Commun. 1997, 3, JCMC324. [Google Scholar] [CrossRef]
  49. Romano, J.M.; Hsiao, K.; Niemeyer, G.; Chitta, S.; Kuchenbecker, K.J. Human-Inspired Robotic Grasp Control With Tactile Sensing. IEEE Trans. Robot. 2011, 27, 1067–1079. [Google Scholar] [CrossRef]
  50. Silvera-Tawil, D.; Rye, D.; Velonaki, M. Artificial Skin and Tactile Sensing for Socially Interactive Robots: A Review. Robot. Auton. Syst. 2015, 63, 230–243. [Google Scholar] [CrossRef]
  51. Girão, P.S.; Ramos, P.M.P.; Postolache, O.; Pereira, J.M.D. Tactile Sensors for Robotic Applications. Measurement 2013, 46, 1257–1271. [Google Scholar] [CrossRef]
  52. Witteveen, H.J.; Rietman, H.S.; Veltink, P.H. Vibrotactile Grasping Force and Hand Aperture Feedback for Myoelectric Forearm Prosthesis Users. Prosthet. Orthot. Int. 2015, 39, 204–212. [Google Scholar] [CrossRef] [PubMed]
  53. Dietrich, C.; Walter-Walsh, K.; Preißler, S.; Hofmann, G.O.; Witte, O.W.; Miltner, W.H.; Weiss, T. Sensory Feedback Prosthesis Reduces Phantom Limb Pain: Proof of a Principle. Neurosci. Lett. 2012, 507, 97–100. [Google Scholar] [CrossRef] [PubMed]
  54. Wijk, U.; Carlsson, I. Forearm Amputees’ Views of Prosthesis Use and Sensory Feedback. J. Hand Ther. 2015, 28, 269–278. [Google Scholar] [CrossRef]
  55. Shehata, A.W.; Rehani, M.; Jassat, Z.E.; Hebert, J.S. Mechanotactile Sensory Feedback Improves Embodiment of a Prosthetic Hand During Active Use. Front. Neurosci. 2020, 14, 263. [Google Scholar] [CrossRef]
  56. Yunus, R.; Ali, S.; Ayaz, Y.; Khan, M.; Kanwal, S.; Akhlaque, U.; Nawaz, R. Development and Testing of a Wearable Vibrotactile Haptic Feedback System for Proprioceptive Rehabilitation. IEEE Access 2020, 8, 35172–35184. [Google Scholar] [CrossRef]
  57. Parmar, S.; Khodasevych, I.; Troynikov, O. Evaluation of Flexible Force Sensors for Pressure Monitoring in Treatment of Chronic Venous Disorders. Sensors 2017, 17, 1923. [Google Scholar] [CrossRef]
  58. Hotson, G.; McMullen, D.P.; Fifer, M.S.; Johannes, M.S.; Katyal, K.D.; Para, M.P.; Armiger, R.; Anderson, W.S.; Thakor, N.V.; Wester, B.A. Individual Finger Control of a Modular Prosthetic Limb Using High-Density Electrocorticography in a Human Subject. J. Neural Eng. 2016, 13, 026017. [Google Scholar] [CrossRef] [Green Version]
  59. Precision Microdrives Inc. The Limits of Vibration Frequency for Miniature Vibration Motors. Available online: https://www.precisionmicrodrives.com/the-limits-of-vibration-frequency-for-miniature-vibration-motors (accessed on 19 May 2022).
  60. Zhao, G.; Yang, J.; Chen, J.; Zhu, G.; Jiang, Z.; Liu, X.; Niu, G.; Wang, Z.L.; Zhang, B. Keystroke Dynamics Identification Based on Triboelectric Nanogenerator for Intelligent Keyboard Using Deep Learning Method. Adv. Mater. Technol. 2019, 4, 1800167. [Google Scholar] [CrossRef] [Green Version]
  61. Zhang, W.; Wang, P.; Sun, K.; Wang, C.; Diao, D. Intelligently Detecting and Identifying Liquids Leakage Combining Triboelectric Nanogenerator Based Self-Powered Sensor with Machine Learning. Nano Energy 2019, 56, 277–285. [Google Scholar] [CrossRef]
  62. Wu, C.; Ding, W.; Liu, R.; Wang, J.; Wang, A.C.; Wang, J.; Li, S.; Zi, Y.; Wang, Z.L. Keystroke Dynamics Enabled Authentication and Identification Using Triboelectric Nanogenerator Array. Mater. Today 2018, 21, 216–222. [Google Scholar] [CrossRef]
  63. He, Q.; Wu, Y.; Feng, Z.; Sun, C.; Fan, W.; Zhou, Z.; Meng, K.; Fan, E.; Yang, J. Triboelectric Vibration Sensor for a Human-Machine Interface Built on Ubiquitous Surfaces. Nano Energy 2019, 59, 689–696. [Google Scholar] [CrossRef]
  64. Kunzler, U.; Runde, C. Kinesthetic Haptics Integration into Large-Scale Virtual Environments. In Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, Pisa, Italy, 18–20 March 2005; IEEE: New Jersey, NJ, USA, 2005; pp. 551–556. [Google Scholar]
  65. Loomis, J.M.; Marston, J.R.; Golledge, R.G.; Klatzky, R.L. Personal Guidance System for People with Visual Impairment: A Comparison of Spatial Displays for Route Guidance. J. Vis. Impair. Blind. 2005, 99, 219–232. [Google Scholar] [CrossRef]
  66. Bolopion, A.; Xie, H.; Halayo, D.S.; Regnier, S. Haptic Teleoperation for 3-D Microassembly of Spherical Objects. IEEE/ASME Trans. Mechatron. 2012, 17, 116–127. [Google Scholar] [CrossRef]
  67. Guinan, A.L.; Montandon, M.N.; Doxon, A.J.; Provancher, W.R. An Ungrounded Tactile Feedback Device to Portray Force and Torque-like Interactions in Virtual Environments. In Proceedings of the 2014 IEEE Virtual Reality (VR), Minneapolis, MN, USA, 29 March–2 April 2014; IEEE: New Jersey, NJ, USA, 2014; pp. 171–172. [Google Scholar]
  68. Tsetserukou, D.; Hosokawa, S.; Terashima, K. LinkTouch: A Wearable Haptic Device with Five-Bar Linkage Mechanism for Presentation of Two-DOF Force Feedback at the Fingerpad. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS), Texas, TX, USA, 23–26 February 2014; IEEE: New Jersey, NJ, USA, 2014; pp. 307–312. [Google Scholar]
  69. Sauvet, B.; Laliberte, T.; Gosselin, C. Design, Analysis and Experimental Validation of an Ungrounded Haptic Interface Using a Piezoelectric Actuator. Mechatronics 2017, 45, 100–109. [Google Scholar] [CrossRef]
  70. Amemiya, T.; Gomi, H. Distinct Pseudo-Attraction Force Sensation by a Thumb-Sized Vibrator That Oscillates Asymmetrically. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, Versailles, France, 24–26 June 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 88–95. [Google Scholar]
  71. Amemiya, T.; Ando, H.; Maeda, T. Lead-Me Interface for a Pulling Sensation from Hand-Held Devices. ACM Trans. Appl. Percept. (TAP) 2008, 5, 1–17. [Google Scholar] [CrossRef]
  72. Amemiya, T.; Maeda, T. Directional Force Sensation by Asymmetric Oscillation from a Double-Layer Slider-Crank Mechanism. J. Comput. Inf. Sci. Eng. 2009, 9, 1–8. [Google Scholar] [CrossRef]
  73. Amemiya, T.; Sugiyama, H. Haptic Handheld Wayfinder with Pseudo-Attraction Force for Pedestrians with Visual Impairments. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, Pennsylvania, PA, USA, 25–28 October 2009; Association for Computing Machinery: New York, NY, USA, 2009; pp. 107–114. [Google Scholar]
  74. Ito, M.; Wakuda, D.; Inoue, S.; Makino, Y.; Shinoda, H. High Spatial Resolution Midair Tactile Display Using 70 KHz Ultrasound. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, London, UK, 4–7 July 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 57–67. [Google Scholar]
  75. Carter, T.; Seah, S.A.; Long, B.; Drinkwater, B.; Subramanian, S. UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, St. Andrews Scotland, UK, 8–11 October 2013; ACM: New York, NY, USA, 2013; pp. 505–514. [Google Scholar]
  76. Van Neer, P.; Volker, A.W.F.; Berkhoff, A.P.; Akkerman, H.B.; Schrama, T.; Van Breemen, A.; Gelinck, G.H. Feasiblity of Using Printed Polymer Transducers for Mid-Air Haptic Feedback. In Proceedings of the 2018 IEEE International Ultrasonics Symposium (IUS), Kobe, Japan, 22–25 October 2018; IEEE: New Jersey, NJ, USA, 2018; pp. 1–4. [Google Scholar]
  77. Shi, X.; Shi, D.; Li, W.L.; Wang, Q. A Unified Method for Free Vibration Analysis of Circular, Annular and Sector Plates with Arbitrary Boundary Conditions. J. Vib. Control. 2016, 22, 442–456. [Google Scholar] [CrossRef]
  78. Vallbo, A.B.; Johansson, R.S. Properties of Cutaneous Mechanoreceptors in the Human Hand-Related to Touch Sensation. Hum. Neurobiol. 1984, 9, 3–14. [Google Scholar]
  79. Johnson, K.O. The Roles and Functions of Cutaneous Mechanoreceptors. Curr. Opin. Neurobiol. 2001, 11, 455–461. [Google Scholar] [CrossRef] [PubMed]
  80. Ertan, S.; Lee, C.; Willets, A.; Tan, H.; Pentland, A. A Wearable Haptic Navigation Guidance System. In Digest of Papers, Proceedings of the 2nd International Symposium on Wearable Computers, Pennsylvania, PA, USA, 19–20 October 1998; IEEE: New Jersey, NJ, USA, 1998; pp. 164–165. [Google Scholar]
  81. Flores, G.; Kurniawan, S.; Manduchi, R.; Martinson, E.; Morales, L.M.; Sisbot, E.A. Vibrotactile Guidance for Wayfinding of Blind Walkers. IEEE Trans. Haptics 2015, 8, 306–317. [Google Scholar] [CrossRef] [PubMed]
  82. Aggravi, M.; Salvietti, G.; Prattichizzo, D. Haptic Assistive Bracelets for Blind Skier Guidance. In Proceedings of the 7th Augmented Human International Conference 2016, Geneva, Switzerland, 25–27 February 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 1–4. [Google Scholar]
  83. Baik, S.; Han, I.; Park, J.-M.; Park, J. Multi-Fingertip Vibrotactile Array Interface for 3D Virtual Interaction. In Proceedings of the 2020 IEEE Haptics Symposium (HAPTICS), Virginia, VA, USA, 26–29 March 2020; IEEE: New Jersey, NJ, USA, 2020; pp. 898–903. [Google Scholar]
  84. Kim, J.; Oh, Y.; Park, J. Adaptive Vibrotactile Flow Rendering of 2.5 D Surface Features on Touch Screen with Multiple Fingertip Interfaces. In Proceedings of the 2017 IEEE World Haptics Conference (WHC), Munich, Germany, 6–9 June 2017; IEEE: New Jersey, NJ, USA, 2017; pp. 316–321. [Google Scholar]
  85. Nunez, O.J.A.; Lubos, P.; Steinicke, F. HapRing: A Wearable Haptic Device for 3D Interaction. In Proceedings of the Mensch & computer, Stuttgart, Germany, 6–9 September 2015; De Gruyter: Berlin, Germany, 2015; pp. 421–424. [Google Scholar]
  86. Israr, A.; Poupyrev, I. Tactile Brush: Drawing on Skin with a Tactile Grid Display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 2019–2028. [Google Scholar]
  87. Makous, J.C.; Friedman, R.M.; Vierck, C.J. A Critical Band Filter in Touch. J. Neurosci. 1995, 15, 2808–2818. [Google Scholar] [CrossRef] [PubMed]
  88. Park, J.; Kim, J.; Oh, Y.; Tan, H.Z. Rendering Moving Tactile Stroke on the Palm Using a Sparse 2d Array. In Proceedings of the International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, Hamburg, Germany, 22–25 March 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 47–56. [Google Scholar]
  89. Son, B.; Park, J. Haptic Feedback to the Palm and Fingers for Improved Tactile Perception of Large Objects. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 14–17 October 2018; ACM: New York, NY, USA, 2018; pp. 757–763. [Google Scholar]
  90. You, X.; Wang, C.-X.; Huang, J.; Gao, X.; Zhang, Z.; Wang, M.; Huang, Y.; Zhang, C.; Jiang, Y.; Wang, J. Towards 6G Wireless Communication Networks: Vision, Enabling Technologies, and New Paradigm Shifts. Sci. China Inf. Sci. 2021, 64, 1–74. [Google Scholar] [CrossRef]
  91. De Fazio, R.; Giannoccaro, N.I.; Carrasco, M.; Velazquez, R.; Visconti, P. Wearable Devices and IoT Applications for Detecting Symptoms, Infected Tracking, and Diffusion Containment of the COVID-19 Pandemic: A Survey. Front. Inf. Technol. Electron. Eng. 2021, 1, 1–29. [Google Scholar] [CrossRef]
  92. De Fazio, R.; De Vittorio, M.; Visconti, P. A BLE-Connected Piezoresistive and Inertial Chest Band for Remote Monitoring of the Respiratory Activity by an Android Application: Hardware Design and Software Optimization. Future Internet 2022, 14, 183. [Google Scholar] [CrossRef]
  93. Ozioko, O.; Dahiya, R. Smart Tactile Gloves for Haptic Interaction, Communication, and Rehabilitation. Adv. Intell. Syst. 2022, 4, 2100091. [Google Scholar] [CrossRef]
  94. Steinbach, E.; Strese, M.; Eid, M.; Liu, X.; Bhardwaj, A.; Liu, Q.; Al-Ja’afreh, M.; Mahmoodi, T.; Hassen, R.; El Saddik, A. Haptic Codecs for the Tactile Internet. Proc. IEEE 2018, 107, 447–470. [Google Scholar] [CrossRef] [Green Version]
  95. Carrasco, M.; Mery, D.; Concha, A.; Velázquez, R.; De Fazio, R.; Visconti, P. An Efficient Point-Matching Method Based on Multiple Geometrical Hypotheses. Electronics 2021, 10, 246. [Google Scholar] [CrossRef]
  96. Janiesch, C.; Zschech, P.; Heinrich, K. Machine Learning and Deep Learning. Electron. Mark. 2021, 31, 685–695. [Google Scholar] [CrossRef]
  97. Dahiya, R.S. Epidermal Electronics—Flexible Electronics for Biomedical Applications. In Handbook of Bioelectronics: Directly Interfacing Electronics and Biological Systems; Iniewski, K., Carrara, S., Eds.; Cambridge University Press: Cambridge, MA, USA, 2015; pp. 245–255. ISBN 978-1-139-62953-9. [Google Scholar]
  98. Luo, S.; Bimbo, J.; Dahiya, R.; Liu, H. Robotic Tactile Perception of Object Properties: A Review. Mechatronics 2017, 48, 54–67. [Google Scholar] [CrossRef] [Green Version]
  99. Bouton, C.E.; Shaikhouni, A.; Annetta, N.V.; Bockbrader, M.A.; Friedenberg, D.A.; Nielson, D.M.; Sharma, G.; Sederberg, P.B.; Glenn, B.C.; Mysiw, W.J. Restoring Cortical Control of Functional Movement in a Human with Quadriplegia. Nature 2016, 533, 247–250. [Google Scholar] [CrossRef] [PubMed]
  100. Dervin, S.; Ganguly, P.; Dahiya, R.S. Disposable Electrochemical Sensor Using Graphene Oxide–Chitosan Modified Carbon-Based Electrodes for the Detection of Tyrosine. IEEE Sens. J. 2021, 21, 26226–26233. [Google Scholar] [CrossRef]
  101. Nikbakhtnasrabadi, F.; El Matbouly, H.; Ntagios, M.; Dahiya, R. Textile-Based Stretchable Microstrip Antenna with Intrinsic Strain Sensing. ACS Appl. Electron. Mater. 2021, 3, 2233–2246. [Google Scholar] [CrossRef]
  102. Takei, K.; Gao, W.; Wang, C.; Javey, A. Physical and Chemical Sensing with Electronic Skin. Proc. IEEE 2019, 107, 2155–2167. [Google Scholar] [CrossRef]
  103. Dahiya, R. E-Skin: From Humanoids to Humans [Point of View]. Proc. IEEE 2019, 107, 247–252. [Google Scholar] [CrossRef] [Green Version]
  104. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile Sensing—From Humans to Humanoids. IEEE Trans. Robot. 2009, 26, 1–20. [Google Scholar] [CrossRef]
  105. Saunders, G.H.; Echt, K.V. An Overview of Dual Sensory Impairment in Older Adults: Perspectives for Rehabilitation. Trends Amplif. 2007, 11, 243–258. [Google Scholar] [CrossRef] [Green Version]
  106. Hyvärinen, L.; Gimble, L.; Sorri, M. Assessment of Vision and Hearing of Deaf-Blind Persons; Royal Victorian Institute for the Blind: Melbourne, Australia, 1990; ISBN 0-949390-11-9. [Google Scholar]
  107. Grigson, P.; Lofmark, N.; Giblin, R. Hand-Tapper III: A Prototype Communication Device Using Finger-Spelling. Br. J. Vis. Impair. 1991, 9, 13–15. [Google Scholar] [CrossRef]
  108. Gollner, U.; Bieling, T.; Joost, G. Mobile Lorm Glove: Introducing a Communication Device for Deaf-Blind People. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, Kingston, ON, Canada, 19–22 February 2012; ACM: New York, NY, USA, 2012; pp. 127–130. [Google Scholar]
  109. Caporusso, N. A Wearable Malossi Alphabet Interface for Deafblind People. In Proceedings of the Working Conference on Advanced Visual Interfaces, Naples, Italy, 28–30 May 2008; ACM: New York, NY, USA, 2008; pp. 445–448. [Google Scholar]
  110. Fang, B.; Sun, F.; Liu, H.; Liu, C. 3D Human Gesture Capturing and Recognition by the IMMU-Based Data Glove. Neurocomputing 2018, 277, 198–207. [Google Scholar] [CrossRef]
  111. Chen, S.; Lou, Z.; Chen, D.; Jiang, K.; Shen, G. Polymer-enhanced Highly Stretchable Conductive Fiber Strain Sensor Used for Electronic Data Gloves. Adv. Mater. Technol. 2016, 1, 1600136. [Google Scholar] [CrossRef]
  112. Bieling, T.; Gollner, U.; Joost, G. What Do You Mean, User Study? Translating Lorm, Norm and User Research. In Proceedings of the DRS International Conference, Bangkok, Thailand, 1–4 July 2012; pp. 102–114. [Google Scholar]
  113. Rose, T.; Nam, C.S.; Chen, K.B. Immersion of Virtual Reality for Rehabilitation-Review. Appl. Ergon. 2018, 69, 153–161. [Google Scholar] [CrossRef]
  114. Manjakkal, L.; Pullanchiyodan, A.; Yogeswaran, N.; Hosseini, E.S.; Dahiya, R. A Wearable Supercapacitor Based on Conductive PEDOT: PSS-coated Cloth and a Sweat Electrolyte. Adv. Mater. 2020, 32, 1907254. [Google Scholar] [CrossRef] [PubMed]
  115. Sorgini, F.; Caliò, R.; Carrozza, M.C.; Oddo, C.M. Haptic-Assistive Technologies for Audition and Vision Sensory Disabilities. Disabil. Rehabil. Assist. Technol. 2018, 13, 394–421. [Google Scholar] [CrossRef] [PubMed]
  116. Jacobs, I.S. Fine Particles, Thin Films and Exchange Anisotropy. Magnetism 1963, 3, 271–350. [Google Scholar]
  117. Luca, C.; Andritoi, D.; Corciova, C.; Fuior, R. Intelligent Glove for Rehabilitation of Hand Movement in Stroke Survivor. In Proceedings of the 2020 International Conference and Exposition on Electrical And Power Engineering (EPE), Iasi, Romania, 22–23 October 2020; IEEE: New Jersey, NJ, USA, 2020; pp. 546–549. [Google Scholar]
  118. Jang, C.H.; Yang, H.S.; Yang, H.E.; Lee, S.Y.; Kwon, J.W.; Yun, B.D.; Choi, J.Y.; Kim, S.N.; Jeong, H.W. A Survey on Activities of Daily Living and Occupations of Upper Extremity Amputees. Ann. Rehabil. Med. 2011, 35, 907–921. [Google Scholar] [CrossRef]
  119. Simons, M.F.; Digumarti, K.M.; Le, N.H.; Chen, H.-Y.; Carreira, S.C.; Zaghloul, N.S.S.; Diteesawat, R.S.; Garrad, M.; Conn, A.T.; Kent, C.; et al. B:Ionic Glove: A Soft Smart Wearable Sensory Feedback Device for Upper Limb Robotic Prostheses. IEEE Robot. Autom. Lett. 2021, 6, 3311–3316. [Google Scholar] [CrossRef]
  120. Simons, M.F.; Haynes, A.C.; Gao, Y.; Zhu, Y.; Rossiter, J. In Contact: Pinching, Squeezing and Twisting for Mediated Social Touch. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; ACM: New York, NY, USA, 2020; pp. 1–9. [Google Scholar]
  121. Garrad, M.; Soter, G.; Conn, A.T.; Hauser, H.; Rossiter, J. A Soft Matter Computer for Soft Robots. Sci. Robot. 2019, 4, eaaw6060. [Google Scholar] [CrossRef] [Green Version]
  122. Bhatia, K.P.; Bain, P.; Bajaj, N.; Elble, R.J.; Hallett, M.; Louis, E.D.; Raethjen, J.; Stamelou, M.; Testa, C.M.; Deuschl, G. Consensus Statement on the Classification of Tremors. from the Task Force on Tremor of the International Parkinson and Movement Disorder Society. Mov. Disord. 2018, 33, 75–87. [Google Scholar] [CrossRef]
  123. Wanasinghe, A.T.; Awantha, W.V.I.; Kavindya, A.G.P.; Kulasekera, A.L.; Chathuranga, D.S.; Senanayake, B. A Layer Jamming Soft Glove for Hand Tremor Suppression. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 2684–2694. [Google Scholar] [CrossRef]
  124. Kavindya, P.; Awantha, W.V.I.; Wanasinghe, A.T.; Kulasekera, A.L.; Chathuranga, D.S.; Senanayake, B. Evaluation of Hand Tremor Frequency among Patients in Sri Lanka Using a Soft Glove. In Proceedings of the 2020 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka, 28–30 July 2020; IEEE: New Jersey, NJ, USA, 2020; pp. 301–306. [Google Scholar]
  125. Awantha, W.V.I.; Wanasinghe, A.T.; Kavindya, A.G.P.; Kulasekera, A.L.; Chathuranga, D.S. A Novel Soft Glove for Hand Tremor Suppression: Evaluation of Layer Jamming Actuator Placement. In Proceedings of the 2020 3rd IEEE International Conference on Soft Robotics (RoboSoft), Connecticut, CT, USA, 15 May–15 July 2020; IEEE: New Jersey, NJ, USA, 2020; pp. 440–445. [Google Scholar]
  126. De Fazio, R.; Al-Hinnawi, A.-R.; De Vittorio, M.; Visconti, P. An Energy-Autonomous Smart Shirt Employing Wearable Sensors for Users’ Safety and Protection in Hazardous Workplaces. Appl. Sci. 2022, 12, 2926. [Google Scholar] [CrossRef]
  127. De Fazio, R.; Cafagna, D.; Marcuccio, G.; Minerba, A.; Visconti, P. A Multi-Source Harvesting System Applied to Sensor-Based Smart Garments for Monitoring Workers’ Bio-Physical Parameters in Harsh Environments. Energies 2020, 13, 2161. [Google Scholar] [CrossRef]
  128. Edge Impulse. Available online: https://www.edgeimpulse.com/ (accessed on 9 September 2022).
  129. Starecki, T. Analog Front-End Circuitry in Piezoelectric and Microphone Detection of Photoacoustic Signals. Int. J. Thermophys. 2014, 35, 2124–2139. [Google Scholar] [CrossRef]
  130. Webster, J.G.; Eren, H. Measurement, Instrumentation, and Sensors Handbook: Two-Volume Set (Electrical Engineering Handbook); CRC Press: Boca Raton, FL, USA, 2014. [Google Scholar]
  131. Pallas-Areny, R.; Webster, J.G. Sensors and Signal Conditioning; John Wiley & Sons: New Jersey, NJ, USA, 2012; ISBN 1-118-58593-3. [Google Scholar]
  132. Graeme, J. Photodiode Amplifiers: Op Amp Solutions; McGraw-Hill, Inc.: New York, NY, USA, 1995; ISBN 0-07-024247-X. [Google Scholar]
  133. Oven, R. Modified Charge Amplifier for Stray Immune Capacitance Measurements. IEEE Trans. Instrum. Meas. 2014, 63, 1748–1752. [Google Scholar] [CrossRef]
  134. Karki, J. Analysis of the Sallen-Key Architecture; Application Report for Texas Instruments: Texas, TX, USA, 1999; pp. 1–14. [Google Scholar]
  135. Karki, J. Active Low-Pass Filter Design; Application Report for Texas Instruments: Texas, TX, USA, 2000; pp. 1–22. [Google Scholar]
  136. Calabrese, B.; Velázquez, R.; Del-Valle-Soto, C.; de Fazio, R.; Giannoccaro, N.I.; Visconti, P. Solar-Powered Deep Learning-Based Recognition System of Daily Used Objects and Human Faces for Assistance of the Visually Impaired. Energies 2020, 13, 6104. [Google Scholar] [CrossRef]
  137. Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.-C. MobileNetV2: Inverted Residuals and Linear Bottlenecks. arXiv 2019, arXiv:1801.04381. [Google Scholar] [CrossRef]
  138. The Centre for Seech Technology Research FeThe Festival Speech Synthesis Systemstival. Available online: https://www.cstr.ed.ac.uk/projects/festival/ (accessed on 12 November 2022).
Figure 1. Workflow used for the selection of papers included in the survey of haptic technologies.
Figure 1. Workflow used for the selection of papers included in the survey of haptic technologies.
Futureinternet 15 00014 g001
Figure 2. Distribution of documents analyzed for realizing the survey of haptic technologies.
Figure 2. Distribution of documents analyzed for realizing the survey of haptic technologies.
Futureinternet 15 00014 g002
Figure 3. Kinesthetic (left) and tactile feedback (right)are shown in a schematic depiction of haptic feedback. The awareness of muscles and joints in response to a gesture, stretch, weight, etc., is known as kinesthetic feedback. The sense of surface hardness, temperature, and other properties of the skin’s surface derived through mechanoreceptors is known as tactile feedback.
Figure 3. Kinesthetic (left) and tactile feedback (right)are shown in a schematic depiction of haptic feedback. The awareness of muscles and joints in response to a gesture, stretch, weight, etc., is known as kinesthetic feedback. The sense of surface hardness, temperature, and other properties of the skin’s surface derived through mechanoreceptors is known as tactile feedback.
Futureinternet 15 00014 g003
Figure 4. Examples of hydraulically driven tactile interfaces. Illustration of a hydraulic glove with a joint-stretching mechanism providing kinesthetic feedback (a). A diagram of a hydraulic actuator that changes the cavity to provide out-of-plane displacement for tactile feedback (b) [31]. An electrostatic mechanism controls the hydraulic actuators leading to dielectric normal and lateral motion as a result of cavity deformation (c) [31].
Figure 4. Examples of hydraulically driven tactile interfaces. Illustration of a hydraulic glove with a joint-stretching mechanism providing kinesthetic feedback (a). A diagram of a hydraulic actuator that changes the cavity to provide out-of-plane displacement for tactile feedback (b) [31]. An electrostatic mechanism controls the hydraulic actuators leading to dielectric normal and lateral motion as a result of cavity deformation (c) [31].
Futureinternet 15 00014 g004
Figure 5. Examples of interfaces with piezoelectric actuators for generating haptic feedback. Illustration of the piezoelectric haptic feedback system (a). To give rapid feedback on the contraction and release of haptic sensations, a smart glove incorporates sensors and PZT stimulators (b) [34].
Figure 5. Examples of interfaces with piezoelectric actuators for generating haptic feedback. Illustration of the piezoelectric haptic feedback system (a). To give rapid feedback on the contraction and release of haptic sensations, a smart glove incorporates sensors and PZT stimulators (b) [34].
Futureinternet 15 00014 g005
Figure 6. Thermal-based haptic interfaces. Joule-heating interface for haptic feedback: schematic representation of Joule heaters [24] (a); haptic feedback interfaces utilizing thermal electric actuators. A diagram of a thermoelectric mechanism [2] (b); microfluidic and other thermal-based haptic interfaces. The systems for micro-fluidic heat transfer are depicted schematically (left). Thermoregulatory clothing that circulates liquid cooling through flexible, thermally conductive silicone aluminum tubes to cool the body (center and right) (c).
Figure 6. Thermal-based haptic interfaces. Joule-heating interface for haptic feedback: schematic representation of Joule heaters [24] (a); haptic feedback interfaces utilizing thermal electric actuators. A diagram of a thermoelectric mechanism [2] (b); microfluidic and other thermal-based haptic interfaces. The systems for micro-fluidic heat transfer are depicted schematically (left). Thermoregulatory clothing that circulates liquid cooling through flexible, thermally conductive silicone aluminum tubes to cool the body (center and right) (c).
Futureinternet 15 00014 g006
Figure 7. Operating principle of neural stimulation-based tactile interfacing system. Schematic depicting the mode in which electric current excites the nerve to produce electro-tactile sensations (a) using pulsed signals (b); an example of prosthesis based on neural stimulation tactile where feedback intensity is steadily controlled when an amputated user pounds a nail (c) [45].
Figure 7. Operating principle of neural stimulation-based tactile interfacing system. Schematic depicting the mode in which electric current excites the nerve to produce electro-tactile sensations (a) using pulsed signals (b); an example of prosthesis based on neural stimulation tactile where feedback intensity is steadily controlled when an amputated user pounds a nail (c) [45].
Futureinternet 15 00014 g007
Figure 8. Vi-Hab system with highlighted main components: the wearable band and 5 vibrating motors [56]; a force sensitive resistors (FSR) and related motor are present for each finger of the dummy hand (from 1 to 5, totally five). The wearable band (in the inset) wraps around the upper arm such that each motor falls in line with the natural position of the fingers as shown by the red arrows.
Figure 8. Vi-Hab system with highlighted main components: the wearable band and 5 vibrating motors [56]; a force sensitive resistors (FSR) and related motor are present for each finger of the dummy hand (from 1 to 5, totally five). The wearable band (in the inset) wraps around the upper arm such that each motor falls in line with the natural position of the fingers as shown by the red arrows.
Futureinternet 15 00014 g008
Figure 9. Glove-based HMI for general purposes; it includes (a) triboelectric finger sensors for sensing bending movements (b) with a related operating principle (c), a triboelectric palm sensor (c) for detecting the sliding movements (d) with a related operating principle (e), and piezoelectric mechanical stimulator for generating haptic feedback (f) [34].
Figure 9. Glove-based HMI for general purposes; it includes (a) triboelectric finger sensors for sensing bending movements (b) with a related operating principle (c), a triboelectric palm sensor (c) for detecting the sliding movements (d) with a related operating principle (e), and piezoelectric mechanical stimulator for generating haptic feedback (f) [34].
Futureinternet 15 00014 g009
Figure 10. Three-dimensional model of the presented ungrounded tactile device with the main components highlighted (a) and the assembled device without the handle (b) [69].
Figure 10. Three-dimensional model of the presented ungrounded tactile device with the main components highlighted (a) and the assembled device without the handle (b) [69].
Futureinternet 15 00014 g010
Figure 11. Three-dimensional model and screenshots of type-A (a) and type-B (b) tactile interfaces applied on fingertips. LF, LR, RF, and RR stand for the locations of the piezo actuator [83].
Figure 11. Three-dimensional model and screenshots of type-A (a) and type-B (b) tactile interfaces applied on fingertips. LF, LR, RF, and RR stand for the locations of the piezo actuator [83].
Futureinternet 15 00014 g011
Figure 12. Average Weber fraction according to the haptic feedback type (a). The results concerning the haptic condition come from earlier research [89]. Mean subjective assessment of realism according to the haptic interface type (b) [83]. For both images, the standard errors are shown by the error bars. The single-star symbol (*) indicates that the observed significance level (p-value) of the test is lower than 0.05; in detail, it was equal to 0.013 for the test reported in (b).
Figure 12. Average Weber fraction according to the haptic feedback type (a). The results concerning the haptic condition come from earlier research [89]. Mean subjective assessment of realism according to the haptic interface type (b) [83]. For both images, the standard errors are shown by the error bars. The single-star symbol (*) indicates that the observed significance level (p-value) of the test is lower than 0.05; in detail, it was equal to 0.013 for the test reported in (b).
Futureinternet 15 00014 g012
Figure 13. Manual user interaction without the use of any equipment (a1). The doctor assesses the finger angle with goniometers for therapeutic purposes (a2). A smart glove that recognizes gestures and uses a highly stretchy, polymer-enhanced strain sensor (a3) [111]. An interactive smart glove that uses touch and the Lorm alphabet to facilitate communication for the deafblind (a4) [112]. An IMMU-assisted smart glove for capturing and identifying 3D human gestures (a5) [110]. Gesture-based methods for deafblind remote communication utilizing PARLOMA, a revolutionary human–robot interaction system (a6). A soft intelligent glove for hand rehabilitation that combines touch and gesture methods (a7). For badminton, a sensory glove is employed to describe delicate hand motions (a8). Tactile feedback and integrated haptic sensing (b1). Gas-permeable, multipurpose electronic-skin electronic realized by laser-induced porous graphene and a spongy layer made of elastomer with a sugar template (b2). An energy-autonomous electronic skin (b3). A smart glove designed as an assistive device integrating touch sensors and actuators for persons who are deafblind (b4). A material for electronic skin applications that is self-healing and self-powered (b5) [93].
Figure 13. Manual user interaction without the use of any equipment (a1). The doctor assesses the finger angle with goniometers for therapeutic purposes (a2). A smart glove that recognizes gestures and uses a highly stretchy, polymer-enhanced strain sensor (a3) [111]. An interactive smart glove that uses touch and the Lorm alphabet to facilitate communication for the deafblind (a4) [112]. An IMMU-assisted smart glove for capturing and identifying 3D human gestures (a5) [110]. Gesture-based methods for deafblind remote communication utilizing PARLOMA, a revolutionary human–robot interaction system (a6). A soft intelligent glove for hand rehabilitation that combines touch and gesture methods (a7). For badminton, a sensory glove is employed to describe delicate hand motions (a8). Tactile feedback and integrated haptic sensing (b1). Gas-permeable, multipurpose electronic-skin electronic realized by laser-induced porous graphene and a spongy layer made of elastomer with a sugar template (b2). An energy-autonomous electronic skin (b3). A smart glove designed as an assistive device integrating touch sensors and actuators for persons who are deafblind (b4). A material for electronic skin applications that is self-healing and self-powered (b5) [93].
Futureinternet 15 00014 g013
Figure 14. The designed sensory glove with highlighted the main components (left) and worn by a user (right) [110].
Figure 14. The designed sensory glove with highlighted the main components (left) and worn by a user (right) [110].
Futureinternet 15 00014 g014
Figure 15. Schematic of B: Ionic glove, constituted by pressure pads, which contain conductive fluid applied to the prosthesis’ fingertip, a battery-powered fluidic controller, and an armband equipped with SMA actuators, worn by the user’s remaining limb to softly squeeze their arm [119].
Figure 15. Schematic of B: Ionic glove, constituted by pressure pads, which contain conductive fluid applied to the prosthesis’ fingertip, a battery-powered fluidic controller, and an armband equipped with SMA actuators, worn by the user’s remaining limb to softly squeeze their arm [119].
Futureinternet 15 00014 g015
Figure 16. Haptic armband constituted of five re-entrant hexagonal components obtained by 3D printing. The device is actuated by twisted SMA wires, and the Kapton tape serves as a heat-resistant barrier [119].
Figure 16. Haptic armband constituted of five re-entrant hexagonal components obtained by 3D printing. The device is actuated by twisted SMA wires, and the Kapton tape serves as a heat-resistant barrier [119].
Futureinternet 15 00014 g016
Figure 17. Schematic representation of the jamming elements’ positioning on the tremor suppression glove [125].
Figure 17. Schematic representation of the jamming elements’ positioning on the tremor suppression glove [125].
Futureinternet 15 00014 g017
Figure 18. Illustration of the different layers constituting the sensor, and in the box, the representation of the connection package made by 3D printing (a). Illustration of the sensor’s shape and dimensions (b). Photograph of the resulting sensor connected with a coaxial cable (c) [20].
Figure 18. Illustration of the different layers constituting the sensor, and in the box, the representation of the connection package made by 3D printing (a). Illustration of the sensor’s shape and dimensions (b). Photograph of the resulting sensor connected with a coaxial cable (c) [20].
Futureinternet 15 00014 g018
Figure 19. The detection of hand motions using the developed WHSs. (A) Five sensors are attached to the back of a human hand in positioning scheme (i) and picture (ii). A detected signal corresponding to the hyperextension of the index finger (iii). (B) The signal provided by WHS applied to the middle finger for different hand apertures. (C,D) The detected signals generated by five WHSs at the hand’s rear, starting from the closed (C) or the open (D) hand, for various conditions: distended index finger (i), index and middle finger (ii); thumb, index, and middle finger (iii); all fingers except for the pinkie (iv); all fingers (v). (E) A series of actions involving the thumb and pinkie that begin with the closed hand. (F) The reliability measurement obtained during 100 repetitions of the closed-opened hand. The measurements were carried out by applying the WHSs on the middle finger and pinkie.
Figure 19. The detection of hand motions using the developed WHSs. (A) Five sensors are attached to the back of a human hand in positioning scheme (i) and picture (ii). A detected signal corresponding to the hyperextension of the index finger (iii). (B) The signal provided by WHS applied to the middle finger for different hand apertures. (C,D) The detected signals generated by five WHSs at the hand’s rear, starting from the closed (C) or the open (D) hand, for various conditions: distended index finger (i), index and middle finger (ii); thumb, index, and middle finger (iii); all fingers except for the pinkie (iv); all fingers (v). (E) A series of actions involving the thumb and pinkie that begin with the closed hand. (F) The reliability measurement obtained during 100 repetitions of the closed-opened hand. The measurements were carried out by applying the WHSs on the middle finger and pinkie.
Futureinternet 15 00014 g019
Figure 20. Block diagram reporting the overall architecture of the proposed AlN-based piezoelectric glove.
Figure 20. Block diagram reporting the overall architecture of the proposed AlN-based piezoelectric glove.
Futureinternet 15 00014 g020
Figure 21. Graphical representation of the designed inferring chain for gesture recognition.
Figure 21. Graphical representation of the designed inferring chain for gesture recognition.
Futureinternet 15 00014 g021
Figure 22. Three-dimensional model of the developed piezoelectric glove with highlighted the main components: AlN flexible sensors, conditioning board, acquisition and processing board, and Lipo battery.
Figure 22. Three-dimensional model of the developed piezoelectric glove with highlighted the main components: AlN flexible sensors, conditioning board, acquisition and processing board, and Lipo battery.
Futureinternet 15 00014 g022
Figure 23. Block diagram of the front-end dedicated to the AlN-based piezoelectric sensors [129].
Figure 23. Block diagram of the front-end dedicated to the AlN-based piezoelectric sensors [129].
Futureinternet 15 00014 g023
Figure 24. Schematic of the designed charge amplifier. The piezoelectric transducer is represented with its Thevenin equivalent circuit.
Figure 24. Schematic of the designed charge amplifier. The piezoelectric transducer is represented with its Thevenin equivalent circuit.
Futureinternet 15 00014 g024
Figure 25. Overall schematic of the designed Band-pass filter based on Sallen–Key cells.
Figure 25. Overall schematic of the designed Band-pass filter based on Sallen–Key cells.
Futureinternet 15 00014 g025
Figure 26. Bode diagram of the overall system’s frequency response: magnitude (a) and phase (b).
Figure 26. Bode diagram of the overall system’s frequency response: magnitude (a) and phase (b).
Futureinternet 15 00014 g026
Figure 27. PCB board of the overall conditioning section (a) and 3D model view (b).
Figure 27. PCB board of the overall conditioning section (a) and 3D model view (b).
Futureinternet 15 00014 g027
Figure 28. Three flexible AlN sensors used in the preliminary tests: triangular tip (a), large rectangular (b), and small rectangular (c).
Figure 28. Three flexible AlN sensors used in the preliminary tests: triangular tip (a), large rectangular (b), and small rectangular (c).
Futureinternet 15 00014 g028
Figure 29. Waveforms of the output signal V out (yellow trace) and the filtered output signal V out ,   filtered (cyan trace) in response to the flexural (a) and impulsive (b) solicitations.
Figure 29. Waveforms of the output signal V out (yellow trace) and the filtered output signal V out ,   filtered (cyan trace) in response to the flexural (a) and impulsive (b) solicitations.
Futureinternet 15 00014 g029
Figure 30. Application of the sensor on the index finger (a) and corresponding output signal V out (yellow trace) and the filtered output signal V out ,   filtered (cyan trace) related to finger bending (b).
Figure 30. Application of the sensor on the index finger (a) and corresponding output signal V out (yellow trace) and the filtered output signal V out ,   filtered (cyan trace) related to finger bending (b).
Futureinternet 15 00014 g030
Figure 31. Application of the sensor on the wrist (a) and corresponding output signal V out (yellow trace) and the filtered output signal V out ,   filtered (cyan trace) related to wrist bending (b).
Figure 31. Application of the sensor on the wrist (a) and corresponding output signal V out (yellow trace) and the filtered output signal V out ,   filtered (cyan trace) related to wrist bending (b).
Futureinternet 15 00014 g031
Figure 32. Graphical representation of the proposed combined tactile and image recognition system.
Figure 32. Graphical representation of the proposed combined tactile and image recognition system.
Futureinternet 15 00014 g032
Table 1. Summarizing and comparative table of the discussed haptic approaches.
Table 1. Summarizing and comparative table of the discussed haptic approaches.
ApproachActuatorMechanical
Feedback
Tactile
Feedback
AdvantagesDisadvantages
Force-based
haptic Devices
Pneumatic
actuators
YesForce, shape, and impactEfficiently stretching force, out-of-plane displacementsLow-speed actuator response, safety issues, complex structure
Hydraulic
actuators
YesForce, shape, and impactEfficiently stretching force, out-of-plane displacementsLow-speed actuator response, safety issues,
complex structure
Piezoelectric actuatorsNoPattern,
hardness, and roughness
Compact size and fast response timeWeak output due to low piezoelectric film displacement
Electro-magnetic actuatorsNoPattern,
roughness
Lower power consumption, fast responsiveness, high haptic strengthNarrow operating frequency,
not-so-small packaging
Thermal-based
haptic devices
Joule heaterNoWarmingSimple structureLack of cooling microstructure and dissipation structure
Thermoelectric actuatorsNoWarming, CoolingComplete cooling and heating manipulationComplex structure, also for wearable due to necessarily develop a certain material for these kinds of applications
Microfluidic systemsNoWarming, CoolingComplete cooling and heating manipulationComplex design
Nerve stimulation haptic devicesElectrotactile stimulationYesImpact, pattern, roughness, and hardnessCompact, wide bandwidth, multi-mode, simple and compact designTickling feeling, Unstable contact resistance (due to impedance conflicts),
Unstable feelings and unclear bio-mechanism
Table 2. Characteristics and technical specifications of all the analyzed devices.
Table 2. Characteristics and technical specifications of all the analyzed devices.
DeviceApplicationN. of Actuators/SensorsActuators or Sensors
Technology
Feedback
Typology
Future Applications
Vi-Hab band
[56]
Rehabilitation systems (biomedical)(5) Vibrational coin motorsVibrotactileKinesthetic Feedback (independent and simultaneous)Force control on active prosthesis or exoskeleton
Smart Glove
[34]
VR surgical training, VR social network(8) Triboelectric sensors
(1) PZT stimulator
Triboelectric tactile sensors based on elastomer; PZT tactile actuatorVibrotactile FeedbackRemote home-care; Self-powered system; Intelligence improvements on machines based on AI Big Data
Ungrounded inertial haptic interface
[69]
Haptic devices(1) Piezo actuator P-602-3SL; (1) Integrated position sensor;
(1) 3- axis accelerometer MMA7361
PiezoelectricForce FeedbackPortable haptic device; Reduced mobile parts by implementing ball bearing
PPTs
[76]
Haptic devicesPrinted polymer transducers (PPTs) piezomembranesPiezoelectricVibrotactile and acoustic haptic feedbackFree space haptic feedback based on ultrasound
Vibrotactile array fingertip [84]VR object recognition and interaction(4) Piezoelectric
actuators
PiezoelectricVibrotactile Feedback3D VR interaction
Table 3. Basic characteristics for intelligent gloves [93].
Table 3. Basic characteristics for intelligent gloves [93].
CharacteristicRequirement
Monitoring hand movementsUp to 23 degrees of freedom (DoFs) are needed to effectively monitor hand gestures, including up to 4 DoFs for each finger (two for the first joint and one for each additional joint) and 3 DoFs for hand rotation.
Tactile ResponseTactile feedback improves both the bidirectional interaction with the manipulated item and, thus, the user experience.
WearabilityThe device must be comfortable and simple to put on.
DimensionThe device might be created in various sizes or made adaptable.
WeightThe gloves should be light, often weighing between 50 and 300 g, as they are placed on the hand.
Power SourceThe implementation of a low-power device is essential for this application. Energy-autonomous devices should be taken into account.
Wireless communicationA wireless connection (such Bluetooth or Wi-Fi-based) is recommended for remote machine control.
Table 4. Summarizing table with the smart gloves previously analyzed and compared in terms of sensors’ typologies and number, processing unit, and future applications and improvements.
Table 4. Summarizing table with the smart gloves previously analyzed and compared in terms of sensors’ typologies and number, processing unit, and future applications and improvements.
WorkN° of Sensors and TypeProcessing UnitResulting Data/
Feedback
Future Applications and
Improvements
B. Fang et al.
[110]
(15) MPU9250 9-axis inertial and magnetic sensorsSTM32F4ELM-based gestures recognitionRobotic teleoperation based on the gesture recognition
C. Luca et al.
[117]
(3) Pressure sensors
(3) Bending sensors
Atmega 328PFinger movements detectionClinical study testing
M.F. Simons et al. [119](1) Pressure pads made with silicone elastomer
(1) Coiled SMA wires and Kapton tape for the armband
Atmega 328PMechanotactile
stimulation
Testing on upper limb amputees to assess the device in a real application
W.V.I. Awantha et al. [125]8-layer jamming mechanism, (1) Inertial Measurement Unit (IMU)Arduino Mega 2560Stiffening of jamming elementsImprove wearability; stiffness control; optimize tremor suppression using force assessment
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

De Fazio, R.; Mastronardi, V.M.; Petruzzi, M.; De Vittorio, M.; Visconti, P. Human–Machine Interaction through Advanced Haptic Sensors: A Piezoelectric Sensory Glove with Edge Machine Learning for Gesture and Object Recognition. Future Internet 2023, 15, 14. https://doi.org/10.3390/fi15010014

AMA Style

De Fazio R, Mastronardi VM, Petruzzi M, De Vittorio M, Visconti P. Human–Machine Interaction through Advanced Haptic Sensors: A Piezoelectric Sensory Glove with Edge Machine Learning for Gesture and Object Recognition. Future Internet. 2023; 15(1):14. https://doi.org/10.3390/fi15010014

Chicago/Turabian Style

De Fazio, Roberto, Vincenzo Mariano Mastronardi, Matteo Petruzzi, Massimo De Vittorio, and Paolo Visconti. 2023. "Human–Machine Interaction through Advanced Haptic Sensors: A Piezoelectric Sensory Glove with Edge Machine Learning for Gesture and Object Recognition" Future Internet 15, no. 1: 14. https://doi.org/10.3390/fi15010014

APA Style

De Fazio, R., Mastronardi, V. M., Petruzzi, M., De Vittorio, M., & Visconti, P. (2023). Human–Machine Interaction through Advanced Haptic Sensors: A Piezoelectric Sensory Glove with Edge Machine Learning for Gesture and Object Recognition. Future Internet, 15(1), 14. https://doi.org/10.3390/fi15010014

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop