Next Article in Journal
A Modified High-Selective Frequency Selective Surface Designed by Multilevel Green’s Function Interpolation Method
Previous Article in Journal
Concept and Realisation of ISFET-Based Measurement Modules for Infield Soil Nutrient Analysis and Hydroponic Systems
Previous Article in Special Issue
Geometry Optimization of Stratospheric Pseudolite Network for Navigation Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hardness Classification Using Cost-Effective Off-the-Shelf Tactile Sensors Inspired by Mechanoreceptors

Wolfson School of Mechanical, Electrical, and Manufacturing Engineering, Loughborough University, Loughborough LE11 3TU, UK
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(13), 2450; https://doi.org/10.3390/electronics13132450
Submission received: 30 April 2024 / Revised: 3 June 2024 / Accepted: 18 June 2024 / Published: 22 June 2024
(This article belongs to the Special Issue Advances in Social Bots)

Abstract

:
Perception is essential for robotic systems, enabling effective interaction with their surroundings through actions such as grasping and touching. Traditionally, this has relied on integrating various sensor systems, including tactile sensors, cameras, and acoustic sensors. This study leverages commercially available tactile sensors for hardness classification, drawing inspiration from the functionality of human mechanoreceptors in recognizing complex object properties during grasping tasks. Unlike previous research using customized sensors, this study focuses on cost-effective, easy-to-install, and readily deployable sensors. The approach employs a qualitative method, using Shore hardness taxonomy to select objects and evaluate the performance of commercial off-the-shelf (COTS) sensors. The analysis includes data from both individual sensors and their combinations analysed using multiple machine learning approaches, and accuracy as the primary evaluation metric was considered. The findings illustrate that increasing the number of classification classes impacts accuracy, achieving 92% in binary classification, 82% in ternary, and 80% in quaternary scenarios. Notably, the performance of commercially available tactile sensors is comparable to those reported in the literature, which range from 50% to 98% accuracy, achieving 92% accuracy with a limited data set. These results highlight the capability of COTS tactile sensors in hardness classification giving accuracy levels of 92%, while being cost-effective and easier to deploy than customized tactile sensors.

1. Introduction

In the robotics domain, various sensor systems, including tactile sensors, cameras, and acoustic sensors, have been utilized to provide perceptual capabilities. These sensors are often integrated into gripper systems, equipped as artificial fingers, layered sensors, or embedded modules. Previous research has explored the use of different sensors for several classifications [1], using sensors individually and in combination. Material classification can be performed to determine the texture of the object or the hardness of the object based on some scale [1,2], whereas object classification studies have involved grasping different objects to analyse their characteristics using sensor values and different resistance values. Techniques such as pressing, sliding sensors across surfaces, and employing squeezing or grasping have been explored in the literature as state-of-the-art methodologies in tactile perception [1,2]. However, the selection of sensors in these studies has typically been driven by application-specific requirements rather than mirroring the architecture or functionality of human mechanoreceptors [3,4,5]. In humans, tactile information is processed by four distinct mechanoreceptors, each specializing in detecting different tactile signals [6]. Presently, there is a gap in the field concerning the selective integration of sensors which could have the same functionality of human mechanoreceptors to perform hardness classification.
The aim of this study is to explore commercially available tactile sensors that mimic the properties and functionalities of human mechanoreceptors. This research seeks to identify sensors capable of detecting tactile information with a performance comparable to that of human fingers, emphasizing cost-effectiveness. Previous research has primarily focused on utilizing customized tactile sensors that are expensive and application-specific [7,8,9,10,11,12,13]. The specialized architecture of these sensors limits their immediate deployment. This work targets the identification of tactile sensors that are readily deployable, requiring minimal installation efforts and offering accessibility without the need for extensive customization. It is important to establish a framework that allows for the description of tactile sensor selection, drawing from the mechanoreceptor concepts described in the existing literature [6,14]. Although prior studies have proposed advanced tactile sensors as artificial mechanoreceptors [5,15,16,17,18,19,20,21], their development and implementation in robotics systems are still at an early stage. In contrast, the existing array of tactile sensors in robotics encompasses various modalities, including vibration, thermal, piezoelectric, and piezoresistive sensors [1,22], which collectively provide a rich source of tactile data. However, many of these sensors are customized for specific tasks or applications, and they have not been adequately assessed for classifying materials based on hardness for robotic grasping tasks. Therefore, there is a need to investigate the potential of commercial or cost-effective tactile sensors in hardness classification in robotic manipulation scenarios. Sensors in robotics grippers [8,9,10,23,24,25] and receptors in human hands [6,14,26,27,28] play a crucial role in receiving tactile information, forming the foundation of tactile perception systems in both. In robotics, hardness classification utilizes various tactile sensing methods, where measuring mechanical resistance while squeezing an object is crucial [1,8,29]. Hardness classification judges an object’s resistance to deformation, its ability to withstand external forces without distortion, and its tendency to deformation, thereby enabling classification into categories such as hard, soft, or flexible. While most research has primarily focused on binary classifications of hardness and softness [8,9,24,30,31,32], there remains a significant gap in exploring classifications that involve three or more classes. Developing multi-class systems could offer distinct advantages in real-time applications, particularly in robotic manipulation.
Tactile information obtained through grasping is crucial for understanding an object’s tactile properties [8,33], unlike vision systems, which rely on predefined object features and lack predictive abilities where tactile sensors provide numeric data, enabling more feasible and less time-consuming analysis when coupled with machine learning algorithms [34]. This paper proposes the use of cost-effective commercial off-the-shelf tactile sensors, inspired by human mechanoreceptors, to perform hardness classification through grasping methods. Human tactile perception relies on the significant abilities of mechanoreceptors, which are specialized sensors within our bodies that detect pressure, vibration, and texture. These biological sensors enable us to recognize various tactile properties with notable precision. Inspired by this natural mechanism, this paper explores commercial off-the-shelf (COTS) sensors capable of detecting similar tactile information when grasping objects, akin to the capabilities of human hands. The aim is to showcase the effectiveness of available COTS tactile sensor data, as individual sensor data and combinations in classification tasks. The results will demonstrate the potential of individual sensors as mechanoreceptor-inspired tactile receptors performing hardness classification using various machine learning algorithms. Furthermore, this study aims to investigate the potential enhancement in accuracy by combining data from each tactile sensor, demonstrating how collective features from each sensor perform in hardness classification. This is expected to deliver a competitive agile solution for hardness classification that can be incorporated and scaled in robotic systems particularly where vision base solutions are unsuitable.
This paper comprises various sections that identify the required tools, methods, and adaptations to execute machine learning on the collected data, described in the following manner. Section 2 provides the foundational understanding of mechanoreceptors, tactile sensors, and material classification, drawing inspiration for this paper. It serves as the basis for the selection process of commercial off-the-shelf (COTS) sensors, leading to their selectivity based on the functionality of mechanoreceptors. Section 3 presents an approach for COTS sensors identification based on mechanoreceptors inspiration, a further stage to implement sensors with grippers and ML steps to implement an algorithm on data structure obtained from COTS. Section 4 presents the design of the experiment which show case data collection action which connects the approach and experiment setup in collecting data from selected sensors, forming different data configurations from COTS. It explains in depth the grasping method performed using pneumatic grippers utilising pressure to control the gripper system and further also explains how the object was selected and prepared using Shore’s qualitative hardness scale (H,S,F,ES) and how data collection was performed. Section 5 showcases three results: 1st for binary (H,S) classification for different ML algorithms with different configurations of sensor data, 2nd for ternary (H,S,F) classification, 3rd for best algorithm outcome with quaternary (H,S,F,ES) classification.

2. Background

Human hands have exceptional tactile capabilities and serve as a significant inspiration for advancing robotic perception. Mechanoreceptors are special sensors/receptors in human skin that detect various tactile sensations like pressure, force, and vibration. They play a crucial role in our ability to perceive and interact with the world around us. Understanding how mechanoreceptors work and their capability to detect hardness is essential for robotics research. Replicating these sensors in robots can provide the capability to perceive and interpret tactile information while grasping. This would enable robots to better understand objects with more precision and perform tasks that require sensitivity to hardness.

2.1. Mechanoreceptors

How a human detects the property of an object using touch, grasping, and picking is based on receptors (biological transducers) which convert any touch to stimulus into tactile information to signal to the brain to judge the object’s property. Mechanoreceptors are fundamental sensors that detect different physical/mechanical properties of objects (tactile information) mainly by four receptors [35,36]. Figure 1 illustrates the following: Four of them have the capabilities to detect different physical properties like force, vibration, deformation, indentation, etc. (1) Merkel Cells (Discs): Merkel cells respond to changes in pressure and texture, allowing them to detect variations in surface features such as roughness and indentation, informing of force/pressure. (2) Meissner’s Corpuscles: Meissner’s corpuscles are sensitive to light touch and low-frequency vibrations, which detect any changes in surface texture and shape. (3) Pacinian Corpuscles: The primary function is to detect sudden changes in vibration and pressure. (4) Ruffini Endings: Ruffini endings are sensitive to sustained pressure and skin stretch, enabling them to detect changes in skin deformation caused by object contact. Based on current understanding, mechanoreceptors comprise tactile receptors that receive multiple pieces of information while grasping or touching the object which processes the tactile information through neurons to the brain to understand the properties and define it or store it in the brain for further reference [37]. Mechanoreceptors are responsible for detecting mechanical stimuli, including pressure, vibration, and deformation, thereby providing humans with valuable tactile information [6,35,36,37]. In the context of tactile perception, mechanoreceptors play a crucial role in assessing the hardness and softness of objects by responding to various tactile signals. Four important aspects understood from receptors illustrated in Figure 1 are that touch, force, vibration, and sliding are the key physical properties that make humans understand if the object is hard or soft or flexible while grasping or squeezing.

2.2. Tactile Sensors

Material classification in robotics is one of the important aspects of development which helps robotic grippers in different environments to handle different objects or perform manipulation. This objective is only possible if sensors embedded within grippers have the capability to detect tactical information to recognize object properties. Material classification can be further subclassified into hardness classification and texture classification. Currently, texture classification has been explored mostly, but hardness classification has not been explored much with different tactile sensors and off-the-shelf sensors. Sensors used in literature are highly customized with shape and dimension which takes time and cost to develop. Parallel research is now more focused in developing sensors that have human capabilities in detecting object characters like texture, hardness, roughness, softness, slippery, etc. [1]. Engineering tools and methods have been developed which have shown some capability to show response type and functionality to human receptors [15,16,17,18,19,20,21] but it remains challenging to develop a system or sensors that have the same capability as humans. In the literature, various types of sensors are discussed with respect to their cost, which often varies depending on the technology and complexity involved. Optical sensors and bioinspired sensors are generally considered high-cost options due to their intricate designs and advanced functionalities. Microfluidic sensors and MEMS-based sensors fall into the moderate- to high-cost category, as they involve sophisticated fabrication processes and specialized materials. Similarly, grid capacitive, piezoelectric, and resistive tactile sensors are also moderately priced, reflecting the complexity of their construction and sensing mechanisms. Neuron memristors and ionic tactile sensors are relatively newer technologies, occupying the moderate cost range as they undergo further development and refinement [5,29,38,39]. However, all of them face limitations including high cost, complexity, sensitivity to environmental factors, integration challenges, and training issues with ML models due to the need for extensive datasets and iterative refinement filtration, etc. In comparison, commercially available off-the-shelf (COTS) sensors are easy to use, cost-effective, ready to deploy, flexible to adjust to most grippers, and suitable for exploring with different robotics grippers as tactile sensing techniques. Moreover, these sensors, such as vibration sensors, force-sensitive resistors (FSR), and soft potentiometers, can also play a role as artificial mechanoreceptors based on functionality.

2.3. Material Classification

Material classification involves sorting objects based on various features, much like how humans recognize things by feeling their texture and hardness when they touch or hold them. While texture classification, edge detection, and object classification have received considerable attention, hardness classification has been relatively less explored, despite its significant importance. In the field of material science and engineering, machine learning (ML) techniques have proven invaluable in determining material properties using various methods and approaches, such as the Shore scale taxonomy [7,10,12,36,38,39]. ML enables systems to learn and recognize materials based on their hardness using techniques like grasping, indentation, and resistance methods. By using extensive datasets, ML models discern patterns and correlations, preparing them to accurately classify materials into predefined hardness classes. This iterative refinement process ensures their reliability, aiding industries in material selection and enhancing robotic grasping mechanisms through tactile sensing. Inspired by human grasping mechanisms, ML algorithms serve as analytical tools, processing tactile information to identify object properties. Robotics systems lack the analytical capabilities of the human brain to recognize object properties. However, through the integration of sensor data and machine learning (ML), this approach becomes feasible in advanced tactile sensing for robotics.
These algorithms are built upon various mathematical formulations, which serve as the backend processes. By encapsulating them as functions within ML libraries and implementing them through Python coding, they become readily deployable for data analysis. This facilitates advancements in material classification and robotics. The hardness classification process initiates with a long process of data collection through experiments, gathering information on material properties alongside corresponding hardness measurements. Thorough preprocessing follows, ensuring the quality and integrity of the dataset, while relevant features indicative of hardness properties are selected or extracted to facilitate accurate classification. Subsequently, an appropriate ML model is selected and trained using the pre-processed dataset, allowing it to learn patterns and relationships between input features and hardness classifications. Finally, the trained model’s performance is evaluated using established metrics, and iterative refinement and optimization processes are employed to enhance its accuracy and effectiveness in hardness classification tasks. Different types of ML algorithm have been deployed for the purpose of classification which were presented as state-of-art in paper [8,9,24,30,31,32]. To see how multiple ML algorithms perform in terms of accuracy of hardness classification, grasping data from COTS sensors was adapted against different objects selected and prepared based on the Shore hardness scale.

2.4. Shore Hardness Scale

The Shore hardness scale is a measurement technique used to assess the hardness or softness of material on a wider scale. The Shore hardness scale was adapted in several cases of classification [10] to match the value of an object as a comparison to testify the assumption made with the object as soft or hard while testing. In many classification approaches, hardness and softness were considered because of easiness in testing. It provides a standardized method for quantifying the resistance of a material by imposing force which works similar for humans while considering the object without noticing the values. There are different options of the Shore hardness scale, such as Shore A, Shore D, and Shore OO, each tailored to specific material types and testing conditions. The Shore hardness scale can be explored or used in different ways where one could select the material or object based on the Shore scale that could precisely describe that material shows this property and can be matched to a qualitative scale as extra soft, soft, medium soft, medium hard, hard, and extra hard. In this process, the qualitative scale was adapted to extend the classification beyond binary classification and adapt multiple objects based on the scale beyond hard and soft which could be flexible (deformable but with force). This approach also makes object adaption faster and gives precise results based on ML algorithms’ outcomes if the extension of classes improves the outcome or not; also, there is the option to test wider unknown/new test objects’ data.

2.5. Summary

Human hands hold exceptional tactile detection capabilities, such as pressure, force, and vibration, due to mechanoreceptors in the skin. These mechanoreceptors are crucial for understanding and interacting with the environment. Inspired by this natural mechanism, which involves converting tactile stimuli into neural signals, this research aims to use tactile capabilities of humans in robotic systems. Unlike existing methods that use highly customized and expensive sensors, this research explores the use of commercial off-the-shelf (COTS) tactile sensors, which are cost-effective and easy to integrate. Hardness classification is particularly important because it enables robots to handle and manipulate objects with precision and can provide capabilities to perform material-type identification and sorting, preventing damage to delicate items and ensuring a secure hold on harder objects, the same as how humans assess material properties through touch or grasping. This capability can improve robot–human interaction such as service robots or robotic assistants; understanding material hardness enhances their ability to perform tasks that involve human-like manipulation skills. By using machine learning (ML) techniques, robots can be trained to recognize material hardness, improving their grasping and manipulation abilities. The Shore hardness scale is employed to provide a standardized measurement of material hardness, aiding in the development of more effective robotic tactile sensing systems.

3. Approach

3.1. Identification of Mechanoreceptor-Inspired Tactile Sensors

Different types of complex and customized tactile sensors were identified in literature which were used in different application and classification tasks. New proposed tactile sensors like Biomimetic tactile sensors, artificial mechanoreceptor tactile sensors, grid-type piezoelectric sensors, and ionics tactile sensors may be under a commercializing state and are not so quickly available [7,13,15,38,40,41,42]. Based on the current state of the art, it is understood that artificial mechanoreceptors and tactile sensors share a common developmental base, detecting similar physical properties from stimuli. These sensors may vary in their single- or multi-dimensional array configurations. Many studies have focused on emulating spike patterns using artificial mechanoreceptors, with some applied in hardness classification tasks [39,43,44,45]. Currently, many of these advanced sensors are not available in thin film for multiple tasking within robotics applications. These types of sensors are very sensitive and can quickly degrade by certain actions of grasping and cannot withstand longer periods for data collection with multiple objects. These are advanced, but not much application has been proposed within robotics grasping or in classification. Despite numerous descriptions of artificial mechanoreceptors and multilayer sensors in literature, their use cases remain limited, with commercial availability posing another challenge. Proposed sensors as solutions are customized based on application which does not have further space or scope of development or availability [22,29,38,46]. Thin-film sensors are easy to embed in most robotics grippers, and based on market research, some of the sensors were identified which have some standard dimensions which can be used directly within the robotics application. Therefore, the focus shifts to available tactile sensors in the market, classified based on functionality mirroring natural mechanoreceptors illustrated in Figure 1. These include FSR force-resistive sensors (also known as piezoresistive sensors), vibration sensors (akin to piezoelectric sensors), and soft potentiometers sensors (such as distributed force-resistive sensors). These thin-film sensors can be easily integrated into hard or soft grippers to detect physical changes in force or distributed pressure. For this purpose, the Schunk gripper was adapted due to its capability to accommodate these sensors within its internal structure.
FSR sensors as mechanoreceptor 1 and 2 functionality: FSR sensors, also known as force-sensitive resistors, are tactile sensors designed to detect changes in resistance when force is applied. This change in resistance leads to an increase in voltage, which is typically scaled within the range of zero to five volts. The sensitivity of FSR sensors allows them to detect various levels of force, ranging from light touch to continuous pressure and even high-impact forces, measuring up to 50N. This versatility makes FSR sensors well suited to emulate the functionality of two mechanoreceptors located on the top layer of the skin, namely the Meissner corpuscles and Merkel discs [47]. These receptors are positioned close to the skin’s surface, enabling them to accurately perceive touch, pressure, or any form of deformation [47]. When an object deforms due to pressure or manipulation, the mechanoreceptors sense the deformations and send signals in proportion to the degree of deformity. Based on deformation values obtained, soft objects are classified as “soft” and hard objects as “hard” or flexible. This ability of pressure and resistance can be measured by FSR sensors which were also chosen.
Soft Potentiometer sensors as Mechanoreceptor 3 functionality: Potentiometer sensors’ capabilities as mechanoreceptors have not yet been fully explored in terms of hardness classification. These sensors are the same as FSR sensors and exhibit a linear change in resistance across their surface. However, they possess a unique capability to detect slip, stretch, or sliding movements based on changes in resistance. This detection capability closely resembles the functionality of mechanoreceptors [6,47] known as Ruffini endings. In this scenario, this off-shelf tactile sensor has the capability to detect changes that are similar in mechanoreceptors to analyse object characteristics or parameters in terms of hardness classification. While squeezing, object deformation will produce stretch across the sensor and will give different output volts as signatures for different objects which will help to classify objects based on hardness.
Vibration sensors as Mechanoreceptor 4 functionality: Vibration sensors, functioning the same as mechanoreceptors, play an essential role in neural pattern generation, classification, and the development of grid sensor architectures. Piezoelectric vibration sensors serve as essential tools for detecting impact and vibrations across a wide spectrum of frequencies. This enables the collection of data crucial for pattern generation and classification tasks. While these sensors may not replicate the intricate architecture of mechanoreceptors, they offer functionality similar to Pacinian receptors, producing data vital for classification purposes. Vibrations generated upon object impact are transmitted to mechanoreceptor junctions. As these mechanoreceptors [6,47] deform in response to detected vibrations, they send electrical impulses to the brain for processing. Higher frequency vibrations typically indicate harder materials, requiring more energy to induce, while lower frequencies correspond to softer materials. This same functionality and process can be performed using thin-film vibration sensors to perform hardness classification.

3.2. Hardness Classification Using Mechanoreceptor-Inspired COTS Sensors

Figure 2 illustrates the proposed approach and steps involved, which are essentially divided into four parts. Firstly, it involves the sensors that are bio inspired by mechanoreceptors. Choosing the right gripper based on dimensions, adapting the grasping method (mechanical resistance) to create an impact on the sensors while grasping objects the same as human pinch grasping, and object selection on the Shore hardness qualitative scale are illustrated in Figure 2. Secondly, it utilizes a machine learning approach to analyse data from sensors, including F-force data, V-vibration data, and P-potentiometer data. Initially, single-sensor (F), (V), and (P) data were considered for analysis, and then different configurations of sensor data were formed from each sensor. This also shows how off-shelf tactile sensors data, selected based on bio-inspired mechanoreceptors, perform a hardness classification. This understanding indicates how an individual and combination of sensors can yield accurate results when attempted. Additionally, it offers insight into how mechanoreceptor-inspired tactile sensors, both individually and in combination, can classify hardness effectively. Accuracy from COTS tactile sensors will set a benchmark for further exploration in layered sensor technology for future scope. These accuracy score data showcase how grasping-based tactile information achieved from sensors in volt as a value can be used to achieve a hardness classification to evaluate their performance. Decisions are made based on the accuracy of each configuration in multi-class scenarios, including dual combinations like (F,V), (F,P), and (P,V), as well as the three-sensor combination (F,V,P). While binary classification has been explored in the literature, this approach also involves different sensors’ combination approaches to perform binary (2 classes object), ternary (3 classes object), and quaternary classification (4 classes object) based on object data obtained from testing and Shore hardness scales.

3.3. Machine Learning Approach for Hardness Classification

Figure 3a illustrates the process followed in the machine learning approach and Figure 3b shows the data structure of configuration used to train the ML. Machine learning algorithms are essential for hardness classification tasks. These algorithms act like a brain analyser, analysing different types of data such as numeric, text, images, and more. In the context of hardness classification, it helps in understanding the features of datasets obtained from tactile sensors or any other source. The algorithms use various techniques, including supervised learning, where they are trained on labelled data to predict hardness classification. Examples of such techniques are decision tree classification, random forest classification, nearest neighbour classification, including support vector classification, and others. During training, the algorithms learn patterns and relationships within the data, allowing them to accurately classify new inputs. They analyse features extracted from the data to predict whether a material is hard or soft or based on scale. Previously bio signal, FFT, and other digital data were used in relation to objects to train ML [2,38]. In this approach, COTS sensors data were formed in different combinations to train the ML algorithms to understand which sensors’ configuration out of F,V,P, (F,V), (F,P), (P,V), (F,P,V) trains ML well, and based on test data, which ML algorithm accuracy comes out to be the best. There are some steps that need to be taken before deploying the machine learning algorithm which are as follows:
Python Libraries: Machine learning analysis was conducted using Python using Google Collaboratory. To process all analysis, there were several Python and machine learning libraries which needed to be declared in beginning of the code. In general, libraries like Sklearn, NumPy, Pandas, and Matplotlib were declared in the first step [48,49,50,51].
Data importing: The initial step in analysis involves importing the dataset into the Python environment. This dataset serves as the foundation for machine learning tasks, containing the necessary information for training, and evaluating classifier models with a feature variable and a target variable. In this case, Figure 3b showcases different S1, S2, S3 data in single CSV file with the column of sensors’ value in volts as X variable and target variable having the information of objects’ type as H as 0, S as 1, F as 2 called as Y variable. These data were called and processed and were performed using the Python library called ‘Pandas’. In a further case, two combinations (S1,S2), (S2,S3), (S3,S1) of dataset were combined with the help of ‘Pandas’ to further follow the combination approach.
Preprocessing data: Prior to training machine learning models, it is essential to preprocess the dataset to ensure compatibility with the chosen algorithms and improve model performance. One common preprocessing step involves encoding categorical variables into a numerical format, which is achieved using techniques such as label encoding, also illustrated in Figure 3b. In this case, the target variable needs to be mapped into numerical values using a label encoder, whose function is to transform string values like H as 0, S as 1, and F as 2, in case of ES (extra soft) as 3. Additionally, preprocessing may involve handling missing values, scaling features, and other data transformations required based on dataset preparation using the Imputer library.
Splitting data: To evaluate the performance of machine learning models accurately, it is crucial to divide the dataset into separate training and testing subsets. This process, known as data splitting, helps assess the generalization ability of the trained models by evaluating their performance on unseen data. In this case, data from the configuration F,V,P, (F,V), (F,P), (P,V), (F,P,V) was split accordingly to train the ML algorithm. Typically, a portion of the dataset is reserved for training, while the remaining portion is used for testing. In this case, 80% of the data were used for training and 20% of the data were used for testing, as illustrated in Figure 3a.
Importing ML algorithm: With the dataset prepared and split into appropriate subsets, it further proceeds to import the machine learning algorithms like SVC (support vector classifier), RFC (random forest classifier), DT (decision tree), LR (logistic regression), and MLP (multilayer perceptron) for data analysis. Depending on the nature of the problem and the characteristics of the dataset classifier, the algorithms found to be most suitable are used for training predictive models. As these algorithms take less computational time, they are easy to deploy and analyse in comparison to any form of deep learning or neural network.
Training ML: Once the algorithm(s) are imported, data split into X (sensors value) from different configurations F,V,P, (F,V), (F,P), (P,V), (F,P,V) and Y (target feature = object type (like H as 0, S as 1, and F as 2)) are illustrated in the Figure 3a,b variable used as training data to train the models, where the models learn patterns and relationships within the training data as sensors’ values and number of object types. This process involves feeding the training data into the algorithm(s) and iteratively adjusting the model parameters to minimize a predefined loss function. Through this iterative optimization process, the models learn to make predictions or classify new data based on the patterns observed in the training set.
Trained model: After completing the training phase, trained machine learning models were obtained that capture the discerned patterns and correlations between tactile sensors’ value and target variables (object types). These relationships form a crucial basis for distinguishing between (1) hard and soft materials within the dataset in case of binary classification and (2) three-object variables, hard, soft, and flexible, in case of ternary classification (H,S,F), as illustrated in Figure 3. With this information, the models are equipped to forecast outcomes for new, unseen instances, particularly in the context of hardness classification tasks.
Prediction: This prediction phase involves feeding unseen data into the trained models and obtaining output predictions based on the learned patterns. The predictions generated by the models represent how machine learning performed predictions based on dataset features and based on different models. In this case, two sets of data will be tested: (1) individual sensor data, and (2) combination of individual sensors from different ML algorithm outcomes in one file, as illustrated in Figure 3.
Validation using accuracy score: To evaluate the performance of the trained machine learning models, validation techniques such as accuracy score are illustrated in Figure 3a,b. In this case, data which was split will use 20% data for testing the model. In this way, models with new data will predict the outcomes as H or S object or for F object. This will be compared to the original test value which will give an accuracy score and performance by comparing it within literature. Also, the accuracy score provides a measure of the overall correctness of the model predictions. This is highlighted in the results section.

4. Design of Experiment

The design of the experiment illustrated in Figure 4 for conducting hardness classification involves both hardware and software components. Firstly, it was crucial to employ COTS tactile sensors to measure the impact of squeezing objects and collect data. This required performing grasping and resistance measurement actions to obtain values from three sensors inspired by mechanoreceptor tactile detection capability. Additionally, the gripper action was facilitated using air pressure, capable of withstanding a force of 0.4 MPa during grasp, controlled through Python coding via Raspberry Pi.
The selection of an appropriate gripper was paramount, requiring compatibility with objects and sensors. A two-finger gripper was chosen for its ability to mimic human grasping and its economic viability for grasping various object types efficiently. The Schunk gripper, available in mechanical and pneumatic types, was selected for compatibility, with tactile sensors shaped to attach easily. Control of pneumatics involved the use of a pressure regulator, solenoid valve, and single electric valve. Integration of Arduino facilitated sensor connection and served as an ADC for data transmission. Additional hardware components, including a 3D printer for object creation and a screen with HDMI port for visualization, completed the experimental setup. Data collected from sensors, denoted as F (force), V (vibration), and P (potentiometer), were saved in Raspberry Pi. These data were further configured into various combinations, such as (F,V), (V,P), (F,P), and (F,P,V), to perform data analysis on individual sensor data and combinations thereof. To examine deeper into the obtained values from COTS sensors, a ‘shore taxonomy’ process was employed to analyse the deformation resistance to force based on the Shore hardness qualitative scale presented in Figure 4. Additionally, each configuration represented specific conditions for binary, ternary, and quaternary classification. For binary classification, the configurations included H (Hard) and S (Soft) classes. Ternary classification incorporated H (Hard), S (Soft), and F (Flexible) classes, while quaternary classification added an additional class, ES (Extra Soft). This approach was crucial for performing hardness classification against each object, providing insights into the behaviour of future applications in terms of sensor layer operation and the number of classes involved in classification.

4.1. Object Preparation

Illustrated in Figure 4: Objects representing different hardness classes (Soft, Flexible, Hard) are prepared according to the Shore hardness scale for testing. For silicone rubber, Eco flex 30 was used with mould of 3 × 3 cm size and filled with Eco flex liquid and left for around 3 h for best result. After extraction of silicone rubber, it was squeezed/pinch grasped to judge silicone rubber act as soft based on Shore scale. TPU (Thermoplastic polyurethane) is filament used in 3D printing to print objects. In this case, 3D object with 3 × 3 cm dimension was printed; based on property analysed, TPU was considered as flexible with medium hard and medium soft property. In case of wood, a piece was diced to form 3 × 3 cm wooden block. PLA (Polylactic Acid) is also a filament which was used to 3D print 4th object and analysed to be another hard object.

4.2. Hardware

Figure 5a illustrates complete collection of setups which showcase what tools and hardware were used. To conduct hardness classification by squeezing object, also known as grasping, and collecting data, it was necessary to have grippers that effectively have enough space to accommodate object and sensor on gripper. For this purpose, Schunk gripper as mechanical gripper/pneumatic gripper type was selected. Additionally, to collect data from tactile sensors, it was necessary for their shape to be compatible with sticking on the side of the pneumatic gripper. To control pneumatics, further tool requirement was to control air pressure, which was conducted using pressure regulator, solenoid valve, and single electric valve. Main air pressure supply was fed into pressure regulator using supply within IAC lab. To develop a system with both automatic and manual grasping capabilities, a device that could synchronize with Python and collect data simultaneously was required. This synchronization was achieved using Raspberry Pi, which served as the central hub with the resources to combine peripheral devices and control them through digital commands (high and low).
In addition to Raspberry Pi, an Arduino board was utilized to connect sensors and serve as an ADC (analogue to digital converter). This allowed the sensors to be connected to Arduino ports and subsequently linked to Raspberry Pi via serial communication. Other essential components included a pressure regulator to limit air pressure to a certain level, relay modules to control electric components, and solenoid valves to regulate air pressure. An RS power supply was necessary to operate the electric valve and solenoid valve effectively. To perform gripping tasks, a Schunk pneumatic gripper was employed. Furthermore, a 3D printer was utilized to create testing objects from different materials, providing versatility in experimentation. Lastly, a screen with an HDMI port was utilized to visualize the outcome of the experiments, ensuring effective monitoring and analysis of the collected data.

4.3. Software

To control grasping and collect sensor data for analysis using machine learning algorithms, various Python libraries and machine learning frameworks were employed. To establish communication between the Arduino and Raspberry Pi, the Nanpy library was used, allowing for integration of the two devices in a master–slave configuration to facilitate serial communication [52]. For data collection and processing, the Pandas library has capabilities to manage data and manipulate datasets. Additionally, for the analysis of collected data and the implementation of various machine learning algorithms, the Scikit-learn library was applied, offering a wide range of algorithm selectivity like SVM (support vector classifier), RFC (random forest classifier), and (DT) decision tree, and for measuring performance like accuracy and confusion matrix. Python 3, along with Python idle-Version 3.8 software, Jupyter Notebook Version 5, and Google Collaboratory, were employed for coding purposes. Each platform was selected based on its capabilities and suitability for specific tasks, ensuring efficient analysis and data collection across various scenarios.

4.4. Gripper System

In the exploration of various grippers documented in literature, diverse structures and use cases have been examined. However, the need arose for a general gripper that could mimic pinch grasping same as human hand operation, while also accommodating the dimensions of COTS sensors. Understanding the embedding issues of sensors when using different grippers and sensor variations was crucial. For the experiment depicted in Figure 5b, the Schunk PGF 80 gripper was utilized. One side of the gripper was equipped with sensors, with each sensor being swapped out during the experiment for each object. These sensors were connected to an Arduino to collect data, capturing tactile information from each stimulus object. A pneumatic system controlled by a control system regulated the opening and closing of the gripper by releasing air pressure through a pipe attached to the gripper. The selection of the gripper was based on the space available between its fingers, facilitating easy placement of objects. Additionally, the gripper’s adjustable gripping range allowed it to grasp objects larger than its default size. Opting for a two-sided gripper enabled a pinch-type grasping similar to human finger dexterity, occupying minimal space, and swiftly integrating into any robotic arm. The selected sensors were easily installable in these grippers compared to custom-made ones, streamlining the experimental setup, and enhancing efficiency.

4.5. Control System

To control the gripper, an SMC solenoid valve which is controlled through relay module and with python code from Raspberry Pi sends high and low command to GPIO pins. This relay module controls the electric valve in pressure system which controls the main supply of air pressure system at decided delay of 0.4 to 0.6 s. Illustrated in Figure 5b, sensors connected in gripper also connected to Arduino which reads the analogue value from sensors and converts them into voltage on scale of 5 volts. Arduino is connected to Raspberry Pi which works as bridge to sensors and Raspberry Pi. With the help of Python library (Nanpy,) Raspberry Pi and Arduino is connected in master–slave configuration to perform serial communication. Sensors’ data were saved in Raspberry Pi from Arduino as CSV file to use in material classification algorithm.

4.6. Pressure System

Air pressure was taken from IAC lab which was connected to central pressure system and pressure regulator with safety knob. This air pressure was passed through regulator which was set around 0.4 MPa approx. This passed through electric valve which was controlled by control system through relay module illustrated in Figure 5b. Main board developed in a way that it can control three multiple air pressure types within this module which can be useful for future scope in terms of pressure variation tests under gripper. In terms of open and close of air flow, described in Figure 5b within gripper system, it is controlled through SMC 5/3 solenoid valve which when one side is open through relay module, gripper opens, and when other side valve is open, gripper closes or grasps. This was achieved by sending high(1) and low(0) command through Raspberry Pi through GPIO pins to 2nd and 3rd relay module.

4.7. Data Collection

The data collection process stated and described in Figure 5b demonstrates the comprehensive connectivity of each module, highlighting how sensors are linked to the gripper and subsequently connected to an Arduino for data transmission to a Raspberry Pi. These data are crucial for further analysis, particularly in material classification research. During the experiment, single sensors were sequentially deployed with the gripper to acquire data, as illustrated in Figure 5, showcasing the formatted data structure illustrated in Figure 4 for subsequent analysis. Each tactile sensor gathered data from three distinct classes of objects grasped by the gripper, with the Raspberry Pi Figure 4 storing these data for each object individually. Gripper grasps every object around 200 samples from each object and sensors with synchronized grasp and releases with duration around 0.4 s to 0.6 s. Utilizing Python’s Pandas library, data collection was streamlined through commands that defined the object’s state (H, S, or F) based on the Shore hardness quantitative scale. The collected data for each object, captured by three sensors individually connected to Arduino ports, was organized into separate data frames, and saved as CSV files on the Raspberry Pi. Additionally, string labels (H,S,F) were transformed into numerical values using label encoder to perform classifier analysis capabilities. The connection setup detailed in Figure 5a,b illustrates the integration of sensors with the gripper system during testing, providing clarity on how tactile sensor data were stored in CSV files on the Raspberry Pi. To analyse data, Figure 4 explains the data collection process and the resulting data structure and steps for ML algorithms analysis process illustrated in Figure 3.

5. Results Based on Machine Learning

5.1. Result on Hardness Classification Outcome Based on Two Classes (H,S)

In term of hardness-based classification when two classes were considered, (H,S) gives comparable accuracy, the same as in literature which used customized sensors. Figure 6 indicates that binary classification of individual sensors showcases accuracy ranging from 65% to 82% noticed from individual sensors (F), (P), and (V). Accuracy achieved from individual sensors represents that off-shelf tactile sensors can be used in hardness-based classification. In case of two classes (H,S), individual sensors can perform up to 80% of accuracy in predicting 20% of testing data as validation. Literature suggests 80 to 95% [2] of accuracy/prediction while using customized and complex tactile sensors. Ultrasonic sensors [32] used in literature describing the hardness and softness take too much space within the gripping area, which is difficult to adjust in different required environments, which is the same for customized sensors. In terms of the combination of sensors like (F,V) = 85%, (P,F) = 89%, and (V,P) = 87%, accuracy obtained shows that the combination of sensors increases the prediction accuracy which indicates that more tactile information as a feature increases machine learning accuracy. And out of multiple algorithm approaches, it shows that RFC performs best among others. In different configurations, RFC also performs better. Based on three sensors (F,P,V), the configuration achieves around 93% which highest among all configurations of sensors as features. This indicates that an increasing number of features (tactile information or value) from different sensors about objects increases the chance of obtaining optimum accuracy. This also showcases that COTS have the capability to perform hardness classification.

5.2. Hardness Classification Outcome Based on Three Classes (H,S,F)

Hardness classification using three classes showcases results obtained from multiple algorithms presented in Figure 7. In comparison to two classes (H,S), the outcome from three classes (H,S,F) has shown less accuracy among different configurations. In individual sensors, accuracy drops below 70%; in the combination of two (F,V), (P,F), and (V,P), accuracy drops below 80%; only three combinations of individual sensor data (F,P,V) show 82% accuracy by the RFC algorithm which was best among all configurations. This outcome represents that on an increasing number of classes, accuracy decreases among ML algorithms. Accuracy obtained from three classes (H,S,F) was not investigated much in literature; this result shows that multiple sensors’ data obtained from COTS as tactile information/ features can perform multiclass hardness classification with 80%+ accuracy with limited data.

5.3. Result from Best Algorithm and Sensors Configuration—Multiclass Output

From class two (Section 5.1) and class three (Section 5.2) results, it been understood that the output of multiple algorithms indicates in most cases that RFC performs well. Multiple features from sensors with more tactile information (F,P,V) perform overall better. Three classes (H,S,F) with three sensors (F,V,P) configuration results outcome inspires a further look into including a fourth class from the Shore scale which is ES (extra soft). For ES objects, a white sponge was included, and ML models were trained again with all configurations. Figure 8 shows all results obtained from RFC and shows the case that binary classification remains optimum in most configurations where other classes’ accuracy falls but shows the capability of performing hardness classification. In four object classes (H,S,ES,F), the (RFC) in (F,V,P) combination performs better among all other configurations achieving accuracy around 79%. Overall, this result also indicates that (F,V,P) combination performs better in all cases.

6. Conclusions and Future Scope

This study demonstrates the feasibility and practicality of using commercial off-the-shelf (COTS) tactile sensors for multiclass hardness classification in robotic applications, specifically showcasing their scalability, cost-effectiveness, and integration ease compared to customized tactile sensors. It particularly underscores the advantages of configurations that incorporate three sensors (F,V,P), which consistently outperform simpler setups with accuracy rates between 80–92% across binary, ternary, and quaternary classifications. These findings highlight not only the feasibility of employing COTS sensors in varied robotic tasks but also their comparability in performance to more complex sensor systems documented in existing literature, where accuracy ranges from 50–97% (binary classification). Notably, the random forest classifier (RFC) was found to be particularly effective, likely due to its robust pattern recognition capabilities within the diverse data sets involving sensor values and target classifications.
In binary classification, individual sensors (F) and (P) have achieved accuracies above 80%, indicating that a single sensor may not be sufficient for comprehensive hardness classification. But individual sensors showcase the possibility of an accuracy score comparable to more complex grid or array sensors documented in the literature, which typically achieve accuracies between 50–90% [41]. This comparison highlights the viability of commercial off-the-shelf (COTS) sensors for rapid deployment in robotic applications, emphasizing their potential for broader use in texture classification and other areas. The selectivity of these sensors, inspired by human mechanoreceptors, is crucial. It enables the capture of diverse force, pressure, and vibration signals from various objects, enhancing hardness classification. This mechanoreceptor-based selectivity is especially beneficial when sensor data from multiple sources (F,V,P) are combined, suggesting future exploration into other bio-inspired sensors such as thermoreceptors for temperature, and optical receptors for light detection.
However, there are notable limitations to consider. The current testing procedures, which primarily involve uniformly shaped square objects, do not fully represent the range of real-world scenarios. The data collected from only four to five objects is limited in quantity, which may restrict the applicability of the tests in real-world and real-time scenarios. This limitation poses a challenge for performing hardness classification with COTS sensors, necessitating tests with a larger number of objects. Including more objects based on the Shore scale could either decrease or increase the accuracy of COTS sensors, which remains uncertain and highlights the need for future research. The resolution and sensitivity of the COTS sensors may not be sufficient for detecting fine differences in hardness, requiring tests with materials of various properties. Furthermore, testing conducted under controlled environments does not account for the variability in real-world conditions, such as temperature, humidity, and platform differences, which could affect sensor performance. Additionally, sensor alignment with various robotic grippers and different objects may yield varying outcomes. This underscores the potential for future work exploring the use of COTS sensors with dexterous robotic grippers to understand their performance across diverse applications. The use of single-sensor data, followed by their combination, can be time-consuming in real-time applications, posing a significant drawback. These factors emphasize the need for more comprehensive testing on objects of different sizes and shapes to accurately assess the capabilities and limitations of COTS sensors in various classification applications. Overall, this study has revealed several significant findings. Firstly, it demonstrated the feasibility and accessibility of hardness classification using commercial off-the-shelf (COTS) sensors, which require minimal processing time and are readily deployable in various robotics environments. Secondly, configurations using three sensors (F,P,V) consistently outperformed others, proving particularly effective in binary classification, although they were less effective in ternary and quaternary scenarios but performed optimally in comparison to others. Notably, among the various machine learning algorithms tested, the random forest classifier (RFC) exhibited optimal performance. This is likely due to RFC’s ability to effectively discern patterns within the training data, especially within the subset containing sensor values and target classifications. The achieved accuracy underscores the potential of COTS sensors, yielding results comparable to those documented in existing literature. These findings suggest significant potential for the extensive use of COTS sensors in robotic tactile sensing applications. Additionally, it has shown potential to explore layered or topology of COTS sensors to identify the optimal configurations using a bio-inspired (mechanoreceptor) approach. Future research will focus on analysing tactile information gathered collectively from layered sensors and performed in real-time predictions with unknown or new objects. In future work, we plan to expand the variety of materials testing to include broader parameters like textures, densities, gradient hardness, and composite structures to better evaluate the sensors’ capabilities in diverse real-world scenarios. We may also explore advanced machine learning models, deep learning approaches, and ensemble techniques to improve the accuracy and robustness of multiclass classification tasks. In the case of improving ML models, various filtering techniques to enhance the quality of sensor data can also be considered in future work.

Author Contributions

Conceptualization, Y.S. and P.F.; methodology, Y.S. and P.F.; software, Y.S.; validation, Y.S. and P.F.; formal analysis, Y.S. and P.F.; investigation, Y.S.; resources, Y.S.; data curation, Y.S.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S., P.F. and L.J.; visualization, Y.S.; supervision, P.F. and L.J.; project administration, P.F. and Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jin, J.; Wang, S.; Zhang, Z.; Mei, D.; Wang, Y. Progress on flexible tactile sensors in robotic applications on objects properties recognition, manipulation and human-machine interactions. Soft Sci. 2023, 3, 8. [Google Scholar] [CrossRef]
  2. Eguíluz, A.G.; Rañó, I.; Coleman, S.A.; McGinnity, T.M. Multimodal Material identification through recursive tactile sensing. Robot. Auton. Syst. 2018, 106, 130–139. [Google Scholar] [CrossRef]
  3. Yi, Z.; Zhang, Y.; Peters, J. Bioinspired tactile sensor for surface roughness discrimination. Sens. Actuators A Phys. 2017, 255, 46–53. [Google Scholar] [CrossRef]
  4. Li, G.; Liu, S.; Wang, L.; Zhu, R. Skin-inspired quadruple tactile sensors integrated on a robot hand enable object recognition. Sci. Robot. 2020, 5, 46–53. [Google Scholar] [CrossRef]
  5. Li, F.; Wang, R.; Song, C.; Zhao, M.; Ren, H.; Wang, S.; Liang, K.; Li, D.; Ma, X.; Zhu, B.; et al. A Skin-Inspired Artificial Mechanoreceptor for Tactile Enhancement and Integration. ACS Nano 2021, 15, 16422–16431. [Google Scholar] [CrossRef] [PubMed]
  6. Iheanacho, F.; Vellipuram, A.R. Physiology, Mechanoreceptors; StatPearls Publishing: Tampa, FL, USA, 2023. [Google Scholar]
  7. Dahiya, R.; Oddo, C.; Mazzoni, A.; Jörntell, H. Biomimetic tactile sensing. In Biomimetic Technologies; Elsevier: Amsterdam, The Netherlands, 2015; pp. 69–91. [Google Scholar] [CrossRef]
  8. Amin, Y.; Gianoglio, C.; Valle, M. Embedded real-time objects’ hardness classification for robotic grippers. Future Gener. Comput. Syst. 2023, 148, 211–224. [Google Scholar] [CrossRef]
  9. Song, Y.; Lv, S.; Wang, F.; Li, M. Hardness-and-Type Recognition of Different Objects Based on a Novel Porous Graphene Flexible Tactile Sensor Array. Micromachines 2023, 14, 217. [Google Scholar] [CrossRef]
  10. Qian, X.; Li, E.; Zhang, J.; Zhao, S.-N.; Wu, Q.-E.; Zhang, H.; Wang, W.; Wu, Y. Hardness Recognition of Robotic Forearm Based on Semi-supervised Generative Adversarial Networks. Front. Neurorobot. 2019, 13, 73. [Google Scholar] [CrossRef] [PubMed]
  11. Jamali, N.; Sammut, C. Material classification by tactile sensing using surface textures. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; IEEE: New York, NY, USA, 2010; pp. 2336–2341. [Google Scholar]
  12. Konstantinova, J.; Cotugno, G.; Stilli, A.; Noh, Y.; Althoefer, K. Object classification using hybrid fiber optical force/proximity sensor. In 2017 IEEE SENSORS; IEEE: New York, NY, USA, 2017; pp. 1–3. [Google Scholar] [CrossRef]
  13. Dai, K.; Wang, X.; Rojas, A.M.; Harber, E.; Tian, Y.; Paiva, N.; Gnehm, J.; Schindewolf, E.; Choset, H.; Webster-Wood, V.A.; et al. Design of a Biomimetic Tactile Sensor for Material Classification. arXiv 2022, arXiv:2203.15941. [Google Scholar]
  14. Madrigal, D.; Torres, G.; Ramos, F.; Vega, L. Cutaneous mechanoreceptor simulator. In Proceedings of the 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom), Kosice, Slovakia, 2–5 December 2012; IEEE: New York, NY, USA, 2012; pp. 781–786. [Google Scholar] [CrossRef]
  15. Najarian, S.; Dargahi, J.; Mehrizi, A.A. Artificial Tactile Sensing in Biomedical Engineering; McGraw-Hill Education: New York, NY, USA, 2009. [Google Scholar]
  16. Chun, S.; Kim, J.-S.; Yoo, Y.; Choi, Y.; Jung, S.J.; Jang, D.; Lee, G.; Song, K.-I.; Nam, K.S.; Youn, I.; et al. An artificial neural tactile sensing system. Nat. Electron. 2021, 4, 429–438. [Google Scholar] [CrossRef]
  17. Pattnaik, D.; Sharma, Y.; Saveliev, S.; Borisov, P.; Akther, A.; Balanov, A.; Ferreira, P. Stress-induced Artificial neuron spiking in Diffusive memristors. arXiv 2023, arXiv:2306.12853. [Google Scholar]
  18. Kalita, H.; Krishnaprasad, A.; Choudhary, N.; Das, S.; Dev, D.; Ding, Y.; Tetard, L.; Chung, H.-S.; Jung, Y.; Roy, T. Artificial Neuron using Vertical MoS2/Graphene Threshold Switching Memristors. Sci. Rep. 2019, 9, 53. [Google Scholar] [CrossRef] [PubMed]
  19. Lucarotti, C.; Oddo, C.; Vitiello, N.; Carrozza, M. Synthetic and Bio-Artificial Tactile Sensing: A Review. Sensors 2013, 13, 1435–1466. [Google Scholar] [CrossRef]
  20. Spigler, G.; Oddo, C.M.; Carrozza, M.C. Soft-neuromorphic artificial touch for applications in neuro-robotics. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; IEEE: New York, NY, USA, 2012; pp. 1913–1918. [Google Scholar] [CrossRef]
  21. Bounakoff, C.; Hayward, V.; Genest, J.; Michaud, F.; Beauvais, J. Artificial fast-adapting mechanoreceptor based on carbon nanotube percolating network. Sci. Rep. 2022, 12, 2818. [Google Scholar] [CrossRef]
  22. Kerr, E.; McGinnity, T.M.; Coleman, S. Material recognition using tactile sensing. Expert Syst. Appl. 2018, 94, 94–111. [Google Scholar] [CrossRef]
  23. Cirillo, A.; Laudante, G.; Pirozzi, S. Tactile Sensor Data Interpretation for Estimation of Wire Features. Electronics 2021, 10, 1458. [Google Scholar] [CrossRef]
  24. Drimus, A.; Kootstra, G.; Bilberg, A.; Kragic, D. Design of a flexible tactile sensor for classification of rigid and deformable objects. Rob. Auton. Syst. 2014, 62, 3–15. [Google Scholar] [CrossRef]
  25. Gao, Z.; Ren, B.; Fang, Z.; Kang, H.; Han, J.; Li, J. Accurate recognition of object contour based on flexible piezoelectric and 9piezoresistive dual mode strain sensors. Sens. Actuators A Phys. 2021, 332, 113121. [Google Scholar] [CrossRef]
  26. Suslak, T. There and Back again: A Stretch Receptor’s Tale. 2015. Available online: https://era.ed.ac.uk/handle/1842/10474 (accessed on 1 April 2024).
  27. Hosoda, K.; Tada, Y.; Asada, M. Anthropomorphic robotic soft fingertip with randomly distributed receptors. Rob. Auton. Syst. 2006, 54, 104–109. [Google Scholar] [CrossRef]
  28. “Somatosensation-collective term for sensory signals from the body; Neuroscience Online. (n.d.). Somatosensory Processes (Section 2, Chapter 5). The University of Texas Medical School at Houston. Available online: https://nba.uth.tmc.edu/neuroscience/m/s2/chapter05.html (accessed on 17 June 2024).
  29. Luo, S.; Bimbo, J.; Dahiya, R.; Liu, H. Robotic tactile perception of object properties: A review. Mechatronics 2017, 48, 54–67. [Google Scholar] [CrossRef]
  30. Shimizu, T.; Shikida, M.; Sato, K.; Itoigawa, K. A New Type of Tactile Sensor Detecting Contact Force and Hardness of an Object. In Proceedings of the Technical Digest. MEMS 2002 IEEE International Conference, Fifteenth IEEE International Conference on Micro Electro Mechanical Systems (Cat. No.02CH37266), Las Vegas, NV, USA, 24–24 January 2002. [Google Scholar]
  31. IEEE Robotics and Automation Society; Institute of Electrical and Electronics Engineers. ICRA2017: IEEE International Conference on Robotics and Automation: Program, Singapore, 29 May–3 June 2017; IEEE: New York, NY, USA, 2017. [Google Scholar]
  32. Bouhamed, S.A.; Chakroun, M.; Kallel, I.K.; Derbel, H. Haralick feature selection for material rigidity recognition using ultrasound echo. In Proceedings of the 2018 4th International Conference on Advanced Technologies for Signal and Image Processing (ATSIP), Sousse, Tunisia, 21–24 March 2018; IEEE: New York, NY, USA, 2018. [Google Scholar]
  33. Schmidt, P.A.; Maël, E.; Würtz, R.P. A sensor for dynamic tactile information with applications in human–robot interaction and object exploration. Rob. Auton. Syst. 2006, 54, 1005–1014. [Google Scholar] [CrossRef]
  34. Würschinger, H.; Mühlbauer, M.; Winter, M.; Engelbrecht, M.; Hanenkamp, N. Implementation and potentials of a machine vision system in a series production using deep learning and low-cost hardware. Procedia CIRP 2020, 90, 611–616. [Google Scholar] [CrossRef]
  35. Dargahi, J.; Najarian, S. Human tactile perception as a standard for artificial tactile sensing—A review. Int. J. Med. Robot. Comput. Assist. Surg. 2004, 1, 23–35. [Google Scholar] [CrossRef] [PubMed]
  36. Delmas, P.; Hao, J.; Rodat-Despoix, L. Molecular mechanisms of mechanotransduction in mammalian sensory neurons. Nat. Rev. Neurosci. 2011, 12, 139–153. [Google Scholar] [CrossRef] [PubMed]
  37. Ding, S.; Bhushan, B. Tactile perception of skin and skin cream by friction induced vibrations. J. Colloid. Interface Sci. 2016, 481, 131–143. [Google Scholar] [CrossRef] [PubMed]
  38. Yi, Z.; Zhang, Y.; Peters, J. Biomimetic tactile sensors and signal processing with spike trains: A review. Sens. Actuators A Phys. 2018, 269, 41–52. [Google Scholar] [CrossRef]
  39. Kappassov, Z.; Corrales, J.-A.; Perdereau, V. Tactile sensing in dexterous robot hands—Review. Rob. Auton. Syst. 2015, 74, 195–220. [Google Scholar] [CrossRef]
  40. Rahiminejad, E.; Parvizi-Fard, A.; Iskarous, M.M.; Thakor, N.V.; Amiri, M. A Biomimetic Circuit for Electronic Skin with Application in Hand Prosthesis. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 2333–2344. [Google Scholar] [CrossRef]
  41. Weng, J.; Yu, Y.; Zhang, J.; Wang, D.; Lu, Z.; Wang, Z.; Liang, J.; Zhang, S.; Li, X.; Lu, Y.; et al. A Biomimetic Optical Skin for Multimodal Tactile Perception Based on Optical Microfiber Coupler Neuron. J. Light. Technol. 2023, 41, 1874–1883. [Google Scholar] [CrossRef]
  42. Kerr, E.; McGinnity, T.M.; Coleman, S. Material classification based on thermal properties—A robot and human evaluation. In Proceedings of the 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China, 12–14 December 2013; IEEE: New York, NY, USA, 2013; pp. 1048–1053. [Google Scholar] [CrossRef]
  43. Pestell, N.; Lepora, N.F. Artificial SA-I, RA-I and RA-II/vibrotactile afferents for tactile sensing of texture. J. R. Soc. Interface 2022, 19, 20210603. [Google Scholar] [CrossRef]
  44. Liu, F.; Deswal, S.; Christou, A.; Sandamirskaya, Y.; Kaboli, M.; Dahiya, R. Neuro-inspired electronic skin for robots. Sci. Robot. 2022, 7, eabl7344. [Google Scholar] [CrossRef]
  45. Luque, N.R.; Garrido, J.A.; Ralli, J.; Laredo, J.J.; Ros, E. From Sensors to Spikes: Evolving Receptive Fields to Enhance Sensorimotor Information in a Robot-Arm. Int. J. Neural. Syst. 2012, 22, 1250013. [Google Scholar] [CrossRef]
  46. Li, P.; Ali, H.P.A.; Cheng, W.; Yang, J.; Tee, B.C.K. Bioinspired Prosthetic Interfaces. Adv. Mater. Technol. 2020, 5, 1900856. [Google Scholar] [CrossRef]
  47. MacKinnon, C.D. Sensorimotor anatomy of gait, balance, and falls. Handb. Clin. Neurol. 2018, 159, 3–26. [Google Scholar] [CrossRef]
  48. Hunter, J.D. Matplotlib: A 2D Graphics Environment. Comput. Sci. Eng. 2007, 9, 90–95. [Google Scholar] [CrossRef]
  49. McKinney, W. Data Structures for Statistical Computing in Python. SciPy 2010, 445, 56–61. [Google Scholar] [CrossRef]
  50. Harris, C.R.; Millman, K.J.; van der Walt, S.J.; Gommers, R.; Virtanen, P.; Cournapeau, D.; Wieser, E.; Taylor, J.; Berg, S.; Smith, N.J.; et al. Array programming with NumPy. Nature 2020, 585, 357–362. [Google Scholar] [CrossRef]
  51. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn Res. 2011, 12, 2825–2830. [Google Scholar]
  52. Stagi, A. Nanpy Firmware. Available online: https://github.com/nanpy/nanpy-firmware (accessed on 21 May 2024).
Figure 1. Showcases the mechanoreceptors’ architecture and their functionality in detecting different tactile information, where COTS tactile sensors are mapped based on their functionality: (1) FSR force sensor, (2) potentiometer Softpot membrane, (3) piezoelectric thin-film vibration sensor.
Figure 1. Showcases the mechanoreceptors’ architecture and their functionality in detecting different tactile information, where COTS tactile sensors are mapped based on their functionality: (1) FSR force sensor, (2) potentiometer Softpot membrane, (3) piezoelectric thin-film vibration sensor.
Electronics 13 02450 g001
Figure 2. Research approach illustrating firstly object selectivity based on Shore scale considering three to four classes in object; secondly selected sensors embedded on gripper one side; thirdly sensors connected to Arduino to raspberry-pi for data collection; lastly from F,V,P data different set of configurations created to investigate accuracy score using multiple machine learning approaches.
Figure 2. Research approach illustrating firstly object selectivity based on Shore scale considering three to four classes in object; secondly selected sensors embedded on gripper one side; thirdly sensors connected to Arduino to raspberry-pi for data collection; lastly from F,V,P data different set of configurations created to investigate accuracy score using multiple machine learning approaches.
Electronics 13 02450 g002
Figure 3. (a) Illustrates machine learning framework for COTS sensors data. Diverse machine learning algorithms are employed to train models using these data. The datasets utilized comprise (1) individual sensor data and (2) combinations of sensor data paired with object information denoted as (H,S) and (H,S,F). Subsequently, the test values are inputted into each algorithm, and the accuracy of their outputs is evaluated to test their efficacy. (b) Illustrates the data structure of various configurations saved within Raspberry Pi to train ML algorithms to investigate hardness classification.
Figure 3. (a) Illustrates machine learning framework for COTS sensors data. Diverse machine learning algorithms are employed to train models using these data. The datasets utilized comprise (1) individual sensor data and (2) combinations of sensor data paired with object information denoted as (H,S) and (H,S,F). Subsequently, the test values are inputted into each algorithm, and the accuracy of their outputs is evaluated to test their efficacy. (b) Illustrates the data structure of various configurations saved within Raspberry Pi to train ML algorithms to investigate hardness classification.
Electronics 13 02450 g003
Figure 4. Design of experiment flow process for collecting data from COTS sensors and preparing different set of configuration data to machine learning analysis. Object preparation based on qualitative Shore hardness scale adopted in this study, highlighting the three to four classes considered: H (Hard), S (Soft), and F (Flexible), ES (extra soft). Objects representing each class are showcased based on the scale, with silicone rubber representing Soft, TPU representing Flexible, and Wood and PLA representing Hard materials.
Figure 4. Design of experiment flow process for collecting data from COTS sensors and preparing different set of configuration data to machine learning analysis. Object preparation based on qualitative Shore hardness scale adopted in this study, highlighting the three to four classes considered: H (Hard), S (Soft), and F (Flexible), ES (extra soft). Objects representing each class are showcased based on the scale, with silicone rubber representing Soft, TPU representing Flexible, and Wood and PLA representing Hard materials.
Electronics 13 02450 g004
Figure 5. (a) Provides an overview or mapping of the experimental setup, demonstrating the connections among the tools involved in developing the prototype; (b) presents the complete experimental setup, showcasing three distinct systems: the Gripper system, Control system, and Pressure system. Additionally, three different tactile sensors are positioned on the side of the gripper to collect data alongside various objects. Data collected were saved in Raspberry Pi in CSV format. All data of sensors with objects have around 200 samples.
Figure 5. (a) Provides an overview or mapping of the experimental setup, demonstrating the connections among the tools involved in developing the prototype; (b) presents the complete experimental setup, showcasing three distinct systems: the Gripper system, Control system, and Pressure system. Additionally, three different tactile sensors are positioned on the side of the gripper to collect data alongside various objects. Data collected were saved in Raspberry Pi in CSV format. All data of sensors with objects have around 200 samples.
Electronics 13 02450 g005
Figure 6. Describes the binary classification (H,S) outcomes among different sensor data configurations or combination using multiple algorithm outputs. Each algorithm is evaluated based on accuracy scores obtained from predictions on test data, which constitutes 20% of the overall dataset remaining unseen during training.
Figure 6. Describes the binary classification (H,S) outcomes among different sensor data configurations or combination using multiple algorithm outputs. Each algorithm is evaluated based on accuracy scores obtained from predictions on test data, which constitutes 20% of the overall dataset remaining unseen during training.
Electronics 13 02450 g006
Figure 7. Illustrates the ternary classification (H,S,F) outcomes among different sensor data configurations or combination using multiple algorithm outputs. Each algorithm is evaluated based on accuracy scores obtained from predictions on test data, which constitutes 20% of the overall dataset remaining unseen during training.
Figure 7. Illustrates the ternary classification (H,S,F) outcomes among different sensor data configurations or combination using multiple algorithm outputs. Each algorithm is evaluated based on accuracy scores obtained from predictions on test data, which constitutes 20% of the overall dataset remaining unseen during training.
Electronics 13 02450 g007
Figure 8. Illustrates the outcomes of multiclass hardness classification accuracy, demonstrating various sensor configurations. The best-performing configuration, utilizing three sensors (F,V,P), achieved optimal results, as highlighted by the outcomes obtained from the random forest classifier (RFC) algorithm.
Figure 8. Illustrates the outcomes of multiclass hardness classification accuracy, demonstrating various sensor configurations. The best-performing configuration, utilizing three sensors (F,V,P), achieved optimal results, as highlighted by the outcomes obtained from the random forest classifier (RFC) algorithm.
Electronics 13 02450 g008
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sharma, Y.; Ferreira, P.; Justham, L. Hardness Classification Using Cost-Effective Off-the-Shelf Tactile Sensors Inspired by Mechanoreceptors. Electronics 2024, 13, 2450. https://doi.org/10.3390/electronics13132450

AMA Style

Sharma Y, Ferreira P, Justham L. Hardness Classification Using Cost-Effective Off-the-Shelf Tactile Sensors Inspired by Mechanoreceptors. Electronics. 2024; 13(13):2450. https://doi.org/10.3390/electronics13132450

Chicago/Turabian Style

Sharma, Yash, Pedro Ferreira, and Laura Justham. 2024. "Hardness Classification Using Cost-Effective Off-the-Shelf Tactile Sensors Inspired by Mechanoreceptors" Electronics 13, no. 13: 2450. https://doi.org/10.3390/electronics13132450

APA Style

Sharma, Y., Ferreira, P., & Justham, L. (2024). Hardness Classification Using Cost-Effective Off-the-Shelf Tactile Sensors Inspired by Mechanoreceptors. Electronics, 13(13), 2450. https://doi.org/10.3390/electronics13132450

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop