Tactile Object Recognition for Humanoid Robots Using New Designed Piezoresistive Tactile Sensor and DCNN
Abstract
:1. Introduction
2. Related Work
2.1. Tactile Sensor Array
2.2. Tactile Image Recognition for the Humanoid Robot
3. Methodology and Experimental Setup
3.1. Sensor Design and Result
3.1.1. Sensor Resistance and Sensitivity
3.1.2. Sensor Hysteresis and Reproducibility
3.2. Humanoid Robot Hand with Sensor
3.3. Haptic Random Exploration and Objects
3.4. Tactile Image Acquisition
3.4.1. Tactile Image
3.4.2. Resolution Enhancement
3.5. DCNN
3.5.1. Transfer Learning Method
3.5.2. DCNN Parameters
4. Result and Discussion
4.1. Recognition Rate from Resolution Enhanced
4.2. Recognition Results from DCNNs
4.3. Recognition Rate from Multimodal DCNNs
4.4. Recognition Rate from Object Exploration
4.5. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Seward, D.W.; Bradshaw, A.; Margrave, F. The anatomy of a humanoid robot. Robotica 1996, 14, 437–443. [Google Scholar] [CrossRef] [Green Version]
- Wu, Z.; Song, E.; Shen, F.; Xu, D.; Fang, B. The Biological Inspired Somatic Neuron Design and its. In Proceedings of the 2005 IEEE International Conference on Information Acquisition, Hong Kong, China, 27 June–3 July 2005. [Google Scholar] [CrossRef]
- Salisbury, J.K.; Craig, J.J. Articulated hands force control and kinematic issues. Int. J. Robot. Res. 1982, 1, 4–17. [Google Scholar] [CrossRef]
- Crisman, J.D.; Kanojia, C.; Zeid, I. Graspar: A flexible, easily controllable robotic hand. IEEE Robot. Autom. Mag. 1996, 3, 32–38. [Google Scholar] [CrossRef]
- Kawasaki, H.; Mouri, T. Humanoid robot hand and its applied research. J. Robot. Mechatron. 2019, 31, 16–26. [Google Scholar] [CrossRef]
- Lederman, S.; Klatzky, R. Haptic perception: A tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef] [Green Version]
- Iskarous, M.; Thakor, N. E-skins: Biomimetic sensing and encoding for upper limb prostheses. Proc. IEEE 2019, 107, 2052–2064. [Google Scholar] [CrossRef]
- Saudabayev, A.; Varol, H.A. Sensors for Robotic Hands: A Survey of State-of-the-Art. IEEE Access 2015, 3, 1765–1782. [Google Scholar] [CrossRef]
- Almassri, A.M.; Hasan, W.Z.W.; Ahmad, S.A.; Ishak, A.J.; Ghazali, A.M.; Talib, A.M.; Wada, C. Pressure Sensor: State of the Art, Design, and Application for Robotic Hand. J. Sens. 2015, 2015, 846487. [Google Scholar] [CrossRef] [Green Version]
- Kappassova, Z.; Corralesb, J.A.; Perdereaua, V. Tactile sensing in dexterous robot hands—Review. Robot. Auton. Syst. 2015, 74, 195–220. [Google Scholar] [CrossRef] [Green Version]
- Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile sensing—From humans to humanoids. IEEE Trans. Robot. 2021, 26, 1–20. [Google Scholar] [CrossRef]
- Hellman, R.B. Haptic Perception, Decision-making, and Learning for Manipulation with Artificial Hands. Ph.D. Thesis, Arizona State University, Tempe, AZ, USA, August 2016. [Google Scholar]
- Tsutsui, H.; Murashima, Y.; Honma, N.; Akazawa, K. Robot hand with soft tactile sensors and underactuated control. In Proceedings of the 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013. [Google Scholar] [CrossRef]
- Robertsson, L.; Iliev, B.; Palm, R.H.; Wide, P. Perception modeling for human-like artificial sensor systems. Int. J. Hum.-Comput. Stud. 2007, 65, 446–459. [Google Scholar] [CrossRef]
- Lao, S.; Bimbo, J.; Dahiya, R.; Lui, H. Robotic tactile perception of object properties: A review. Mechatronics 2017, 48, 54–67. [Google Scholar] [CrossRef] [Green Version]
- Zou, L.; Ge, C.; Wang, Z.J.; Certu, E.; Li, X. Novel Tactile Sensor Technology and Smart Tactile Sensing Systems: A Review. Sensors 2017, 17, 2653. [Google Scholar] [CrossRef]
- Salim, A.; Lim, S. Review of Recent Inkjet-Printed Capacitive Tactile Sensors. Sensors 2017, 17, 2593. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Seminara, L.; Gastaldo, P.; Watt, S.J.; Valyear, K.F.; Zuher, F.; Mastrogiovanni, F. Active Haptic Perception in Robots: A Review. Front. Neurorobotics 2019, 13, 1–20. [Google Scholar] [CrossRef]
- Shimonomura, K. Tactile Image Sensors Employing Camera: A Review. Sensors 2019, 19, 3933. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Handarish, U.A.; Omisore, O.M.; Igbe, T.; Han, S.; Li, H.; Du, W.; Zhang, J.; Wang, L. A Survey of Tactile-Sensing Systems and Their Applications in Biomedical Engineering. Adv. Mater. Sci. Eng. 2020, 2020, 4047937. [Google Scholar] [CrossRef] [Green Version]
- Weiss, K.; Worn, H. The working principle of resistive tactile sensor cells. In Proceedings of the IEEE International Conference Mechatronics and Automation, Niagara Falls, ON, Canada, 29 July–1 August 2005. [Google Scholar] [CrossRef]
- Stassi, S.; Cauda, V.; Canavese, G.; Pirri, C.F. Flexible Tactile Sensing Based on Piezoresistive Composites: A Review. Sensors 2014, 14, 5296–5332. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, X.; Zhong, Y.; Sun, Y.; Li, X. A flexible capacitive tactile sensing array for pressure measurement. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Bali, Indonesia, 5–10 December 2014. [Google Scholar] [CrossRef]
- Attar, I.; Altintig, K.S.; Bozyel, I.; Gokcen, D. Design of A Highly Sensitive, Flexible and Stretchable Tactile Sensor for Electronic Skin Applications. In Proceedings of the IEEE International Conference on Flexible and Printable Sensors and Systems (FLEPS), Glasgow, UK, 8–10 July 2019. [Google Scholar] [CrossRef]
- Chuang, C.; Dong, W.; Lo, W. Flexible Piezoelectric Tactile Sensor with Structural Electrodes Array for Shape Recognition System. In Proceedings of the 3rd International Conference on Sensing Technology, Tainan, Taiwan, 30 November–3 December 2008. [Google Scholar] [CrossRef]
- Lee, J.; Choi, W.; Yoo, Y.K.; Hwang, K.S.; Lee, S.; Kang, S.; Kim, J.; Lee, J.H. A Micro-Fabricated Force Sensor Using an All Thin Film Piezoelectric Active Sensor. Sensors 2014, 14, 22199–22207. [Google Scholar] [CrossRef] [Green Version]
- Jiang, H.; Yan, Y.; Zhu, X.; Zhang, C. A 3-D Surface Reconstruction with Shadow Processing for Optical Tactile Sensors. Sensors 2018, 18, 2785. [Google Scholar] [CrossRef] [Green Version]
- Yussof, H.; Abdullah, S.C.; Ohka, M. Development of Optical Three-Axis Tactile Sensor and its Application to Robotic Hand for Dexterous Manipulation Tasks. In Proceedings of the 2010 Fourth Asia International Conference on Mathematical/Analytical Modelling and Computer Simulation, Kota Kinabalu, Malaysia, 26–28 May 2010. [Google Scholar] [CrossRef]
- Palli, G.; Pirozzi, S. A Tactile-Based Wire Manipulation System for Manufacturing Applications. Robotics 2019, 8, 46. [Google Scholar] [CrossRef] [Green Version]
- You, Z.; Chen, Y. The Use of Tactile Sensors and PIV Analysis for Understanding the Bearing Mechanism of Pile Groups. Sensors 2018, 18, 476. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Giovanelli, D.; Farella, E. Force Sensing Resistor and Evaluation of Technology for Wearable Body Pressure Sensing. J. Sens. 2016, 2016, 9391850. [Google Scholar] [CrossRef] [Green Version]
- Nakamoto, H.; Kobayashi, F.; Nobuaki, I.; Hidenori, S.; Kojima, F. Universal robot hand equipped with tactile and joint torque sensors- development and experiments on stiffness and object recognition. Syst. Cybern. Inform. 2007, 5, 79–84. [Google Scholar]
- Chang, W.; Fang, T.; Yeh, S.; Lin, Y. Flexible Electronics Sensors for Tactile Multi-Touching. Sensors 2009, 9, 1188–1203. [Google Scholar] [CrossRef] [Green Version]
- Heever, D.J.; Schreve, K.; Scheffer, C. Tactile Sensing Using Force Sensing Resistors and a Super-Resolution Algorithm. IEEE Sens. J. 2009, 9, 29–35. [Google Scholar] [CrossRef]
- Castellanos-Ramosa, J.; Navas-Gonzáleza, R.; Maciciorb, H.; Sikorab, T.; Ochotecob, E.; Vidal-Verdúa, F. Tactile Sensors Based on Conductive Polymers. Microsyst. Technol. 2010, 16, 765–776. [Google Scholar] [CrossRef]
- Drimus, A.; Kootstra, G.; Bilberg, A.; Kragic, D. Classification of rigid and deformable objects using a novel tactile sensor. In Proceedings of the International Conference on Advanced Robotics (ICAR), Tallinn, Estonia, 20–23 June 2011. [Google Scholar] [CrossRef] [Green Version]
- Wang, H.; Zhou, D.; Cao, J. Development of a Skin-Like Tactile Sensor Array for Curved Surface. IEEE Sens. J. 2014, 14, 55–61. [Google Scholar] [CrossRef]
- Khan, S.; Tinku, S.; Lorenzelli, L.; Dahiya, R.S. Flexible tactile sensors using screen-printed P(VDF-TrFE) and MWCNT/PDMS composites. IEEE Sens. J. 2014, 15, 3146–3155. [Google Scholar] [CrossRef]
- Liu, Y.; Hsiao, Y.; Cheng, W.; Liu, Y.; Su, J. Low-Resolution Tactile Image Recognition for Automated Robotic Assembly Using Kernel PCA-Based Feature Fusion and Multiple Kernel Learning-Based Support Vector Machine. Math. Probl. Eng. 2014, 2014, 497275. [Google Scholar] [CrossRef]
- Wang, F.; Song, Y.; Zhang, Z.; Chen, W. Structure Analysis and Decoupling Research of a Novel Flexible Tactile Sensor Array. J. Sens. 2015, 2015, 476403. [Google Scholar] [CrossRef]
- Tsai, Y.; Ma, C.; Lin, Y.; Yang, Y. Development of a Large-Area 8 × 8 Tactile Sensing Array with High Sensitivity. Sens. Mater. 2017, 29, 303–309. [Google Scholar] [CrossRef] [Green Version]
- Gerlach, C.; Sanli, D.; Ramalingame, R.; Kanoun, O. Flexible, dynamic piezoresistive sensor matrix based on carbon nanotube polymer composite for pressure distribution measurement. In Proceedings of the AMA Conferences 2017—SENSOR 2017 and IRS2 2017, Nürnberg, Germany, 30 May–1 June 2017. [Google Scholar] [CrossRef]
- Pizarro, F.; Villavicencio, P.; Yunge, D.; Rodríguez, M.; Hermosilla, G.; Leiva, A. Easy-to-Build Tactile Pressure Sensor. Sensors 2018, 18, 1190. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ramalingame, R.; Hu, Z.; Gerlach, C.; Rajendran, D.; Zubkova, T.; Baumann, R.; Kanoun, O. Flexible piezoresistive sensor matrix based on a carbon nanotube PDMS composite for dynamic pressure distribution measurement. J. Sens. Sens. Syst. 2019, 8, 1–7. [Google Scholar] [CrossRef]
- Saccomandi, P.; Schena, E.; Oddo, C.M.; Zollo, L.; Silvestri, S.; Guglielmelli, E. Microfabricated Tactile Sensors for Biomedical Applications: A Review. Biosensors 2014, 4, 422–448. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Huang, C.; Wang, Q.; Zhao, M.; Chen, C.; Pan, S.; Yuan, M. Tactile Perception Technologies and Their Applications in Minimally Invasive Surgery: A Review. Front. Physiol. 2020, 11, 611596. [Google Scholar] [CrossRef]
- Schneider, A.; Sturm, J.; Stachniss, C.; Reisert, M.; Burkhardt, H.; Burgard, W. Object Identification with Tactile Sensors using Bag-of-Features. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009. [Google Scholar] [CrossRef] [Green Version]
- Pezzementi, Z.; Plaku, E.; Reyda, C.; Hager, G.D. Tactile-Object Recognition From Appearance Information. IEEE Trans. Robot. 2011, 27, 473–487. [Google Scholar] [CrossRef] [Green Version]
- Bhattacharjee, T.; Rehg, J.M.; Kemp, C.C. Haptic classification and recognition of objects using a tactile sensing forearm. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012. [Google Scholar] [CrossRef]
- Liu, H.; Greco, J.; Song, X.; Bimbo, J.; Seneviratne, L.; Althoefer, K. Tactile image based contact shape recognition using neural network. In Proceedings of the 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Hamburg, Germany, 13–15 September 2012. [Google Scholar] [CrossRef]
- Dattaa, S.; Khasnobishb, A.; Konara, A.; Tibarewalab, D.N.; Janarthananc, R. Performance Analysis of Object Shape Classification and Matching from Tactile Images Using Wavelet Energy Features. Procedia Technol. 2013, 10, 805–812. [Google Scholar] [CrossRef] [Green Version]
- Luo, S.; Mou, W.; Althoefer, K.; Liu, H. Novel Tactile-SIFT Descriptor for Object Shape Recognition. IEEE Sens. J. 2015, 15, 5001–5009. [Google Scholar] [CrossRef]
- Cretu, A.; Oliveira, A.E.A.; Fonseca, V.P.; Tawbe, B.; Petriu, M.; Groza, V.C. Computational Intelligence and Mechatronics Solutions for Robotic Haptic Object Recognition. In Proceedings of the 2015 IEEE 9th International Symposium on Intelligent Signal Processing (WISP), Siena, Italy, 15–17 May 2015. [Google Scholar] [CrossRef]
- Schmitz, A.; Bansho, Y.; Noda, K.; Iwata, H.; Ogata, T.; Sugano, S. Tactile Object Recognition using Deep Learning and Dropout. In Proceedings of the 2014 IEEE-RAS International Conference on Humanoid Robots, Madrid, Spain, 18–20 November 2014. [Google Scholar] [CrossRef]
- Liu, H.; Guo, D.; Sun, F. Object Recognition Using Tactile Measurements: Kernel Sparse Coding Methods. IEEE Trans. Instrum. Meas. 2016, 65, 656–665. [Google Scholar] [CrossRef]
- Luo, S.; Mou, W.; Althoefer, K.; Liu, H. Iterative Closest Labeled Point for tactile object shape recognition. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016. [Google Scholar] [CrossRef] [Green Version]
- Albini, A.; Denei, S.; Cannata, G. Human hand recognition from robotic skin measurements in human-robot physical interactions. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar] [CrossRef]
- Gandarias, J.M.; Gómez-de-Gabriel, J.M.; García-Cerezo, A. Human and object recognition with a high-resolution tactile sensor. In Proceedings of the 2017 IEEE SENSORS, Glasgow, UK, 29 October–1 November 2017. [Google Scholar] [CrossRef] [Green Version]
- Gandarias, J.M.; García-Cerezo, A.; Gómez-de-Gabriel, J.M. CNN-Based Methods for Object Recognition With High-Resolution Tactile Sensors. IEEE Sens. J. 2019, 19, 6872–6882. [Google Scholar] [CrossRef]
- Zhang, X.F.; Cai, A.J.; Zhao, Y.L. Experimental Investigation of Measurement Error in Networked Resistors Arrays Based on Zero Potential Method. Int. J. Precis. Eng. Manuf. 2018, 19, 473–479. [Google Scholar] [CrossRef]
- Keys, R. Cubic convolution interpolation for digital image processing. IEEE Trans. Acoust. Speech. Signal Process. 1981, 37, 1153–1160. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Yang, R.; Zhao, N.; Zhu, W.; Huang, Y.; Zhang, R.; Chen, X.; Liu, J.; Liu, W.; Zuo, Z. Concentration Quantification of Oil Samples by Three-Dimensional Concentration-Emission Matrix (CEM) Spectroscopy. Appl. Sci. 2020, 10, 315. [Google Scholar] [CrossRef] [Green Version]
- Azulay, A.; Weiss, Y. Why do deep convolutional networks generalize so poorly to small image transformations? J. Mach. Learn. Res. 2019, 20, 1–25. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Proceedings of the 25th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; pp. 1097–1105. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [Google Scholar] [CrossRef] [Green Version]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar] [CrossRef] [Green Version]
- Zhou, B.; Khosla, A.; Lapedriza, A.; Torralba, A.; Oliva, A. Places: An Image Database for Deep Scene Understanding. J. Vis. 2016, 17. [Google Scholar] [CrossRef]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Wojna, Z.; Shlens, J. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the Computer Vision and Pattern Recognition 2016, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar] [CrossRef] [Green Version]
- Tan, M.; Le, Q.V. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019. [Google Scholar]
- Iandola, F.N.; Han, S.; Moskewicz, M.W.; Ashraf, K.; Dally, W.J.; Keutzer, K. SqueezeNet: AlexNet-level accuracy with 50× fewer parameters and<0.5 MB model size. arXiv 2016, arXiv:1602.07360. [Google Scholar]
- Szegedy, C.; Sergey, I.; Vanhoucke, V.; Alemi, A. Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning. In Proceedings of the AAAI Conference on Artificial Intelligence, Phoenix, AZ, USA, 23 February 2016. [Google Scholar]
- Huang, G.; Liu, H.; van der Maaten, M.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar] [CrossRef] [Green Version]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar] [CrossRef] [Green Version]
- Zoph, B.; Vasudevan, V.; Shlens, J.; Le, Q.V. Learning Transferable Architectures for Scalable Image Recognition. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018. [Google Scholar] [CrossRef] [Green Version]
- Ma, N.; Zhang, X.; Zheng, H.; Sun, J. ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design. In Proceedings of the European Conference on Computer Vision ECCV 2018, Munich, Germany, 8–14 September 2018. [Google Scholar] [CrossRef] [Green Version]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.C. MobileNetV2: Inverted Residuals and Linear Bottlenecks. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Lake City, UT, USA, 18–23 June 2018. [Google Scholar] [CrossRef] [Green Version]
- Pan, S.J.; Yang, Q. A Survey on Transfer Learning. IEEE Trans. Knowl. Data Eng. 2010, 22, 1345–1359. [Google Scholar] [CrossRef]
- Yang, L.; Jiang, D.; Han, W.; Sahli, H. DCNN and DNN Based Multi-modal Depression Recognition. In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017. [Google Scholar] [CrossRef]
- Gao, J.; Li, P.; Chen, Z.; Zhang, J. A Survey on Deep Learning for Multimodal Data Fusion. Neural Comput. 2020, 32, 829–864. [Google Scholar] [CrossRef] [PubMed]
Year | Sensor Resolution (Pixel) | Physical Size (mm) | Sensor Size (mm) | Transducer | Application |
---|---|---|---|---|---|
2007 [32] | 9 × 8 | 20 × 30 | 1.8 × 3.4 | Conductive rubber | Finger |
2009 [33] | 25 × 25 | 150 × 150 | 5.0 × 5.0 | Organic resistance | Flex e-Skin |
2009 [34] | 5 × 5 | 32 × 32 | 50 mm2 | FSR | Touch sensing |
2010 [35] | 16 × 16 | 50 × 50 | 3.44 mm2 | Conductive polymer | Flex e-Skin |
2011 [36] | 8 × 8 | 20 × 20 | 2.5 × 2.5 | Conductive rubber | Gripper Finger |
2011 [37] | 8 × 16 | 70 × 120 | 3.0 × 3.0 | Nickel power + PDMS | Flex e-Skin |
2104 [38] | 8 × 8 | - | 3.0 × 3.0 | P(VDF-TrFE), MWCNT/PDMS | Flex e-Skin |
2014 [39] | 10 × 10 | 20 × 20 | 1.3 × 1.3 | Nanoparticles of carbon and silica | Finger |
2015 [40] | 3 × 3 | 50 × 50 | 10 × 10 | Conductive pillars | Flex e-Skin |
2017 [41] | 8 × 8 | 300 × 220 | 37.5 × 2.5 | Conductive polymer | e-Skin |
2017 [42] | 8 × 8 | 100 × 100 | 480 mm2 | Polymer composite | e-Skin |
2018 [43] | 1 × 1 | 20 × 25 | 20 × 25 | Conductive polymer | Finger tip |
2019 [44] | 4 × 4 | 10 × 10 | 1900 mm2 | Polymer composite | e-Skin |
Year | Sensor Resolution (Pixel) | Number of Class | Descriptor | Classification Method | Number of Grasps | Recognition Rate (%) |
---|---|---|---|---|---|---|
2009 [47] | 6 × 14 (2ea) | 21 | All Data Vector | BoW | 10 | 84.6 |
2011 [36] | 8 × 8 (2ea) | 10 | Mean, SD | kNN | 1 | 92.0 |
2011 [48] | 6 × 6 | 5 | Polar Furrier | PCA, BoW | >50 | 90.0 |
2012 [49] | 24 × 16 | 4 | Maximum Vector | PCA, kNN | 1 | 81.4 |
2012 [50] | 5 × 9, 12 × 10 | 4 | 3 × 3 Segmentation | ANN | 1 | 91.0 |
2013 [51] | 32 × 32 | 10 | Haar Wavelet | kNN | 1 | 86.0 |
2014 [39] | 10 × 10 | 12 | K-PCA, FD, GE | MKL-SVM | 1 | 85.54 |
2015 [52] | 6 × 14 | 18 | Segmentation SIFT | BoW | 15 | 89.9 |
2015 [53] | 16 × 16 | 25 | Reduced Vector | ANN | 1 | 96.0 |
2015 [54] | 108+133 | 20 | All Data Vector | DNN | 1 | 91.1 |
2016 [55] | 8 × 3 (3ea), 7 × 4 | 7 | All Data Vector | KSC | 1 | 94.0 |
2016 [56] | 6 × 14 | 20 | Zernike moment | iCLAP | 20 | 85.36 |
2017 [57] | 768 | 2 | AlexNet-DCNN | DCNN | 1 | 98.3 |
2017 [58] | 28 × 50 | 8 | AlexNet-DCNN | DCNN | 1 | 91.6 |
2019 [59] | 28 × 50 | 22 | ResNet-DCNN | DCNN | 1 | 95.36 |
Class | Object | Size |
---|---|---|
1. | Battery | ∅ = 33 mm, H = 60 mm |
2. | Remote controller | L = 172 mm, W = 45 mm, H = 15 mm |
3. | Plastic tongs | L = 165 mm, W = 85 mm, H = 15 mm |
4. | Screwdriver | ∅ = 38 mm, L = 220 mm |
5. | Coffee Cup | ∅ = 85 mm, W = 120 mm, H = 95 mm |
6. | Scissors | L = 18 mm, W = 80 mm, H = 12 mm |
7. | Fixed wrench | L = 168 mm, W = 30 mm, H = 6 mm |
8. | Allen key | L = 90 mm, W = 35 mm, H = 6 mm |
9. | Golf ball | ∅ = 40 mm |
10. | Measuring tape | ∅ = 70 mm, W = 92 mm, H = 47 mm |
11. | Computer mouse | L = 100 mm, W = 50 mm, H = 35 mm |
12. | Brush | ∅ = 80 mm, L = 390 mm |
13. | Amp meter | ∅ = 35 mm, L = 35 mm |
14. | Cola bottle | ∅ = 60 mm, H = 22 mm |
15. | Pen | ∅ = 12 mm, L = 147 mm |
16. | Charger | L = 90 mm, W = 70 mm, H = 22 mm |
17. | Soda Bottle | ∅ = 65 mm, H = 155 mm |
18. | Variable wrench | L = 200 mm, W = 60 mm, H = 14 mm |
19. | Water bottle | ∅ = 60 mm, H = 175 mm |
20. | Cream bottle | ∅ = 75 mm, H = 160 mm |
Parameters | Values |
---|---|
NumEpochs | 30 |
NumBatchSize | 16 |
Momentum | 0.9 |
LearnRateDropFactor | 0.1 |
LearnRateDropPeriod | 8 |
Initial Learning Rate | 0.00001 | 0.0001 | 0.001 | 0.01 |
DCNN Model | SqueezeNet | VGGNet16 VGGNet19 DarkNet19 DarkNet53 | AlexNet GoogLeNet Place365GoogleNet EfficienNetB0 | ResNet18 ResNet50 ResNet101 InceptionV3 InceptionResNetV2 DensNet201 XceptionNet NASNetMobile ShuffleNet MobileNetV2 |
Model | Accuracy of 16 × 16 Pixels | Accuracy of 512 × 512 Pixels | ||||||
---|---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD | |
AlexNet | 76.40 | 80.55 | 77.66 | 1.27 | 84 | 85.2 | 84.42 | 0.40 |
VGG16 | 79.95 | 81.85 | 80.92 | 0.58 | 85.7 | 88.35 | 86.94 | 0.78 |
VGG19 | 80.10 | 84.20 | 81.84 | 1.11 | 86.4 | 88.6 | 87.62 | 0.66 |
GoogLeNet | 85.15 | 86.30 | 85.57 | 0.36 | 89.2 | 91.01 | 90.09 | 0.59 |
ResNet18 | 80.05 | 83.00 | 81.36 | 0.99 | 85.25 | 87.25 | 86.5 | 0.61 |
ResNet50 | 81.55 | 84.60 | 82.87 | 0.95 | 86.1 | 88.55 | 87.59 | 0.82 |
ResNet101 | 81.75 | 85.40 | 83.61 | 1.12 | 86.75 | 89.65 | 88.07 | 0.85 |
Place365GoogLeNet | 74.80 | 81.00 | 78.40 | 1.84 | 87.2 | 89.75 | 88.56 | 0.80 |
InceptionNetV3 | 85.25 | 87.45 | 86.47 | 0.75 | 90.1 | 91.8 | 91.06 | 0.50 |
EfficienNetB0 | 71.25 | 76.95 | 74.18 | 2.01 | 71.25 | 76.95 | 74.18 | 2.01 |
SqueezeNet | 39.02 | 41.50 | 40.21 | 1.71 | 45.5 | 51.6 | 48.33 | 1.71 |
InceptionResNetV2 | 84.55 | 87.20 | 85.15 | 0.91 | 90.7 | 93.05 | 91.86 | 0.70 |
DarkNet19 | 61.60 | 68.05 | 65.01 | 1.74 | 75 | 78.8 | 77.36 | 1.17 |
DarkNet53 | 77.30 | 81.55 | 79.44 | 1.43 | 83.85 | 86.2 | 85.42 | 0.90 |
DenseNet201 | 82.40 | 86.10 | 83.79 | 1.15 | 87.9 | 89.6 | 88.69 | 0.56 |
XceptionNet | 82.45 | 86.15 | 84.48 | 1.06 | 91 | 91.85 | 91.4 | 0.29 |
NASNetMobile | 85.10 | 87.35 | 86.36 | 0.69 | 84.35 | 91.6 | 89.72 | 2.08 |
ShuffleNet | 81.65 | 83.80 | 82.65 | 0.78 | 87.25 | 88.8 | 88.16 | 0.55 |
MobileNetV2 | 82.30 | 85.55 | 83.92 | 0.92 | 87.35 | 89.9 | 88.23 | 0.73 |
Model | Accuracy | |||
---|---|---|---|---|
Min | Max | Mean | SD | |
InceptionResNetV2/XceptionNet | 91.85 | 93.35 | 92.73 | 0.51 |
InceptionResNetV2/InceptionNetV3 | 91.60 | 93.05 | 92.56 | 0.49 |
InceptionResNetV2/GoogLeNet | 91.35 | 93.05 | 92.46 | 0.47 |
InceptionResNetV2/NASNetMobile | 90.80 | 93.10 | 92.29 | 0.82 |
InceptionResNetV2/DensNet201 | 91.35 | 92.70 | 91.93 | 0.43 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pohtongkam, S.; Srinonchat, J. Tactile Object Recognition for Humanoid Robots Using New Designed Piezoresistive Tactile Sensor and DCNN. Sensors 2021, 21, 6024. https://doi.org/10.3390/s21186024
Pohtongkam S, Srinonchat J. Tactile Object Recognition for Humanoid Robots Using New Designed Piezoresistive Tactile Sensor and DCNN. Sensors. 2021; 21(18):6024. https://doi.org/10.3390/s21186024
Chicago/Turabian StylePohtongkam, Somchai, and Jakkree Srinonchat. 2021. "Tactile Object Recognition for Humanoid Robots Using New Designed Piezoresistive Tactile Sensor and DCNN" Sensors 21, no. 18: 6024. https://doi.org/10.3390/s21186024
APA StylePohtongkam, S., & Srinonchat, J. (2021). Tactile Object Recognition for Humanoid Robots Using New Designed Piezoresistive Tactile Sensor and DCNN. Sensors, 21(18), 6024. https://doi.org/10.3390/s21186024