Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
3.1. Welding Skill Extraction System Design
3.2. MRVF Tele-Welding System Overview
3.3. MRVF Visual/Haptic Workspace
3.4. Welding Experiments
3.4.1. Experimental Design
3.4.2. On-Site Welding Experiment
3.4.3. Tele-Welding Experiment
- Baseline: Perform the tele-welding operation with a non-immersive display using monoscopic streams (Figure 4a). The display screen was a standard 27-inch PC monitor. The 2D visualization was transmitted from the monoscopic camera mounted on the welding robot. The welder manipulated the master haptic robot for the welding robot control without haptic effects. The non-immersive 2D display was used as the baseline condition, as it is commonly used for visual feedback in typical remote-controlled welding systems.
- MRnoVF: Conduct the tele-welding task with immersive MR-HMD with overlaid monocular images on the top of the virtual workpiece (Figure 4c). The MRnoVF scheme is a limited version of the proposed MRVF module because it does not provide the participants with haptic cues to support hand maneuvering. The haptic device was deployed to command the UR5 arm for welding but provided no force feedback to the operator.
- MRVF: MRVF incorporates combined planar prevention and conical guidance haptic cues in the immersive MR workspace (Figure 4d). The user maneuvered the haptic device within the constraints provided by guidance and prevention VFs while welding with the remotely placed robot. The user inspected the real-time pose of the physical welding robot via the scaled virtual replica in the scene.
4. Results
4.1. Onsite Welding Results
4.2. Tele-Welding Results
4.2.1. Objective Measures
4.2.2. Subjective Measures
4.3. Limitations
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Fu, B.; Seidelman, W.; Liu, Y.; Kent, T.; Carswell, M.; Zhang, Y.; Yang, R. Towards Virtualized Welding: Visualization and Monitoring of Remote Welding. In Proceedings of the 2014 IEEE International Conference on Multimedia and Expo (ICME), Chengdu, China, 14–18 July 2014; pp. 1–6. [Google Scholar]
- Baklouti, S.; Gallot, G.; Viaud, J.; Subrin, K. On the Improvement of Ros-Based Control for Teleoperated Yaskawa Robots. Appl. Sci. 2021, 11, 7190. [Google Scholar] [CrossRef]
- Wang, B.; Hu, S.J.; Sun, L.; Freiheit, T. Intelligent Welding System Technologies: State-of-the-Art Review and Perspectives. J. Manuf. Syst. 2020, 56, 373–391. [Google Scholar] [CrossRef]
- Liu, Y.K. Toward Intelligent Welding Robots: Virtualized Welding Based Learning of Human Welder Behaviors. Weld. World 2016, 60, 719–729. [Google Scholar] [CrossRef]
- Solanes, J.E.; Muñoz, A.; Gracia, L.; Martí, A.; Girbés-Juan, V.; Tornero, J. Teleoperation of Industrial Robot Manipulators Based on Augmented Reality. Int. J. Adv. Manuf. Technol. 2020, 111. [Google Scholar] [CrossRef]
- Liu, Y.K.; Zhang, Y.M. Toward Welding Robot with Human Knowledge: A Remotely-Controlled Approach. IEEE Trans. Autom. Sci. Eng. 2015, 12, 769–774. [Google Scholar] [CrossRef]
- Ming, H.; Huat, Y.S.; Lin, W.; Hui Bin, Z. On Teleoperation of an Arc Welding Robotic System. In Proceedings of the IEEE International Conference on Robotics and Automation, Minneapolis, MN, USA, 22–28 April 1996; Volume 2, pp. 1275–1280. [Google Scholar] [CrossRef]
- Park, J.H.; Kim, M.C.; Böhi, R.; Gommel, S.A.; Kim, E.S.; Choi, E.; Park, J.O.; Kim, C.S. A Portable Intuitive Haptic Device on a Desk for User-Friendly Teleoperation of a Cable-Driven Parallel Robot. Appl. Sci. 2021, 11, 3823. [Google Scholar] [CrossRef]
- Fine, T.; Zaidner, G.; Shapiro, A. Grasping Assisting Algorithm in Tele-Operated Robotic Gripper. Appl. Sci. 2021, 11, 2640. [Google Scholar] [CrossRef]
- Ding, D.; Shen, C.; Pan, Z.; Cuiuri, D.; Li, H.; Larkin, N.; van Duin, S. Towards an Automated Robotic Arc-Welding-Based Additive Manufacturing System from CAD to Finished Part. CAD Comput. Aided Des. 2016, 73, 66–75. [Google Scholar] [CrossRef] [Green Version]
- Dinham, M.; Fang, G. Autonomous Weld Seam Identification and Localisation Using Eye-in-Hand Stereo Vision for Robotic Arc Welding. Robot. Comput.-Integr. Manuf. 2013, 29, 288–301. [Google Scholar] [CrossRef]
- van Essen, J.; van der Jagt, M.; Troll, N.; Wanders, M.; Erden, M.S.; van Beek, T.; Tomiyama, T. Identifying Welding Skills for Robot Assistance. In Proceedings of the 2008 IEEE/ASME International Conference on Mechatronics and Embedded Systems and Applications, MESA 2008, Beijing, China, 12–15 October 2008; pp. 437–442. [Google Scholar]
- Erden, M.S.; Billard, A. End-Point Impedance Measurements at Human Hand during Interactive Manual Welding with Robot. In Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, 31 May–5 June 2014; pp. 126–133. [Google Scholar]
- Erden, M.S.; Billard, A. Hand Impedance Measurements during Interactive Manual Welding with a Robot. IEEE Trans. Robot. 2015, 31, 168–179. [Google Scholar] [CrossRef]
- Erden, M.S.; Billard, A. End-Point Impedance Measurements across Dominant and Nondominant Hands and Robotic Assistance with Directional Damping. IEEE Trans. Cybern. 2015, 45, 1146–1157. [Google Scholar] [CrossRef]
- Liu, Y.K.; Shao, Z.; Zhang, Y.M. Learning Human Welder Movement in Pipe GTAW: A Virtualized Welding Approach. Weld. J. 2014, 93, 388s–398s. [Google Scholar]
- Erden, M.S.; Tomiyama, T. Identifying Welding Skills for Training and Assistance with Robot. Sci. Technol. Weld. Join. 2009, 14, 523–532. [Google Scholar] [CrossRef]
- Liu, Y.K.; Zhang, Y.M. Control of Human Arm Movement in Machine-Human Cooperative Welding Process. Control Eng. Pract. 2014, 32, 161–171. [Google Scholar] [CrossRef]
- Wang, Y.; Chen, Y.; Nan, Z.; Hu, Y. Study on Welder Training by Means of Haptic Guidance and Virtual Reality for Arc Welding. In Proceedings of the 2006 IEEE International Conference on Robotics and Biomimetics, Kunming, China, 17–20 December 2006; pp. 954–958. [Google Scholar]
- Ciszak, O.; Juszkiewicz, J.; Suszyński, M. Programming of Industrial Robots Using the Recognition of Geometric Signs in Flexible Welding Process. Symmetry 2020, 12, 1429. [Google Scholar] [CrossRef]
- Yu, H.; Qin, J.; Zhao, K. Innovation in Interactive Design of Tele-Robotic Welding in the Trend of Interaction Change. Des. Eng. 2020, 322–330. [Google Scholar] [CrossRef]
- Wang, Q.; Jiao, W.; Yu, R.; Johnson, M.T.; Zhang, Y.M. Virtual Reality Robot-Assisted Welding Based on Human Intention Recognition. IEEE Trans. Autom. Sci. Eng. 2020, 17, 799–808. [Google Scholar] [CrossRef]
- Wells, T.; Miller, G. The Effect of Virtual Reality Technology on Welding Skill Performance. J. Agric. Educ. 2020, 61, 152–171. [Google Scholar] [CrossRef]
- Byrd, A.P.; Stone, R.T.; Anderson, R.G.; Woltjer, K. The Use of Virtual Welding Simulators to Evaluate Experienced Welders. Weld. J. 2015, 94, 389–395. [Google Scholar]
- Liu, Y.; Zhang, Y. Human Welder 3-D Hand Movement Learning in Virtualized GTAW: Theory and Experiments. In Transactions on Intelligent Welding Manufacturing; Springer: Singapore, 26 August 2019; pp. 3–25. [Google Scholar]
- Liu, Y.K.; Zhang, Y.M. Supervised Learning of Human Welder Behaviors for Intelligent Robotic Welding. IEEE Trans. Autom. Sci. Eng. 2017, 14, 1532–1541. [Google Scholar] [CrossRef]
- Wang, Q.; Cheng, Y.; Jiao, W.; Johnson, M.T.; Zhang, Y.M. Virtual Reality Human-Robot Collaborative Welding: A Case Study of Weaving Gas Tungsten Arc Welding. J. Manuf. Process. 2019, 48, 210–217. [Google Scholar] [CrossRef]
- Wang, Q.; Jiao, W.; Yu, R.; Johnson, M.T.; Zhang, Y. Modeling of Human Welders’ Operations in Virtual Reality Human–Robot Interaction. IEEE Robot. Autom. Lett. 2019, 4, 2958–2964. [Google Scholar] [CrossRef]
- Papadopoulos, T.; Evangelidis, K.; Kaskalis, T.H.; Evangelidis, G.; Sylaiou, S. Interactions in Augmented and Mixed Reality: An Overview. Appl. Sci. 2021, 11, 8752. [Google Scholar] [CrossRef]
- Ni, D.; Yew, A.W.W.; Ong, S.K.; Nee, A.Y.C. Haptic and Visual Augmented Reality Interface for Programming Welding Robots. Adv. Manuf. 2017, 5, 191–198. [Google Scholar] [CrossRef]
- Selvaggio, M.; Notomista, G.; Chen, F.; Gao, B.; Trapani, F.; Caldwell, D. Enhancing Bilateral Teleoperation Using Camera-Based Online Virtual Fixtures Generation. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Daejeon, Korea, 9–14 October 2016; pp. 1483–1488. [Google Scholar] [CrossRef]
- Bischof, B.; Gluck, T.; Bock, M.; Kugi, A. A Path/Surface Following Control Approach to Generate Virtual Fixtures. IEEE Trans. Robot. 2018, 34, 1577–1592. [Google Scholar] [CrossRef]
- Vitrani, M.A.; Poquet, C.; Morel, G. Applying Virtual Fixtures to the Distal End of a Minimally Invasive Surgery Instrument. IEEE Trans. Robot. 2017, 33, 114–123. [Google Scholar] [CrossRef]
- He, Y.; Hu, Y.; Zhang, P.; Zhao, B.; Qi, X.; Zhang, J. Human–Robot Cooperative Control Based on Virtual Fixture in Robot-Assisted Endoscopic Sinus Surgery. Appl. Sci. 2019, 9, 1659. [Google Scholar] [CrossRef] [Green Version]
- Krupke, D.; Zhang, J.; Steinicke, F. Virtual Fixtures in VR—Perceptual Overlays for Assisted Teleoperation, Teleprogramming and Learning. In Proceedings of the ICAT-EGVE 2018—International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, Limassol, Cyprus, 7–9 November 2018; pp. 195–201. [Google Scholar] [CrossRef]
- Moccia, R.; Iacono, C.; Siciliano, B.; Ficuciello, F. Vision-Based Dynamic Virtual Fixtures for Tools Collision Avoidance in Robotic Surgery. IEEE Robot. Autom. Lett. 2020, 5, 1650–1655. [Google Scholar] [CrossRef]
- Druta, R.; Druta, C.; Negirla, P.; Silea, I. A Review on Methods and Systems for Remote Collaboration. Appl. Sci. 2021, 11, 10035. [Google Scholar] [CrossRef]
- Rokhsaritalemi, S.; Sadeghi-Niaraki, A.; Choi, S.M. A Review on Mixed Reality: Current Trends, Challenges and Prospects. Appl. Sci. 2020, 10, 636. [Google Scholar] [CrossRef] [Green Version]
- Aygün, M.M.; Ögüt, Y.Ç.; Baysal, H.; Taşcioglu, Y. Visuo-Haptic Mixed Reality Simulation Using Unbound Handheld Tools. Appl. Sci. 2020, 10, 5344. [Google Scholar] [CrossRef]
- Liu, Y.K.; Zhang, Y.M. Fusing Machine Algorithm with Welder Intelligence for Adaptive Welding Robots. J. Manuf. Process. 2017, 27, 18–25. [Google Scholar] [CrossRef]
- Tu, X.; Autiosalo, J.; Jadid, A.; Tammi, K.; Klinker, G. A Mixed Reality Interface for a Digital Twin Based Crane. Appl. Sci. 2021, 11, 9480. [Google Scholar] [CrossRef]
- Saeidi, H.; Wagner, J.R.; Wang, Y. A Mixed-Initiative Haptic Teleoperation Strategy for Mobile Robotic Systems Based on Bidirectional Computational Trust Analysis. IEEE Trans. Robot. 2017, 33, 1500–1507. [Google Scholar] [CrossRef]
- Vo, C.P.; To, X.D.; Ahn, K.K. A Novel Force Sensorless Reflecting Control for Bilateral Haptic Teleoperation System. IEEE Access 2020, 8, 96515–96527. [Google Scholar] [CrossRef]
- Valenzuela-Urrutia, D.; Muñoz-Riffo, R.; Ruiz-del-Solar, J. Virtual Reality-Based Time-Delayed Haptic Teleoperation Using Point Cloud Data. J. Intell. Robot. Syst. Theory Appl. 2019, 96, 387–400. [Google Scholar] [CrossRef]
- de Pace, F.; Gorjup, G.; Bai, H.; Sanna, A.; Liarokapis, M.; Billinghurst, M. Assessing the Suitability and Effectiveness of Mixed Reality Interfaces for Accurate Robot Teleoperation. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Virtual Event, Canada, 1–4 November 2020; pp. 7–9. [Google Scholar] [CrossRef]
- Lima, A.; Rocha, F.; Torre, M.P.; Azṕurua, H.; Freitas, G. Teleoperation of an ABB IRB 120 Robotic Manipulator and Barretthand BH8-282 Using a Geomagic Touch x Haptic Device and ROS. In Proceedings of the 15th Latin American Robotics Symposium, 6th Brazilian Robotics Symposium and 9th Workshop on Robotics in Education, LARS/SBR/WRE 2018, Joao Pessoa, Brazil, 6–10 November 2018; pp. 194–200. [Google Scholar]
- Rakita, D.; Mutlu, B.; Gleicher, M. A Motion Retargeting Method for Effective Mimicry-Based Teleoperation of Robot Arms. In Proceedings of the 2017 12th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2017, Vienna, Austria, 6–9 March 2017; pp. 361–370. [Google Scholar]
- Wang, Z.; Fey, A.M. Human-Centric Predictive Model of Task Difficulty for Human-in-the-Loop Control Tasks. PLoS ONE 2018, 13, e0195053. [Google Scholar] [CrossRef] [Green Version]
- Tavakkoli, A.; Wilson, B.; Bounds, M. An Immersive Virtual Environment for Teleoperation of Remote Robotic Agents for Everyday Applications in Prohibitive Environments. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020, Atlanta, GA, USA, 22–26 March 2020; pp. 371–375. [Google Scholar] [CrossRef]
- Dybvik, H.; Løland, M.; Gerstenberg, A.; Slåttsveen, K.B.; Steinert, M. A Low-Cost Predictive Display for Teleoperation: Investigating Effects on Human Performance and Workload. Int. J. Hum. Comput. Stud. 2021, 145, 102536–102554. [Google Scholar] [CrossRef]
- Triantafyllidis, E.; McGreavy, C.; Gu, J.; Li, Z. Study of Multimodal Interfaces and the Improvements on Teleoperation. IEEE Access 2020, 8, 78213–78227. [Google Scholar] [CrossRef]
- Chen, Z.; Huang, F.; Sun, W.; Song, W. An Improved Wave-Variable Based Four-Channel Control Design in Bilateral Teleoperation System for Time-Delay Compensation. IEEE Access 2018, 6, 12848–12857. [Google Scholar] [CrossRef]
- Guo, J.; Liu, C.; Poignet, P. A Scaled Bilateral Teleoperation System for Robotic-Assisted Surgery with Time Delay. J. Intell. Robot. Syst. 2019, 95, 165–192. [Google Scholar] [CrossRef] [Green Version]
X-Direction | Z-Direction | Y-Direction | |||
---|---|---|---|---|---|
Measured Groups | Variance | RMSE | Variance | RMSE | Mean Velocity in Y |
Skilled Welder | 0.26 | 0.58 | 0.57 | 0.90 | 3.07 |
Novice Welder | 0.96 | 1.17 | 1.40 | 1.27 | 3.17 |
Baseline | MRnoVF | MRVF | ||||
---|---|---|---|---|---|---|
Measure | Mean | Std. Dev | Mean | Std. Dev | Mean | Std. Dev |
Time | 46.72 | 8.75 | 42.32 | 11.40 | 18.60 | 5.37 |
Collisions | 0.50 | 0.73 | 0.25 | 0.45 | 0.00 | 0.00 |
Mental Demand | 80.31 | 9.41 | 75.31 | 10.08 | 45.00 | 15.71 |
Physical Demand | 77.94 | 16.02 | 74.44 | 10.58 | 32.25 | 13.87 |
Temporal Demand | 78.00 | 15.56 | 73.44 | 13.15 | 66.81 | 11.36 |
Performance | 64.75 | 24.33 | 78.50 | 11.80 | 43.88 | 18.06 |
Effort | 78.94 | 14.36 | 63.56 | 19.43 | 34.38 | 10.78 |
Frustration | 92.81 | 5.47 | 69.94 | 12.92 | 38.31 | 11.46 |
Average Workload | 78.79 | 5.80 | 72.53 | 5.13 | 43.44 | 4.86 |
Usefulness | 2.25 | 0.78 | 2.69 | 1.08 | 4.19 | 1.42 |
Ease of Use | 2.63 | 1.03 | 3.13 | 0.96 | 5.50 | 0.82 |
TAM | 2.44 | 0.48 | 2.91 | 0.78 | 4.85 | 0.77 |
Post-Hoc Tests | ||||||
---|---|---|---|---|---|---|
Measure | Partial Eta Squared | F | p | MRVF-MR | MRVF-B | MR-B |
Time | 0.76 | F (1.866, 27.995) = 47.279 | <0.001 | <0.001 | <0.001 | 0.478 |
Collisions | 0.21 | F (1.424, 21.353) = 4.091 | 0.043 | 0.123 | 0.046 | 0.783 |
Mental Demand | 0.75 | F (1.905, 28.580) = 45.449 | <0.001 | <0.001 | <0.001 | 0.584 |
Physical Demand | 0.79 | F (1.972, 29.580) = 57.679 | <0.001 | <0.001 | <0.001 | 1.000 |
Temporal Demand | 0.15 | F (1.486, 22.292) = 2.594 | 0.109 | 0.222 | 0.118 | 1.000 |
Performance | 0.49 | F (1.505, 22.581) = 14.660 | <0.001 | <0.001 | 0.061 | 0.046 |
Effort | 0.65 | F (1.590, 23.852) = 27.782 | <0.001 | 0.001 | <0.001 | 0.156 |
Frustration | 0.88 | F (1.703, 25.552) = 112.067 | <0.001 | <0.001 | <0.001 | <0.001 |
Overall Workload | 0.94 | F (1.703, 25.552) = 228.777 | <0.001 | <0.001 | <0.001 | 0.023 |
Usefulness | 0.44 | F (1.683, 25.241) = 11.719 | <0.001 | 0.017 | 0.002 | 0.559 |
Ease of Use | 0.72 | F (1.641, 24.617) = 38.829 | <0.001 | <0.001 | <0.001 | 0.684 |
TAM | 0.78 | F (1.916, 28.742) = 54.141 | <0.001 | <0.001 | <0.001 | 0.180 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Su, Y.-P.; Chen, X.-Q.; Zhou, T.; Pretty, C.; Chase, G. Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding. Appl. Sci. 2021, 11, 11280. https://doi.org/10.3390/app112311280
Su Y-P, Chen X-Q, Zhou T, Pretty C, Chase G. Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding. Applied Sciences. 2021; 11(23):11280. https://doi.org/10.3390/app112311280
Chicago/Turabian StyleSu, Yun-Peng, Xiao-Qi Chen, Tony Zhou, Christopher Pretty, and Geoffrey Chase. 2021. "Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding" Applied Sciences 11, no. 23: 11280. https://doi.org/10.3390/app112311280
APA StyleSu, Y. -P., Chen, X. -Q., Zhou, T., Pretty, C., & Chase, G. (2021). Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding. Applied Sciences, 11(23), 11280. https://doi.org/10.3390/app112311280