Next Article in Journal
A Trust Framework to Detect Malicious Nodes in Cognitive Radio Networks
Previous Article in Journal
Downlink Power Allocation Strategy for Next-Generation Underwater Acoustic Communications Networks
Previous Article in Special Issue
Enhanced Switch Image-Based Visual Servoing Dealing with FeaturesLoss
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Visual Servoing in Robotics

Department of Physics, Systems Engineering and Signal Theory, University of Alicante, 03690 Alicante, Spain
Electronics 2019, 8(11), 1298; https://doi.org/10.3390/electronics8111298
Submission received: 4 October 2019 / Accepted: 1 November 2019 / Published: 6 November 2019
(This article belongs to the Special Issue Visual Servoing in Robotics)

1. Introduction

Visual servoing is a well-known approach to guide robots using visual information. Image processing, robotics and control theory are combined in order to control the motion of a robot depending on the visual information extracted from the images captures by one or several cameras. With respect to vision issues, different problems are currently under research such as the use of different kinds of image features (or different kinds of cameras), image processing at high velocity, convergence properties, etc. Furthermore, the use of new control schemes allows the system to behave more robustly, efficiently, or compliantly with less delays. Related issues such as optimal and robust approaches, direct control, path tracking or sensor fusion allows the application of the visual servoing systems in different domains.
In so-called image-based visual servoing systems, the control law is calculated using directly visual information. These last systems do not need a complete 3D reconstruction of the environment. For tasks that require high precision, speed or response, several works suggest that it may be beneficial to take into account the dynamics of the robot when designing visual servoing control laws. The type of visual control systems that consider the dynamics of the robot in the control law are often referred to as direct or dynamic visual control systems. However, for simplicity, indirect visual servoing schemes are mostly used in the literature.
Nowadays, the application fields of the visual servoing systems are very wide, and include research and application fields such as navigation and localization of mobile robots, guidance of humanoid robots, robust and optimal control of robots, manipulation, intelligent transportation, deep learning and machine learning in visual servoing and visual guidance of field robotics (aerial robots, assistive Robots, medical robots, etc.).

2. The Present Issue

This special issue consists of eight papers covering important topics in the field of visual servoing. In [1], an enhanced switch image-based visual servoing system for a 6 degrees of freedom industrial robot is proposed. An image feature reconstruction algorithm based on the Kalman filter is presented to handle feature loss during the tracking. Visual servoing approaches can be applied for the guidance of different kinds of robots such as mobile robots, aerial robots or parallel robots. The latter is the case described in [2], where an optical coordinate measuring machine is employed to stabilize parallel robots. In this case, the dynamic model parameters are identified by using a non-linear optimisation technique. In [3], a direct image-based visual servoing system is used for the guidance of a mobile manipulator. This approach considers not only kinematic properties of the robot, but also dynamic ones for guiding both the robot base and the manipulator arm. An optimal control approach is used in this paper.
This special issue also includes papers that describe new and interesting applications of the visual servoing systems such as the ones described in [4] or [5]. In [4], a visual servoing system is applied to an apple-picking robot. The image-based visual servo control method is adopted to control the manipulator in order to improve the grasping accuracy in the picking process. The joint control performance of the control system has been improved by the proposed adaptive fuzzy neural network sliding-mode control algorithm. Additionally, in [5], a spatial trajectory optimization method of a spray-painting robot is proposed.
Additionally, visual servoing approaches are very related with the necessity of estimating parameters or variables used in the control process. For example, in [6], a visual-based method to estimate robot orientation with RGB-D cameras is proposed. In [7], the reaction force of the end effector and second link of a three-degree of freedom hydraulic servo system with master–slave manipulators sliding mode control is determined with a sliding perturbation observer. Also, bilateral control is used to estimate the reaction force of the master device which is provided to the operator to handle the master device. Finally, in [8] a novel measurement system to visualize the motion-to-photon latency with time-series data in real time is proposed.

3. Future

While visual servoing systems have been an important field of research in the last years, several major challenges still remain. Tasks such as tracking, positioning, detection, segmentation, and localization play a critical role in visual servoing and different research is currently ongoing to increase the robustness of visual controllers. Additionally, new control approaches such as optimal control, robust control, dynamic control or predictive control will provide this kind of system with new dynamic properties. Furthermore, new computer vision systems, electronics, computers, etc., will offer new and interesting capabilities for the application of visual servoing in new kinds of robotics systems such as autonomous driving cars, humanoid robots, aerial robots, service robotics, UAV, parallel robots, space robotics, etc.

Acknowledgments

First of all, we would like to thank all researchers who submitted articles to this special issue for their excellent contributions. We are also grateful to all reviewers who helped in the evaluation of the manuscripts and made very valuable suggestions to improve the quality of contributions. We would like to acknowledge the editorial board of Electronics, who invited us to guest edit this special issue. We are also grateful to the Electronics Editorial Office staff who worked thoroughly to maintain the rigorous peer-review schedule and timely publication.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Ghasemi, A.; Li, P.; Xie, W.-F.; Tian, W. Enhanced Switch Image-Based Visual Servoing Dealing with FeaturesLoss. Electronics 2019, 8, 903. [Google Scholar] [CrossRef]
  2. Li, P.; Ghasemi, A.; Xie, W.; Tian, W. Visual Closed-Loop Dynamic Model Identification of Parallel Robots Based on Optical CMM Sensor. Electronics 2019, 8, 836. [Google Scholar] [CrossRef]
  3. Belmonte, Á.; Ramón, J.L.; Pomares, J.; Garcia, G.J.; Jara, C.A. Optimal Image-Based Guidance of Mobile Manipulators using Direct Visual Servoing. Electronics 2019, 8, 374. [Google Scholar] [CrossRef]
  4. Chen, W.; Xu, T.; Liu, J.; Wang, M.; Zhao, D. Picking Robot Visual Servo Control Based on Modified Fuzzy Neural Network Sliding Mode Algorithms. Electronics 2019, 8, 605. [Google Scholar] [CrossRef]
  5. Chen, W.; Wang, X.; Liu, H.; Tang, Y.; Liu, J. Optimized Combination of Spray Painting Trajectory on 3D Entities. Electronics 2019, 8, 74. [Google Scholar] [CrossRef]
  6. Guo, R.; Peng, K.; Zhou, D.; Liu, Y. Robust Visual Compass Using Hybrid Features for Indoor Environments. Electronics 2019, 8, 220. [Google Scholar] [CrossRef]
  7. Kallu, K.D.; Wang, J.; Abbasi, S.J.; Lee, M.C. Estimated Reaction Force-Based Bilateral Control between 3DOF Master and Hydraulic Slave Manipulators for Dismantlement. Electronics 2018, 7, 256. [Google Scholar] [CrossRef]
  8. Choi, S.-W.; Lee, S.; Seo, M.-W.; Kang, S.-J. Time Sequential Motion-to-Photon Latency Measurement System for Virtual Reality Head-Mounted Displays. Electronics 2018, 7, 171. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Pomares, J. Visual Servoing in Robotics. Electronics 2019, 8, 1298. https://doi.org/10.3390/electronics8111298

AMA Style

Pomares J. Visual Servoing in Robotics. Electronics. 2019; 8(11):1298. https://doi.org/10.3390/electronics8111298

Chicago/Turabian Style

Pomares, Jorge. 2019. "Visual Servoing in Robotics" Electronics 8, no. 11: 1298. https://doi.org/10.3390/electronics8111298

APA Style

Pomares, J. (2019). Visual Servoing in Robotics. Electronics, 8(11), 1298. https://doi.org/10.3390/electronics8111298

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop