Next Article in Journal
A Novel Solution for Day-Ahead Scheduling Problems Using the IoT-Based Bald Eagle Search Optimization Algorithm
Next Article in Special Issue
Wind Variation near the Black Sea Coastal Areas Reflected by the ERA5 Dataset
Previous Article in Journal
Incorporating Human Preferences in Decision Making for Dynamic Multi-Objective Optimization in Model Predictive Control
Previous Article in Special Issue
Communication and Control of an Assembly, Disassembly and Repair Flexible Manufacturing Technology on a Mechatronics Line Assisted by an Autonomous Robotic System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mobile Visual Servoing Based Control of a Complex Autonomous System Assisting a Manufacturing Technology on a Mechatronics Line

1
Department of Automation and Electrical Engineering, “Dunărea de Jos” University of Galați, 800008 Galați, Romania
2
Doctoral School of Fundamental Sciences and Engineering, “Dunărea de Jos” University of Galați, 800008 Galați, Romania
3
Department of Automation, Computer Science and Electrical Engineering, “Valahia” University of Târgoviște, 130024 Târgoviște, Romania
*
Author to whom correspondence should be addressed.
Inventions 2022, 7(3), 47; https://doi.org/10.3390/inventions7030047
Submission received: 15 May 2022 / Revised: 16 June 2022 / Accepted: 19 June 2022 / Published: 22 June 2022

Abstract

:
The main contribution of this paper is the modeling and control for a complex autonomous system (CAS). It is equipped with a visual sensor to operate precision positioning in a technology executed on a laboratory mechatronics line. The technology allows the retrieval of workpieces which do not completely pass the quality test. Another objective of this paper is the implementation of an assisting technology for a laboratory processing/reprocessing mechatronics line (P/RML) containing four workstations, assisted by the following components: a complex autonomous system that consists of an autonomous robotic system (ARS), a wheeled mobile robot (WMR) PeopleBot, a robotic manipulator (RM) Cyton 1500 with seven degrees of freedom (7 DOF), and a mobile visual servoing system (MVS) with a Logitech camera as visual sensor used in the process of picking, transporting and placing the workpieces. The purpose of the MVS is to increase the precision of the RM by utilizing the look and move principle, since the initial and final positions of the CAS can slightly deviate from their trajectory, thus increasing the possibility of errors to appear during the process of catching and releasing the pieces. If the processed piece did not pass the quality test and has been rendered as defective, it is retrieved from the last station of the P/RML and transported to the first station for reprocessing. The control of the WMR is done using the trajectory-tracking sliding-mode control (TTSMC). The RM control is based on inverse kinematics model, and the MVS control is implemented with the image moments method.

1. Introduction

The main contribution of this paper is the modeling and control of a complex autonomous system (CAS) equipped with a robotic manipulator (RM) and a mobile visual sensor (MVS) to achieve precise positioning in a processing technology on a laboratory mechatronics line (ML). The presented technology allows the recovery of products, through reprocessing, if they partially satisfy the quality requirements. The mechatronics line handles the full manufacturing process of a workpiece, and it has buffering, handling, processing (drilling, boring), sorting, and quality testing facilities. Due to the fact that this technology allows the product retrieval operation for reprocessing, the mechatronics line will be further referred to as P/RML.
This paper presents the integration of multiple subsystems for a laboratory mechatronic line: the P/RML is served by a CAS consisting of an ARS equipped with a 7 DOF RM Cyton 1500 and uses an eye in hand type Mobile Visual Servoing (MVS) located on the end-effector. Task planning, simulation, monitoring, and real-time control are some of the tools used to fulfill the objectives of this study, like workpiece recovery for reprocessing or the increase of the automation level o for the P/RML so that a human operator is no longer needed after the program is started. Moreover, the CAS improves the reusability feature of the production line [1,2,3,4,5].
The manufacturing line is characterized by the operations to be performed in the workstations using the monitoring and control systems, CAS, ARS, RM, MVS which give reparability and flexibility features for processing and reprocessing operations [4,5,6,7,8]. The structure of the P/RML highlights the system’s overall functions: the automatic processing of parts, transport, handling, storing and the automatic control of all the devices integrated in the system using the PLCs in different configurations: centralized or distributed. The processing/reprocessing tasks are the P/RML actions (drilling, reaming, extrusion, bending), operations which are executed in serial or parallel for manufacturing the workpiece until the final product meets the quality requirements. For this, the processing/reprocessing planning also requires complex operations to be performed such as performance measurement and evaluation, kinematics control, system planning, and resources management. For a P/RML assisted by CAS, a preparing strategy adapted to the characteristics of the system is more effective than the techniques used from domain-independent methods [6,7,8,9,10,11]. If the final workpiece does not pass the quality test, the task planner will provide the optimal sequence to reprocess the product.
Visual servoing systems are robust systems that enhance the accuracy and adaptability of the control architecture based on optical feedback [12]. The purpose of a servoing system is to close the movement control loop for a robotic arm. The robotic arm has to move from an initial point to a destination point. The aim of a MVS is to provide information, based on a visual sensor, regarding the actual position. Based on this information, the robot control is able to improve the interaction with the working environment. The features extracted from the images represent the actual position and also the input for the control architecture. The control architectures corresponding to the servoing systems are divided into three categories:
  • Position Based Visual Servoing (PBVS) [13,14,15];
  • Image Based Visual Servoing (IBVS) [13,16,17];
  • Hybrid Visual Servoing (HVS) [4,11,16].
An important factor for a MVS is the visual features selection, which is related to the speed, accuracy, and performance that control the movement of a RM, taking into consideration the degrees of freedom as shown in [10,18,19].
Other elements of originality and contributions are concentrated in the following areas: assigning, planning, and synchronization tasks of P/RML assisted by CAS, RMs, and MVS, designing the architecture of the entire system to allow flexible manufacturing, multifunctionality, communication, synchronization of signals from sensors, distributed control and image processing for precise positioning eye in hand MVS and real-time control, improving the level of automation and increasing the efficiency of the manufacturing lines using the CAS.
The article is structured as follows:
Section 2 is dedicated to the hardware structure and the functional features of the system:
  • In Section 2.1, the hardware structure of the P/RML assisted by CAS is presented;
  • In Section 2.3, an eye in hand MVS technology is presented;
  • In Section 2.3, modeling, and control of MVS, based on the moments of the image, are presented;
  • In Section 2.4, assumptions and task planning are presented.
  • In Section 3, direct and inverse kinematics model of RM Cyton 1500 are presented;
  • In Section 4, trajectory-tracking sliding-mode control of ARS PeopleBot is presented;
  • In Section 5, some results regarding real-time control of P/RML assisted by CAS are laid out;
  • In Section 6, some final remarks and conclusions are presented.

2. P/RML Assisted by the CAS Equipped with RM and MVS

2.1. The Hardware Architecture of the P/RML Assisted by CAS

The P/RML Festo MPS 200, presented in Figure 1, is a configurable laboratory mechatronic system consisting of four workstations. The four workstations are controlled by an individual Siemens S7 300 PLC that operates in different stages: buffering, handling, processing, and sorting:
  • The first station triggers the color tests and then transports the pieces to the warehouse corresponding to the color;
  • The second station is where the pieces are buffered for further processing;
  • The third station performs drilling and boring operations;
  • The fourth station is where the pieces are stored and organized by their colors.
RM Cyton is used for recovering the pieces which are then transported by the ARS PeopleBot. The RM is equipped with a camera used for the eye in hand subsystem mounted on the end-effector. All the elements are controlled by a computer, the camera via USB connection, ARS PeopleBot, and RM Cyton by Wi-Fi with the help of an Advanced Robotic Interface for Applications (ARIA) package [20,21,22,23]. The combined subsystems are presented in Figure 2.

2.2. Eye in Hand MVS

An eye in hand MVS type refers to a system where the visual sensor is placed on the last joint of the RM and the camera movements also affect RM movements. Another type of MVS commonly used is eye to hand, where, contrary to eye in hand, the sensor is fixed in a position relative to the work environment, where both the RM and the workpieces are observed. For image-based visual servoing (IBVS), 2D image information is being used for an estimation of the desired motion of the robot. Standard tasks such as detection, tracking and positioning are realized by minimizing the error between the features extracted from the current image and the visual features of the desired image [13].
In the case of IBVS architecture, the visual sensor that extracts information about the work environment can be either eye in hand implementation, where the movement of the robot induces the movement of the camera, or eye to hand, where the RM and its motions can be examined from a fixed point. A common method used in object detection, classification, and shape identification is called the moments of the image. This method is based on features extraction from a 2D intensity distribution of the image, together with information about the orientation and coordinates of the gravity center. In Figure 3 is presented the complete CAS’s transport trajectory, from the last to the first P/RML workstation, from the detection and grabbing to the placing of the workpiece.

2.3. Modelling and Control of MVS Based on the Moments of the Image Method

The structure of the eye in hand MVS has the following components: a 7-DOF RM, a visual sensor, and an image-based controller presented in Figure 4.
One of the most important parts of this design implementation, is that the image-based controller, needs information about the environment of the system to minimize error between the current configuration of the visual features, f , and the desired configuration f * . For modelling the open loop servoing system, the fixed parts components of the RM and the visual sensor must be analyzed separately. The signal related with the input control of the RM is υ c * , and it represents the reference speed of the camera with the structure: υ c * = ( υ * , ω * ) T where υ * = [ υ x * , υ y * , υ z * ] T and ω * = [ ω x * , ω y * , ω z * ] T are defined as the linear and angular speed, respectively. The signal, υ c * is expressed in Cartesian space and must be transformed to be applied to the RM.
The posture is defined by the υ c * integration and is noted with s = [ s 1 , s 2 , s 3 , s 4 , s 5 , s 6 ] T defining the robot Jacobian as follows:
J r = [ s i / q j ] , i , j = 1 , , 7 ,
where q i , j = 1 , , 7 represents the RM’s joints’ states. The transformation of υ c * from Cartesian space to robotic joint space is done with J r 1 and the interaction matrix. The moments m i j are a set of visual features with the analytic form for the time variation, m ˙ i j , resultant to the moments of order ( i + j ) , differing depending on the speed of the visual sensor υ c * corresponding to the following equation:
m ˙ i j = L m i j υ c ,
where L m i j = [ m υ x m υ y m υ z m ω x m ω y m ω z ] is the interaction matrix.
The interaction matrix associated with a set of image moments f = [ x n , y n , a n , τ , ξ , α ] T for n points is processed in this manner [12,13,24,25,26]
L f = [ 1 0 0 a n e 11 a n ( 1 + e 12 ) y n 0 1 0 a n ( 1 + e 21 ) a n e 11 x n 0 0 1 e 31 e 32 0 0 0 0 τ ω x τ ω y 0 0 0 0 ξ ω x ξ ω y 0 0 0 0 α ω x α ω y 1 ] .
The analytical form of the parameters from (3) can be found in [27].

2.4. Assumptions and Task Planning

The processing/reprocessing tasks can be simplified into a sequence of actions combined in parallel with workpieces positioning and transportation along the cells of the mechatronics line. Shown in Figure 5 is shown the task planning in the form of a block diagram with the actions done by the CAS when a workpiece has failed the quality test. Next, the workpiece is transported from the storage station (last cell) to the buffer station (first cell). The red lines represent the reprocessing tasks done if the piece has been considered faulty, and the black lines represent the normal processing assignments when an initial workpiece has been supplied.
Because the P/RML on its basis is a flexible industrial production line, it is designed to easily adapt to changes in the type and quantity of the product and it has a high range of products that it can provide. Through reprocessing, the faulty products can be retrieved and reworked to the necessary quality standard. Since the technology utilized by the P/RML can be altered by aspects such as process modes, type of finished workpiece, and operations times, some assumptions are needed for the entire system to work as intended:
Assumption 1.
The manufacturing technology processes the workpieces in pipeline mode;
Assumption 2.
The first station is provided with one piece at a time for processing;
Assumption 3.
The initial conditions and parameters of the technology are known, such as quantity of pieces and task duration;
Assumption 4.
Only one type of piece can be processed or reprocessed with different colors since the proposed processing operations are specific to a type of product;
Assumption 5.
The number of the workstations involved in processing/reprocessing by the P/RML is previously known and unchanged;
Assumption 6.
The workstations of the P/RML have a linear distribution in the following order: buffer, handling, processing, and storing;
Assumption 7.
The processing/reprocessing tasks are executed on the same workstation and the pieces can be processed simultaneously in different stages;
Assumption 8.
A red workpiece will mean that the quality test is not passed and reprocessing is needed;
Assumption 9.
In the storage, the first level from the top is utilized for rejected workpieces;
Assumption 10.
One CAS assists the P/RML, which is used for picking up, transporting, and releasing the workpieces;
Assumption 11.
One eye in hand visual sensor is mounted on the RM Cyton;
Assumption 12.
To avoid conflict, priority is given to the workpiece for reprocessing;
Assumption 13.
The technology includes two quality tests, which detect between a scrap workpiece and one recoverable by reprocessing.

3. Direct and Inverse Kinematics Model of RM Cyton 1500

RM Cyton 1500 offers a robust and precise manipulation for a wide variety of applications. It has been designed to simulate the structure of a human arm; the shoulder has three joints, the elbow has one joint, and the wrist has three joints. All the joints described make up the 7 DOF of the RM. Angle limits of the RM are shown in Table 1.
For testing and simulating the RM, kinematic modeling needs to be performed, with the main objective being the study of RM’s mechanical parts, the direct and the inverse kinematics.
The direct kinematics consist of finding the position of the end-effector in space by knowing the movements of the joints, for example F ( θ 1 , θ 2 , , θ n ) = [ x , y , z , R ] , and inverse kinematics consist of determining the value of every joint by knowing the position of the end effector and its orientation, F ( x , y , z , R ) = [ θ 1 , , θ n ] .
A RM is composed, in general, of three joints, each defined by one or more degrees. In the case of RM Cyton 1500, it is made of:
  • One shoulder type joint, characterized by three angles;
  • One elbow type joint, characterized by one angle;
  • One wrist type joint, characterized by three angles.
Presented in Figure 6 is a compact diagram block of kinematic modeling [28].
The configuration of the RM and its joints is presented in Figure 7 [29].
One convention that is used for the selection of reference frames, especially in robotics applications, is represented by the Deavit&Hartenberg (D&H) convention, illustrated in Figure 8 [30], and parameters of the RM Cyton 1500, presented in Table 2.
To determine the direct kinematic model, a fixed coordinate system with an index of 0 for the shoulder type joints and, for the other joints, a system with an index of i, i = 1 … 7 have been placed. In this convention, every homogeneous transformation A i is represented as a product of four basic transformations, where the four variables θ i ,   a i ,   d i ,   α i are the parameters associated with every link and joint.
A i = R z , θ i T r a n s z , d i T r a n s x , a i R x , α i ,
A i = [ c θ i s θ i 0 0 s θ i c θ i 0 0 0 0 1 0 0 0 0 1 ] [ 1 0 0 0 0 1 0 0 0 0 1 d i 0 0 0 1 ] [ 1 0 0 a i 0 1 0 0 0 0 1 0 0 0 0 1 ] [ 1 0 0 0 0 c α i s α i 0 0 s α i c α i d i 0 0 0 1 ]
A i = [ c θ i s θ i c α i s θ i s α i a i c θ i s θ i c θ i c α i c θ i s α i a i s θ i 0 s α i c α i d i 0 0 0 1 ] .
With the help of the D&H table the A I matrices have been determined for every DOF of the RM:
A 1 = [ c 1 0 s 1 0 s 1 0 c 1 0 0 1 0 d 1 0 0 0 1 ] ,   A 2 = [ c 2 s 2 0 α 2 c 2 s 2 c 2 0 α 2 s 2 0 0 1 0 0 0 0 1 ] ,   A 3 = [ c 3 0 s 3 0 s 3 0 c 3 0 0 1 0 d 3 0 0 0 1 ] ,   A 4 = [ c 4 s 4 0 α 4 c 4 s 4 c 4 0 α 4 s 4 0 0 1 0 0 0 0 1 ] ,   A 5 = [ c 5 0 s 5 0 s 5 0 c 5 0 0 1 0 d 5 0 0 0 1 ] , A 6 = [ c 6 s 6 0 α 6 c 6 s 6 c 6 0 α 6 s 6 0 0 1 0 0 0 0 1 ] ,   A 7 = [ c 7 0 s 7 α 7 c 7 s 7 0 c 7 α 7 c 7 0 1 0 0 0 0 0 1 ] .  
The direct kinematics model of the RM Cyton is determined by the decomposition of the seven matrices:
T 1 0 = A 1 ,
T 7 0 = A 1 A 2 A 3 A 4 A 5 A 6 A 7 = [ s x n x a x d x s y n y a y d y s z n z a z d z 0 0 0 1 ]  
Figure 9 illustrates an image representing all the joints and angles composing RM Cyton. The kinematic model obtained is utilized in the control structure of RM Cyton:

4. Trajectory Tracking Sliding-Mode Control of ARS PeopleBot

The ARS PeopleBot is controlled to follow a desired trajectory based on continuous time sliding-mode control. In TTSMC, the real ARS follows the trajectory of a virtual one, with the desired trajectory generated by the virtual ARS denoted with
q d ( t ) = [ x d y d θ d ] T .  
The kinematic model of the virtual ARS becomes
{ x ˙ = v d cos θ d y ˙ = v d sin θ d θ ˙ = ω d ,
where x d and y d represent the cartesian coordinates of the geometric center, v d represents the linear speed, θ d represents the orientation, and ω d represents the angular speed. When the ARS follows the desired trajectory, on X and Y axis appear tracking and orientation errors:
[ x e y e θ e ] = [ cos θ d sin θ d 0 sin θ d cos θ d 0 0 0 1 ] [ x r x d y r y d θ r θ d ] .
The error dynamics are given by the following equations
{ x ˙ e = v d + v r cos θ e + ω d y e y ˙ e = v r s i n θ e ω d x e θ ˙ e = ω r ω d .
Since the ARS orientation is not perpendicular on the desired trajectory, it is assumed that | θ e | < π / 2 . Given the errors’ positions (11) and the derivatives (12), the sliding surfaces are
{ s 1 = x ˙ e + k 1 x e s 2 = y ˙ e + k 2 y e + k 0 s g n ( y e ) θ e .
Presented in Figure 10 are the real and virtual ARS with absolute coordinates and orientation following a desired trajectory:
The parameters k 1 ,   k 2 ,   k 3 are positive and constant. If s1 converges to zero, then x e becomes 0 too, and if s 2 tends to 0, then y ˙ = k 2 y e + k 0 s g n ( y e ) θ e . The surface derivatives
{ s ˙ 1 = x ¨ e + k 1 x ˙ e , s ˙ 2 = y ¨ e + k 2 y ˙ e + k 0 s g n ( y e ) θ ˙ e ,  
are written in a compact form:
s ˙ = Q s g n ( s ) P s ,
where
Q = [ Q 1 0 0 Q 2 ] ,   P = [ P 1 0 0 P 2 ] ,   Q 0 , P 0 ,
s = [ s 1 s 2 ] T , s g n ( s ) = [ s g n ( s 1 ) s g n ( s 2 ) ] T .
Thus, from (11)–(14), the TTSMC law becomes as follows:
v ˙ c = Q 1 s i g n ( s 1 ) P 1 s 1 k 1 x ˙ e ω ˙ d y e ω ˙ d y ˙ e + v r θ e + v ˙ d ˙   cos θ e ,
ω c = Q 2 s i g n ( s 2 ) P 2 s 2 k 2 y ˙ e v ˙ r sin θ e + ω ˙ d x e + ω d x ˙ e + ω d v r c o s θ e + k 0 s g n ( y e ) .

5. Real-Time Control of P/RML Assisted by CAS

The assisting technology for P/RML consists of one dynamic robotic system, CAS, used in picking, placing, and transporting the workpieces.
The process is based on 3 main control loops:
  • Control loop for the P/RML with Siemens PLCs, programmed in Simatic Step 7;
  • Control loop for the ARS PeopleBot based on TTSMC used in workpiece transportation;
  • Control loop for the RM Cyton with eye in hand MVS for precise positioning for both picking and placing workpieces.
Presented in Figure 11 are the control loops connected to a single computer that runs the applications which control the P/RML, CAS and its mechanisms with the help of Simatic Step 7, Microsoft Visual Studio, the C programming language, Aria packages, and MATLAB.
Displayed in Figure 12 are a few pictures from real-time with the process of scanning the object, and, if it is detected, it is picked up by the gripper of the RM and transported with the CAS to the first workstation. The process with the main steps done in detecting the objects is presented in Figure 13. In Figure 13a is the conversion from the RGB (Red Green Blue) color model to the HSV (Hue Saturation Value) color model, as it is more robust to changes in light, since, during the experiments, there have been major effects with the type of light used, natural sunlight or laboratory light. Presented in Figure 13b is the detected object if the colors between the HSV limits have been met and the shape (in this case, a circle) has been found using the Ramer–Douglas–Peucker algorithm and Canny Edge detection, which are implemented by using the OpenCV libraries. Finally, in Figure 13c, the object is tracked if the conditions of both the color and shape have been met; the centroid is tracked by using the image moments method for its versatility and efficiency, as deviations appear only in a 2D space, the distance on the Z axis is constant, and the rotations of the object do not exist, since the workpiece always comes to the same place, only the CAS does not have a stable position and a MVS is needed [31].
Likewise, presented in Figure 14 are pictures with a similar process to the previous figure, this time for placing the workpieces rather than pick them up. The biggest difference between Figure 13 and Figure 15 consists of the fact that, in Figure 13, the object itself is detected while, in Figure 15, a reference point specific to the first workstation is detected, and the object is placed. Also, while in the first case the shape detected is a circle, in the second one the shape detected is a rectangle.
In Figure 16a, the complete 3D picking up trajectory is presented: the movement from the home position to the scanning, then above the object, picking it, and back to the home position, so that the CAS can transport it to the first workstation for reprocessing if it fails the quality test. Presented in Figure 16b is the 3D trajectory for placing the object.
Although the MVS starting and ending point are present in Figure 16a,b, they are hard to visualize, which is why Figure 17 is needed, in the case of picking the object, having the X and Z axis in Figure 17a and the Y on Figure 17b. The entire MVS process takes ~7 s and it is most apparent in the case of the Y axis, where there is a deviation that goes up to 10 mm from the desired position.
The time for placing the workpiece, based on MVS control, is about four s due to the fact that difference between the desired and actual features is much smaller than picking up. The X and Z trajectories over time are displayed in Figure 18a. In Figure 18b, only the Y axis trajectory is shown, since the distance is much smaller than the X and Y axis.
The PeopleBot trajectory has some deviations on the X and Y axis that can be observed in Figure 19a. Along the X axis, the error is near zero, while on the Y axis the final error is about 4 mm, as shown in Figure 19b.
The velocity profile of the ARS PeopleBot along the forward trajectory, during transportation of the workpiece, is presented in the Figure 20a, with a maximum of 0.12 m/s. The state transition of the workpiece is shown in Figure 20b. While the RM is active for picking and placing the workpiece, the CAS has zero velocity.
In Figure 21, the X and Y trajectory are displayed, so that the differences between the desired and real trajectories can be identified more easily.

6. Conclusions

The presented research is still in the experimental phase, the final step being a fully automated processing technology, allowing the recovery by reprocessing of a given volume of pieces. By using SIMATIC STEP 7, Visual C ++ platforms and MATLAB, real-time control structures have been implemented, allowing automatic processing/reprocessing, assisted by the CAS, ARS equipped with RM, and MVS. Compared to [4], where two autonomous robotic systems and two type visual servoing systems (eye to hand and eye in hand) have been used, in this paper only one CAS, with one type of MVS (eye in hand), assists P/RML. As a result, the reliability, flexibility, and robustness of the technology have been increased due to the uncertainties that might come from the sensors fails, environment conditions, disturbances in data transfer and communications. Although this is a technology that has been used at the level of a laboratory, it can be extended further to real industry, where high accuracy and positioning are needed. In the future, this technology is expected to be improved to be able to satisfy the standards of the Industry 4.0. Through this perspective, the presented manufacturing technology has a dual purpose, one educational and the other to be as close as possible to the real industrial world. The educational goal aims to familiarize the system designer with everything that defines: Industry 4.0, concepts, design, Cloud and IoT communication, real-time monitoring and control. Therefore, we try to improve the presented technology by the integration of the state of the art in the field of Industry 4.0, including smart manufacturing, SCADA systems, tasks communication and synchronization, sensors data fusion, and high precision actuators.

Author Contributions

Conceptualization, G.S., A.F. (Adrian Filipescu), D.I., R.Ș., D.C., E.M. and A.F. (Adriana Filipescu); methodology G.S., A.F. (Adrian Filipescu), D.I., D.C., R.Ș. and E.M.; software, G.S. and D.I.; validation, A.F. (Adriana Filipescu), R.Ș., D.C. and E.M.; formal analysis, G.S., D.I. and A.F. (Adrian Filipescu); writing—original draft preparation, G.S., A.F. (Adrian Filipescu) and D.I.; writing—review and editing, A.F. (Adrian Filipescu), R.Ș. and D.C.; supervision, A.F. (Adrian Filipescu); project administration, A.F. (Adrian Filipescu); funding acquisition, A.F. (Adrian Filipescu), G.S. and D.I. All authors have read and agreed to the published version of the manuscript.

Funding

This article (APC) will be supported by Doctoral School of Fundamental Sciences and Engineering, “Dunărea de Jos” University of Galati.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data availability is not applicable to this article as the study did not report any data.

Acknowledgments

The results of this work were presented to the 10th edition of the Scientific Conference organized by the Doctoral Schools of “Dunarea de Jos” University of Galati (SCDS-UDJG) http://www.cssd-udjg.ugal.ro/, (accessed on 15 March 2022), that was held on 9th and 10th of June 2022, in Galati, Romania.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Stoll, J.T.; Schanz, K.; Pott, A. Mechatronic Control System for a Compliant and Precise Pneumatic Rotary Drive Unit. Actuators 2020, 9, 1. [Google Scholar] [CrossRef] [Green Version]
  2. Syafrudin, M.; Fitriyani, N.L.; Alfian, G.; Rhee, J. An Affordable Fast Early Warning System for Edge Computing in Assembly Line. Appl. Sci. 2019, 9, 84. [Google Scholar] [CrossRef] [Green Version]
  3. de Gea Fernández, J.; Yu, B.; Bargsten, V.; Zipper, M.; Sprengel, H. Design, Modelling and Control of Novel Series-Elastic Actuators for Industrial Robots. Actuators 2020, 9, 6. [Google Scholar] [CrossRef] [Green Version]
  4. Filipescu, A.; Mincă, E.; Filipescu, A.; Coandă, H.-G. Manufacturing Technology on a Mechatronics Line Assisted by Autonomous Robotic Systems, Robotic Manipulators and Visual Servoing Systems. Actuators 2020, 9, 127. [Google Scholar] [CrossRef]
  5. Chen, Z.-Y.; Chen, C.-T. A Remote-Controlled Robotic Arm That Reads Barcodes and Handles Products. Inventions 2018, 3, 17. [Google Scholar] [CrossRef] [Green Version]
  6. Ciubucciu, G.; Filipescu, A., Jr.; Filipescu, S.; Dumitrascu, B. Control and Obstacle Avoidance of a WMR Based on Sliding-Mode, Ultrasounds and Laser. In Proceedings of the 12th IEEE International Conference on Control and Automation (ICCA), Kathmandu, Nepal, 1–3 June 2016; pp. 779–784, ISBN 978-1-5090-1737-9. [Google Scholar] [CrossRef]
  7. Lupu, C.; Popescu, D.; Florea, G. Supervised Solutions for Precise Ratio Control: Applicability in Continuous Production Line. Stud. Inform. Control 2014, 23, 53–64. [Google Scholar] [CrossRef] [Green Version]
  8. Radaschin, A.; Voda, A.; Filipescu, A. Task Planning Algorithm in Hybrid Assembly/Disassembly Process. In Proceedings of the 14th IFAC Symposium on Information Control Problems in Manufacturing, Bucharest, Romania, 23–25 May 2012; Volume 45, pp. 571–576. [Google Scholar] [CrossRef] [Green Version]
  9. Minca, E.; Filipescu, A.; Voda, A. Modelling and control of an assembly/disassembly mechatronics line served by mobile robot with manipulator. Control Eng. Pract. 2014, 31, 50–62. [Google Scholar] [CrossRef]
  10. Filipescu, A., Jr.; Petrea, G.; Filipescu, A.; Filipescu, S. Modeling and Control of a Mechatronics System Served by a Mobile Platform Equipped with Manipulator. In Proceedings of the 33rd Chinese Control Conference, Nanjing, China, 28–30 July 2014; pp. 6577–6582, ISBN 978-988-15638-4-2. [Google Scholar] [CrossRef]
  11. Petrea, G.; Filipescu, A., Jr.; Minca, E. Hybrid Modelling and Simulation of a P/RML with Integrated Complex Autonomous Systems. In Proceedings of the 22nd IEEE, International Conference on System Theory, Control and Computing, (ICSTCC), Sinaia, Romania, 10–12 October 2018; pp. 439–444, ISBN 978-1-5386-4444-7. [Google Scholar] [CrossRef]
  12. Copot, C. Control Techniques for Visual Servoing Systems. Ph.D. Thesis, Gheorghe Asachi Technical University of Iasi, Iasi, Romania, 2012. [Google Scholar]
  13. Song, R.; Li, F.; Fu, T.; Zhao, J. A Robotic Automatic Assembly System Based on Vision. Appl. Sci. 2020, 10, 1157. [Google Scholar] [CrossRef] [Green Version]
  14. Lan, C.-W.; Chang, C.-Y. Development of a Low Cost and Path-free Autonomous Patrol System Based on Stereo Vision System and Checking Flags. Appl. Sci. 2020, 10, 974. [Google Scholar] [CrossRef] [Green Version]
  15. Deng, L.; Wilson, W.; Janabi-Sharifi, F. Dynamic performance of the position-based visual servoing method in the cartesian and image spaces. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Las Vegas, NV, USA, 27–31 October 2003; pp. 510–515. [Google Scholar] [CrossRef]
  16. Gans, N.; Hutchinson, S.; Corke, P. Performance tests for visual servo control systems, with application to partitioned approaches to visual servo control. Int. J. Robot. Res. 2003, 22, 955–981. [Google Scholar] [CrossRef]
  17. Corke, P.I.; Spindler, F.; Chaumette, F. Combining Cartesian and polar coordinates in IBVS. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 11 December 2009; pp. 5962–5967. [Google Scholar] [CrossRef] [Green Version]
  18. Dragomir, F.; Mincă, E.; Dragomir, O.E.; Filipescu, A. Modelling and Control of Mechatronics Lines Served by Complex Autonomous Systems. Sensors 2019, 19, 3266. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Petrea, G.; Filipescu, A.; Minca, E.; Voda, A.; Filipescu, A., Jr.; Serbencu, A. Hybrid Modelling Based Control of a Processing/Reprocessing Mechatronics Line Served by an Autonomous Robotic System. In Proceedings of the 17th IEEE, International Conference on System Theory, Control and Computing, (ICSTCC), Sinaia, Romania, 11–13 October 2013; pp. 410–415, ISBN 978-1-4799-2228-4. [Google Scholar] [CrossRef]
  20. Gasparetto, A.; Zanotto, V. A new method for smooth trajectory planning of robot manipulators. Mech. Mach. Theory 2007, 42, 455–471. [Google Scholar] [CrossRef]
  21. Kim, J.; Park, J.; Chung, W. Self-Diagnosis of Localization Status for Autonomous Mobile Robots. Sensors 2018, 18, 3168. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Brassai, S.T.; Iantovics, B.; Enăchescu, C. Optimization of Robotic Mobile Agent Navigation. Stud. Inform. Control 2012, 21, 6. [Google Scholar] [CrossRef] [Green Version]
  23. Ravankar, A.; Ravankar, A.A.; Kobayashi, Y.; Hoshino, Y.; Peng, C.-C. Path Smoothing Techniques in Robot Navigation: State-of-the-Art, Current and Future Challenges. Sensors 2018, 18, 3170. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Chou, C.-Y.; Juang, C.-F. Navigation of an Autonomous Wheeled Robot in Unknown Environments Based on Evolutionary Fuzzy Control. Inventions 2018, 3, 3. [Google Scholar] [CrossRef] [Green Version]
  25. Filipescu, A.; Ionescu, D.; Filipescu, A.; Mincă, E.; Simion, G. Multifunctional Technology of Flexible Manufacturing on a Mechatronics Line with IRM and CAS, Ready for Industry 4.0. Processes 2021, 9, 864. [Google Scholar] [CrossRef]
  26. Tolio, T. Design of Flexible Production Systems—Methodologies and Tools; Springer: Berlin, Germany, 2009. [Google Scholar]
  27. Tahri, O.; Chaumette, F. Point-based and region-based image moments for visual servoing of planar objects. IEEE Trans. Robot. 2005, 21, 1116–1127. [Google Scholar] [CrossRef]
  28. Abuqassem, M.R.M. Simulation and Interfacing of 5 DOF Educational Robot Arm. Master’s Thesis, Islamic University of Gaza, Gaza, Palestine, June 2010. [Google Scholar]
  29. Badler, N.I.; Tolani, D. Real-Time Inverse Kinematics of the Human Arm. In Teleoperators and Virtual Environments; University of Pennsylvania: Philadelphia, PA, USA, 1996; Volume 5, pp. 393–401. [Google Scholar] [CrossRef] [Green Version]
  30. Spong, W.; Hutchinson, S.; Vidyasagar, M. Robot Modeling and Control, 1st ed.; Wiley: Urbana, IL, USA, 2005; pp. 71–93. ISBN 978-0471649908. [Google Scholar]
  31. Kromanis, R.; Forbes, C. A Low-Cost Robotic Camera System for Accurate Collection of Structural Response. Inventions 2019, 4, 47. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Festo MPS 200 assisted by CAS (ARS PeopleBot equipped with RM Cyton 1500 and MVS).
Figure 1. Festo MPS 200 assisted by CAS (ARS PeopleBot equipped with RM Cyton 1500 and MVS).
Inventions 07 00047 g001
Figure 2. Structure of the CAS: ARS PeopleBot; RM Cyton 1500; eye in hand MVS.
Figure 2. Structure of the CAS: ARS PeopleBot; RM Cyton 1500; eye in hand MVS.
Inventions 07 00047 g002
Figure 3. P/RML assisted by CAS.
Figure 3. P/RML assisted by CAS.
Inventions 07 00047 g003
Figure 4. Eye in hand MVS closed-loop control.
Figure 4. Eye in hand MVS closed-loop control.
Inventions 07 00047 g004
Figure 5. Processing/Reprocessing cycle for a piece that has failed the quality test.
Figure 5. Processing/Reprocessing cycle for a piece that has failed the quality test.
Inventions 07 00047 g005
Figure 6. Block diagram of kinematics models.
Figure 6. Block diagram of kinematics models.
Inventions 07 00047 g006
Figure 7. Angles that represent every joint, (a) Shoulder type, (b) Elbow type and (c) Wrist type.
Figure 7. Angles that represent every joint, (a) Shoulder type, (b) Elbow type and (c) Wrist type.
Inventions 07 00047 g007
Figure 8. Denavit&Hartenberg Convention.
Figure 8. Denavit&Hartenberg Convention.
Inventions 07 00047 g008
Figure 9. Joints and angles of 7-DOF RM Cyton 1500.
Figure 9. Joints and angles of 7-DOF RM Cyton 1500.
Inventions 07 00047 g009
Figure 10. Real ARS, Virtual ARS, Desired trajectory.
Figure 10. Real ARS, Virtual ARS, Desired trajectory.
Inventions 07 00047 g010
Figure 11. Communication block set of the P/RML, CAS and computer.
Figure 11. Communication block set of the P/RML, CAS and computer.
Inventions 07 00047 g011
Figure 12. Real-time control of the CAS for picking up workpieces with the following order: (a) home position, (b) scanning position (c) picking up object (d) home position with the workpiece picked up in the gripper.
Figure 12. Real-time control of the CAS for picking up workpieces with the following order: (a) home position, (b) scanning position (c) picking up object (d) home position with the workpiece picked up in the gripper.
Inventions 07 00047 g012
Figure 13. Object Detection of the workpiece with the following steps: (a) Conversion from RGB to HSV (b) image segmentation after the color has been found between the HSV limits and the shape corresponding to object has been found (c) object color and shape has been found and is being tracked.
Figure 13. Object Detection of the workpiece with the following steps: (a) Conversion from RGB to HSV (b) image segmentation after the color has been found between the HSV limits and the shape corresponding to object has been found (c) object color and shape has been found and is being tracked.
Inventions 07 00047 g013
Figure 14. Real time control of the CAS for placing the workpieces with the following order: (a) parking position (b) scanning position (c) placing the object (d) RM returning to the home position with the object placed on the workstation.
Figure 14. Real time control of the CAS for placing the workpieces with the following order: (a) parking position (b) scanning position (c) placing the object (d) RM returning to the home position with the object placed on the workstation.
Inventions 07 00047 g014
Figure 15. Object Detection of the reference point with the following steps: (a) Conversion from RGB to HSV (b) image segmentation after the color has been found between the HSV limits and the shape corresponding to reference has been found (c) reference object color and shape has been found and is being tracked.
Figure 15. Object Detection of the reference point with the following steps: (a) Conversion from RGB to HSV (b) image segmentation after the color has been found between the HSV limits and the shape corresponding to reference has been found (c) reference object color and shape has been found and is being tracked.
Inventions 07 00047 g015
Figure 16. 3D Trajectories of the RM for (a) picking and (b) placing the workpiece.
Figure 16. 3D Trajectories of the RM for (a) picking and (b) placing the workpiece.
Inventions 07 00047 g016
Figure 17. Trajectories evolution over time for (a) X and Z axis and (b) Y axis for picking the workpiece.
Figure 17. Trajectories evolution over time for (a) X and Z axis and (b) Y axis for picking the workpiece.
Inventions 07 00047 g017
Figure 18. Trajectories evolution over time for (a) X and Z axis and (b) Y axis in the case of picking the workpiece.
Figure 18. Trajectories evolution over time for (a) X and Z axis and (b) Y axis in the case of picking the workpiece.
Inventions 07 00047 g018
Figure 19. (a) Real and desired trajectories of the ARS PeopleBot and (b) evolution of the errors in time.
Figure 19. (a) Real and desired trajectories of the ARS PeopleBot and (b) evolution of the errors in time.
Inventions 07 00047 g019
Figure 20. (a) ARS PeopleBot velocity evolution over time; (b) state transition of the workpiece.
Figure 20. (a) ARS PeopleBot velocity evolution over time; (b) state transition of the workpiece.
Inventions 07 00047 g020
Figure 21. ARS PeopleBot trajectory: (a) X axis; (b) Y axis Trajectories.
Figure 21. ARS PeopleBot trajectory: (a) X axis; (b) Y axis Trajectories.
Inventions 07 00047 g021
Table 1. 7-DOF RM Cyton 1500.
Table 1. 7-DOF RM Cyton 1500.
JointLower Limit (in Angles)Upper Limit (in Angles)
Shoulder Roll−150°150°
Shoulder Pitch−105°105°
Shoulder Yaw−105°105°
Elbow Pitch−105°105°
Wrist Yaw−105°105°
Wrist Pitch−105°105°
Wrist Roll−150°150°
Table 2. RM Cyton 1500 D&H parameters.
Table 2. RM Cyton 1500 D&H parameters.
iαadP
1900d1θ1
20a20θ2
3900d3θ3
40a40θ4
5900d5θ5
60a50θ6
7−90a60θ7
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Simion, G.; Filipescu, A.; Ionescu, D.; Șolea, R.; Cernega, D.; Mincă, E.; Filipescu, A. Mobile Visual Servoing Based Control of a Complex Autonomous System Assisting a Manufacturing Technology on a Mechatronics Line. Inventions 2022, 7, 47. https://doi.org/10.3390/inventions7030047

AMA Style

Simion G, Filipescu A, Ionescu D, Șolea R, Cernega D, Mincă E, Filipescu A. Mobile Visual Servoing Based Control of a Complex Autonomous System Assisting a Manufacturing Technology on a Mechatronics Line. Inventions. 2022; 7(3):47. https://doi.org/10.3390/inventions7030047

Chicago/Turabian Style

Simion, Georgian, Adrian Filipescu, Dan Ionescu, Răzvan Șolea, Daniela Cernega, Eugenia Mincă, and Adriana Filipescu. 2022. "Mobile Visual Servoing Based Control of a Complex Autonomous System Assisting a Manufacturing Technology on a Mechatronics Line" Inventions 7, no. 3: 47. https://doi.org/10.3390/inventions7030047

APA Style

Simion, G., Filipescu, A., Ionescu, D., Șolea, R., Cernega, D., Mincă, E., & Filipescu, A. (2022). Mobile Visual Servoing Based Control of a Complex Autonomous System Assisting a Manufacturing Technology on a Mechatronics Line. Inventions, 7(3), 47. https://doi.org/10.3390/inventions7030047

Article Metrics

Back to TopTop