Next Article in Journal
Enhancing Anomaly Detection for Cultural Heritage via Long Short-Term Memory with Attention Mechanism
Previous Article in Journal
Baseline Calibration Scheme Embedded in Single-Slope ADC for Gas Sensor Applications
Previous Article in Special Issue
A Portable Robotic System for Ankle Joint Rehabilitation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Shape Sensing for Continuum Robotics Using Optoelectronic Sensors with Convex Reflectors

Department of Mechanical and Aerospace Engineering, Brunel University, London UB8 3PH, UK
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(7), 1253; https://doi.org/10.3390/electronics13071253
Submission received: 21 February 2024 / Revised: 18 March 2024 / Accepted: 19 March 2024 / Published: 28 March 2024
(This article belongs to the Special Issue Robots in Medical and Industrial Applications)

Abstract

:
Three-dimensional shape sensing in soft and continuum robotics is a crucial aspect for stable actuation and control in fields such as minimally invasive surgery, engine repairs and search and rescue operations, as the estimation of complex curvatures while using continuum robotic tools is required to manipulate through fragile paths. This challenge has been addressed using a range of different sensing techniques, for example, Fibre Bragg grating (FBG) technology, inertial measurement unit (IMU) sensor networks, or stretch sensors. Previously, an optics-based method using optoelectronic sensors was explored, offering a simple and cost-effective solution for shape sensing in a flexible tendon-actuated manipulator in two orientations. This was based on proximity-modulated angle estimation and has been the basis for the shape sensing method addressed in this paper. The improved and miniaturised technique demonstrated in this paper is based on the use of a convex shaped reflector with optoelectronic sensors integrated into a tendon-actuated robotic manipulator. Upgraded sensing capability is achieved using optimisation of the convex reflector shape in terms of sensor range and resolution, and improved calibration is achieved through the integration of spherical bearings for friction-free motion. Shape estimation is achieved in two orientations upon calibration of sensors, with a maximum Root-Mean-Square Error (RMS) of 3.37°.

1. Introduction

A multitude of shape sensing systems have been developed for continuum and hyper-redundant robots. For safe and accurate utilisation of a continuum robot, a stable position control system must be integrated. As the distinguishing features of continuum and hyper-redundant robots are narrow dimensions, flexibility, and the use of soft, compliant materials, the integration with a position shape sensing system that does not compromise these features is challenging. Continuum robots have been utilised in difficult environments, such as search-and-rescue missions, as well as for exploration missions in unknown environments [1,2], and in maintenance for onsite inspection and repairs of large machines such as airplane turbine engines, which are difficult to access internally with rigid tools due to their confined environment and may otherwise be time-consuming and costly to dismantle [3]. A popular field in which continuum robots are used is for minimally invasive surgery (MIS) [4]. In MIS, a small incision is made in the body through which tools are passed for carrying out surgical procedures. Typically, shape sensing systems for MIS-based continuum robots must be miniaturised in order to allow full motion and flexibility capabilities through complex paths, and the use of actuators and encoders directly within joints along the robotic manipulator as is done in larger rigid robots is not feasible in such applications.
A variety of sensor types have been employed in the design of shape sensing systems. Microelectromechanical (MEMs) tracking sensors such as inertial sensors are comprised of a system of accelerometers, a gyroscope, and magnetometers to measure multiaxial orientation measurements. Some examples demonstrate a shape sensing approach for hyper-redundant, snake-like, and pneumatic robots using a network of IMU sensors fixed along the links of the robot, using the sensor information to calculate the tip pose through a kinematic model [5,6,7]. Although these sensor sizes are relatively small, they are not small enough to easily fuse within a soft manipulator. Inertial sensors have also been used in SensorTape [8], a flexible modular tape embedded with a network of sensors including inertial sensors, which can be attached to sections of a flexible robotic structure for shape sensing in three-dimensional space. Inertial sensors, however, can suffer from error due to magnetic interference, gyroscopic drift, as well as position estimation errors from mechanical vibration [9].
Stretchable strain sensors have been employed in some soft robotic manipulators. So, J. et al., for example, used stretchable sensors on a section of a flexible manipulator for use in MIS [10]. The skin-like stretch sensor was composed of carbon nanotube layers within a silicone sheet, placed equidistantly around a silicone manipulator. These types of sensors respond to a change in strain upon bending of the manipulator, as a change in resistance is measured due to the deformation. Many types of materials have been used to develop such stretch sensors, including silver nanowire, carbon black, as well as eutectic gallium–indium liquid. Although highly flexible, the inherent properties of these materials exhibit signal non-linearity and noise due to inhomogeneities in the material. Another point is the assumption of constant curvature in the geometrical modelling of the elastic backbones of the soft robot, which means that more complex curves would not be able to be measured using such stretch sensors.
Some other innovative techniques made use of marker tracking cameras. This utilised passively attached tendons that were only attached on one end to the body of the robotic segment. The lower end of these passive tendons had markers that were detected by cameras to measure their linear displacement during bending of the robot [11]. This allowed more accurate measurement of the robot’s shape and torsion compared to conventional methods of measuring the length changes of the actuating tendons during bending of the continuum robot, as the passive tendons are less affected by slack or excess tension. The paper, however, reported an increase in stiffness of the robot due to the number of passive tendons and in shape estimation error due to the clearance between the tendons and routing paths. Additionally, if future prototypes introduce more than one segment, the passive tendons will need to pass through multiple segments, and this may cause coupling between the markers that are detected by the camera as they will be affected by motion from other robotic segments, which could reduce shape sensing estimation accuracy.
Fibre Bragg gratings fused within optical fibres can measure curves due to a shift in measured wavelength of light upon applied strain due to bending. The fibres can be integrated into narrow robotic structures and are flexible, allowing use in robotic surgical procedures such as catheterisation [12]. However, these gratings are difficult to engrave and require a specialist for accurate placement, thereby making the manufacturing process more difficult, and its signal detection equipment so-called interrogator is expensive. Also, under larger curvatures, performance of shape sensing may be reduced, and errors can occur due to temperature and central wavelength drift over time [13]. FBG sensors are highly sensitive to material properties and external conditions. Shape sensing errors can arise due to large bending curvatures or low stiffness conditions. They are limited to smaller bending ranges within the elastic limit of the material and so sensors must remain within this elastic range, otherwise calibration will not be maintained. FBGs are highly sensitive to any external force or load that is applied to them. This can cause issues in certain conditions. For example, if calibration of an FBG-based robotic structure is completed under free space or using known curvatures, any other load or external/tangential force condition applied to the surface of the structure which is different to the calibration conditions may cause the FBGs to not accurately measure the deformations. This would lead to shape estimation errors or noise.
Optoelectronic sensors are another optical-based technique that can be used for shape sensing. Prior work by the authors introduced the integration of optoelectronic reflective sensors into a tendon-actuated robot for shape estimation in three dimensions [14]. Three optoelectronic sensors were placed equidistantly around each rotating link of the manipulator. These sensors are comprised of an infrared LED and a reflection detecting coupled phototransistor. For shape sensing, the change in angle between each link was estimated through detection of light reflected between the reflective surface on each consecutive link of the manipulator. This light intensity modulation due to change in proximity between the sensors and the reflecting surface allowed estimation of two orientations, through a calibration process as described in that paper. Results of the work showed the viability of the technique for shape sensing and have inspired the new technique to be demonstrated in the following sections.
Before proceeding with the new shape sensing technique, areas of improvement upon the previous method [14] were identified. One was that three sensors were required per link of the robotic manipulator, and these were embedded directly onto the disk. In a longer length of the tendon robot, the number of optoelectronic sensors would accumulate to an excessive number of sensors and electric wires. These also occupied a lot of space on the manipulator, which limited the space for the passing of tools in future development. A second point was that the links were jointed using a 3D-printed ball joint—the components were designed and printed using PLA. This resulted in a lot of friction during motion of the prototype and affected calibration measurement due to excessive vibration. Lastly, each sensor was individually installed in each unit, resulting in excessive wiring. For these reasons, in this paper, the authors will present a new optoelectronic-based shape sensing mechanism to resolve the flaws of the previous version as mentioned above.

2. Design of a New Shape Sensing Mechanism

2.1. Design Concept

The new design proposed in this paper, as illustrated in Figure 1, uses light intensity modulation between optoelectronic sensors and a convex-shaped reflector, for estimation of orientation of the continuum robot segment. The design of this shape sensing system improves upon the author’s previous work presented in [14]. Implementation of these improvements is with the goal of achieving the following conditions: (1) that the shape sensing system can measure motion in two degrees of freedom, (2) elimination of noise due to friction during motion of the robotic segment, as well as from external interference, (3) enable miniaturisation of the shape sensing mechanism, and (4) simplification of the shape sensing system. These points have been addressed in the following ways, in comparison to [14].
Firstly, the ball joints have been replaced by spherical bearings (INA GE6-DO 6 mm Bore Spherical Bearing, 14 mm O.D, Schaeffler, Herzogenaurach, Germany). These spherical bearings are fixed within the rotational unit and linked to the consecutive unit though a shaft that fits into the spherical bearing bore. This vastly improves motion, and friction is virtually eliminated, allowing smooth rotation between joints. More importantly, the sensing configuration in the new system is based on two sensors per rotational unit rather than three. These sensors are fitted vertically (Figure 2a) and massively reduce the occupied space on each link. All sensors were incorporated onto a single flexible circuit (Figure 2b), thereby eliminating wiring for a more simplified design. A new model of optoelectronic reflective sensor was chosen—the NJL5901R-2 (1 × 1.4 × 0.6 mm, New Japan Radio, Tokyo, Japan) (Figure 2c), as opposed to the previously used QRE113 (Onsemi, Phoenix, AZ, US) (3.6 × 2.9 × 1.7 mm), and is much smaller, allowing the reduction of space occupied by sensors. As described in [14], the sensing range of the previously used QRE113 optoelectronic sensor was between 2 and 10 mm. This large sensing range meant that during motion, there was a possibility of interference in the sensor signal from external sources, or reflection of infrared light from nearby objects or cover for enclosing the robotic structure, and this affected the sensor calibration. It is for this reason that the NJL5901R-2 optoelectronic sensor is chosen in the current prototype, as the sensing range is much shorter (0.5–3 mm) (Figure 2d) and eliminates the risk of interference from unintended reflection or scattering of light from nearby objects other than the reflector, and further enables miniaturisation. Due to the housing of the sensors within the structure of the continuum robot (Figure 2a), this also protects the sensor signals from interference from external infrared sources.
In keeping with the same sensing principle, that is, light intensity modulation upon change in proximity, a new surface reflector shape was designed as part of the tendon-actuated robot links. This new reflector takes on a spherical shape and is essentially a convex reflector (Figure 3a). The inner part of this convex reflector section is concentric with the centre of rotation of the rotational unit. However, the outer bound of the convex reflector section that is facing the sensors has the centre of its sphere slightly offset from the rotational centre, as seen in Figure 4a. This has the effect of a sphere with gradually changing radius upon rotation, relative to the sensors. As such, this means that during rotation, the distances between the sensors and the reflector change, providing a means for modulation of the reflected light intensity, which can be transformed to orientation estimation in both pitch and roll rotations. The sensing principle is based on non-contact light intensity proximity sensing. This means that the sensors do not directly measure properties such as strain and are not affected by any loads or material limitations. This means that even after calibration of the sensors in free space, the calibration matrix remains intact to accurately estimate shape regardless of any external loads or forces applied to the surface of the robotic structure.
This is further described in Section 3. The previous work by the authors [14] used a sensing configuration in which three optoelectronic sensors were embedded into each unit of the tendon actuated robot. Proximity variation between the sensors and a reflector were achieved through the tilting of the upper unit, relative to the lower units where the optoelectronic sensors were embedded. This meant that the proximity change through tilting was dependent on the rotational range of the units (±15°), which had to be quite large (2–10 mm). However, in using the new sensing configuration presented in this paper, use of the convex reflector allows the use of the new optoelectronic sensors’ small proximity range (0.5–3 mm) (Figure 2d), with the same rotational range of each unit of the robotic manipulator (±15°), thereby allowing reduction in the size of the manipulator.

2.2. Shape Sensing Integration into Robotic Manipulator

The design of the tendon-actuated robot is comprised of three DC motors to actuate three wire tendons. For verifying the sensing principle of the new shape sensing mechanism, four consecutive units are used as one segment, as illustrated in Figure 2a, with the fitted flexible sensor circuit shown (Figure 2b). With two sensors allocated to each disk, these total 8 sensors. The four units were linked together using the spherical bearings to form a ball joint. Each unit measured 22 mm in diameter and 12 mm in height (Figure 2a). Each unit consisted of the convex reflector, with a central shaft for fitting into the spherical bearing bore of the subsequent lower unit. Three tendons were routed along the structure, with a set of three springs between each unit fitted to limit some of the torsion motion. The springs between each unit were 6 mm in height. Each tendon was wound around a pully system attached to the motor horn of three DC servo motors (Dynamixel XL430-W250T, ROBOTIS, Seoul, Republic of Korea) at the base platform. These three tendons were used to achieve actuation in both pitch and roll orientations. A channel down one side of the structure was used to allow the optoelectronic PCB strip, shown in Figure 2, to be fitted. This was a printed flexible sensor circuit strip, comprising of four pairs of the optoelectronic sensor. The sensor strip was fixed along the channel at each unit of the robotic structure using two small screws (Figure 3a). On the top unit, a frame is fitted for allowing the IMU sensor (LPMS B2, LP-RESEARCH Inc., Tokyo, Japan) to be fixed. The platform components were designed using CAD software (Onshape 2014) and, other than the convex reflector component, were 3D printed using white PLA (polylactic acid) plastic. The convex reflector component was printed using a UV Resin SLA 3D printer (Photon S, ANYCUBIC, Shenzhen, China), with white-coloured resin, for high-resolution surface finish in order to maximise reflectivity and reduce noise in the sensor signal. For actuation, motion was controlled using software developed in Python (3.8), with real-time interfaced motor control, optoelectronic sensor recordings, as well as IMU sensor recordings. To achieve motion, an input target orientation of the top plate is chosen, and a vector-based model is used to calculate the required tendon lengths for conversion to three target motor positions. As the three motors move simultaneously, each pulls a tendon wire over the pulley for continuous motion.

3. Sensing Principle

3.1. Sensor Design Principle

The spherical convex reflector design is based on the work demonstrated in [15], which is on the design of a curved joint angle measurement sensor. This one-dimensional angle sensor demonstrated the potential for adapting the output of optoelectronic sensors for increased sensing sensitivity and range by altering the curvature of a reflecting surface. This concept is therefore applied here in three dimensions, where the sensor reflector is designed as a section of a sphere and can be regarded as a convex reflector. Two sets of cross sections of the reflector design are shown in Figure 3b. These illustrate the rotations in pitch (about the y axis) and roll (about the x axis). Two optoelectronic sensors are placed circumferentially around the origin of rotation O, on the xy plane. Light is reflected from the convex reflector section (red dotted) (Figure 3b) and recorded by the sensors. Considering the schematic in Figure 4a, the reflector with centre C is placed at a location offset to the rotation origin O. As such, when the reflector rotates about origin O, either in the pitch or roll orientations, the distance between the sensor and the reflector varies. This displacement variation with a change in orientation allows changes in sensor voltage readings due to varied light intensity for proximity-based sensing. In the cross sections shown in Figure 3b, the sensor is placed at an original distance of d0 (0°). The rotating reflector is designed to rotate ±15° degrees about both the y axis (pitch) and x axis (roll). At these points, the reflector is at distance dmax/dmin and dmax’/dmin’ from the sensors, respectively. For pitch rotation, the distances d0, dmin, and dmax correspond to the range identified in the graph in Figure 2d, highlighted in blue. This is the characteristic optoelectronic sensing graph showing sensor voltage output with increasing linear displacement from a reflector. This response is in part based on the resistors of the optoelectronics sensors—these are the LED resistor (R1) and the phototransistor resistor (R2) (Figure 2f). These were chosen to be 680 Ω and 10 kΩ, respectively. To maximise range and sensitivity, the highlighted section (blue) of Figure 2d identifies the ideal sensor proximity to a reflector. This range identifies the ideal placement of the optoelectronic sensor with the convex reflector. Given these distance ranges, the spherical radius and centre C of the reflector could be identified and designed. During roll motion, the proximity change in the range of ±15° is d0, dmin’, and dmax’, and is smaller compared to the proximity range during pitch rotation (d0, dmin, and dmax).
Figure 4. (a) Schematic diagram for calculation of proximity ‘d’ between the sensor and reflector (with radius rs) upon rotation. (b) Simplified schematic of light intensity model.
Figure 4. (a) Schematic diagram for calculation of proximity ‘d’ between the sensor and reflector (with radius rs) upon rotation. (b) Simplified schematic of light intensity model.
Electronics 13 01253 g004

3.2. Mathematical Theory

The mathematical theoretical concept behind this sensing principle is simply based on the Gaussian reflection of light against a convex reflector [15,16]. Figure 4a illustrates a schematic of the sensing configuration and shows a cross-sectional view of one sensor placement around the rotational convex reflector component. It shows a system of how to calculate distance ‘d’ between the sensor and reflector. Given the centre of rotation O, vectors f, between O and the sensor, as well as vector s0, between O and the convex reflector centre C, can be known. Upon rotation of the unit about the origin in both pitch and roll orientations, a new vector s1 can be calculated using Equation (1), using rotation matrices Rx and Ry. From this, distance d can be calculated as shown in Equations (2) and (3).
s 1 = R x R y s 0
p = f s 1  
d = p r s  
Considering Figure 4b, the schematic can be simplified, by making a few assumptions, to allow the set-up of a mathematical model to describe the sensing principle. As the LED size on the sensor is quite small, we can assume that the beam of light emitted from this is of a point source. We can also assume, due to the small scale, that the area upon which the light is incident on the reflector is small, so it is taken to be a spherical convex reflector with a constant radius about centre C’. Using this, and values for d, as well as known phototransistor area (sarea) (Figure 4b), a theoretical reflected light intensity can be calculated using the Gaussian power equation in Equation (4) [15]. Power is directly proportional to intensity and depends on the cross-sectional area along the light beam. Therefore, the initial power emitted by the LED, P E , can indicate the input intensity, and the reflected power P   , can indicate the output intensity detected by the PT. This equation essentially integrates the reflected light beam with cross-sectional radius ω at the distance of the projection plane, over a circular cross section sarea of the PT.
P = P E s a r e a π ω 2
where,
ω = f d
k 1 k 2 k 3 k 4 P 1 P 2 = α γ  
In order to conceptualise whether light intensity values from two sensors can be used to uniquely estimate pitch and roll of one rotational unit, a theoretical simulation can be demonstrated using the mathematical model. A series of theoretical pitch and roll values were chosen to measure two sets of values ‘d’ for each sensor in a rotational unit. From these values, two sets of theoretical output intensity values were calculated for the sensors using Equation (4). Next, MATLAB (R2022) was used to generate a calibration matrix by calculating coefficients through a linear regression algorithm, based on a least-squares approach. Equation (5) shows the calibration matrix for estimating orientation (pitch (γ) and roll (α)) from the theoretical output intensities (P1 and P2) using the coefficients k1–4. Another set of pitch and roll values were used to obtain output intensity values for two sensors (Equation (4)). These were then multiplied by the coefficient matrix to estimate pitch and roll values (α and γ). These were plotted against the actual pitch and roll values used, as can be seen in Figure 5. Here, it can be seen that the estimations compared to the actual values have substantial overlap. Any deviation that occurs is likely due to not describing an exact model due to the simplifications. Despite this, it can be seen that orientation motion patterns can be estimated with some accuracy using two theoretical intensity values belonging to two sensors. Therefore, as intensity is proportional to voltage induced in the phototransistor of the optoelectronic sensor, it can be deduced that the sensing principle could work in practice and that two sensor voltage values can be used to estimate two orientations during motion.

4. Experimental Platform

Sensor Calibration Platform

The experimental platform was designed as shown in Figure 2, comprising a section of a flexible tendon-actuated robot. For calibration of each pair of sensors at each unit, the process design shown in Figure 6 was used. This utilised 3D-printed rigid fixtures that could be screwed around the robotic structure in a way that limits its motion. For example, to calibrate the two sensors on the top unit, the fixtures were fixed onto the three lower units, removing any capability for motion in these units while allowing motion due to tendon actuation in the top unit. A motion pattern was generated to cover all angles in both pitch and roll directions in increments of 0.1° in these two orientations up to ±15°. For example, the pitch is incremented by 0.1°, and motion of the unit in the roll orientation within the range of ±15° is set in increments of 0.1°. The pitch increments, again by 0.1°, and the next set of roll orientations are set in motion. This is repeated until all pitch and roll motions are achieved for ±15° each for that unit. During the motion, both optoelectronic sensor voltage readings and IMU orientation data readings were recorded. This provided a full set of sensor readings for the full range of motion for that rotational unit. A set of this data for one rotational unit can be seen in Figure 7. The calibration was carried out by using the IMU data as ground truth, and along with the voltage sensor readings, input into a non-linear regression algorithm to find coefficients that transformed the two sets of sensor voltages to an estimation of pitch and roll values (α and γ). These data were fed into MATLAB to generate a set of 8 coefficients per orientation estimation, (k1–8) and (j1–8), to map the two voltage values (v1, v2) to an estimation of pitch (γi) and roll (αi). This process is repeated for all four units (i = 1 to i = 4), involving each pair of sensors. The equation below (Equation (6)) shows the calculation for the orientation estimations using the sensor voltage data and calibration coefficients. As such, an estimation of pitch and roll could be made at each rotational unit, and therefore, an estimation of the total orientation of the full manipulator could be made. Validation of the calibration results is carried out by setting the full robot segment to move to its maximum angle range of ±60° in both orientations. All sensor voltages are recorded during this motion and multiplied by the calibration coefficients to give estimated orientations during motion, which are compared to the orientations given by the IMU sensor. These results are discussed in the following section. Regarding the light intensity model described in Section 3, we can further evaluate the validity of this by comparing sensor output voltages during motion to estimated light intensities given the same motion orientation pattern. This is shown in Figure 8, while calibration validation is shown in Figure 9 and Table 1.
γ i = k 1 i   v 1 i   + k 2 i   v 2 i   + k 3 i   v 1 i   2 + k 4 i   v 2 i   2 + k 5 i   v 1 i   v 2 i   + k 6 i   v 1 i   v 2 i   2 + k 7 i   v 2 i   v 1 i   2 + k 8 i   α i = j 1 i   v 1 i   + j 2 i   v 2 i   + j 3 i   v 1 i   2 + j 4 i   v 2 i   2 + j 5 i   v 1 i   v 2 i   + j 6 i   v 1 i   v 2 i   2 + j 7 i   v 2 i   v 1 i   2 + j 8 i   i = 1 , 2 , 3 , 4

5. Results and Analysis

Figure 7 shows the output of the calibration for one unit of the robotic structure. It shows the recorded IMU data (Figure 7a), with both pitch and roll measurements within the range of ±15° in incremental steps as described previously. These data were recorded along with the two sets of sensor voltages during motion, as shown in Figure 7b. As can be seen, the motion data are more stable compared to the previously referenced technique [14] that used a PLA-printed ball joint as opposed to a spherical bearing, as is done here. The use of this bearing eliminated some of the noise previously seen in the sensor data, owing to friction, and the motion is a lot smoother. Next, Figure 8 shows the graphs for both sensors 1 and 2 during one rotational unit calibrating motion and compares the sensor voltage output to the estimated relative reflected light intensity. As can be seen, the estimated light intensity generally follows the sensor voltage pattern. It is, however, inverted relative to the sensor voltage. This is because when considering the interaction of light with the sensor, the intensity collected by the phototransistor is proportional to the current. This current induces a drop in the voltage output of the collector in the phototransistor which corresponds to the voltage values shown in the graphs in Figure 8. This current is inversely proportional to the voltage drop displayed in the graph. This was also seen when comparing Figure 2d and Figure 2e, respectively, which both show the characteristic optoelectronic sensor behaviour over proximity to a reflective surface, where Figure 2d displays the output voltage, and Figure 2e displays the inverse of this, as the relative output current. As such, we must in the future update the theoretical model to find a relation between the sensor voltage output and the theoretical intensity currently described by the model.
In another aspect of this comparison (Figure 8), we can see that sensor 1 and 2 voltage outputs differ in range and starting voltage level, whereas for the estimated theoretical intensity output, the starting values are almost identical in range and starting value, indicating symmetry between the two sensors’ output during motion and corresponding to what is described by the mathematical model. A reason for which this is not reflected in the real sensor output may be due to non-symmetric placement of the optoelectronic sensor pairs on the PCB strip (Figure 2g), in regard to the orientation of the LED and PT. This resulted in slightly different sensor responses, which may have affected the sensing system properties in terms of range and sensitivity. This can be rectified in future prototypes through reorienting the optoelectronic sensors on one side of the flexible PCB strips. This could, in the future, show a more predictable sensor voltage output when comparing it to the theoretical model. Nonetheless, it can be shown that the model may predict tendencies displayed practically in the experiments.
For the calibration validation results shown in Figure 9, it can be seen that the orientation estimation was successful. The maximum motion range of ±60° in purely pitch, as well as roll, orientations were carried out over four cycles. The orientation estimations based on the calibration coefficients calculated through a non-linear regression model (Equation (6)) closely follow the orientation given by the IMU sensor, with maximum RMS error of 3.27°, and percentage error of 0.77% (Table 1). The maximum tip orientation estimation error for the pitch motion shown in Figure 9 was 6.03°, and 6.86° for the roll motion. As a measure of repeatability, the mean standard deviation for pitch orientation was 1.3128° and 1.0274° for roll, over four cycles of motion within the range of ±60°. The estimation of the shape error is relatively successful; however, small errors along the points summate to larger tip positional and orientation estimation errors. It is possible that these errors at each point may arise from interference between the sensor pairs. This interference refers to a case where infrared light emitted from the LED of one optoelectronic sensor may be reflected into the phototransistor of the adjacent optoelectronic sensor. This can affect calibration and orientation estimation accuracy. As such, future prototypes will address this issue by designing a new PCB strip with a current switching circuit using a demultiplexer, to eliminate this interference effect. This would involve alternating power between adjacent optoelectronic sensors at a high frequency, meaning that adjacent sensors are never on at the same instance. This would also have the advantage of reducing overall power consumption, which supplements the safety of the system for various application.

6. Conclusions and Future Works

To conclude, it can be said that the optoelectronic-based shape sensing technique coupled with a convex shaped reflector is successful in orientation estimation in two degrees of orientation. The sensing configuration in this system is greatly reduced in size and utilises fewer sensors while successfully estimating orientation in the same motion range of the robotic manipulator compared to previous prototypes. The use of a single flexible circuit eliminates wiring for a more simplified design, while the use of a spherical bearing for the robot structure allows for smoother motion and subsequently better calibration results. For future improvement in consequent prototypes, certain features can be addressed to improve suitability to targeted applications. In this prototype, the size was left to be larger in order to easily test different parameters such as the sensor placement, and in carrying out the mechanical calibration process. However, once this shape sensing technique is established and proof of concept has been achieved and the principles of the shape sensing are understood, the future designs would of course be scaled down, and its safety pertaining to large-scale electric current from multiple optoelectronic sensors in all segments should be guaranteed. Regarding the size issue, the robot structure can feasibly be miniaturised through the use of smaller spherical bearings, to a diameter of 12 mm. While most MIS-based continuum robots require a smaller diameter, some applications have utilised robots of up to 12 mm [17,18], while applications such as airplane engine inspection and repairs, as well as for exploration, have used continuum robots with diameters between 12 and 25 mm [19,20,21,22]. Regarding the safety issue, as well as the interference issue, use of a current source switching circuit using a DEMUX and transistors is proposed to provide all the current required for all the optoelectronic sensors, thereby reducing overall power consumption in the robotic manipulator, and eliminating the possibility of interference between adjacent optoelectronic sensors. The maximum angle measurement range of this shape sensing mechanism is limited due to a lower number of units in one segment (currently 4), so two more units will be added to cover a larger measurement range (up to 90°). A soft cover will be used to enclose the continuum robot, to protect it further from any external light, as well as from any damage from chemicals or substances, and to also maintain accurate calibration. Temperature compensation circuits will be implemented to the optoelectronic sensors when future experiments are carried out with more realistic task conditions. A full-length robot will be designed integrating this shape sensor to allow for testing in phantom environments. Nonetheless, the results are promising and show potential for further development of shape sensing in flexible robotic applications.

Author Contributions

Validation, D.O. and Y.N.; Investigation, D.O. and Y.N.; Writing—original draft, D.O. and Y.N.; Writing—review & editing, D.O. and Y.N.; Visualization, D.O. and Y.N.; Supervision, Y.N.; Co-Supervision: X.D. and T.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Engineering and Physical Sciences Research Council, grant number (EP/T518116/1).

Data Availability Statement

The data presented in this study are available in this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Yamauchi, Y.; Ambe, Y.; Nagano, H.; Konyo, M.; Bando, Y.; Ito, E.; Arnold, S.; Yamazaki, K.; Itoyama, K.; Okatani, T.; et al. Development of a continuum robot enhanced with distributed sensors for search and rescue. Robomech. J. 2022, 9, 8. [Google Scholar] [CrossRef]
  2. Dong, X.; Palmer, D.; Axinte, D.; Kell, J. In-situ repair/maintenance with a continuum robotic machine tool in confined space. J. Manuf. Process. 2019, 38, 313–318. [Google Scholar] [CrossRef]
  3. Dong, X.; Axinte, D.; Palmer, D.; Cobos, S.; Raffles, M.; Rabani, A.; Kell, J. Development of a slender continuum robotic system for on-wing inspection/repair of gas turbine engines. Robot. Comput. Integr. Manuf. 2017, 44, 218–229. [Google Scholar] [CrossRef]
  4. Burgner-Kahrs, J.; Rucker, D.C.; Choset, H. Continuum Robots for Medical Applications: A Survey. IEEE Trans. Robot. 2015, 31, 1261–1280. [Google Scholar] [CrossRef]
  5. Zhang, Z.; Shang, J.; Seneci, C.; Yang, G.Z. Snake robot shape sensing using micro-inertial sensors. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2023; pp. 831–836. [Google Scholar] [CrossRef]
  6. Lapusan, C.; Hancu, O.; Rad, C. Shape Sensing of Hyper-Redundant Robots Using an AHRS IMU Sensor Network. Sensors 2022, 22, 373. [Google Scholar] [CrossRef] [PubMed]
  7. Sofla, M.S.; Sadigh, M.J.; Zareinejad, M. Shape estimation of a large workspace continuum manipulator with fusion of inertial sensors. Mechatronics 2021, 80, 102684. [Google Scholar] [CrossRef]
  8. Dementyev, A.; Kao, H.L.C.; Paradiso, J.A. SensorTape: Modular and programmable 3D-aware dense sensor network on a tape. In Proceedings of the UIST 2015—28th Annual ACM Symposium on User Interface Software and Technology, Charlotte, NC, USA, 11–15 November 2015; pp. 649–658. [Google Scholar] [CrossRef]
  9. Shi, C.; Luo, X.; Qi, P.; Li, T.; Song, S.; Najdovski, Z.; Fukuda, T.; Ren, H. Shape Sensing Techniques for Continuum Robots in Minimally Invasive Surgery: A Survey. IEEE Trans. Biomed. Eng. 2017, 64, 1665–1678. [Google Scholar] [CrossRef] [PubMed]
  10. So, J.; Kim, U.; Kim, Y.B.; Seok, D.-Y.; Yang, S.Y.; Kim, K.; Park, J.H.; Hwang, S.T.; Gong, Y.J.; Choi, H.R. Shape Estimation of Soft Manipulator Using Stretchable Sensor. Cyborg Bionic Syst. 2021, 2021, 9843894. [Google Scholar] [CrossRef] [PubMed]
  11. Li, J.; Zhang, F.; Yang, Z.; Jiang, Z.; Wang, Z.; Liu, H. Shape Sensing for Continuum Robots by Capturing Passive Tendon Displacements with Image Sensors. IEEE Robot. Autom. Lett. 2022, 7, 3130–3137. [Google Scholar] [CrossRef]
  12. Jäckle, S.; Eixmann, T.; Schulz-Hildebrandt, H.; Hüttmann, G.; Pätz, T. Fiber optical shape sensing of flexible instruments for endovascular navigation. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 2137–2145. [Google Scholar] [CrossRef] [PubMed]
  13. Li, R.; Tan, Y.; Chen, Y.; Hong, L.; Zhou, Z. Investigation of sensitivity enhancing and temperature compensation for fiber Bragg grating (FBG)-based strain sensor. Opt. Fiber Technol. 2019, 48, 199–206. [Google Scholar] [CrossRef]
  14. Osman, D.; Du, X.; Li, W.; Noh, Y. Development of an Optical Shape Sensing Method Using Optoelectronic Sensors for Soft Flexible Robotic Manipulators in MIS. IEEE Trans. Med. Robot. Bionics 2022, 4, 343–347. [Google Scholar] [CrossRef]
  15. Osman, D.; Li, W.; Du, X.; Minton, T.; Noh, Y. Miniature Optical Joint Measurement Sensor for Robotic Application Using Curvature-Based Reflecting Surfaces. IEEE Sens. Lett. 2022, 6, 3501304. [Google Scholar] [CrossRef]
  16. Gaikwad, A.D.; Gawande, J.P.; Joshi, A.K.; Chile, R.H. An intensity-modulated optical fiber sensor with concave mirror for measurement of displacement. J. Opt. 2013, 42, 300–306. [Google Scholar] [CrossRef]
  17. Degani, A.; Choset, H.; Wolf, A.; Zenati, M.A. Highly articulated robotic probe for minimally invasive surgery. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, 2006, ICRA 2006, Orlando, FL, USA, 15–19 May 2006. [Google Scholar] [CrossRef]
  18. Garbin, N.; Di Natali, C.; Buzzi, J.; De Momi, E.; Valdastri, P. Laparoscopic tissue retractor based on local magnetic actuation. J. Med. Devices 2015, 9, 011005. [Google Scholar] [CrossRef]
  19. Wang, Y.; Ju, F.; Cao, Y.; Yun, Y.; Wang, Y.; Bai, D.; Chen, B. An aero-engine inspection continuum robot with tactile sensor based on EIT for exploration and navigation in unknown environment. In Proceedings of the 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, China, 8–12 July 2019. [Google Scholar] [CrossRef]
  20. Dong, X.; Wang, M.; Mohammad, A.; Ba, W.; Russo, M.; Norton, A.; Kell, J.; Axinte, D. Continuum robots collaborate for safe manipulation of high-temperature flame to enable repairs in challenging environments. IEEE/ASME Tran. Mechatron. 2022, 27, 4217–4220. [Google Scholar] [CrossRef]
  21. Liu, B.; Ozkan-Aydin, Y.; Goldman, D.I.; Hammond, F.L. Kirigami Skin Improves Soft Earthworm Robot Anchoring and Locomotion Under Cohesive Soil. In Proceedings of the 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft), Seoul, Republic of Korea, 14–18 April 2019; pp. 828–833. [Google Scholar] [CrossRef]
  22. Das, R.; Babu, S.P.M.; Palagi, S.; Mazzolai, B. Soft Robotic Locomotion by Peristaltic Waves in Granular Media. In Proceedings of the 2020 3rd IEEE International Conference on Soft Robotics (RoboSoft), New Haven, CT, USA, 15 May–15 July 2020; pp. 223–228. [Google Scholar] [CrossRef]
Figure 1. Optoelectronic shape sensing for a tendon-actuated robot. A spherically convex-shaped reflector is shown along with the optoelectronic sensor placement (designed in Onshape 2014).
Figure 1. Optoelectronic shape sensing for a tendon-actuated robot. A spherically convex-shaped reflector is shown along with the optoelectronic sensor placement (designed in Onshape 2014).
Electronics 13 01253 g001
Figure 2. Tendon-actuated robot (a) with integrated optoelectronic sensor strip (b) for shape sensing, using NJL5901R-2 (c,f) optoelectronic sensors. (d,e) show characteristic sensor output. (g) shows optoelectronic sensors positioning on PCB.
Figure 2. Tendon-actuated robot (a) with integrated optoelectronic sensor strip (b) for shape sensing, using NJL5901R-2 (c,f) optoelectronic sensors. (d,e) show characteristic sensor output. (g) shows optoelectronic sensors positioning on PCB.
Electronics 13 01253 g002
Figure 3. (a) Integration of optoelectronic PCB strip into robotic structure (left), with optoelectronic sensor placement around the reflector shown in the transparent image on the right. (b) Cross-sectional views of light path from sensor to reflector, under different orientation rotations.
Figure 3. (a) Integration of optoelectronic PCB strip into robotic structure (left), with optoelectronic sensor placement around the reflector shown in the transparent image on the right. (b) Cross-sectional views of light path from sensor to reflector, under different orientation rotations.
Electronics 13 01253 g003
Figure 5. Estimated pitch and roll based on the theoretical light intensity values P1 and P2 compared to simulated motion pitch and roll orientations (orange), shown in (a) and (b) respectively.
Figure 5. Estimated pitch and roll based on the theoretical light intensity values P1 and P2 compared to simulated motion pitch and roll orientations (orange), shown in (a) and (b) respectively.
Electronics 13 01253 g005
Figure 6. Calibration platform using motion-locking fixtures (orange) to allow independent calibration of each unit’s sensors during a full range of motion in two orientations. The IMU sensor (blue) is attached to the tip unit.
Figure 6. Calibration platform using motion-locking fixtures (orange) to allow independent calibration of each unit’s sensors during a full range of motion in two orientations. The IMU sensor (blue) is attached to the tip unit.
Electronics 13 01253 g006
Figure 7. Output for single unit calibration, displaying a full set of recorded IMU orientation data (a), along with sensor voltage output for the pair of sensors (b).
Figure 7. Output for single unit calibration, displaying a full set of recorded IMU orientation data (a), along with sensor voltage output for the pair of sensors (b).
Electronics 13 01253 g007
Figure 8. Comparing sensor voltage output with theoretically estimated light intensity for both sensors in one rotational unit of the robotic manipulator, for sensor 1 (a), and sensor 2 (b).
Figure 8. Comparing sensor voltage output with theoretically estimated light intensity for both sensors in one rotational unit of the robotic manipulator, for sensor 1 (a), and sensor 2 (b).
Electronics 13 01253 g008
Figure 9. Calibration validation results. The robotic manipulator was moved freely within the range of ±60° in both orientations, and estimated orientation using calibration coefficients is compared to IMU orientation.
Figure 9. Calibration validation results. The robotic manipulator was moved freely within the range of ±60° in both orientations, and estimated orientation using calibration coefficients is compared to IMU orientation.
Electronics 13 01253 g009
Table 1. Calibration validation error.
Table 1. Calibration validation error.
Orientation% ErrorRMS Tip Error (°)Maximum Tip Error (°)
Pitch0.772.456.03
Roll0.213.276.86
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Osman, D.; Du, X.; Minton, T.; Noh, Y. Shape Sensing for Continuum Robotics Using Optoelectronic Sensors with Convex Reflectors. Electronics 2024, 13, 1253. https://doi.org/10.3390/electronics13071253

AMA Style

Osman D, Du X, Minton T, Noh Y. Shape Sensing for Continuum Robotics Using Optoelectronic Sensors with Convex Reflectors. Electronics. 2024; 13(7):1253. https://doi.org/10.3390/electronics13071253

Chicago/Turabian Style

Osman, Dalia, Xinli Du, Timothy Minton, and Yohan Noh. 2024. "Shape Sensing for Continuum Robotics Using Optoelectronic Sensors with Convex Reflectors" Electronics 13, no. 7: 1253. https://doi.org/10.3390/electronics13071253

APA Style

Osman, D., Du, X., Minton, T., & Noh, Y. (2024). Shape Sensing for Continuum Robotics Using Optoelectronic Sensors with Convex Reflectors. Electronics, 13(7), 1253. https://doi.org/10.3390/electronics13071253

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop