Next Article in Journal
Robot Anticipation Learning System for Ball Catching
Previous Article in Journal
A Simulation Study of a Planar Cable-Driven Parallel Robot to Transport Supplies for Patients with Contagious Diseases in Health Care Centers
Previous Article in Special Issue
Engineering Interoperable, Plug-and-Play, Distributed, Robotic Control Systems for Futureproof Fusion Power Plants
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Suite of Robotic Solutions for Nuclear Waste Decommissioning

1
School of Electronic Engineering and Computer Science (EECS), Queen Mary University of London, Mile End Road, London E1 4NS, UK
2
School of Engineering and Materials Science (SEMS), Queen Mary University of London, Mile End Road, London E1 4NS, UK
3
Department of Engineering and Design, University of Sussex, Falmer, Brighton BN1 9RH, UK
*
Author to whom correspondence should be addressed.
Robotics 2021, 10(4), 112; https://doi.org/10.3390/robotics10040112
Submission received: 23 August 2021 / Revised: 27 September 2021 / Accepted: 29 September 2021 / Published: 7 October 2021
(This article belongs to the Special Issue Advances in Robots for Hazardous Environments in the UK)

Abstract

:
Dealing safely with nuclear waste is an imperative for the nuclear industry. Increasingly, robots are being developed to carry out complex tasks such as perceiving, grasping, cutting, and manipulating waste. Radioactive material can be sorted, and either stored safely or disposed of appropriately, entirely through the actions of remotely controlled robots. Radiological characterisation is also critical during the decommissioning of nuclear facilities. It involves the detection and labelling of radiation levels, waste materials, and contaminants, as well as determining other related parameters (e.g., thermal and chemical), with the data visualised as 3D scene models. This paper overviews work by researchers at the QMUL Centre for Advanced Robotics (ARQ), a partner in the UK EPSRC National Centre for Nuclear Robotics (NCNR), a consortium working on the development of radiation-hardened robots fit to handle nuclear waste. Three areas of nuclear-related research are covered here: human–robot interfaces for remote operations, sensor delivery, and intelligent robotic manipulation.

1. Introduction

As robotic technologies and artificial intelligence have advanced, roboticists have turned their attention to the challenge of extending autonomous operations to ever more complex environments. The big “success story” in terms of robot deployment in recent decades has undoubtedly been industrial automation. Reaching the same levels of reliability and consistency in performance remains a challenge in more extreme environments—the nuclear industry being a prime example.
The harsh environment and precarious nature of the physical tasks involved impose several challenges in terms of deploying robots for nuclear clean-up. Nuclear decommissioning use cases present certain requirements that a standard commercial system would not be able to meet. Factors such as radiation effects, remote maintenance, and constraints to deployment are impositions with which commercial systems do not generally need to reckon. This paper reviews recent research by the authors that addresses some of these challenges from the perspective of the nuclear sector. It also examines the requisite adaptations that would facilitate crossover from mainstream robotics.
The robotics teams at QMUL, a partner in the UK EPSRC National Centre for Nuclear Robotics, have contributed to the three areas of research summarised in Figure 1: human–robot interfaces, radiation sensor delivery, and robotic manipulation.
Figure 1 additionally outlines a possible route to the implementation of robotic tasks in nuclear decommissioning: a human operator programmes the robot through a user-friendly GUI, or directly teleoperates the robot to perform a required task. The task could be the delivery of appropriate sensors, e.g., radiation sensors, to a required location; or the physical exploration of an environment using visual, proximity, and tactile sensors; or the grasping and manipulation of objects in a remote, potentially radioactive environment.
This paper is structured as follows. Section 2 motivates the discussion and introduces an illustrative use case for the integration of the proposed solutions. Section 3 focuses on robotic grasping and manipulation. Section 4 covers sensors and sensor deployment with the help of robotic manipulators. Section 5 overviews human–robot interfaces for performing robotic tasks (either automatic or teleoperated). Section 6 concludes the paper and outlines future challenges.

2. Robotics Research for the Nuclear Environment—Overview and Applications

The clean-up of nuclear power plants involves the dismantling and decontamination of industrial infrastructure, along with the decommissioning of nuclear waste. High-energy radiation sources hasten the degradation of industrial equipment through exposure to ionising radiation, with the equipment eventually becoming inoperative and having to be dismantled. Surfaces and machinery that have come into contact with radioactive material will, on the other hand, need to be decontaminated. The hazardous nature of this activity and the toxicity of all substances and contaminated infrastructure within the environment necessitate the wearing of cumbersome protective gear, such as hazmat suits, by human operatives in the field. The work is therefore not only dangerous but difficult. A label often attached to nuclear sites subject to decommissioning is that of “extreme environment”, a descriptor also used to designate other environments that share some of the same operating characteristics—typically outer space, deep-sea and deep mines [1]. These kinds of environments present a strong case for increased automation and remote operation as well as the replacement of human workers by robots [2]. As a consequence, there has been a surge of interest from roboticists and industry stakeholders as regards extreme environments. Despite this, there remains little provision for automation, most notably in older facilities, and manual operations still make up the bulk of the clean-up effort.
Robots have played a key role in the nuclear sector that dates back to atmospheric nuclear weapons testing and the clean-up of Three Mile Island [3]. Typically, they have been used to access radiologically hazardous areas and to collect samples from which to ascertain the scope and magnitude of the risk. More recently, in the wake of robotic deployment related to the Fukushima nuclear disaster, public interest was piqued by the idea that robots could potentially act as human rescuers in certain situations. Despite some negativity and scepticism apparent in media coverage (“dying” robots “failing” the clean-up effort), robots have shown considerable value to the US DOE’s clean-up efforts, especially in decommissioning nuclear waste that had accumulated as a by-product of their weapons programme [4].
For industrial applications more generally, the introduction of robotic systems has seen steady increase over the last half-century, and the trend shows no sign of abating. Incremental improvements in robotic technologies have led to a rise in productivity, improved safety, and a reduction in costs, among other frequently cited benefits, largely confined to robots in highly structured environments [5]. The hope is that the same benefits can transfer over to more complex, less structured working environments. As robotic technologies continue to advance, so too will the opportunities for technology transfer from one sector to another, be it from manufacturing to nuclear or, even, vice versa.
Several ground rules have been established in relation to the development of nuclear robotics [5]. These include utilising existing equipment as much as possible rather than creating entirely new robotic systems; sticking to tethered rather than wireless control, so that a robot can always be retrieved should problems arise; ensuring robots are small enough to fit through interior hatch openings; and making them sufficiently robust and waterproof to cope with underwater work and withstand high-pressure spraying during post-work decontamination.
In the past, developing bespoke equipment has generally been the default route for nuclear applications. Experience suggests, however, that this isn’t always optimal, as pre-existing commercial developments and applications can sometimes provide directly transferable technologies and solutions. An example of this is teleoperated robot hand control: a robust ergonomically designed multi-function handheld joystick controller can be used for many hours without causing operator discomfort [6]. While developing and testing a device from scratch would be both costly and time-consuming, the gaming industry has already made huge advances in this regard and it would, therefore, make sense to look into how best to transfer these kinds of commercial off-the-shelf (COTS) technologies to nuclear applications.

An Illustrative Use Case

Let us consider the integrated solution depicted in Figure 1 and the use case illustrated in Figure 2. Nuclear facilities are often housed in large buildings, where nuclear waste is stored. Such buildings are often completely sealed—no door, window, or any other aperture is present. The stored waste can remain inside these buildings for a long time, without any monitoring, and it, therefore, remains impossible to ascertain the precise state of the waste containers or the building itself. A way for our integrated solution to address this scenario would involve creating a small aperture in the outer wall and introducing sensors mounted on a robot arm; the arm would squeeze through to the other side and proceed to monitor the inside of the building. More specifically:
-
A soft eversion robot (see Section 4.3) can enter the building through the small aperture;
-
The robot can be equipped with sensors and specific protocols can be employed to inspect the surfaces of the walls and the waste containers, either using touch (see Section 4.2) or a combination of vision and touch (see Section 4.1), to evaluate their structural integrity;
-
The robot can be equipped with a gripper (see Section 3.1) to grasp and manipulate objects (see Section 3.2) inside the building;
-
The movements of the robot can be controlled by human teleoperation with haptic feedback (see Section 5) or can be programmed to be autonomous (see Section 3.3).

3. Robotic Grasping and Manipulation

3.1. Task-Oriented Design and System Development of a Dexterous Robotic Gripper

Deployable manipulators equipped with grippers are key to enabling robots to replace human operators within a radiation-exposed nuclear environment (Figure 2). Although rigid-bodied robots have been specially developed to perform essential tasks in extreme environments, they come equipped with electronic and electrical components that are not radiation-proof, and are therefore susceptible to the ill-effects of radiation [7].
Despite some extremely limited choice of rad-hard components (a combined consequence of the extremely high material costs and the relative dearth of suppliers), electronic-component-free robotic devices have enormous potential for applications in radioactive environments. The key factors in the development of radiation-resistant robotic grippers for remote operation using mechanical and materials intelligence are classified as follows [8]:
Leader/follower operation: Though the traditional master/slave system for manipulation of hazardous material may fall short in terms of the required distance between operator and robot, this system has advantages in terms of its simplicity and affordability [9].
Underactuated design for higher affordance: In the development of robotic grippers, the concept of underactuation has been widely adopted as an effective approach to embedding a high number of degrees of freedom (DOFs) without increasing the number of actuators that need to be controlled. An underactuated robotic gripper not only reduces the complexity of its actuation and control systems but also lends itself to inherent compliance and adaptability in relation to how it interacts with its environment [10,11].
Materials intelligence: The use of deformable materials could also contribute to inherent compliance, thereby engendering greater self-adaptability and dexterity. Deformable soft-bodied robots are lightweight and flexible while offering high payload capacity and resistance, a combination ideally suited to extreme scenarios, among them certain nuclear applications requiring power and flexibility [12,13].
High power actuation and variable stiffness: Pneumatic actuators operate by channelling compressed air through tubes into soft materials, effectively acting as muscles. The principal benefits of this type of system are its simplicity and its ability to generate large forces. To increase dexterity, pneumatic pouch actuator-driven variable stiffness control can be achieved by using long tubes at a safe distance that allow the gripper to better adapt to objects of varying shape [14,15].
Waterproof and low cost: With new robotic technologies such as origami-inspired design/production and pneumatic actuation, soft robotic systems, made from deformable PVC materials, are capable of high payload/mass ratio, demonstrate high reliability and scalability, are waterproof, and can be easily integrated into existing equipment. They can also be built at a relatively low cost [16,17,18].
As with all rigid-bodied robotic devices, conventional grippers have limited capabilities interacting with irregularly shaped objects, unless equipped with a considerable number of actuators and high-resolution sensors. In contrast, soft robotic grippers have proven abilities grasping a wide variety of objects. Bearing in mind the dual objectives of minimising the use of electronic components while keeping controllers and power sources away from any radiation-contaminated environment, we proposed a novel design of flexure hinges and incorporated them into a robotic gripper. These hinges had adjustable stiffness (realised via shape morphing—see below) and were incorporated into gripper fingers, along with pneumatic pouch actuators. This enabled the deployment of the gripper within confined spaces, and indeed even in water ponds within the nuclear plant [19].
The stiffness variation described above is, as suggested, achieved through shape morphing. For a homogeneous beam of uniform cross-section, the flexural stiffness about any specific axis of bending is a function of the shape of that cross-section. The area of cross-section and the distribution of mass about the centroid affect the flexural stiffness of the beam. The cross-section of the flexural hinges in the fingers show two distinct elements—the thick central region, and two thin, bendable adjoining flaps (shown in Figure 3). An embedded pneumatic actuator attached to the two flaps can be pressurised to vary the flap angle and thus control the overall flexural stiffness of the hinge.
When a moment M is applied to the beam, the curvature k about the moment axis is given by k = M E I where E is Young’s modulus of the material and I is the second moment of area about the neutral axis in the plane of the moment axis. Initially, the flap is horizontal and aligned with the longer axis of symmetry in the cross-section of the flexure hinge (as seen in Figure 3 for the case of flap angle = 0 degrees). The second moment of area is at its lowest in this condition and the flexural hinge offers little resistance to bending. When the flap bends, this causes an overall increase in the second moment of area of the cross-section. As well as the change in orientation of the flap, the shift of the neutral axis away from the initial neutral axis also causes an increase in the second moment of area. For a flap of width 1 cm and height 2 mm, the second moment of inertia about the horizontal axis of symmetry is 25 times that about the vertical axis of symmetry. Thus, changing the orientation of the flaps can be used to effect a substantial change in the stiffness of the beam. Figure 3 shows a representational drawing of the change in the flexural stiffness with varying flap angles. As the flap angle increases from 0 to 60 , the ratio of the flexural stiffness to initial flexural stiffness E I / ( E I ) 0 increases due to the increase in the second moment of inertia. As a consequence, resistance to bending increases.
As Figure 3 shows, the actuation of different pouches, and consequent tendon response, leads to different gripper configurations. A “1” represents an actuated pouch at the flexure hinge while a “0” represents an unactuated one. The ability of these pneumatic pouches to control the flexural stiffness of the flexible fingers shows great potential in producing a variety of grasp modes. In this section, various combinations are tested by actuating the pouches embedded in the two flexible fingers and identifying the different configurations achievable by the under-actuated gripper.
Given the wide range of configurations that can be achieved by the gripper and the high conformability of the flexible fingers, as shown in Figure 3, we can demonstrate the gripper’s versatility in grasping different objects encountered in daily life—indeed the gripper proves capable of handling objects of different shape and size and of variable stiffness (rigid to soft). It suggests that the new gripper could be used in storage and other confined environments for picking and placing tasks.

3.2. Safe Object Grasping and Manipulation

Manipulating objects in nuclear environments brings with it two key challenges. Objects are often unknown, (i.e., we do not have access to analytical models or experience from previous learning) and these objects need to be handled safely (i.e., without breaking or dropping them). Haptic intelligence, the use of tactile sensing in exploratory procedures, is therefore vitally important for a robot hand employed in these settings. Grasp safety can be maximised through appropriate haptic procedures both before and during the manipulation of an object.
Before manipulation, the object can be explored haptically to identify an ideal grasp metric. To minimise the number of exploratory actions required, unscented Bayesian optimisation has proven very effective [20,21]. We implemented a full perception–action pipeline [22] in which we employed unscented Bayesian optimisation to identify grasps that maximise a force closure metric, in order to determine a configuration that has a high probability of being robust, before using it to pick up and transport an object. This approach works in applications in which time is not critical but safety is; indeed, haptic exploration does require a certain amount of time (ultimately depending on the complexity of the explored object) but, as our experiments demonstrate, it dramatically increases the chances of keeping the object stable within the grasp once it has been picked up.
During manipulation, real-time feedback from tactile/force sensors in the robot fingertips helps keep a stable grasp of the object. We demonstrated this in a set of experiments in which we learned in-hand manipulation actions from human demonstrations and executed them on novel unknown objects [23]. What we learn from demonstration is the intended motion of the object rather than the profile of forces applied to it. However, with the aid of a compliant controller that leverages real-time force feedback from the fingertip sensors, these actions can be executed successfully without drops or breakages.
In a different set of experiments, we used a parallel robotic gripper equipped with a recently developed tactile skin that can measure 3D contact forces at multiple contact points [24]. We initially recorded tactile data during several pick-and-place actions using different objects, grasped in different configurations. In these instances, the gripper applied a constant force on the object, and on some occasions we observed object slips. We then used these labelled data to train a classifier that would detect slip events and were able to report a high degree of classification accuracy [25].
Notably, we were able to show that the trained classifier could also be applied to a set of novel unknown objects, detecting slip events when handling objects that had not been included in the training process, albeit with a lower degree of accuracy. Interestingly, a different version of this tactile skin has been used to sensorise the fingertips and fingers of a dexterous robotic hand [26], suggesting that this approach could equally be applied to dexterous hands.

3.3. A Software Framework for Robot Manipulation

We have recently introduced the Grasping Robot Integration & Prototyping (GRIP) framework [27], a novel open-source software platform that facilitates robot programming and deployment. GRIP is hardware-agnostic and its main aim is to reduce the time and effort required to programme complex robotic manipulation tasks, by enabling the integration of disparate components. Its graphical user interface (GUI), for example, helps and guides the user through all the necessary steps, from hardware and software integration to the design and execution of complex tasks.
GRIP provides a systems framework in which robots can be operated with ROS-compatible software components from various sources, in addition to those that are custom-made. Both low-level and high-level components (kinematic libraries, controllers, motion planners, sensors, and learning methods) can be easily integrated in a relatively straightforward way (see Figure 4). The GRIP framework, we believe, moves us closer to the potential application of pre-existing software components within the nuclear setting.
To ensure the framework is as accessible as possible, we needed to keep the requisite programming to a minimum. In doing so we focused, in particular, on the following issues:
  • Robot interfacing—using components widely available online;
  • Software and sensor integration—regardless of implementation details;
  • Variables definition—to be used during robot execution;
  • Task design and execution using integrated components—via a visual drag and drop interface.
Videos of typical use cases implemented using this procedure are available online (https://sr-grip.readthedocs.io/en/latest/ (accessed on 28 September 2021)).
GRIP enables the rapid integration and deployment of a broad range of components onto a robot, eliciting minimal limitations in the execution of a specific task. The various integration options available within the framework enable components of different origins to be linked together. By way of example, a specially designed controller for a robot end-effector could be made to work alongside MoveIt! [28] in the operation of the arm.
The integration of different components (such as sensors, actuators, controllers, etc.) even in moderately complex robotic scenarios requires clear communication pathways and agile management of information flow. GRIP can provide an interface that allows for the management and propagation of custom ROS messages. Per Figure 5, GRIP abstracts a given robot prototype as a set of configuration-dependent building blocks which interfaces, for ease of manipulation by the user, via an interactive GUI that effectively walks the user through each stage of the workflow—from robot integration to task execution.
To further facilitate usability, especially when integration involves unfamiliar components (and therefore unknown syntax), we incorporated a real-time parser, which flags up any invalid input.
Having interfaced a robot in GRIP, we move our attention to behavioural design using a set of integrated components. As with other robot programming frameworks [29,30], this is carried out using state machines in a drag-and-drop programming environment, making the process more intuitive than text-based programming, with no specific prior knowledge required. As per Figure 6, task generation involves the following steps:
  • Dragging and dropping states or state machines in the window;
  • Configuring each state;
  • Amalgamating the outcomes of the various elements to define the behaviour of the robot.
This graphical programming approach simplifies the logic of the task, presenting it in a clear visual format. The ability to modify states within the task editor strips away the considerable complexity of configuring challenging robotic tasks when programming with a text editor. GRIP’s visual programming editor thus lowers the configuration overhead when setting up grasping and manipulation tasks. It has also been shown to be usable, and indeed beneficial, to both experienced and novice users as revealed by a user study involving different levels of expertise in robotics [27].
As a system integration platform for solving complex robotic tasks, GRIP has, in our opinion, considerable potential value towards developing engineering systems for hazardous environments. A typical task in nuclear decommissioning is to pick and sort lightly irradiated elements into various containers—a task that mandates the maintenance of a stable object pose throughout the procedure. We therefore had GRIP integrate a varied set of software and hardware components (e.g., uSkin sensors [24]) into the robot arm and gripper and then used the task editor to implement an autonomous pick-and-place task with added slip-detection. A tactile sensor detects any slippages during the grasping phase; where slip is identified, the robot replaces the object onto the surface and re-grasps it, maintaining its pose.
Another typical decommissioning use case involves manoeuvring a soft robot into a building containing nuclear material, to detect and assess cracking in any vessels housing nuclear material. In this situation, GRIP takes care of the integration of the soft robot controller, with vision and force/tactile sensors to supply observational data that is entered into a crack detection algorithm. GRIP enables the rapid prototyping of different exploratory strategies, facilitating evaluation comparisons.

4. Sensors and Sensor Deployment

4.1. Using Visual-Tactile Sensing for Surface Inspection

Automatic inspection of pipework [31], vats, tanks, and other vessels for mechanical fractures could be expected to form an important part of an early warning system in relation to chemical/radioactive waste management in nuclear environments. Established industry-standard techniques for inspecting large structures include X-ray scanning [32], eddy-current techniques [33], those exploiting fluid motion dynamics [34], and vision-based crack detection [35]. Methods such as these bring a certain overhead in terms of needing specialist equipment (and its potentially high cost) along with in situ experienced personnel, militating against their use in nuclear environments. A further drawback in environments that have limited luminosity and are subject to strong radiation is that electronic instruments (e.g., cameras) can be rendered inoperable or prone to degradation and failure.
In light of this, we advocate a new technique that uses fibre-optic tactile sensors in the detection of surface cracks. It has been developed with a nuclear decommissioning use case in mind that involves remotely operated robots. Using fibre optics brings certain advantages such as its function being unaffected by gamma radiation [36,37,38]. It, therefore, presents a potential route to replacing electrical cables in nuclear power plants [39,40].
In [41,42], we propose a software framework for an integrated force and proximity sensor in the shape of a finger, as detailed in [43]. This sensor is comprised of 3D-printed rigid (VeroClear Glossy) and soft (Nylon-PA2200) components, enabling a certain amount of flexure when pressed up against objects in the environment.
As per Figure 7a, the sensor uses three fibre optic light guides (D1, D2, and D3), each consisting of a pair of optical fibres, relying on light intensity modulation to determine the deformation of the flexible mid-section. A fourth pair of optical fibres (P) is responsible for proximity sensing, measuring the distance between the sensor’s tip and objects in its vicinity. A Keyence FS-N11MN light-to-voltage transducer is attached to each light guide. Changes in light intensity are therefore observed. Further details regarding the functioning of the device can be found in [43,44].
In [41,42], we establish a process that combines tactile and optical proximity sensing as the basis for efficient automatic crack detection. We leverage the Keyence sensor coupled with learning algorithms for the detection of cracks and protuberances using deformation and proximity readings. When a particular crack is identified, the system automatically classifies it according to its width, this process running both on- and off-line. Data collection and testing of the proposed algorithm involved mounting the sensor onto the end-effector of a Touch desktop haptic interface (previously called Phantom Omni, latterly Geomagic), as depicted in Figure 7b. We had the Geomagic execute a periodic sliding movement tracing the tactile sensor over a sample surface. An Arduino Mega ADK, connected to the computer via USB port, was used to acquire the data at 400 Hz via 4 analogue pins. The data were later matched against the absolute tip position of the tactile and proximity sensor, as Figure 7d shows. Being fibre-optic in design, the sensing instrument is resilient to gamma radiation, and should therefore readily work in a nuclear environment [36]. Moreover, subject to certain constraints, the nylon parts of the sensor are also usable within certain radiation parameters, as explained in [45].
The mean detection rate for cracks was ∼94%, while the mean correct classification rate for crack widths was ∼80%. A technique for online classification has also been developed, allowing for surface exploration in real time. The technique presented in [46] for multi-modal visuo-tactile surface inspection that enables the detection and classification of fractures can reasonably be applied to a remote inspection use case in a nuclear environment. As an example, a teleoperated robot manipulator equipped with this sensor would be able to scan for areas of interest. Visual imagery captured by such a robot could then be processed by the proposed algorithm to pinpoint those areas that have a high probability of containing mechanical fractures [41]. Subsequent to this, the robot homes in on the identified area, utilising its on-board sensor-embedded manipulator to perform in situ surface exploration at close quarters to gather further information on potential damage.
The presented technique employs object detection to determine the location of surface fractures that are then further inspected and verified via optical fibre proximity sensing. The sensor data is captured during physical interaction between a customised robotic finger and the environment. A complete description of the model and datasets can be found in [46]. A pair of experiments were conducted to evaluate the efficacy of the multi-modal solution: one to assess the online detection rate and another to gauge the time taken to perform surface exploration and identify cracks. Figure 7c presents a data sample processed by the aforementioned algorithm. Following visuo-tactile fusion, the model had elicited a detection rate of 92.85% of all cracks. Surface exploration by tactile sensing alone took an average of 199 s, this figure falling to 31 s when leveraging both modalities.

4.2. Surface Characterisation with Highly Sensitive Bio-Inspired Tactile Cilia

Finer characterisation of the texture of a surface can be obtained by employing a more sensitive tactile or force sensor. To this end, we developed a miniaturised sensor [47] inspired by the physical structure of biological cilia, which are found in many living organisms [48], e.g., the hair on our skin, the whiskers of a rat, and the trichomes of ciliated cells. Our sensor is composed of a flexible magnetic cilium whose magnetic field is measured by a Giant MagnetoResistive (GMR) sensor. When the cilia are deformed through external contact, the magnetic field changes, providing information on the nature of that contact; indeed, both the physical structure of the flexible cilia and the high sensitivity of the GMR sensor allow for the measurement of very small deformations, rendering this physical scanning method able to extract very detailed information about the texture of a surface. We applied this idea to two different scenarios. In one case we scanned a thin metallic sheet into which we had intentionally worked some defects in the form of holes and cavities, and were able to precisely detect the position and size of those defects by using a simple signal processing analysis of the sensor data [49]. In another set of experiments, we scanned two different types of fruit (apples and strawberries) that were labelled as either ripe or senescent, and trained classifiers based on the data collected with the sensor. We were able to report high classification accuracy for both apples and strawberries, using different versions of the sensor, either with one single cilium or with a matrix of cilia [50].

4.3. Highly Manoeuvrable Eversion Robot for Sensor Delivery

Soft robots are characterised by flexible and compliant bodies and an associated high number of degrees of freedom. They also tend to be lighter in weight and lower in cost than their rigid-body counterparts, the former of these two properties making them relatively easy to transport and deploy. Being compliant, soft robots can re-shape and mould themselves to the environment, such as when colliding with an obstacle. A variety of means of actuation exist to steer (soft) continuum manipulators towards a target or region of interest, such as tendon mechanisms [51,52,53], McKibben muscles [54], pouch motors [55,56], inflatable pleats [57], inflatable bladders [58,59,60], and pneumatic artificial muscles [61,62]. Despite these advances in actuation, most soft robots currently still face limitations in relation to elongation or extension of their main structure and are therefore unable to make traverses over long distances.
Eversion robots, a new class of soft robot, can, however, grow longitudinally, exploiting the principle of eversion. This, combined with their ability to squeeze through narrow openings, opens up the possibility of robots being able to access locations that would otherwise have remained inaccessible. These inflatable robots overcome the elongation issue, achieving extension ratios of 1 (initial length) to 100 (fully extended) [63], enabling access to remote environments [64].
An eversion robot resembles a sleeve of circular cross-section which is folded in on itself [65]. Under pressure the folds near the tip start to unravel, turning inside out and forcing longitudinal displacement or growth. In free space, growth is linear; when butting up against obstacles, the expanding eversion robot conforms to its surroundings. This property renders it well suited to negotiating an unstructured environment or confined space. The principal downside is the limited capacity for bending inherent to most designs of this type.
We have proposed a novel way to enhance bending actuation of the robot’s structure [66]. The augmented design features an eversion robot body that integrates a central chamber acting as the backbone with actuators that enable bending and help manoeuvre the manipulator. The proposed method results in greatly improved bending capacity (133% improvement in bending angle) in an eversion robot with externally attached actuators. The added manoeuvrability represents a step-change in the development of eversion robots for use in remote and hard-to-reach environments. These enhancements, as well as ongoing work to prime an eversion robot for sensor deployment, allow for delivery of sensor loads to cluttered scenes in nuclear environments inaccessible to humans.
Eversion robots can deploy sensors to remote locations in each of the following three ways:
  • (For unknown environments or destinations) Because the tip extends continuously, attaching a gripper or indeed any end-effector to it would enable relevant tasks to be carried out once the target destination is reached.
  • (For unknown environments or destinations) The eversion robot has a longitudinal hollow, so once it has extended and reached its target, the sensor can be passed from one end of the robot (base) to the other end of the robot (tip)—as shown in Figure 8.
  • (For known environments or destinations) Attaching the sensor to a predetermined position within the body of the robot. Provided we know the precise location of the target, we can place the sensor within the robot at the exact point that will unfold upon reaching that target. In this way the sensor can be deployed to the correct position.

5. Human–Machine Interfaces for Efficient Robot Teleoperation in Extreme Environments

Reliable, easy to learn, and easy to operate human–machine interfaces are critical for the efficient and safe teleoperation of robotic systems performing tasks in extreme environments. At QMUL we have proposed several novel interaction methods [6,67,68,69] that utilise virtual reality and haptic technologies (as outlined in the following subsections) to efficiently teleoperate robots located in remote and hazardous environments. The proposed human–machine interfaces can be employed within the integrated telerobotic system shown in Figure 1.

5.1. Virtual Reality-Based Teleoperation

We present an overview of a virtual reality (VR)-based robot teleoperation interface that can facilitate multiple exploration and manipulation tasks in extreme environments. In comparison to conventional robot teleoperation interfaces (2D displays, keyboards, and joysticks), interfaces using VR headsets and handheld wireless controllers provide a human operator with improved depth perception [70], more intuitive control, and better remote-environment exploratory capacity [71,72].
In our recent work we compared a human operator’s ability to perceive and navigate a remote environment with the help of VR-based teleoperation interfaces employing different visualisation and remote-camera operation modes [6]. We considered the following video camera configurations for the remote environment: using a single external static camera; a camera attached to and manoeuvred by a robotic manipulator (in-hand dynamic camera); a combination of in-hand dynamic and external static cameras; and an in-hand dynamic camera in conjunction with scene visualisation based on OctoMap occupancy mapping. These four remote-scene representation modes were compared in an experimental study with human participants. The experimental task was to explore the remote environment and to detect and identify objects placed randomly in the vicinity of the teleoperated robot manipulator. Performance on each task was assessed in terms of completion time, number of correctly identified objects, and NASA task load index.
Our study showed that the in-hand dynamic camera operated by a robot manipulator combined with OctoMap visualisation provided a human operator with better understanding of the remote environment whilst requiring relatively small communication bandwidth. However, an in-hand camera hindered grasping tasks, as the RGB-D camera could not properly register objects that were very close to it (minimal distance varies in the approx. range 10–15 cm). To improve the performance of grasping tasks when RGB-D cameras were used, we have proposed several VR-based manipulation techniques that utilise remote scene segmentation and cloning for VR representation and a set of VR gestures to control remote grasping (see Figure 9) [6]. A video demonstration of the system is available here (https://youtu.be/3vZaEykMS_E (accessed on 28 September 2021)). Additionally, we have demonstrated that it can be beneficial for VR-based teleoperation interfaces to introduce workspace scaling if rate mode control is used such that the human-operator joystick’s displacement is mapped onto the desired speed of the remote robot’s end-effector [73]. The commands for the teleoperated robotic manipulator were sent using the interoperability teleoperation protocol for switching between position and rate control modes [74,75,76].

5.2. Haptic Feedback for Robot Teleoperation

Providing tactile feedback to a human operator is another key factor in achieving safe and successful task completion in remote robotic manipulation. Traditional haptic interfaces can generate force or vibrotactile feedback to characterise the haptic properties of the remote environment; therefore, we have explored the use of simple vibration motors as a very affordable and portable solution [77,78,79,80,81]. However, very few of these solutions can simultaneously render force, texture, and shape. To address this limitation, several novel haptic devices based on particle jamming have been developed, allowing air pressure control to affect jamming transition and inflate the touch surface and the source of the vibrations (via eccentric motors or linear resonant actuators). The prototype interface [67,82] and the proposed joystick interface [83] are shown in Figure 10. Our solution uses a vacuum pump to generate an area of low pressure within a rigid casing, forcing the soft cover into the device and causing the particles to jam. Two materials were considered for the soft haptic pad—Polyvinyl Chloride (PVC) and vinyl. During testing, the PVC sheet proved too stiff to effectively deform and jam the particles, though vinyl was moderately effective. The particle filling was also selected following initial experimentation, with plastic balls ultimately replaced by quinoa seeds.
We have tested how the vibrotactile waves propagate through the particle jamming interface at different air pressure levels. An accelerometer attached to the tactile surface of the prototype was used to measure the vibrations. Experimental results show that the amplitude of vibration drops from 50 m/s 2 to 20 m/s 2 over the range of pressures used in the experiment. Another interesting observation is that the shape of the vibration waveform has a noticeable double peak in the soft fluid but rapidly becomes smoother as the fluid stiffens. Testing showed that under low vacuum pressure, and thus with the particles in their soft state, increasing the vibration motor’s power created a more pronounced periodic vibration and a slight increase in the measured amplitude. Frequency remained fairly consistent under this condition. Under higher vacuum pressure, and a consequent rigid particle body, increasing motor power doubled the amplitude of vibration by about 50%, after which point vibration amplitude remained consistent [67].
A joystick-shaped implementation (Figure 10b) of the existing particle jamming interface uses two soft pouches to contain the particle fluid and vibrotactile actuator (mounted directly at the back of the soft pouch to keep it in place and allow vibrations to propagate outwards to the user’s hand). Stiffness and shape can be controlled by the existing control unit, enhanced by a second pressure regulator and driver for the vibrotactile actuator.

5.3. Teleoperation of Legged and Wheeled Mobile Robots

In addition to teleoperation of robotic manipulators it is often necessary to employ mobile robotic systems to safely deliver sensors and tools to remote environments with larger workspaces [84,85]. We have developed a novel lower-limb teleoperation interface that can be used for teleoperation of legged and wheeled robotic systems [69,86,87]. Our interface uses a human operator’s legs’ input to control a remote mobile robot which is crucial in the creation of a genuinely immersive experience for teleoperation.
The designed ankle interface is shown in Figure 11. The device consists of a single actuated foot platform that rotates around the ankle’s coronal axis. The platform is actively impedance-controlled around the horizontal state [68]. A seated human operator uses alternate left/right ankle plantar-/dorsi-flexion (foot tapping) as the input walking command for robot teleoperation. The platform’s periodic angular displacements are captured by a shaft encoder and, via a dedicated gait extraction algorithm, mapped to a continuous reference that dictates the remotely controlled robot’s gait. As a result, the remote robot can imitate the walking commands of its human operator. Significantly, since the ankle platform is actuated, it can render haptic feedback that can be programmed to reflect the properties of a remote robot’s terrain.
We have evaluated the designed ankle-based teleoperation interface with a group of human participants and it was demonstrated that the ankle gestures required for the interface are learned [69] and can be efficiently used to control the speed of a remotely teleoperated legged robot [87]. It was also demonstrated that the platform can render different types of haptic terrain feedback [69]. A more recent version of the interface is presented in [88]. The proposed ankle-based input interfaces can be used to teleoperate wheeled mobile robots as well.

6. Conclusions and Future Challenges

The paper provides an overview of recent advances in robot technologies developed at the Centre for Advanced Robotics @ Queen Mary (ARQ) in relation to dealing with nuclear waste. Recognising the need for remedial solutions for the decommissioning of radioactive waste, ARQ set out to research and evaluate the role of soft robotics in this area of the nuclear industry, with a focus on robot design, sensors and their delivery, grasping and related manipulation tasks, and human–machine interfaces for teleoperated systems. Advancements in these key areas show that reasonable approaches can be found to solve specific problems. With this in mind, initial steps have been taken to create a software framework that allows us to integrate these varied components in a straightforward user-friendly way. We see this framework as a potential catalyst to bringing new technological solutions to what has historically been a somewhat conservative industry.
Despite the many and varied advancements in the field, significant challenges remain. On the upside, soft robots can penetrate relatively inaccessible areas of the nuclear world, and, by keeping their electronics outside the radioactive environment, can operate close to sources of nuclear radiation. However, these types of robots bring their own challenges and further progress in terms of their control and navigation, possibly using data-driven learning-based methods, are much needed. The proposed sensor solutions to identify surfaces in the rough environment of nuclear plants make use of optical fibres as they are somewhat more immune to radiation and have a longer lifespan than their electronic counterparts. However, the accurate and reliable measurement of surface properties over long periods needs to be further improved. Grasping and manipulation techniques have advanced considerably and the robust handling of a range of objects is observable in realistic environments. The next step, i.e., conducting intelligent manipulation in actual nuclear environments, needs to be taken to ultimately demonstrate the feasibility of the proposed approach. Retaining a human element in the loop of any such operation remains crucial, as we currently do not have machines working at high enough automation levels to independently carry out complex tasks in complex environments. We, therefore, remain reliant on human–machine interfaces, and to this end ARQ has enjoyed notable progress in relation to its work on haptic and virtual/augmented reality interfaces. However, to create a completely immersive ‘feel’ and the most intuitive interactive system for the user remains a challenge.

Author Contributions

Conceptualisation, I.V., I.F., B.D., F.P., A.O., J.B., B.O., T.A., M.H., C.O., S.P., C.L., H.G., K.Z., L.J. and K.A.; writing (draft preparation), I.V., I.F., B.D., F.P., A.O., J.B., B.O., C.L., K.Z., L.J. and K.A.; writing (review and editing), I.V., I.F., L.J., S.P. and K.A. All authors have read and agreed to the published version of the manuscript.

Funding

The research in this work was funded under the Robotics and AI for Extreme Environments programme’s NCNR grant: National Centre for Nuclear Robotics (EP/R02572X/1).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analysed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, G.Z.; Bellingham, J.; Dupont, P.E.; Fischer, P.; Floridi, L.; Full, R.; Jacobstein, N.; Kumar, V.; McNutt, M.; Merrifield, R.; et al. The grand challenges of Science Robotics. Sci. Robot. 2018, 3, eaar7650. [Google Scholar] [CrossRef] [PubMed]
  2. Fisher, M.; Cardoso, R.C.; Collins, E.C.; Dadswell, C.; Dennis, L.A.; Dixon, C.; Farrell, M.; Ferrando, A.; Huang, X.; Jump, M.; et al. An Overview of Verification and Validation Challenges for Inspection Robots. Robotics 2021, 10, 67. [Google Scholar] [CrossRef]
  3. Lovering, D. Radioactive Robot: The Machines That Cleaned up Three Mile Island; Scientific American: New York, NY, USA, 2009. [Google Scholar]
  4. Oshiro, T.; Palmer, C.; Hollinger, G.; Menguc, Y.; Palmer, T.; Courier, T.; Yirmibesoglu, O.D.; Morrell, S.; Rynes, A. Soft Robotics in Radiation Environments for Safeguard Applications. Available online: http://research.engr.oregonstate.edu/rdml/sites/research.engr.oregonstate.edu.rdml/files/soft_robotics_inmm_annual_2017.pdf (accessed on 28 September 2021).
  5. Smith, R.; Cucco, E.; Fairbairn, C. Robotic Development for the Nuclear Environment: Challenges and Strategy. Robotics 2020, 9, 94. [Google Scholar] [CrossRef]
  6. Omarali, B.; Denoun, B.; Althoefer, K.; Jamone, L.; Valle, M.; Farkhatdinov, I. Virtual Reality based Telerobotics Framework with Depth Cameras. In Proceedings of the IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 1217–1222. [Google Scholar]
  7. Marturi, N.; Rastegarpanah, A.; Rajasekaran, V.; Ortenzi, V.; Bekiroglu, Y.; Kuo, J.; Stolkin, R. Towards advanced robotic manipulations for nuclear decommissioning. In Robots Operating in Hazardous Environments; InTech: London, UK, 2017. [Google Scholar]
  8. Voinov, I.; Nosikov, M. Automatic and Manual Control Algorithms of Radiation-Proof Manipulators. In Proceedings of the IEEE 2018 Global Smart Industry Conference (GloSIC), Chelyabinsk, Russia, 13–15 November 2018; pp. 1–6. [Google Scholar]
  9. Prikhodko, V.; Sobolev, A.; Zhukov, A.; Chavkin, E.; Fomin, A.; Levshchanov, V.; Pavlov, S.; Svetukhin, V. Radiation-resistant robotic manipulator controlled by 6-DoF haptic control device to perform technological tasks in hot cells. In Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2019; Volume 1353, p. 012045. [Google Scholar]
  10. Lee, K.; Wang, Y.; Zheng, C. Twister hand: Underactuated robotic gripper inspired by origami twisted tower. IEEE Trans. Robot. 2020, 36, 488–500. [Google Scholar] [CrossRef]
  11. Chen, T.; Haas-Heger, M.; Ciocarlie, M. Underactuated hand design using mechanically realizable manifolds. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 7392–7398. [Google Scholar]
  12. Phillips, B.T.; Becker, K.P.; Kurumaya, S.; Galloway, K.C.; Whittredge, G.; Vogt, D.M.; Teeple, C.B.; Rosen, M.H.; Pieribone, V.A.; Gruber, D.F.; et al. A dexterous, glove-based teleoperable low-power soft robotic arm for delicate deep-sea biological exploration. Sci. Rep. 2018, 8, 14779. [Google Scholar] [CrossRef] [Green Version]
  13. Yirmibeşoğlu, O.D.; Oshiro, T.; Olson, G.; Palmer, C.; Mengüç, Y. Evaluation of 3D printed soft robots in radiation environments and comparison with molded counterparts. Front. Robot. AI 2019, 6, 40. [Google Scholar] [CrossRef] [Green Version]
  14. Yang, Y.; Zhang, Y.; Kan, Z.; Zeng, J.; Wang, M.Y. Hybrid jamming for bioinspired soft robotic fingers. Soft Robot. 2020, 7, 292–308. [Google Scholar] [CrossRef]
  15. Shintake, J.; Cacucciolo, V.; Floreano, D.; Shea, H. Soft robotic grippers. Adv. Mater. 2018, 30, 1707035. [Google Scholar] [CrossRef] [Green Version]
  16. Li, H.; Yao, J.; Wei, C.; Zhou, P.; Xu, Y.; Zhao, Y. An untethered soft robotic gripper with high payload-to-weight ratio. Mech. Mach. Theory 2021, 158, 104226. [Google Scholar] [CrossRef]
  17. Sui, D.; Zhu, Y.; Zhao, S.; Wang, T.; Agrawal, S.K.; Zhang, H.; Zhao, J. A Bioinspired Soft Swallowing Gripper for Universal Adaptable Grasping. Soft Robot. 2020. [Google Scholar] [CrossRef]
  18. Licht, S.; Collins, E.; Badlissi, G.; Rizzo, D. A partially filled jamming gripper for underwater recovery of objects resting on soft surfaces. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 6461–6468. [Google Scholar]
  19. Godaba, H.; Sajad, A.; Patel, N.; Althoefer, K.; Zhang, K. A Two-Fingered Robot Gripper with Variable Stiffness Flexure Hinges Based on Shape Morphing. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 8716–8721. [Google Scholar]
  20. Nogueira, J.; Martinez-Cantin, R.; Bernardino, A.; Jamone, L. Unscented Bayesian optimization for safe robot grasping. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 1967–1972. [Google Scholar]
  21. Castanheira, J.; Vicente, P.; Martinez-Cantin, R.; Jamone, L.; Bernardino, A. Finding safe 3D robot grasps through efficient haptic exploration with unscented Bayesian optimization and collision penalty. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1643–1648. [Google Scholar]
  22. Siddiqui, M.S.; Coppola, C.; Solak, G.; Jamone, L. Grasp Stability Prediction for a Dexterous Robotic Hand combining Depth Vision and Haptic Bayesian Exploration. Front. Robot. AI 2021. in print. [Google Scholar] [CrossRef] [PubMed]
  23. Solak, G.; Jamone, L. Learning by Demonstration and Robust Control of Dexterous In-Hand Robotic Manipulation Skills. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; pp. 8246–8251. [Google Scholar]
  24. Tomo, T.P.; Regoli, M.; Schmitz, A.; Natale, L.; Kristanto, H.; Somlor, S.; Jamone, L.; Metta, G.; Sugano, S. A New Silicone Structure for uSkin—A Soft, Distributed, Digital 3-Axis Skin Sensor and Its Integration on the Humanoid Robot iCub. IEEE Robot. Autom. Lett. 2018, 3, 2584–2591. [Google Scholar] [CrossRef]
  25. Zenha, R.; Denoun, B.; Coppola, C.; Jamone, L. Tactile Slip Detection in the Wild Leveraging Distributed Sensing of both Normal and Shear Forces. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021. [Google Scholar]
  26. Tomo, T.P.; Schmitz, A.; Wong, W.K.; Kristanto, H.; Somlor, S.; Hwang, J.; Jamone, L.; Sugano, S. Covering a Robot Fingertip With uSkin: A Soft Electronic Skin With Distributed 3-Axis Force Sensitive Elements for Robot Hands. IEEE Robot. Autom. Lett. 2018, 3, 124–131. [Google Scholar] [CrossRef]
  27. Denoun, B.; Leon, B.; Hansard, M.; Jamone, L. Grasping Robot Integration and Prototyping: The GRIP Software Framework. IEEE Robot. Autom. Mag. 2021, 28, 101–111. [Google Scholar] [CrossRef]
  28. Chitta, S.; Sucan, I.; Cousins, S. Moveit! IEEE Robot. Autom. Mag. 2012, 19, 18–19. [Google Scholar] [CrossRef]
  29. Schillinger, P.; Kohlbrecher, S.; von Stryk, O. Human-robot collaborative high-level control with application to rescue robotics. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 2796–2802. [Google Scholar]
  30. Brunner, S.G.; Steinmetz, F.; Belder, R.; Dömel, A. RAFCON: A graphical tool for engineering complex, robotic tasks. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 3283–3290. [Google Scholar]
  31. Chablat, D.; Venkateswaran, S.; Boyer, F. Dynamic model of a bio-inspired robot for piping inspection. In ROMANSY 22–Robot Design, Dynamics and Control; Springer: Cham, Switzerland, 2019; pp. 42–51. [Google Scholar]
  32. Barhli, S.; Saucedo-Mora, L.; Jordan, M.; Cinar, A.; Reinhard, C.; Mostafavi, M.; Marrow, T. Synchrotron X-ray characterization of crack strain fields in polygranular graphite. Carbon 2017, 124, 357–371. [Google Scholar] [CrossRef] [Green Version]
  33. Yao, Y.; Tung, S.T.E.; Glisic, B. Crack detection and characterization techniques—An overview. Struct. Control Health Monit. 2014, 21, 1387–1413. [Google Scholar] [CrossRef]
  34. Nicoletti, R.; Cavalini, A.A.; Steffen, V. Detection of Cracks in Rotating Shafts by Using the Combination Resonances Approach and the Approximated Entropy Algorithm. Shock Vib. 2018, 2018, 4094631. [Google Scholar] [CrossRef] [Green Version]
  35. Mohan, A.; Poobal, S. Crack detection using image processing: A critical review and analysis. Alex. Eng. J. 2018, 57, 787–798. [Google Scholar] [CrossRef]
  36. Berghmans, F.; Fernandez, A.F.; Brichard, B.; Vos, F.; Decreton, M.C.; Gusarov, A.I.; Deparis, O.; Megret, P.; Blondel, M.; Caron, S.; et al. Radiation hardness of fiber optic sensors for monitoring and remote handling applications in nuclear environments. In Process Monitoring with Optical Fibers and Harsh Environment Sensors; International Society for Optics and Photonics, Photonics East: Boston, MA, USA, 1999; Volume 3538, pp. 28–39. [Google Scholar]
  37. Phéron, X.; Girard, S.; Boukenter, A.; Brichard, B.; Delepine-Lesoille, S.; Bertrand, J.; Ouerdane, Y. High γ-ray dose radiation effects on the performances of Brillouin scattering based optical fiber sensors. Opt. Express 2012, 20, 26978–26985. [Google Scholar] [CrossRef]
  38. Inaudi, D.; Glisic, B.; Fakra, S.; Billan, J.; Redaelli, S.; Perez, J.G.; Scandale, W. Development of a displacement sensor for the CERN-LHC superconducting cryodipoles. Meas. Sci. Technol. 2001, 12, 887. [Google Scholar] [CrossRef] [Green Version]
  39. Hashemian, H. The state of the art in nuclear power plant instrumentation and control. Int. J. Nucl. Energy Sci. Technol. 2009, 4, 330–354. [Google Scholar] [CrossRef]
  40. Berthold III, J.W. Overview of prototype fiber optic sensors for future application in nuclear environments. In Optical Fibre Sensing and Systems in Nuclear Environments; International Society for Optics and Photonics, Society of Photo-Optical Instrumentation Engineers: Mol, Belgium, 1994; Volume 2425, pp. 74–83. [Google Scholar]
  41. Palermo, F.; Konstantinova, J.; Althoefer, K.; Poslad, S.; Farkhatdinov, I. Automatic fracture characterization using tactile and proximity optical sensing. Front. Robot. AI 2020, 7, 174. [Google Scholar] [CrossRef]
  42. Palermo, F.; Konstantinova, J.; Althoefer, K.; Poslad, S.; Farkhatdinov, I. Implementing tactile and proximity sensing for crack detection. In Proceedings of the Rincon IEEE International Conference on Robotics and Automation (ICRA), Online Virtual Conference, 31 May–15 June 2020; pp. 632–637. [Google Scholar]
  43. Konstantinova, J.; Cotugno, G.; Stilli, A.; Noh, Y.; Althoefer, K. Object classification using hybrid fiber optical force/proximity sensor. In Proceedings of the 2017 IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
  44. Konstantinova, J.; Stilli, A.; Althoefer, K. Fingertip fiber optical tactile array with two-level spring structure. Sensors 2017, 17, 2337. [Google Scholar] [CrossRef] [Green Version]
  45. Morita, Y.; Seguchi, T. Radiation resistance of nylon. Denki Gakkai Zetsuen Zairyo Kenkyukai Shiryo 1983, EIM-83, 47–52. [Google Scholar]
  46. Palermo, F.; Rincon-Ardila, L.; Oh, C.; Althoefer, K.; Poslad, S.; Venture, G.; Farkhatdinov, I. Multi-modal robotic visual-tactile localisation and detection of surface cracks. In Proceedings of the 2021 IEEE International Conference on Automation Science and Engineering, Lyon, France, 23–27 August 2021. [Google Scholar]
  47. Ribeiro, P.; Khan, M.A.; Alfadhel, A.; Kosel, J.; Franco, F.; Cardoso, S.; Bernardino, A.; Schmitz, A.; Santos-Victor, J.; Jamone, L. Bioinspired Ciliary Force Sensor for Robotic Platforms. IEEE Robot. Autom. Lett. 2017, 2, 971–976. [Google Scholar] [CrossRef]
  48. Seale, M.; Cummins, C.; Viola, I.; Mastropaolo, E.; Nakayama, N. Design principles of hair-like structures as biological machines. J. R. Soc. Interface 2018, 15, 20180206. [Google Scholar] [CrossRef] [Green Version]
  49. Ribeiro, P.; Cardoso, S.; Bernardino, A.; Jamone, L. Highly sensitive bio-inspired sensor for fine surface exploration and characterization. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Online Virtual Conference, 31 May–15 June 2020. [Google Scholar]
  50. Ribeiro, P.; Cardoso, S.; Bernardino, A.; Jamone, L. Fruit quality control by surface analysis using a bio-inspired soft tactile sensor. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020. [Google Scholar]
  51. Althoefer, K. Antagonistic actuation and stiffness control in soft inflatable robots. Nat. Rev. Mater. 2018, 3, 76–77. [Google Scholar] [CrossRef]
  52. Shiva, A.; Stilli, A.; Noh, Y.; Faragasso, A.; De Falco, I.; Gerboni, G.; Cianchetti, M.; Menciassi, A.; Althoefer, K.; Wurdemann, H.A. Tendon-based stiffening for a pneumatically actuated soft manipulator. IEEE Robot. Autom. Lett. 2016, 1, 632–637. [Google Scholar] [CrossRef] [Green Version]
  53. Zhang, Z.; Chen, G.; Wu, H.; Kong, L.; Wang, H. A pneumatic/cable-driven hybrid linear actuator with combined structure of origami chambers and deployable mechanism. IEEE Robot. Autom. Lett. 2020, 5, 3564–3571. [Google Scholar] [CrossRef]
  54. Al-Fahaam, H.; Davis, S.; Nefti-Meziani, S. The design and mathematical modelling of novel extensor bending pneumatic artificial muscles (EBPAMs) for soft exoskeletons. Robot. Auton. Syst. 2018, 99, 63–74. [Google Scholar] [CrossRef]
  55. Chang, S.Y.; Takashima, K.; Nishikawa, S.; Niiyama, R.; Someya, T.; Onodera, H.; Kuniyoshi, Y. Design of small-size pouch motors for rat gait rehabilitation device. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 4578–4581. [Google Scholar]
  56. Niiyama, R.; Sun, X.; Sung, C.; An, B.; Rus, D.; Kim, S. Pouch motors: Printable soft actuators integrated with computational design. Soft Robot. 2015, 2, 59–70. [Google Scholar] [CrossRef]
  57. Voisembert, S.; Mechbal, N.; Riwan, A.; Aoussat, A. Design of a novel long-range inflatable robotic arm: Manufacturing and numerical evaluation of the joints and actuation. J. Mech. Robot. 2013, 5, 045001. [Google Scholar] [CrossRef] [Green Version]
  58. Realmuto, J.; Sanger, T. A robotic forearm orthosis using soft fabric-based helical actuators. In Proceedings of the 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft), Seoul, Korea, 14–18 April 2019; pp. 591–596. [Google Scholar]
  59. Gillespie, M.T.; Best, C.M.; Killpack, M.D. Simultaneous position and stiffness control for an inflatable soft robot. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1095–1101. [Google Scholar]
  60. Nguyen, P.H.; Mohd, I.I.; Sparks, C.; Arellano, F.L.; Zhang, W.; Polygerinos, P. Fabric soft poly-limbs for physical assistance of daily living tasks. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 8429–8435. [Google Scholar]
  61. Hawkes, E.W.; Christensen, D.L.; Okamura, A.M. Design and implementation of a 300% strain soft artificial muscle. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 4022–4029. [Google Scholar]
  62. Usevitch, N.S.; Okamura, A.M.; Hawkes, E.W. APAM: Antagonistic pneumatic artificial muscle. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 1539–1546. [Google Scholar]
  63. Hawkes, E.W.; Blumenschein, L.H.; Greer, J.D.; Okamura, A.M. A soft robot that navigates its environment through growth. Sci. Robot. 2017. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  64. Greer, J.D.; Blumenschein, L.H.; Alterovitz, R.; Hawkes, E.W.; Okamura, A.M. Robust navigation of a soft growing robot by exploiting contact with the environment. Int. J. Robot. Res. 2020, 39, 1724–1738. [Google Scholar] [CrossRef] [Green Version]
  65. Putzu, F.; Abrar, T.; Althoefer, K. Plant-inspired soft pneumatic eversion robot. In Proceedings of the 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), Enschede, The Netherlands, 26–29 August 2018; pp. 1327–1332. [Google Scholar]
  66. Abrar, T.; Putzu, F.; Rizqi, A.; Godaba, H.; Althoefer, K. Highly Manoeuvrable Eversion Robot Based on Fusion of Function with Structure. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021. [Google Scholar]
  67. Brown, J.P.; Farkhatdinov, I. Soft Haptic Interface based on Vibration and Particle Jamming. In Proceedings of the 2020 IEEE Haptics Symposium (HAPTICS), Crystal City, VA, USA, 28–31 March 2020; IEEE: Washington DC, USA, 2020; pp. 1–6. [Google Scholar] [CrossRef]
  68. Otaran, A.; Farkhatdinov, I. Modeling and Control of Ankle Actuation Platform for Human-Robot Interaction. In Towards Autonomous Robotic Systems; Springer: Berlin/Heidelberg, Germany, 2019; pp. 338–348. [Google Scholar]
  69. Otaran, A.; Farkhatdinov, I. Haptic Ankle Platform for Interactive Walking in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2021. [Google Scholar] [CrossRef] [PubMed]
  70. Fung, W.K.; Lo, W.T.; Liu, Y.H.; Xi, N. A case study of 3D stereoscopic vs. 2D monoscopic tele-reality in real-time dexterous teleoperation. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 181–186. [Google Scholar] [CrossRef]
  71. Rakita, D. Methods for Effective Mimicry-based Teleoperation of Robot Arms. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 371–372. [Google Scholar] [CrossRef]
  72. Whitney, D.; Rosen, E.; Ullman, D.; Phillips, E.; Tellex, S. ROS Reality: A Virtual Reality Framework Using Consumer-Grade Hardware for ROS-Enabled Robots. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–9. [Google Scholar] [CrossRef]
  73. Omarali, B.; Althoefer, K.; Fulvio, M.; Valle, M.; Farkhatdinov, I. Workspace Scaling and Rate Mode Control for Virtual Reality based Robot Teleoperation. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics, Melbourne, Australia, 17–20 October 2021. [Google Scholar]
  74. Farkhatdinov, I.; Ryu, J.H. Switching of control signals in teleoperation systems: Formalization and application. In Proceedings of the 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Xian, China, 2–5 July 2008; pp. 353–358. [Google Scholar]
  75. King, H.H.; Hannaford, B.; Kwok, K.W.; Yang, G.Z.; Griffiths, P.; Okamura, A.; Farkhatdinov, I.; Ryu, J.H.; Sankaranarayanan, G.; Arikatla, V.; et al. Plugfest 2009: Global interoperability in telerobotics and telemedicine. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–8 May 2010; pp. 1733–1738. [Google Scholar]
  76. Omarali, B.; Palermo, F.; Valle, M.; Poslad, S.; Althoefer, K.; Farkhatdinov, I. Position and velocity control for telemanipulation with interoperability protocol. In Annual Conference Towards Autonomous Robotic Systems; Springer: Berlin/Heidelberg, Germany, 2019; pp. 316–324. [Google Scholar]
  77. Junput, B.; Wei, X.; Jamone, L. Feel It on Your Fingers: Dataglove with Vibrotactile Feedback for Virtual Reality and Telerobotics. In Towards Autonomous Robotic Systems (TAROS); Springer: Basingstoke, UK, 2019. [Google Scholar]
  78. Junput, B.; Farkhatdinov, I.; Jamone, L. Touch it, rub it, feel it! Haptic rendering of physical textures with a low cost wearable system. In Towards Autonomous Robotic Systems (TAROS); Springer: Basingstoke, UK, 2020. [Google Scholar]
  79. Duvernoy, B.; Farkhatdinov, I.; Topp, S.; Hayward, V. Electromagnetic actuator for tactile communication. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications; Springer: Berlin/Heidelberg, Germany, 2018; pp. 14–24. [Google Scholar]
  80. Ogrinc, M.; Farkhatdinov, I.; Walker, R.; Burdet, E. Sensory integration of apparent motion speed and vibration magnitude. IEEE Trans. Haptics 2017, 11, 455–463. [Google Scholar] [CrossRef] [Green Version]
  81. Ogrinc, M.; Farkhatdinov, I.; Walker, R.; Burdet, E. Horseback riding therapy for a deafblind individual enabled by a haptic interface. Assist. Technol. 2018, 30, 143–150. [Google Scholar] [CrossRef] [PubMed]
  82. Brown, J.; Farkhatdinov, I. Shape-Changing Touch Pad based on Particle Jamming and Vibration. In Proceedings of the 2021 IEEE World Haptics Conference (WHC), Online Virtual Conference, 6–9 July 2021; p. 337. [Google Scholar]
  83. Brown, J.; Farkhatdinov, I. A Soft, Vibrotactile, Shape-Changing Joystick for Telerobotics. In Proceedings of the 2021 IEEE World Haptics Conference (WHC), Online Virtual Conference, 6–9 July 2021; p. 1158. [Google Scholar]
  84. Farkhatdinov, I.; Ryu, J.H.; Poduraev, J. Control strategies and feedback information in mobile robot teleoperation. IFAC Proc. Vol. 2008, 41, 14681–14686. [Google Scholar] [CrossRef] [Green Version]
  85. Farkhatdinov, I.; Ryu, J.H.; Poduraev, J. A user study of command strategies for mobile robot teleoperation. Intell. Serv. Robot. 2009, 2, 95–104. [Google Scholar] [CrossRef]
  86. Otaran, A.; Farkhatdinov, I. A Short Description of an Ankle-Actuated Seated VR Locomotion Interface. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Online Virtual Conference, 27 March–2 April 2021; pp. 64–66. [Google Scholar]
  87. Otaran, A.; Farkhatdinov, I. Walking-in-Place Foot Interface for Locomotion Control and Telepresence of Humanoid Robotso. In Proceedings of the 2021 IEEE-RAS International Conference on Humanoid Robots, Munich, Germany, 20–21 July 2021. [Google Scholar]
  88. Otaran, A.; Farkhatdinov, I. A Cable-Driven Walking Interface with Haptic Feedback for Seated VR. In Proceedings of the 2021 IEEE World Haptics Conference (WHC), Online Virtual Conference, 6–9 July 2021; p. 592. [Google Scholar]
Figure 1. Overview of our research contributions.
Figure 1. Overview of our research contributions.
Robotics 10 00112 g001
Figure 2. Remotely operated low-cost gripper with variable stiffness flexure hinges actuated by pneumatic and tendon driven systems.
Figure 2. Remotely operated low-cost gripper with variable stiffness flexure hinges actuated by pneumatic and tendon driven systems.
Robotics 10 00112 g002
Figure 3. (a) Variation of flexural stiffness as a function of the flap angle. (b) Actuation of different pouches leads to different gripper configurations upon actuation by tendons (Reprinted with permission from ref. [19]. Copyright 2020 IEEE).
Figure 3. (a) Variation of flexural stiffness as a function of the flap angle. (b) Actuation of different pouches leads to different gripper configurations upon actuation by tendons (Reprinted with permission from ref. [19]. Copyright 2020 IEEE).
Robotics 10 00112 g003
Figure 4. Overview of the GRIP framework. The architecture allows users to operate robots with integrated components (hardware and/or software) for grasping and manipulation tasks. The task editor provides an intuitive interface for designing and tuning the robot’s behaviour. Arrows indicate the different (mutually compatible) ways to interface external components (Reprinted with permission from ref. [27]. Copyright 2020 IEEE).
Figure 4. Overview of the GRIP framework. The architecture allows users to operate robots with integrated components (hardware and/or software) for grasping and manipulation tasks. The task editor provides an intuitive interface for designing and tuning the robot’s behaviour. Arrows indicate the different (mutually compatible) ways to interface external components (Reprinted with permission from ref. [27]. Copyright 2020 IEEE).
Robotics 10 00112 g004
Figure 5. Diagram showing the different ways of interfacing a robot with GRIP. Colours indicate the integration modalities. Robots can be configured through MoveIt! (blue) or using an existing launch file that gathers all components to be run (orange). External low-level components can also be integrated into our framework by wrapping them into ROS actions or services (red). Black arrows indicate consistent operations across the integration modalities (Reprinted with permission from ref. [27]. Copyright 2020 IEEE).
Figure 5. Diagram showing the different ways of interfacing a robot with GRIP. Colours indicate the integration modalities. Robots can be configured through MoveIt! (blue) or using an existing launch file that gathers all components to be run (orange). External low-level components can also be integrated into our framework by wrapping them into ROS actions or services (red). Black arrows indicate consistent operations across the integration modalities (Reprinted with permission from ref. [27]. Copyright 2020 IEEE).
Robotics 10 00112 g005
Figure 6. The appearance of the task editor when designing a bimanual operation. The user can navigate both between and within hierarchies, via the sub-windows that are created when new containers are added. Different levels of zoom will show or hide the configuration data of each state, to ensure an appropriate visualisation (Reprinted with permission from ref. [27]. Copyright 2020 IEEE).
Figure 6. The appearance of the task editor when designing a bimanual operation. The user can navigate both between and within hierarchies, via the sub-windows that are created when new containers are added. Different levels of zoom will show or hide the configuration data of each state, to ensure an appropriate visualisation (Reprinted with permission from ref. [27]. Copyright 2020 IEEE).
Robotics 10 00112 g006
Figure 7. Hybrid fibre optic force/proximity fingertip sensor: (a) close-up visualisation of the fibre optics operating principles. D1, D2, D3 indicates the three deformation optical fibres. P the proximity optical fibre. (b) Hybrid fibre optic force/proximity sensor. (c) Examples of online testing with multi-modal vision and tactile features for crack exploration. On the left, the frames are captured by the webcam. In the centre, the object detection results on the previously acquired frames. On the right, the tactile results are shown. (d) Raw measurements of multiple runs from the four sensing elements of the sensor (deformations D1, D2, D3, and proximity P) for the “crack” surface. (Adapted from ref. [41]).
Figure 7. Hybrid fibre optic force/proximity fingertip sensor: (a) close-up visualisation of the fibre optics operating principles. D1, D2, D3 indicates the three deformation optical fibres. P the proximity optical fibre. (b) Hybrid fibre optic force/proximity sensor. (c) Examples of online testing with multi-modal vision and tactile features for crack exploration. On the left, the frames are captured by the webcam. In the centre, the object detection results on the previously acquired frames. On the right, the tactile results are shown. (d) Raw measurements of multiple runs from the four sensing elements of the sensor (deformations D1, D2, D3, and proximity P) for the “crack” surface. (Adapted from ref. [41]).
Robotics 10 00112 g007
Figure 8. The two ways an eversion robot can deliver sensors in an unknown environment. (iiv) Passing the sensor load through the central chamber of the eversion robot. (v) Cap mounted on the eversion robot’s tip to which sensor load can be attached for transportation as growth occurs.
Figure 8. The two ways an eversion robot can deliver sensors in an unknown environment. (iiv) Passing the sensor load through the central chamber of the eversion robot. (v) Cap mounted on the eversion robot’s tip to which sensor load can be attached for transportation as growth occurs.
Robotics 10 00112 g008
Figure 9. Overview of the proposed virtual reality-based robot teleoperation. (a) A human operator uses VR to command a robot. (b) A human operator’s view of the VR representation of the robot and its environment used for direct teleoperation. (c) Hand gestures used for rotating and scaling the VR scene. (Reprinted with permission from ref. [73]. Copyright 2020 IEEE).
Figure 9. Overview of the proposed virtual reality-based robot teleoperation. (a) A human operator uses VR to command a robot. (b) A human operator’s view of the VR representation of the robot and its environment used for direct teleoperation. (c) Hand gestures used for rotating and scaling the VR scene. (Reprinted with permission from ref. [73]. Copyright 2020 IEEE).
Robotics 10 00112 g009
Figure 10. The particle jamming interface technology used as: (a) a tabletop touch-pad type device; and (b) a joystick.
Figure 10. The particle jamming interface technology used as: (a) a tabletop touch-pad type device; and (b) a joystick.
Robotics 10 00112 g010
Figure 11. Teleoperation system based on seated ankle interface to control locomotion of remote mobile robotic systems. A human operator uses feet tapping movements to control the walking of a humanoid robot, i.e., its gait. Terrain feedback from a remote environment is rendered to a human operator.
Figure 11. Teleoperation system based on seated ankle interface to control locomotion of remote mobile robotic systems. A human operator uses feet tapping movements to control the walking of a humanoid robot, i.e., its gait. Terrain feedback from a remote environment is rendered to a human operator.
Robotics 10 00112 g011
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Vitanov, I.; Farkhatdinov, I.; Denoun, B.; Palermo, F.; Otaran, A.; Brown, J.; Omarali, B.; Abrar, T.; Hansard, M.; Oh, C.; et al. A Suite of Robotic Solutions for Nuclear Waste Decommissioning. Robotics 2021, 10, 112. https://doi.org/10.3390/robotics10040112

AMA Style

Vitanov I, Farkhatdinov I, Denoun B, Palermo F, Otaran A, Brown J, Omarali B, Abrar T, Hansard M, Oh C, et al. A Suite of Robotic Solutions for Nuclear Waste Decommissioning. Robotics. 2021; 10(4):112. https://doi.org/10.3390/robotics10040112

Chicago/Turabian Style

Vitanov, Ivan, Ildar Farkhatdinov, Brice Denoun, Francesca Palermo, Ata Otaran, Joshua Brown, Bukeikhan Omarali, Taqi Abrar, Miles Hansard, Changjae Oh, and et al. 2021. "A Suite of Robotic Solutions for Nuclear Waste Decommissioning" Robotics 10, no. 4: 112. https://doi.org/10.3390/robotics10040112

APA Style

Vitanov, I., Farkhatdinov, I., Denoun, B., Palermo, F., Otaran, A., Brown, J., Omarali, B., Abrar, T., Hansard, M., Oh, C., Poslad, S., Liu, C., Godaba, H., Zhang, K., Jamone, L., & Althoefer, K. (2021). A Suite of Robotic Solutions for Nuclear Waste Decommissioning. Robotics, 10(4), 112. https://doi.org/10.3390/robotics10040112

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop