A ROS2-Based Gateway for Modular Hardware Usage in Heterogeneous Environments
Abstract
:1. Introduction
2. State of the Art
2.1. Background
2.1.1. ROS2 Architecture
2.1.2. ROS2 Package
2.1.3. ROS2 Topics
2.1.4. ROS2 Launch File
2.2. Related Work
3. Architecture and Prototype
3.1. Logical Architecture
- Client Layer—this layer is where the clients are represented. It should be noted that entities in this layer are limited to the external control of the target robot. Therefore, it is understood that the underlying processes and remaining features fall outside of their scope. Consequently, there are endpoints and topics that are visible to the client side, which also form part of this layer.
- Middleware Layer—this is the main layer of the architecture, which serves to house the core components of the solution. From the initial launcher script, which defines how the robot will behave, to the actual ROS2 packages orchestration, taking into account the user input, everything is handled in this layer. The aim is for all of these processes to be transparent for the users.
- Physical Layer—this layer represents the physical attachable components, which are then used as the robot’s payloads through ROS2 packages. It would be fair to say that these packages play a role in converting ROS2 messages to the applicable payload low-level communication protocol.
3.2. Non-ROS2 Important Concepts
3.2.1. Gateway Launcher Script
3.2.2. Bridging Process
3.3. Implementation
3.3.1. Prototype’s Architecture
3.3.2. ROS2 Launch File Development
3.3.3. ROS2 Payload Packages
3.3.4. ROS2 Communication Packages and Bridging Process
3.3.5. Bash User Interface
4. Results
4.1. The Select and Interact with Payloads Test Case
- Purpose: The The aim of this test was to ascertain whether the gateway is functioning as intended, enabling the selection and interaction with a chosen payload. This should demonstrate that the mechanism allows for payload modularity and adapts the robot to an on-demand communication technology.
- Test bed: Table 1 presents a list of resources required to perform this test.
- Results: The test case has shown that the gateway is able to provide on-demand payload usage and a high-level communication protocol as well.
4.2. The Arm Manipulation through Multiple Communication Technologies Test Case
- Purpose: The aim of this test case is to ascertain whether the user can launch the UI script and select the arm payload and multiple high-level communication protocols in a seamless manner, and whether the arm can be controlled by all the selected communication technologies simultaneously.
- Test Bed: Table 2 refers to the resources used for this test case.
- Results: In this test case, the user has chosen MQTT, websockets and Kafka as can be observed in Figure 11 and Figure 12. After having everything running, the MQTTX (Figure 13) was used to communicate through MQTT, Postman was used for websockets (Figure 14), a python script was employed for Kafka communication (Figure 15) and everything has worked as expected.
4.3. The Simulation of an Object Grab Test Case
- Purpose: This test case aims to observe and touch an object. To this end, a web interface was created that allows the user to visualise the images acquired by a real sense camera attached to the robot arm as well as to manipulate the arm in order to reach the object. In this use case, websockets were used for the interaction between the user and the robot.
- Test Bed: Table 3 contains all the resources used for the execution of this test.
- Results: Once the websockets server was activated, the arm and camera payloads and websockets were selected at the gateway’s initial configuration, as part of this preliminary test. Subsequently, all the connection settings were provided and everything proceeded without incident. Once all the necessary packages had been launched, the UI script was able to print the connection data for each of the launched payloads. At this point, an HTML web client was created. In this web client, an iframe object was created and its source attribute was set to the stream’s URL. Subsequently, some HTML buttons were created for each of the arm’s defined poses, with each button sending a different message through WebSockets. This resulted in a web page (Figure 16) receiving the camera stream where the target object was visible and in three buttons that, once clicked, would move the arm until it reached the object (Figure 17).
4.4. The SmartFarm Project Gateway Test Case
- Purpose: This test brings together the work of different researchers in the same research group to create a controlled multi-component real-world scenario. The objective is to integrate cloud robotics and computer vision research with artificial intelligence through the ROS2-based gateway to enable a robot prototype to move towards an orange tree and grab oranges with a robotic arm. The user must first connect to the robot via the cloud robotics platform and activate the wheels, camera and arm payloads. Activate the payloads and remotely control the wheels from the cloud robotics platform with the aid of the gateway. Once the robot reaches the tree, the camera payload observes the orange tree and sends the captured frames to an external websockets server via the ROS2-based gateway. The server then uses the computer vision AI model to detect all the oranges and extract their relative coordinates, which are sent back via websockets to the ROS2-based gateway. The gateway then converts the coordinates messages into the ROS2 message structure supported by the arm so that the arm can move towards each orange.
- Test Bed: Table 4 enumerates all the resources required when conducting this test.
- Results: In this test case, the gateway was responsible for interacting with the robotic arm, the robot’s wheels, camera and to send and receive websockets messages while being integrated into a more complex test case, where a cloud robotics platform was used as well as an AI model to recognise oranges in an orange tree. The robot was manually guided to approach the small orange tree; then, the orange picking/touching operation was totally automated and realised with success (Figure 18 and Figure 19). Employing an AI model to detect orange trees would also replace the manual approaching operation by a totally automated task of visiting orange trees in an orchard and picking/touching the oranges.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Villa, D.; Song, X.; Heim, M.; Li, L. Internet of Robotic Things: Current Technologies, Applications, Challenges and Future Directions. arXiv 2021, arXiv:2101.06256. [Google Scholar]
- Vermesan, O.; Bahr, R.; Ottella, M.; Serrano, M.; Karlsen, T.; Wahlstrom, T.; Sand, H.E.; Ashwathnarayan, M.; Gamba, M.T. Internet of Robotic Things Intelligent Connectivity and Platforms. Front. Robot AI 2020, 7, 104. [Google Scholar] [CrossRef] [PubMed]
- Romeo, L.; Petitti, A.; Marani, R.; Milella, A. Internet of Robotic Things in Smart Domains: Applications and Challenges. Sensors 2020, 20, 12. [Google Scholar] [CrossRef] [PubMed]
- Post, M.A.; Yan, X.-T.; Letier, P. Modularity for the future in space robotics: A review. Acta Astronaut. 2021, 189, 530–547. [Google Scholar] [CrossRef]
- Zou, Y.; Kim, D.; Norman, P.; Espinosa, J.; Wang, J.-C.; Virk, G.S. Towards robot modularity—A review of international modularity standardization for service robots. Robot. Auton. Syst. 2022, 148, 103943. [Google Scholar] [CrossRef]
- ISO 22166-1:2021. 14:00–17:00. Available online: https://www.iso.org/standard/72715.html (accessed on 26 April 2024).
- Macenski, S.; Foote, T.; Gerkey, B.; Lalancette, C.; Woodall, W. Robot Operating System 2: Design, Architecture, and Uses In The Wild. Sci. Robot. 2022, 7, abm6074. [Google Scholar] [CrossRef] [PubMed]
- Gupta, B.B.; Nedjah, N. Safety, Security, and Reliability of Robotic Systems: Algorithms, Applications, and Technologies; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
- Boubaker, O. (Ed.) Front Matter. In Medical and Healthcare Robotics, in Medical Robots and Devices: New Developments and Advances; Academic Press: Cambridge, MA, USA, 2023; pp. i–iii. [Google Scholar] [CrossRef]
- Chennareddy, S.S.R.; Agrawal, A.; Karuppiah, A. Modular Self-Reconfigurable Robotic Systems: A Survey on Hardware Architectures. J. Robot. 2017, 2017, 5013532. [Google Scholar] [CrossRef]
- Daudelin, J.; Jing, G.; Tosun, T.; Yim, T.; Kress-Gazit, H.; Campbell, M. An Integrated System for Perception-Driven Autonomy with Modular Robots. Sci. Robot. 2018, 3. [Google Scholar] [CrossRef] [PubMed]
- Ha, S.; Kim, J.; Yamane, K. Automated Deep Reinforcement Learning Environment for Hardware of a Modular Legged Robot. In Proceedings of the 5th International Conference on Ubiquitous Robots (UR), Honolulu, HI, USA, 26–30 June 2018; pp. 348–354. [Google Scholar] [CrossRef]
- Seo, J.; Paik, J.; Yim, M. Modular Reconfigurable Robotics. Annu. Rev. Control. Robot. Auton. Syst. 2019, 2, 63–88. [Google Scholar] [CrossRef]
- Roach, M.A.; Penney, J.; Jared, B.H. Exploring a supervisory control system using ROS2 and IOT sensors. In Proceedings of the 2023 International Solid Freeform Fabrication Symposium, Austin TX, USA, 14–16 August 2023. [Google Scholar]
- Kim, Y.; Lee, D.; Jeong, S.; Moon, H.; Yu, C. Development of ROS2-on-Yocto-based Thin Client Robot for Cloud Robotics. J. Korea Robot. Soc. 2021, 16, 327–335. [Google Scholar] [CrossRef]
- IEEE Xplore Full-Text PDF. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=arnumber=9945526 (accessed on 5 June 2024).
- IEEE Xplore Full-Text PDF. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=arnumber=10161307 (accessed on 5 June 2024).
- Integrating Launch Files into ROS2 Packages—ROS2 Documentation: Foxy Documentation. Available online: https://docs.ros.org/en/foxy/Tutorials/Intermediate/Launch/Launch-system.html (accessed on 15 May 2024).
- RobotWebTools. RobotWebTools/Webvideoserver. 2024. Available online: https://github.com/RobotWebTools/webvideoserver (accessed on 15 May 2024).
- ROBOTIS. ROBOTIS-GIT/Openmanipulator. 2024. Available online: https://github.com/ROBOTIS−GIT/openmanipulator (accessed on 14 June 2024).
- Intel® RealSense™. IntelRealSense/Realsense-ros. 2024. Available online: https://github.com/IntelRealSense/realsense-ros (accessed on 14 June 2024).
Hardware | Raspberry Pi 4 Compute Module |
Turtlebot Waffle | |
Open Manipulator X | |
OpenCR Microcontroller | |
Windows 10 Machine | |
Software | Linux Ubuntu 20.04 |
ROS2 Foxy | |
MQTT Broker | |
MQTTX | |
Robotis OpenManipulator ROS2 package [20] | |
Visual Studio Code |
Hardware | Raspberry Pi 4 Compute Module |
Turtlebot Waffle | |
Open Manipulator X | |
OpenCR Microcontroller | |
Windows 10 Machine | |
Software | Linux Ubuntu 20.04 |
ROS2 Foxy | |
MQTT Broker | |
MQTTX | |
Postman | |
Python Script | |
Robotis OpenManipulator ROS2 package [20] | |
Visual Studio Code |
Hardware | Raspberry Pi 4 Compute Module |
Turtlebot Waffle | |
Open Manipulator X | |
Camera Intel Realsense d435i | |
OpenCR Microcontroller | |
Windows 10 Machine | |
Software | Linux Ubuntu 20.04 |
ROS2 Foxy | |
HTML Web page | |
Webserver ROS2 package [19] | |
Robotis OpenManipulator ROS2 package [20] | |
IntelRealSense ROS2 package [21] | |
Visual Studio Code |
Hardware | Raspberry Pi 4 Compute Module |
Turtlebot Waffle | |
Open Manipulator X | |
Camera Intel Realsense d435i | |
OpenCR Microcontroller | |
ESP32 Microcontroller | |
Developed Wheels prototype | |
Fedora Linux Machine | |
2x Windows 10 Machine | |
Software | Linux Ubuntu 20.04 |
ROS2 Foxy | |
Cloud Robotics Platform | |
Artificial Intelligence Model | |
Robotis OpenManipulator ROS2 package [20] | |
IntelRealSense ROS2 package [21] | |
Visual Studio Code |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Carreira, R.; Costa, N.; Ramos, J.; Frazão, L.; Pereira, A. A ROS2-Based Gateway for Modular Hardware Usage in Heterogeneous Environments. Sensors 2024, 24, 6341. https://doi.org/10.3390/s24196341
Carreira R, Costa N, Ramos J, Frazão L, Pereira A. A ROS2-Based Gateway for Modular Hardware Usage in Heterogeneous Environments. Sensors. 2024; 24(19):6341. https://doi.org/10.3390/s24196341
Chicago/Turabian StyleCarreira, Rúben, Nuno Costa, João Ramos, Luís Frazão, and António Pereira. 2024. "A ROS2-Based Gateway for Modular Hardware Usage in Heterogeneous Environments" Sensors 24, no. 19: 6341. https://doi.org/10.3390/s24196341
APA StyleCarreira, R., Costa, N., Ramos, J., Frazão, L., & Pereira, A. (2024). A ROS2-Based Gateway for Modular Hardware Usage in Heterogeneous Environments. Sensors, 24(19), 6341. https://doi.org/10.3390/s24196341