Next Article in Journal
Predicting Models for Local Sedimentary Basin Effect Using a Convolutional Neural Network
Previous Article in Journal
An Improved Arc Fault Location Method of DC Distribution System Based on EMD-SVD Decomposition
Previous Article in Special Issue
Estimating the Soundscape Structure and Dynamics of Forest Bird Vocalizations in an Azimuth-Elevation Space Using a Microphone Array
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Special Issue on Trends and Challenges in Robotic Applications

1
Instituto de Diseño y Fabricación (IDF), Universitat Politècnica de València, 46022 Valencia, Spain
2
Instituto de Investigación en Ingeniería I3E, Universidad Miguel Hernández de Elche, 03202 Elche, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(16), 9131; https://doi.org/10.3390/app13169131
Submission received: 4 August 2023 / Accepted: 9 August 2023 / Published: 10 August 2023
(This article belongs to the Special Issue Trends and Challenges in Robotic Applications)

1. Introduction

The world of robotics has evolved rapidly in recent years, with groundbreaking advancements and innovative applications becoming increasingly prevalent. Robots are no longer limited to traditional industrial environments, and are now being utilized across different sectors, revolutionizing processes and transforming how humans perceive robotics. This Special Issue explores the latest trends and challenges in robotic applications, shedding light on how these technological advances are shaping the future of automation, with emphasis on robotics in particular.
This compendium of papers focuses on the current research fields, trends and challenges in robotic applications and includes a set of review papers that allow us to frame this Special Issue in a general context. Four of the presented articles describe areas of robotic application and their associated trends and challenges, including service robots [1], advanced applications in industry [2], multiple object tracking [3] and drone control and localization [4].

2. Fields of Application

Robots are continuously being introduced to new environments, and are their applications are becoming more diverse. This Special Issue contains articles from the following research fields: collaborative robots; service robotics; computer vision; mobile robots; and other advanced tools in robotics.

2.1. Collaborative Robots

Collaborative robots, or cobots, are designed to work alongside humans, creating safer and more efficient work environments and enabling the development of new strategic approaches to problems [5] and alternative technical solutions. Unlike their predecessors, cobots are equipped with advanced sensors and sophisticated algorithms that enable them to perceive and respond to human movements. As a result, they can assist in tasks that require human dexterity and decision making while reducing the risk of workplace accidents. From manufacturing assembly lines and assembly techniques [6] to healthcare settings, cobots are being adopted across multiple industries to enhance human capabilities.
The authors of [7] developed a human–robot collaboration system with good path-tracking accuracy. This enabled the implementation of an industrial robot with enhanced capabilities to improve system behavior.

2.2. Service Robotics

Service robots [1] are gaining popularity and significantly impacting how businesses operate and humans interact with technology. In the hospitality industry, robots are now employed to carry out room service, reception duties and concierge services. In healthcare, robots assist in patient care, including as nanoelectromechanical devices for medical applications [8], in rehabilitation support [9,10] and in surgical robotics [11]. Moreover, service robots are also making inroads in agriculture [12], retail and public spaces, presenting new opportunities for automation in areas that previously relied solely on human labor.
Robotic rehabilitation is a challenging field that involves a transversal flow of knowledge with other robotic technologies. For example, the similarity between lower limb rehabilitation robotics and humanoid auto-balanced walking robotics [13] lies in their focus on stability and balance. Both lower limb rehabilitation robotics and humanoid auto-balanced walking robotics prioritize maintaining stability and balance during movement. In lower limb rehabilitation robotics, ensuring the patient’s safety and preventing falls are crucial aspects of the rehabilitation process. The robot’s control algorithms and mechanical design are geared towards providing stable and controlled movements during therapy sessions. On the other hand, humanoid auto-balanced walking robotics aims to replicate human-like walking patterns while autonomously maintaining balance. These robots often have advanced sensors, such as inertial measurement units (IMUs), cameras and force/torque sensors, to continually assess their orientation and stability during walking. Advanced control algorithms are employed to adjust the robot’s posture and foot placement in real-time, ensuring balance and stability, even on uneven or challenging terrain. Technologies and insights developed in one field can potentially inform and improve the other, fostering cross-disciplinary advancements in robotics research [14].

2.3. Computer Vision and Other Advanced Techniques

Computer vision focuses on enabling computers to interpret and understand visual information from the world, including objects and humans [15]. It involves the development of algorithms and techniques that allow machines to process, analyze and extract meaningful insights from images and videos. In recent years, computer vision has undergone significant advancements, leading to the emergence of new technologies that allow for new interaction applications; for example, in [16], a moving object is manipulated by means of batting primitive to play table tennis with a human player.
Pose estimation is a fundamental concept in computer vision that involves determining the position and orientation of an object or a camera relative to a specific coordinate system. It plays a crucial role in various applications, such as augmented reality [17], robotics, 3D scene reconstruction and human–computer interaction. In simple terms, pose estimation aims to answer the question: “Where is the object or camera located, and how is it oriented in 3D space?” To achieve this, computer vision algorithms analyze visual data, typically in the form of images or videos, and extract relevant features or keypoints from the objects of interest. Similarly, [18] presents an interesting approach to pose computation using a 3D reconstruction of data obtained using an RGB-D multi-camera. A groundbreaking piece of technology that is currently attracting attention due to its relatively low cost is Time-of-Flight (ToF) sensing, and the authors of [19] present a multi-perspective ToF laser ranging system using prisms and mirrors.
The authors of [20] discuss the significance of 6D pose estimation in industry and its application in functions like bin picking and autopilot. They highlight the evolution of various approaches, both learning-based and non-learning-based, and aim to provide an up-to-date, thorough review of methods for 6D pose estimation, as well as the challenges and future trends in this field. Their paper compares the performance of different methods and categorizes them into two types of approach: non-learning-based and learning-based.
Using data from external sensors that measure 3D/6D locations is crucial to determining robots’ kinematic parameters and the transformation between the world coordinate frame and the robot base. The study presented in [21] shows that full pose measurements result in significantly smaller robot orientation errors compared to using 3D data alone, while the robot position errors remain similar in both cases.

2.4. Mobile Robots

Mobile robots, including autonomous vehicles, are an exciting and rapidly evolving technology that has the potential to revolutionize robotic transportation. As research and development in this field progress, several new topics are emerging that address the challenges and enhance the capabilities of autonomous mobile robots, including swarm robotics and its tools [22]. Examples of this flourishing field of research include [23], which is focused on automatically allocating space for parking autonomous and human-operated vehicles, and [24], which presents a control system developed to achieve mobile robot formations based on the leader–follower method. In [25], a LiDAR sensor is proposed to navigate featured environments.
A special case of this technology is robotic navigation inside tunnels and mines. The authors of [26] present a localization and navigation system for autonomous dump vehicles in tunnels. This work also presents an autonomous load system for this kind of vehicle [27].

2.5. Other Advanced Tools in Robotics

Advanced tools for robotics applications encompass a wide range of software and hardware technologies that enhance the capabilities and efficiency of robotic systems. These tools are designed to streamline development, improve control, and enable robots to carry out complex tasks with higher precision. For instance, the authors of [28] use robot audition methods in order to estimate the elevation and azimuth angles of birds’ vocalizations.
In [29], the authors propose a new stochastic method that efficiently tracks the desired end-effector task-space motions for mechanisms with redundant actuation and is applicable to industrial and collaborative robots. It utilizes manipulability measures and null-space configurations to achieve better manipulability, together with a collision-free trajectory in the task-space, providing a computationally tractable alternative to optimal motion planning, and demonstrated promising results in simulations and real robot scenarios.
Kinematics is a fundamental aspect of robotics because it deals with the study of robots’ mechanical link motions. The work presented in [30] discusses the challenges in selecting structural parameters for artificial neural networks, which often relies on trial-and-error procedures. It presents a design method based on neural networks and Genichi Taguchi’s approach, which is applied to solving inverse kinematics in a robotic manipulator, leading to improved result accuracy with a prediction percentage above 90% and a margin of error under 5%.

3. Conclusions

The future of robotics is exciting and promising, with the discussed trends in robotic applications continuing to unfold. From collaborative robots enhancing workplace safety to AI-powered robots driving intelligent automation and service robots revolutionizing industries, the impact of robotics is becoming increasingly pervasive. As technology advances and becomes more accessible, we can expect robots to play an even more significant role in transforming various aspects of human life, contributing to a more efficient, productive, and innovative world. However, with these advancements also come ethical considerations and the need for thoughtful regulations to ensure the responsible and beneficial integration of robots into society.

4. Statistical Data

This Special Issue on “Trends and Challenges in Robotic Applications” focuses on a wide range of robotics applications. In total, 39 papers were received between 20 August 2020 and 16 February 2023. After the reviewing process, 30 papers were accepted and 9 were not accepted, indicating an acceptance ratio of 76.9%. The accepted papers originated from research centers based mainly in Latin America, Europe and China.

Author Contributions

Conceptualization, L.G. and C.P.-V.; investigation, L.G. and C.P.-V.; writing—original draft preparation, L.G. and C.P.-V.; writing—review and editing, L.G. and C.P.-V.; visualization, L.G. and C.P.-V.; supervision, L.G. and C.P.-V.; funding acquisition, L.G. and C.P.-V.; All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by the Spanish Government (Grant PID2020-117421RB-C21, funded by MCIN/AEI/10.13039/501100011033).

Acknowledgments

The Guest Editors would like to thank all the authors and reviewers who contributed to this Special Issue for their valuable work in disseminating and curating the information contained within it. The Guest Editors also thank the Applied Sciences journal for trusting them to manage this publication.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gonzalez-Aguirre, J.A.; Osorio-Oliveros, R.; Rodríguez-Hernández, K.L.; Lizárraga-Iturralde, J.; Morales Menendez, R.; Ramírez-Mendoza, R.A.; Ramírez-Moreno, M.A.; Lozoya-Santos, J.d.J. Service Robots: Trends and Technology. Appl. Sci. 2021, 11, 10702. [Google Scholar] [CrossRef]
  2. Dzedzickis, A.; Subaciute-Žemaitiene, J.; Šutinys, E.; Samukaite-Bubniene, U.; Bucinskas, V. Advanced Applications of Industrial Robotics: New Trends and Possibilities. Appl. Sci. 2022, 12, 135. [Google Scholar] [CrossRef]
  3. Gad, A.; Basmaji, T.; Yaghi, M.; Alheeh, H.; Alkhedher, M.; Ghazal, M. Multiple Object Tracking in Robotic Applications: Trends and Challenges. Appl. Sci. 2022, 12, 9408. [Google Scholar] [CrossRef]
  4. Yousaf, J.; Zia, H.; Alhalabi, M.; Yaghi, M.; Basmaji, T.; Shehhi, E.A.; Gad, A.; Alkhedher, M.; Ghazal, M. Drone and Controller Detection and Localization: Trends and Challenges. Appl. Sci. 2022, 12, 12612. [Google Scholar] [CrossRef]
  5. Gutman, D.; Olatunji, S.; Edan, Y. Evaluating Levels of Automation in Human—Robot Collaboration at Different Workload Levels. Appl. Sci. 2021, 11, 7340. [Google Scholar] [CrossRef]
  6. Ortega-Aranda, D.; Jimenez-Vielma, J.F.; Saha, B.N.; Lopez-Juarez, I. Dual-Arm Peg-in-Hole Assembly Using DNN with Double Force/Torque Sensor. Appl. Sci. 2021, 11, 6970. [Google Scholar] [CrossRef]
  7. Reyes-Uquillas, D.; Hsiao, T. Compliant Human—Robot Collaboration with Accurate Path-Tracking Ability for a Robot Manipulator. Appl. Sci. 2021, 11, 5914. [Google Scholar] [CrossRef]
  8. Šajic, J.L.; Langthaler, S.; Baumgartner, C. Creating a Novel Mathematical Model of the Kv10.1 Ion Channel and Controlling Channel Activity with Nanoelectromechanical Systems. Appl. Sci. 2022, 12, 3836. [Google Scholar] [CrossRef]
  9. Menga, G. The Spherical Inverted Pendulum: Exact Solutions of Gait and Foot Placement Estimation Based on Symbolic Computation. Appl. Sci. 2021, 11, 1588. [Google Scholar] [CrossRef]
  10. Bressi, F.; Santacaterina, F.; Cricenti, L.; Campagnola, B.; Nasto, F.; Assenza, C.; Morelli, D.; Cordella, F.; Lapresa, M.; Zollo, L.; et al. Robotic-Assisted Hand Therapy with Gloreha Sinfonia for the Improvement of Hand Function after Pediatric Stroke: A Case Report. Appl. Sci. 2022, 12, 4206. [Google Scholar] [CrossRef]
  11. Rosa, M.; Liu, R.; Pitruzzello, G.; Tortora, G. A Smart Modular IoT Sensing Device for Enhancing Sensory Feedbacks in Surgical Robotics. Appl. Sci. 2022, 12, 8083. [Google Scholar] [CrossRef]
  12. Mao, W.; Liu, Z.; Liu, H.; Yang, F.; Wang, M. Research Progress on Synergistic Technologies of Agricultural Multi-Robots. Appl. Sci. 2021, 11, 1448. [Google Scholar] [CrossRef]
  13. Polakovic, D.; Juhás, M.; Juhásová, B.; Cervenanská, Z. Bio-Inspired Model-Based Design and Control of Bipedal Robot. Appl. Sci. 2022, 12, 10058. [Google Scholar] [CrossRef]
  14. Nakamura, K.; Saga, N. A Symmetry Evaluation Method, Using Elevation Angle, for Lower Limb Movement Patterns during Sitting-to-Standing. Appl. Sci. 2022, 12, 9454. [Google Scholar] [CrossRef]
  15. Khalifa, A.; Abdelrahman, A.A.; Strazdas, D.; Hintz, J.; Hempel, T.; Al-Hamadi, A. Face Recognition and Tracking Framework for Human–Robot Interaction. Appl. Sci. 2022, 12, 5568. [Google Scholar] [CrossRef]
  16. Joe, H.-M.; Lee, J.; Oh, J.-H. Dynamic Nonprehensile Manipulation of a Moving Object Using a Batting Primitive. Appl. Sci. 2021, 11, 3920. [Google Scholar] [CrossRef]
  17. García, A.; Solanes, J.E.; Muñoz, A.; Gracia, L.; Tornero, J. Augmented Reality-Based Interface for Bimanual Robot Teleoperation. Appl. Sci. 2022, 12, 4379. [Google Scholar] [CrossRef]
  18. de Medeiros Esper, I.; Smolkin, O.; Manko, M.; Popov, A.; From, P.J.; Mason, A. Evaluation of RGB-D Multi-Camera Pose Estimation for 3D Reconstruction. Appl. Sci. 2022, 12, 4134. [Google Scholar] [CrossRef]
  19. Pogacnik, L.; Munih, M. Towards a Multi-Perspective Time of Flight Laser Ranging Device Based on Mirrors and Prisms. Appl. Sci. 2022, 12, 7121. [Google Scholar] [CrossRef]
  20. He, Z.; Feng, W.; Zhao, X.; Lv, Y. 6D Pose Estimation of Objects: Recent Technologies and Challenges. Appl. Sci. 2021, 11, 228. [Google Scholar] [CrossRef]
  21. Franaszek, M.; Marvel, J.A. Using Full Pose Measurement for Serial Robot Calibration. Appl. Sci. 2022, 12, 3680. [Google Scholar] [CrossRef] [PubMed]
  22. Aurecianus, S.; Ha, G.-H.; Park, H.-C.; Kang, T.-S. Longitudinal Mode System Identification of an Insect-like Tailless Flapping-Wing Micro Air Vehicle Using Onboard Sensors. Appl. Sci. 2022, 12, 2486. [Google Scholar] [CrossRef]
  23. Wu, M.; Jiang, H.; Tan, C.-A. Automated Parking Space Allocation during Transition with both Human-Operated and Autonomous Vehicles. Appl. Sci. 2021, 11, 855. [Google Scholar] [CrossRef]
  24. Hirata-Acosta, J.; Pliego-Jiménez, J.; Cruz-Hernández, C.; Martínez-Clark, R. Leader-Follower Formation Control of Wheeled Mobile Robots without Attitude Measurements. Appl. Sci. 2021, 11, 5639. [Google Scholar] [CrossRef]
  25. Nguyen, P.T.-T.; Yan, S.-W.; Liao, J.-F.; Kuo, C.-H. Autonomous Mobile Robot Navigation in Sparse LiDAR Feature Environments. Appl. Sci. 2021, 11, 5963. [Google Scholar] [CrossRef]
  26. Mascaró, M.; Parra-Tsunekawa, I.; Tampier, C.; Ruiz-del-Solar, J. Topological Navigation and Localization in Tunnels—Application to Autonomous Load-Haul-Dump Vehicles Operating in Underground Mines. Appl. Sci. 2021, 11, 6547. [Google Scholar] [CrossRef]
  27. Tampier, C.; Mascaró, M.; Ruiz-del-Solar, J. Autonomous Loading System for Load-Haul-Dump (LHD) Machines Used in Underground Mining. Appl. Sci. 2021, 11, 8718. [Google Scholar] [CrossRef]
  28. Suzuki, R.; Hayashi, K.; Osaka, H.; Matsubayashi, S.; Arita, T.; Nakadai, K.; Okuno, H.G. Estimating the Soundscape Structure and Dynamics of Forest Bird Vocalizations in an Azimuth- Elevation Space Using a Microphone Array. Appl. Sci. 2023, 13, 3607. [Google Scholar] [CrossRef]
  29. Gil Aparicio, A.; Valls Miro, J. An Efficient Stochastic Constrained Path Planner for Redundant Manipulators. Appl. Sci. 2021, 11, 10636. [Google Scholar] [CrossRef]
  30. Ibarra-Pérez, T.; Ortiz-Rodríguez, J.M.; Olivera-Domingo, F.; Guerrero-Osuna, H.A.; Gamboa-Rosales, H.; Martínez-Blanco, M.d.R. A Novel Inverse Kinematic Solution of a Six-DOF Robot Using Neural Networks Based on the Taguchi Optimization Technique. Appl. Sci. 2022, 12, 9512. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gracia, L.; Perez-Vidal, C. Special Issue on Trends and Challenges in Robotic Applications. Appl. Sci. 2023, 13, 9131. https://doi.org/10.3390/app13169131

AMA Style

Gracia L, Perez-Vidal C. Special Issue on Trends and Challenges in Robotic Applications. Applied Sciences. 2023; 13(16):9131. https://doi.org/10.3390/app13169131

Chicago/Turabian Style

Gracia, Luis, and Carlos Perez-Vidal. 2023. "Special Issue on Trends and Challenges in Robotic Applications" Applied Sciences 13, no. 16: 9131. https://doi.org/10.3390/app13169131

APA Style

Gracia, L., & Perez-Vidal, C. (2023). Special Issue on Trends and Challenges in Robotic Applications. Applied Sciences, 13(16), 9131. https://doi.org/10.3390/app13169131

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop