sensors-logo

Journal Browser

Journal Browser

Instrumentation in Interactive Robotic and Automation

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (15 September 2023) | Viewed by 19933

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer and Software Engineering, Polytechnique Montrésal, Boul Édouard-Montpetit, Montreal, QC 2900, Canada
Interests: mobile robots; multi-robot systems; internet of things; slam (robots); collision avoidance; decentralised control; embedded systems; position control; robot vision; wireless sensor networks

E-Mail Website
Guest Editor
RISUQ Member, CISD Member, LAR.i (Laboratoire D'automatique et de Robotique Interactive) Automation and Interactive Robotic Lab (AIRL), Applied Scicences Departrement, Université du Québec à Chicoutimi, Chicoutimi, QC, Canada
Interests: interactive mechatronic systems; robot; IMU; wearable device
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In the current industrial context, the high demand for small-batch customized products is leading to the development of flexible, responsive, and highly autonomous manufacturing systems.

These new requirements in industrial automation have to be met with new sensing technologies that bridge different machines and systems. Smart sensing allows the collaboration of networked systems (such as those in smart factories), promoting just-in-time production based on data-driven demand models, for a transparent and intelligent supply chain. The aim of this Special Issue is to explore new theory, technologies, and applications of smart sensing in robotics and automation, bringing useful knowledge to practitioners.

We invite high-quality, novel, innovative, and unpublished contributions on any topics focused on sensors using concepts in the fields of robotics, industrial automation, flexible manufacturing systems, cyber-physical (social) systems, process optimization, and diagnosis, including (but not limited to) the following:

  • Microelectronic smart and virtual sensors;
  • Sensor fusion;
  • Sensors for human–robot collaboration and interaction;
  • Advanced techniques for diagnostics; predictive maintenance; and self-organized, self-maintenance systems;
  • Haptics and teleoperation;
  • Cyber-physical social systems and human-in-the-loop flexible manufacturing systems;
  • Digital twins;
  • Multirobot systems;
  • Robot swarms;
  • Data-driven control, learning-based control, robust and predictive control;
  • Data-driven, demand-driven, and Lean Six Sigma;
  • Human safety using wearable smart devices;
  • Sensors for intelligent supply chain and process optimization;
  • Applications: welding, assembly, grasping, painting, etc.

Prof. Dr. Giovanni Beltrame
Prof. Dr. Martin Otis
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 35881 KiB  
Article
Minimize Tracking Occlusion in Collaborative Pick-and-Place Tasks: An Analytical Approach for Non-Wrist-Partitioned Manipulators
by Hamed Montazer Zohour, Bruno Belzile, Rafael Gomes Braga and David St-Onge
Sensors 2022, 22(17), 6430; https://doi.org/10.3390/s22176430 - 26 Aug 2022
Cited by 1 | Viewed by 2054
Abstract
Several industrial pick-and-place applications, such as collaborative assembly lines, rely on visual tracking of the parts. Recurrent occlusions are caused by the manipulator motion decrease line productivity and can provoke failures. This work provides a complete solution for maintaining the occlusion-free line of [...] Read more.
Several industrial pick-and-place applications, such as collaborative assembly lines, rely on visual tracking of the parts. Recurrent occlusions are caused by the manipulator motion decrease line productivity and can provoke failures. This work provides a complete solution for maintaining the occlusion-free line of sight between a variable-pose camera and the object to be picked by a 6R manipulator that is not wrist-partitioned. We consider potential occlusions by the manipulator as well as the operator working at the assembly station. An actuated camera detects the object goal (part to pick) and keeps track of the operator. The approach consists of using the complete set of solutions obtained from the derivation of the univariate polynomial equation solution to the inverse kinematics (IK). Compared to numerical iterative solving methods, our strategy grants us a set of joint positions (posture) for each root of the equation from which we extract the best (minimizing the risks of occlusion). Our analytical-based method, integrating collision and occlusion avoidance optimizations, can contribute to greatly enhancing the efficiency and safety of collaborative assembly workstations. We validate our approach with simulations as well as with physical deployments on commercial hardware. Full article
(This article belongs to the Special Issue Instrumentation in Interactive Robotic and Automation)
Show Figures

Figure 1

28 pages, 11282 KiB  
Article
Affordable Motion Tracking System for Intuitive Programming of Industrial Robots
by Martin Švejda, Martin Goubej, Arnold Jáger, Jan Reitinger and Ondřej Severa
Sensors 2022, 22(13), 4962; https://doi.org/10.3390/s22134962 - 30 Jun 2022
Cited by 1 | Viewed by 4339
Abstract
The paper deals with a lead-through method of programming for industrial robots. The goal is to automatically reproduce 6DoF trajectories of a tool wielded by a human operator demonstrating a motion task. We present a novel motion-tracking system built around the HTC Vive [...] Read more.
The paper deals with a lead-through method of programming for industrial robots. The goal is to automatically reproduce 6DoF trajectories of a tool wielded by a human operator demonstrating a motion task. We present a novel motion-tracking system built around the HTC Vive pose estimation system. Our solution allows complete automation of the robot teaching process. Specific algorithmic issues of system calibration and motion data post-processing are also discussed, constituting the paper’s theoretical contribution. The motion tracking system is successfully deployed in a pilot application of robot-assisted spray painting. Full article
(This article belongs to the Special Issue Instrumentation in Interactive Robotic and Automation)
Show Figures

Figure 1

17 pages, 5404 KiB  
Article
Realization of Force Detection and Feedback Control for Slave Manipulator of Master/Slave Surgical Robot
by Hu Shi, Boyang Zhang, Xuesong Mei and Qichun Song
Sensors 2021, 21(22), 7489; https://doi.org/10.3390/s21227489 - 11 Nov 2021
Cited by 9 | Viewed by 2967
Abstract
Robot-assisted minimally invasive surgery (MIS) has received increasing attention, both in the academic field and clinical operation. Master/slave control is the most widely adopted manipulation mode for surgical robots. Thus, sensing the force of the surgical instruments located at the end of the [...] Read more.
Robot-assisted minimally invasive surgery (MIS) has received increasing attention, both in the academic field and clinical operation. Master/slave control is the most widely adopted manipulation mode for surgical robots. Thus, sensing the force of the surgical instruments located at the end of the slave manipulator through the main manipulator is critical to the operation. This study mainly addressed the force detection of the surgical instrument and force feedback control of the serial surgical robotic arm. A measurement device was developed to record the tool end force from the slave manipulator. An elastic element with an orthogonal beam structure was designed to sense the strain induced by force interactions. The relationship between the acting force and the output voltage was obtained through experiment, and the three-dimensional force output was decomposed using an extreme learning machine algorithm while considering the nonlinearity. The control of the force from the slave manipulator end was achieved. An impedance control strategy was adopted to restrict the force interaction amplitude. Modeling, simulation, and experimental verification were completed on the serial robotic manipulator platform along with virtual control in the MATLAB/Simulink software environment. The experimental results show that the measured force from the slave manipulator can provide feedback for impedance control with a delay of 0.15 s. Full article
(This article belongs to the Special Issue Instrumentation in Interactive Robotic and Automation)
Show Figures

Figure 1

28 pages, 4876 KiB  
Article
A CAN-Bus Lightweight Authentication Scheme
by Jia-Ning Luo, Chang-Ming Wu and Ming-Hour Yang
Sensors 2021, 21(21), 7069; https://doi.org/10.3390/s21217069 - 25 Oct 2021
Cited by 12 | Viewed by 5321
Abstract
The design of the Controller Area Network (CAN bus) did not account for security issues and, consequently, attacks often use external mobile communication interfaces to conduct eavesdropping, replay, spoofing, and denial-of-service attacks on a CAN bus, posing a risk to driving safety. Numerous [...] Read more.
The design of the Controller Area Network (CAN bus) did not account for security issues and, consequently, attacks often use external mobile communication interfaces to conduct eavesdropping, replay, spoofing, and denial-of-service attacks on a CAN bus, posing a risk to driving safety. Numerous studies have proposed CAN bus safety improvement techniques that emphasize modifying the original CAN bus method of transmitting frames. These changes place additional computational burdens on electronic control units cause the CAN bus to lose the delay guarantee feature. Consequently, we proposed a method that solves these compatibility and security issues. Simple and efficient frame authentication algorithms were used to prevent spoofing and replay attacks. This method is compatible with both CAN bus and CAN-FD protocols and has a lower operand when compared with other methods. Full article
(This article belongs to the Special Issue Instrumentation in Interactive Robotic and Automation)
Show Figures

Figure 1

24 pages, 20467 KiB  
Article
Image Generation for 2D-CNN Using Time-Series Signal Features from Foot Gesture Applied to Select Cobot Operating Mode
by Fadwa El Aswad, Gilde Vanel Tchane Djogdom, Martin J.-D. Otis, Johannes C. Ayena and Ramy Meziane
Sensors 2021, 21(17), 5743; https://doi.org/10.3390/s21175743 - 26 Aug 2021
Cited by 7 | Viewed by 3475
Abstract
Advances in robotics are part of reducing the burden associated with manufacturing tasks in workers. For example, the cobot could be used as a “third-arm” during the assembling task. Thus, the necessity of designing new intuitive control modalities arises. This paper presents a [...] Read more.
Advances in robotics are part of reducing the burden associated with manufacturing tasks in workers. For example, the cobot could be used as a “third-arm” during the assembling task. Thus, the necessity of designing new intuitive control modalities arises. This paper presents a foot gesture approach centered on robot control constraints to switch between four operating modalities. This control scheme is based on raw data acquired by an instrumented insole located at a human’s foot. It is composed of an inertial measurement unit (IMU) and four force sensors. Firstly, a gesture dictionary was proposed and, from data acquired, a set of 78 features was computed with a statistical approach, and later reduced to 3 via variance analysis ANOVA. Then, the time series collected data were converted into a 2D image and provided as an input for a 2D convolutional neural network (CNN) for the recognition of foot gestures. Every gesture was assimilated to a predefined cobot operating mode. The offline recognition rate appears to be highly dependent on the features to be considered and their spatial representation in 2D image. We achieve a higher recognition rate for a specific representation of features by sets of triangular and rectangular forms. These results were encouraging in the use of CNN to recognize foot gestures, which then will be associated with a command to control an industrial robot. Full article
(This article belongs to the Special Issue Instrumentation in Interactive Robotic and Automation)
Show Figures

Figure 1

Back to TopTop