COBOT Applications—Recent Advances and Challenges
Abstract
:1. Introduction
2. Collaborative Robot Architecture Frame
- Coexistence: the work areas need to be defined without overlapping zones. The human operator and the robot can perform the activities separately.
- Synchronization: the human and the robot share the work environment with independent tasks.
- Cooperation: the human and the robot share the work environment and the task execution is in a step-by-step procedure.
- Collaboration: the human and robot share the work area and the task concurrently.
- Safety: COBOTS are designed to work safely in the same workspace occupied by an operator, detecting and reacting to the risk of accidents or injuries.
- Flexibility: COBOTS can be reconfigured to execute a set of unknown tasks.
- User-friendliness: COBOTS are equipped with intuitive interfaces to program and operate them without requiring extensive technical knowledge.
- Manufacturing processes: articles that focus on the use of collaborative robots in various manufacturing processes, such as assembly lines and welding.
- Material handling: Articles that specifically address the application of collaborative robots in material handling tasks, including picking, sorting, and transporting objects.
- Personal assistance: articles that explore the use of collaborative robots in providing assistance to individuals in tasks such as household chores or caregiving.
- Security and inspections: articles that examine the application of collaborative robots in security-related tasks, such as surveillance, monitoring, and inspections in various settings.
- Medicare: articles that discuss the utilization of collaborative robots in healthcare and medical environments, including patient care, surgical assistance, and rehabilitation.
- Control interface: articles that investigate different interfaces and control mechanisms for humans to interact and communicate with collaborative robots effectively.
- Intention recognition: articles that study how to define techniques, algorithms, and sensor systems used to enable robots to recognize and understand human intentions.
- Programming and learning: articles that explore methods and techniques for programming and teaching collaborative robots, including machine learning, programming languages, and algorithms.
- Virtual reality perspectives: articles that discuss the potential of virtual reality systems in enhancing human–robot interactions and collaboration, such as immersive training environments and augmented reality interfaces.
3. Classification of COBOT Applications
3.1. Industrial Application of Collaborative Robots: Assembly
3.2. Industrial Application of Collaborative Robots: Material Handling
3.3. Service Application of Collaborative Robots: Personal Assistance
3.4. Service Application of Collaborative Robots: Security and Inspection
3.5. Service Application of Collaborative Robots: Medicare
3.6. Supernumerary Robotics
4. Interactions with Human Beings: Practical Implications
4.1. Control Interface
4.2. Intention Recognition
4.3. Programming and Learning
4.4. Virtual Reality (VR)-Based COBOT
5. COBOT Market Analyses: Potentialities and Limits
5.1. COBOT Assessment: Degrees of Freedom
5.2. COBOT Assessment: Reach and Payload
5.3. COBOT Assessment: Accuracy
5.4. COBOTs Assessment: Energy Consuption vs. Tool Center Point (TCP) Velocity
6. Discussion
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A
Producer | Model | Class | DoF | Payload [kg] | Reach [mm] | Accuracy [mm] |
---|---|---|---|---|---|---|
ABB | CRB 11000 SWIFTI | Anthropomorphic | 6 | 4.0 | 580 | 0.01 |
CRB 15000 GoFa | Anthropomorphic | 6 | 5.0 | 950 | 0.05 | |
IRB 1400 Yumi | Torso | 14 | 0.5 | 1200 | 0.02 | |
IRB 14050 Yumi | Anthropomorphic | 7 | 0.5 | 559 | 0.02 | |
Acutronics | MARA | Anthropomorphic | 6 | 3.0 | 656 | 0.10 |
AIRSKIN | Kuka Agilus Fenceless | Anthropomorphic | 6 | 10.0 | 1100 | 0.02 |
Airskin | Kuka Cybertech Fenceless | Anthropomorphic | 6 | 24.0 | 2020 | 0.04 |
AUBO Robotics | I10 | Anthropomorphic | 6 | 10.0 | 1350 | 0.10 |
I3 | Anthropomorphic | 6 | 3.0 | 625 | 0.03 | |
I5 | Anthropomorphic | 6 | 5.0 | 924 | 0.05 | |
I7 | Anthropomorphic | 6 | 7.0 | 1150 | 0.05 | |
Automata | EVA | Anthropomorphic | 6 | 1.3 | 600 | 0.50 |
Automationware | AW-Tube 5 | Anthropomorphic | 6 | 5.0 | 900 | 0.03 |
AW-Tube 8 | Anthropomorphic | 6 | 8.0 | 1000 | 0.04 | |
AW-Tube 12 | Anthropomorphic | 6 | 13.0 | 1300 | 0.05 | |
AW-Tube 15 | Anthropomorphic | 6 | 15.0 | 1000 | 0.05 | |
AW-Tube 18 | Anthropomorphic | 6 | 18.0 | 1700 | 0.06 | |
AW-Tube 20 | Anthropomorphic | 6 | 20.0 | 1500 | 0.07 | |
Bosch | APAS | Anthropomorphic | 6 | 4.0 | 911 | 0.03 |
Comau | Aura | Anthropomorphic | 6 | 170.0 | 2790 | 0.10 |
e.Do | Anthropomorphic | 6 | 1.0 | 478 | 0.01 | |
Racer 5 0.80 Cobot | Anthropomorphic | 6 | 5.0 | 809 | 0.03 | |
Denso | Cobotta | Anthropomorphic | 6 | 0.5 | 342 | 0.05 |
Dobot | CR10 | Anthropomorphic | 6 | 10.0 | 1525 | 0.03 |
CR16 | Anthropomorphic | 6 | 16.0 | 1223 | 0.03 | |
CR3 | Anthropomorphic | 6 | 3.0 | 795 | 0.02 | |
CR5 | Anthropomorphic | 6 | 5.0 | 1096 | 0.03 | |
M1 | SCARA | 4 | 1.5 | 400 | 0.02 | |
Magician | Anthropomorphic | 4 | 0.3 | 320 | 0.20 | |
Magician | Anthropomorphic | 4 | 0.5 | 340 | 0.20 | |
MG 400 | Anthropomorphic | 4 | 0.8 | 440 | 0.05 | |
Doosan Robotics | A0509 | Anthropomorphic | 6 | 5.0 | 900 | 0.03 |
A0912 | Anthropomorphic | 6 | 9.0 | 1200 | 0.05 | |
H2017 | Anthropomorphic | 6 | 20.0 | 1700 | 0.10 | |
H2515 | Anthropomorphic | 6 | 25.0 | 1500 | 0.10 | |
M0609 | Anthropomorphic | 6 | 6.0 | 900 | 0.10 | |
M0617 | Anthropomorphic | 6 | 6.0 | 1700 | 0.10 | |
M1013 | Anthropomorphic | 6 | 10.0 | 1300 | 0.10 | |
M1509 | Anthropomorphic | 6 | 15.0 | 900 | 0.10 | |
Efort | ECR5 | Anthropomorphic | 6 | 5.0 | 928 | 0.03 |
Elephant Robotics | C3 | Anthropomorphic | 6 | 3.0 | 500 | 0.50 |
E5 | Anthropomorphic | 6 | 5.0 | 810 | 0.50 | |
myCobot | Anthropomorphic | 6 | 0.3 | 280 | 0.20 | |
Panda 3 | Anthropomorphic | 6 | 3.0 | 550 | 0.50 | |
Panda 5 | Anthropomorphic | 6 | 5.0 | 850 | 0.50 | |
Elite Robot | CS612 | Anthropomorphic | 6 | 12.0 | 1304 | 0.05 |
CS63 | Anthropomorphic | 6 | 3.0 | 624 | 0.02 | |
CS66 | Anthropomorphic | 6 | 6.0 | 914 | 0.03 | |
EC612 | Anthropomorphic | 6 | 12.0 | 1304 | 0.03 | |
EC63 | Anthropomorphic | 6 | 3.0 | 624 | 0.02 | |
EC66 | Anthropomorphic | 6 | 6.0 | 914 | 0.03 | |
ESI | C-15 | Anthropomorphic | 6 | 15.0 | 1323 | 0.05 |
C-7 | Anthropomorphic | 6 | 7.0 | 900 | 0.05 | |
F&P Personal Robotics | 2R 24V | Anthropomorphic | 6 | 3.0 | 775 | 0.10 |
2R 48V | Anthropomorphic | 6 | 5.0 | 775 | 0.10 | |
Fanuc | 1CR4iAL | Anthropomorphic | 6 | 14.0 | 911 | 0.03 |
CR15iA | Anthropomorphic | 6 | 15.0 | 1411 | 0.02 | |
CR35iA | Anthropomorphic | 6 | 35.0 | 1813 | 0.08 | |
CR4iA | Anthropomorphic | 6 | 4.0 | 550 | 0.02 | |
CR7iA | Anthropomorphic | 6 | 7.0 | 717 | 0.02 | |
CR7iAL | Anthropomorphic | 6 | 7.0 | 911 | 0.02 | |
CRX10iA | Anthropomorphic | 6 | 10.0 | 1249 | 0.05 | |
CR10XiAL | Anthropomorphic | 6 | 10.0 | 1418 | 0.05 | |
Flexiv | Rizon 4 | Anthropomorphic | 7 | 4.0 | 780 | 0.01 |
Franka Emika | Robot | Anthropomorphic | 7 | 3.0 | 855 | 0.10 |
Hans Robot | E10 | Anthropomorphic | 6 | 10.0 | 1000 | 0.05 |
E15 | Anthropomorphic | 6 | 15.0 | 700 | 0.05 | |
E3 | Anthropomorphic | 6 | 3.0 | 590 | 0.05 | |
E5 | Anthropomorphic | 6 | 5.0 | 800 | 0.05 | |
E5-L | Anthropomorphic | 6 | 3.5 | 950 | 0.05 | |
Hanwha | HCR-12 | Anthropomorphic | 6 | 12.0 | 1300 | 0.10 |
HCR-12A | Anthropomorphic | 6 | 12.0 | 1300 | 0.05 | |
HCR-3 | Anthropomorphic | 6 | 3.0 | 630 | 0.10 | |
HCR-3A | Anthropomorphic | 6 | 3.0 | 630 | 0.05 | |
HCR5 | Anthropomorphic | 6 | 5.0 | 915 | 0.10 | |
HCR-5A | Anthropomorphic | 6 | 5.0 | 915 | 0.05 | |
HIT Robot Group | T5 | Anthropomorphic | 6 | 5.0 | 850 | 0.10 |
HITBOT | Z-Arm 1632 | SCARA | 4 | 1.0 | 452 | 0.02 |
Z-Arm 1832 | SCARA | 4 | 3.0 | 455 | 0.02 | |
Z-Arm 2140 | SCARA | 4 | 3.0 | 532 | 0.03 | |
Z-Arm 2442 | SCARA | 4 | 1.0 | 617 | 0.03 | |
Z-Arm 6140 | SCARA | 4 | 1.0 | 532 | 0.02 | |
Z-Arm mini | SCARA | 4 | 1.0 | 320 | 0.10 | |
Hyundai | YL005 | Anthropomorphic | 6 | 5.0 | 916 | 0.10 |
YL012 | Anthropomorphic | 6 | 12.0 | 1350 | 0.10 | |
YL015 | Anthropomorphic | 6 | 15.0 | 963 | 0.10 | |
Inovo Robotics | Robotic Arm 1300 | Anthropomorphic | 6 | 3.0 | 1.340 | 0.25 |
Robotics Arm 650 | Anthropomorphic | 6 | 10.0 | 690 | 0.25 | |
Robotics Arm 850 | Anthropomorphic | 6 | 6.0 | 990 | 0.25 | |
Isybot | SYB3 | Anthropomorphic | 4 | 10.0 | 1600 | 0.20 |
JAKA | Zu 12 | Anthropomorphic | 6 | 12.0 | 1.300 | 0.03 |
Zu 18 | Anthropomorphic | 6 | 18.0 | 1.073 | 0.03 | |
Zu 3 | Anthropomorphic | 6 | 3.0 | 498 | 0.03 | |
Zu 7 | Anthropomorphic | 6 | 7.0 | 796 | 0.03 | |
Kassow Robots | KR1018 | Anthropomorphic | 6 | 10.0 | 1000 | 0.10 |
KR1205 | Anthropomorphic | 7 | 5.0 | 1200 | 0.10 | |
KR1410 | Anthropomorphic | 7 | 10.0 | 1400 | 0.10 | |
KR1805 | Anthropomorphic | 7 | 5.0 | 1800 | 0.10 | |
KR810 | Anthropomorphic | 7 | 10.0 | 850 | 0.10 | |
Kawasaki Robotics | Duaro | SCARA | 8 | 4.0 | 760 | 0.05 |
Duaro 2 | SCARA | 8 | 6.0 | 760 | 0.05 | |
Kinetic Systems | 6 Axes Robot | Anthropomorphic | 6 | 16.0 | 1900 | 0.05 |
SCARA Robot | SCARA | 4 | 5.0 | 1200 | 0.05 | |
Kinova | Gen2 | Anthropomorphic | 7 | 2.4 | 985 | 0.15 |
Gen3 | Anthropomorphic | 7 | 4.0 | 902 | 0.15 | |
Gen3 Lite | Anthropomorphic | 6 | 0.5 | 760 | 0.15 | |
KUKA | LBR iisy 3 R760 | Anthropomorphic | 6 | 3.0 | 760 | 0.01 |
LBR iisy 11 R1300 | Anthropomorphic | 6 | 11.0 | 1300 | 0.15 | |
LBR iisy 15 R930 | Anthropomorphic | 6 | 15.0 | 930 | 0.15 | |
LBR iiwa 14 R820 | Anthropomorphic | 7 | 14.0 | 820 | 0.15 | |
LBR iiwa 7 R800 | Anthropomorphic | 7 | 7.0 | 800 | 0.10 | |
LWR | Anthropomorphic | 7 | 7.0 | 790 | 0.05 | |
Life Robotics | CORO | Anthropomorphic | 6 | 2.0 | 800 | 1.00 |
Mabi | Speedy 12 | Anthropomorphic | 6 | 12.0 | 1250 | 0.10 |
Speedy 6 | Anthropomorphic | 6 | 6.0 | 800 | 0.10 | |
Megarobo | MRX-T4 | Anthropomorphic | 4 | 3.0 | 505 | 0.05 |
MIP Robotics | Junior 200 | SCARA | 4 | 3.0 | 400 | 0.50 |
Junior 300 | SCARA | 4 | 5.0 | 600 | 0.40 | |
Mitsubishi Electric | RV-5AS-D MELFA ASSISTA | Anthropomorphic | 6 | 5.0 | 910 | 0.03 |
MRK Systeme | KR 5 SI | Anthropomorphic | 6 | 5.0 | 1432 | 0.04 |
Nachi | CZ 10 | Anthropomorphic | 6 | 10.0 | 1300 | 0.10 |
Neura Robotics | LARA 10 | Anthropomorphic | 6 | 10.0 | 1000 | 0.02 |
LARA 5 | Anthropomorphic | 6 | 5.0 | 800 | 0.02 | |
Neuromeka | Indy 10 | Anthropomorphic | 6 | 10.0 | 1000 | 0.10 |
Indy 12 | Anthropomorphic | 6 | 12.0 | 1200 | 0.50 | |
Indy 3 | Anthropomorphic | 6 | 3.0 | 590 | 0.10 | |
Indy 5 | Anthropomorphic | 6 | 3.0 | 800 | 0.10 | |
Indy 7 | Anthropomorphic | 6 | 7.0 | 800 | 0.05 | |
Indy RP | Anthropomorphic | 6 | 5.0 | 950 | 0.05 | |
Indy RP 2 | Anthropomorphic | 7 | 5.0 | 800 | 0.05 | |
Opti 10 | Anthropomorphic | 6 | 10.0 | 1216 | 0.10 | |
Opti 5 | Anthropomorphic | 6 | 5.0 | 880 | 0.10 | |
Niryo | One | Anthropomorphic | 6 | 0.3 | 440 | 0.10 |
Pilz | PRBT | Anthropomorphic | 6 | 6.0 | 741 | 0.20 |
Precise Automation | Direct Drive 6 Axes | SCARA | 6 | 6.0 | 1793 | 0.02 |
PAVP6 | Anthropomorphic | 6 | 2.5 | 432 | 0.02 | |
PAVS6 | Anthropomorphic | 6 | 37.0 | 770 | 0.03 | |
PF3400 | SCARA | 4 | 23.0 | 588 | 0.05 | |
PP100 | Cartesian | 4 | 2.0 | 1270 | 0.10 | |
Productive Robotics | OB7 | Anthropomorphic | 7 | 5.0 | 1000 | 0.10 |
OB7 Max 12 | Anthropomorphic | 7 | 12.0 | 1300 | 0.10 | |
OB7 Max 8 | Anthropomorphic | 7 | 8.0 | 1700 | 0.10 | |
OB7 Stretch | Anthropomorphic | 7 | 4.0 | 1250 | 0.10 | |
Rainbow Robotics | RB10 1200 | Anthropomorphic | 6 | 10.0 | 1200 | 0.10 |
RB3 1300 | Anthropomorphic | 6 | 3.0 | 1300 | 0.10 | |
RB5 850 | Anthropomorphic | 6 | 5.0 | 850 | 0.10 | |
Rethink Robotics | Baxter | Torso | 14 | 2.2 | 1210 | 3.00 |
Sawyer | Anthropomorphic | 7 | 4.0 | 1260 | 0.10 | |
Sawyer Black Edition | Anthropomorphic | 7 | 4.0 | 1260 | 0.10 | |
Robut Tecnology | Armobot | Anthropomorphic | 6 | 3.0 | 1500 | 0.10 |
Rokae | X Mate 3 | Anthropomorphic | 7 | 3.0 | 760 | 0.03 |
X Mate 7 | Anthropomorphic | 7 | 7.0 | 850 | 0.03 | |
Rozum Robotics | Pulse 75 | Anthropomorphic | 6 | 6.0 | 750 | 0.10 |
Pulse 90 | Anthropomorphic | 6 | 4.0 | 900 | 0.10 | |
Siasun | DSCR3 Duco | Torso | 7 | 3.0 | 800 | 0.02 |
DSCR5 | Torso | 7 | 5.0 | 800 | 0.02 | |
GCR14 1400 | Anthropomorphic | 6 | 14.0 | 1400 | 0.05 | |
GCR20 1100 | Anthropomorphic | 6 | 20.0 | 1100 | 0.05 | |
GCR5 910 | Anthropomorphic | 6 | 5.0 | 910 | 0.05 | |
SCR3 | Anthropomorphic | 7 | 3.0 | 600 | 0.02 | |
SCR5 | Anthropomorphic | 7 | 5.0 | 800 | 0.02 | |
TCR 0.5 | Anthropomorphic | 6 | 0.5 | 300 | 0.05 | |
TCR 1 | Anthropomorphic | 6 | 1.0 | 500 | 0.05 | |
ST Robotics | R12 | Anthropomorphic | 6 | 1.0 | 500 | 0.10 |
R17 | Anthropomorphic | 6 | 3.0 | 750 | 0.20 | |
Staubli | TX2 Touch 60 | Anthropomorphic | 6 | 4.5 | 670 | 0.02 |
TX2 Touch 60L | Anthropomorphic | 6 | 3.7 | 920 | 0.03 | |
TX2 Touch 90 | Anthropomorphic | 6 | 14.0 | 1000 | 0.03 | |
TX2 Touch 90L | Anthropomorphic | 6 | 12.0 | 1200 | 0.04 | |
TX2 Touch 90XL | Anthropomorphic | 6 | 7.0 | 1450 | 0.04 | |
Yamaha | YA-U5F | Anthropomorphic | 7 | 5.0 | 559 | 0.06 |
YA-U10F | Anthropomorphic | 7 | 10.0 | 720 | 0.10 | |
YA-U20F | Anthropomorphic | 7 | 20.0 | 910 | 0.10 | |
Techman | Techman TM12 | Anthropomorphic | 6 | 12.0 | 1300 | 0.10 |
Techman TM14 | Anthropomorphic | 6 | 14.0 | 1100 | 0.10 | |
Techman TM5 700 | Anthropomorphic | 6 | 6.0 | 700 | 0.05 | |
Techman TM5 900 | Anthropomorphic | 6 | 4.0 | 900 | 0.05 | |
Tokyo Robotics | Torobo Arm | Anthropomorphic | 7 | 6.0 | 600 | 0.05 |
Torobo Arm Mini | Anthropomorphic | 7 | 3.0 | 600 | 0.05 | |
UFACTORY | uArm Swift Pro | Anthropomorphic | 4 | 0.5 | 320 | 0.20 |
xArm 5 Lite | Anthropomorphic | 5 | 3.0 | 700 | 0.10 | |
xArm 6 | Anthropomorphic | 6 | 3.0 | 700 | 0.10 | |
xArm 7 | Anthropomorphic | 7 | 3.5 | 700 | 0.10 | |
Universal Robots | UR10 CB3 | Anthropomorphic | 6 | 10.0 | 1300 | 0.10 |
UR10e | Anthropomorphic | 6 | 10.0 | 1300 | 0.03 | |
UR16e | Anthropomorphic | 6 | 16.0 | 900 | 0.05 | |
UR3 CB3 | Anthropomorphic | 6 | 3.0 | 500 | 0.10 | |
UR3e | Anthropomorphic | 6 | 3.0 | 500 | 0.03 | |
UR5 CB3 | Anthropomorphic | 6 | 5.0 | 850 | 0.10 | |
UR5e | Anthropomorphic | 6 | 5.0 | 850 | 0.03 | |
Yaskawa | Motoman HC10 | Anthropomorphic | 6 | 10.0 | 1200 | 0.10 |
Motoman HC10 DT | Anthropomorphic | 6 | 10.0 | 1200 | 0.10 | |
Motoman HC20 | Anthropomorphic | 6 | 20.0 | 1700 | 0.05 | |
Yuanda | Robotics Arm | Anthropomorphic | 6 | 7.0 | 1000 | 0.10 |
Svaya Robotics | SR-L3 | Anthropomorphic | 6 | 3.0 | 600 | 0.03 |
SR-L6 | Anthropomorphic | 6 | 6.0 | 850 | 0.03 | |
SR-L10 | Anthropomorphic | 6 | 10.0 | 1300 | 0.05 | |
SR-L12 | Anthropomorphic | 6 | 12.0 | 1100 | 0.05 | |
SR-L16 | Anthropomorphic | 6 | 16.0 | 900 | 0.05 |
Appendix B
Model | Class | Payload [kg] | TCP Velocity [m/s] | Power Consumption [kW] |
---|---|---|---|---|
OB7 Max 12 | Anthropomorphic | 12.0 | 2.0 | 0.90 |
OB7 Max 8 | Anthropomorphic | 8.0 | 2.0 | 0.90 |
AW-Tube 5 | Anthropomorphic | 5.0 | 0.75 | |
AW-Tube 8 | Anthropomorphic | 8.0 | 0.75 | |
AW-Tube 12 | Anthropomorphic | 13.0 | 0.75 | |
AW-Tube 15 | Anthropomorphic | 15.0 | 0.75 | |
AW-Tube 18 | Anthropomorphic | 18.0 | 0.75 | |
AW-Tube 20 | Anthropomorphic | 20.0 | 0.75 | |
SYB3 | Anthropomorphic | 10.0 | 1.0 | 0.70 |
OB7 Stretch | Anthropomorphic | 4.0 | 2.0 | 0.65 |
Zu 18 | Anthropomorphic | 18.0 | 3.5 | 0.60 |
RV-5AS-D MELFA ASSISTA | Anthropomorphic | 5.0 | 1.0 | 0.60 |
GCR20 1100 | Anthropomorphic | 20.0 | 1.0 | 0.60 |
I10 | Anthropomorphic | 10.0 | 4.0 | 0.50 |
Racer 5 0.80 Cobot | Anthropomorphic | 5.0 | 6.0 | 0.50 |
CS612 | Anthropomorphic | 12.0 | 3.0 | 0.50 |
EC612 | Anthropomorphic | 12.0 | 3.2 | 0.50 |
Zu 12 | Anthropomorphic | 12.0 | 3.0 | 0.50 |
I7 | Anthropomorphic | 7.0 | 0.40 | |
SCR5 | Anthropomorphic | 5.0 | 1.0 | 0.40 |
Gen3 | Anthropomorphic | 4.0 | 0.5 | 0.36 |
E10 | Anthropomorphic | 10.0 | 1.0 | 0.35 |
E15 | Anthropomorphic | 15.0 | 1.0 | 0.35 |
Zu 7 | Anthropomorphic | 7.0 | 2.5 | 0.35 |
Indy 10 | Anthropomorphic | 10.0 | 1.0 | 0.35 |
Indy 12 | Anthropomorphic | 12.0 | 1.0 | 0.35 |
Indy 3 | Anthropomorphic | 3.0 | 1.0 | 0.35 |
Indy 5 | Anthropomorphic | 3.0 | 1.0 | 0.35 |
Indy 7 | Anthropomorphic | 7.0 | 1.0 | 0.35 |
Indy RP 2 | Anthropomorphic | 5.0 | 1.0 | 0.35 |
UR10e | Anthropomorphic | 10.0 | 2.0 | 0.35 |
UR16e | Anthropomorphic | 16.0 | 1.0 | 0.35 |
X Mate 3 | Anthropomorphic | 3.0 | 0.30 | |
Techman TM12 | Anthropomorphic | 12.0 | 1.3 | 0.30 |
Techman TM14 | Anthropomorphic | 14.0 | 1.1 | 0.30 |
EVA | Anthropomorphic | 1.3 | 0.8 | 0.28 |
E5 | Anthropomorphic | 5.0 | 1.0 | 0.26 |
Panda 5 | Anthropomorphic | 5.0 | 1.0 | 0.26 |
CS66 | Anthropomorphic | 6.0 | 2.6 | 0.25 |
EC66 | Anthropomorphic | 6.0 | 2.8 | 0.25 |
Gen2 | Anthropomorphic | 2.4 | 0.2 | 0.25 |
KR 5 SI | Anthropomorphic | 5.0 | 0.25 | |
Pulse 90 | Anthropomorphic | 4.0 | 2.0 | 0.25 |
SCR3 | Anthropomorphic | 3.0 | 0.8 | 0.25 |
UR10 CB3 | Anthropomorphic | 10.0 | 1.0 | 0.25 |
MRX-T4 | Anthropomorphic | 3.0 | 0.24 | |
Techman TM5 700 | Anthropomorphic | 6.0 | 1.1 | 0.22 |
Techman TM5 900 | Anthropomorphic | 4.0 | 1.4 | 0.22 |
I5 | Anthropomorphic | 5.0 | 2.8 | 0.20 |
CR10 | Anthropomorphic | 10.0 | 3.0 | 0.20 |
CR16 | Anthropomorphic | 16.0 | 3.0 | 0.20 |
CR3 | Anthropomorphic | 3.0 | 3.0 | 0.20 |
CR5 | Anthropomorphic | 5.0 | 3.0 | 0.20 |
ECR5 | Anthropomorphic | 5.0 | 2.8 | 0.20 |
E3 | Anthropomorphic | 3.0 | 1.0 | 0.20 |
Gen3 Lite | Anthropomorphic | 0.5 | 0.3 | 0.20 |
PAVP6 | Anthropomorphic | 2.5 | 0.20 | |
GCR5 910 | Anthropomorphic | 5.0 | 0.20 | |
UR5e | Anthropomorphic | 5.0 | 1.0 | 0.20 |
C3 | Anthropomorphic | 3.0 | 1.0 | 0.18 |
E5 | Anthropomorphic | 5.0 | 1.0 | 0.18 |
E5-L | Anthropomorphic | 3.5 | 1.0 | 0.18 |
IRB 14050 Yumi | Anthropomorphic | 0.5 | 1.5 | 0.17 |
Panda 3 | Anthropomorphic | 3.0 | 1.0 | 0.16 |
I3 | Anthropomorphic | 3.0 | 1.9 | 0.15 |
CS63 | Anthropomorphic | 3.0 | 2.0 | 0.15 |
EC63 | Anthropomorphic | 3.0 | 2.0 | 0.15 |
Zu 3 | Anthropomorphic | 3.0 | 1.5 | 0.15 |
Pulse 75 | Anthropomorphic | 6.0 | 2.0 | 0.15 |
UR5 CB3 | Anthropomorphic | 5.0 | 1.0 | 0.15 |
xArm 5 Lite | Anthropomorphic | 3.0 | 0.3 | 0.12 |
UR3 CB3 | Anthropomorphic | 3.0 | 1.0 | 0.12 |
2R 48V | Anthropomorphic | 5.0 | 0.10 | |
T5 | Anthropomorphic | 5.0 | 0.10 | |
UR3e | Anthropomorphic | 3.0 | 1.0 | 0.10 |
OB7 | Anthropomorphic | 5.0 | 2.0 | 0.09 |
2R 24V | Anthropomorphic | 3.0 | 0.08 | |
CORO | Anthropomorphic | 2.0 | 0.08 | |
Robot | Anthropomorphic | 3.0 | 2.0 | 0.06 |
One | Anthropomorphic | 0.3 | 0.06 |
References
- Moulières-Seban, T.; Salotti, J.M.; Claverie, B.; Bitonneau, D. Classification of Cobotic Systems for Industrial Applications. In Proceedings of the 6th Workshop towards a Framework for Joint Action, Paris, France, 26 October 2015. [Google Scholar]
- Di Marino, C.; Rega, A.; Vitolo, F.; Patalano, S.; Lanzotti, A. A new approach to the anthropocentric design of human–robot collaborative environments. Acta IMEKO 2020, 9, 80–87. [Google Scholar] [CrossRef]
- Vitolo, F.; Rega, A.; Di Marino, C.; Pasquariello, A.; Zanella, A.; Patalano, S. Mobile Robots and Cobots Integration: A Preliminary Design of a Mechatronic Interface by Using MBSE Approach. Appl. Sci. 2022, 12, 419. [Google Scholar] [CrossRef]
- Harold, L.S.; Michael, Z.; Ryan, R.J. The Robotics Revolution. Electron. Power 1985, 31, 598. [Google Scholar] [CrossRef]
- Rigby, M. Future-proofing UK manufacturing Current investment trends and future opportunities in robotic automation. Barclays/Dev. Econ. 2015, 1, 1–10. [Google Scholar]
- Russmann, M.; Lorenz, M.; Gerbert, P.; Waldner, M.; Justus, J.; Engel, P.; Harnisch, M. Industry 4.0: The Future of Productivity and Growth in Manufacturing Industries. Bost. Consult. Gr. 2015, 9, 54–89. [Google Scholar]
- Di Marino, C.; Rega, A.; Vitolo, F.; Patalano, S.; Lanzotti, A. The anthropometric basis for the designing of collaborative workplaces. In Proceedings of the II Workshop on Metrology for Industry 4.0 and IoT (MetroInd4.0 IoT), Naples, Italy, 4–6 June 2019; pp. 98–102. [Google Scholar]
- Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
- Bi, Z.; Luo, C.; Miao, Z.; Zhang, B.; Zhang, W.; Wang, L. Safety assurance mechanisms of collaborative robotic systems in manufacturing. Robot. Comput. Manuf. 2021, 67, 102022. [Google Scholar] [CrossRef]
- OECD. The Future of Productivity; OECD: Paris, France, 2015; Volume 1. [Google Scholar]
- Schmidtler, J.; Knott, V.; Hölzel, C.; Bengler, K. Human Centered Assistance Applications for the working environment of the future. Occup. Ergon. 2015, 12, 83–95. [Google Scholar] [CrossRef]
- Wang, X.V.; Seira, A.; Wang, L. Classification, personalised safety framework and strategy for human-robot collaboration. In Proceedings of the CIE 48, International Conference on Computers & Industrial Engineering, Auckland, New Zealand, 2–5 December 2018. [Google Scholar]
- Wang, L.; Gao, R.; Váncza, J.; Krüger, J.; Wang, X.; Makris, S.; Chryssolouris, G. Symbiotic human-robot collaborative assembly. CIRP Ann. 2019, 68, 701–726. [Google Scholar] [CrossRef] [Green Version]
- Antonelli, D.; Astanin, S. Qualification of a Collaborative Human-robot Welding Cell. Procedia CIRP 2016, 41, 352–357. [Google Scholar] [CrossRef] [Green Version]
- Levratti, A.; De Vuono, A.; Fantuzzi, C.; Secchi, C. TIREBOT: A novel tire workshop assistant robot. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Zurich, Switzerland, 4–7 September 2016; AIM: Cranberry Township, PA, USA, 2016; pp. 733–738. [Google Scholar]
- Peternel, L.; Tsagarakis, N.; Caldwell, D.; Ajoudani, A. Adaptation of robot physical behaviour to human fatigue in hu-man-robot co-manipulation. In Proceedings of the IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016; pp. 489–494. [Google Scholar]
- Cherubini, A.; Passama, R.; Crosnier, A.; Lasnier, A.; Fraisse, P. Collaborative manufacturing with physical human–robot interaction. Robot. Comput. Integr. Manuf. 2016, 40, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Tan, J.T.C.; Duan, F.; Zhang, Y.; Watanabe, K.; Kato, R.; Arai, T. Human-robot collaboration in cellular manufacturing: Design and development. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, St Louis, MI, USA, 11–15 October 2009; pp. 29–34. [Google Scholar]
- Krüger, J.; Lien, T.; Verl, A. Cooperation of human and machines in assembly lines. CIRP Ann. 2009, 58, 628–646. [Google Scholar] [CrossRef]
- Erden, M.S.; Billard, A. End-point impedance measurements at human hand during interactive manual welding with robot. In Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, 31 May–7 June 2014; pp. 126–133. [Google Scholar]
- Wojtara, T.; Uchihara, M.; Murayama, H.; Shimoda, S.; Sakai, S.; Fujimoto, H.; Kimura, H. Human–robot collaboration in precise positioning of a three-dimensional object. Automatica 2009, 45, 333–342. [Google Scholar] [CrossRef]
- Morel, G.; Malis, E.; Boudet, S. Impedance based combination of visual and force control. In Proceedings of the IEEE International Conference on Robotics and Automation, Leuven, Belgium, 20–20 May 1998; Volume 2, pp. 1743–1748. [Google Scholar]
- Magrini, E.; Ferraguti, F.; Ronga, A.J.; Pini, F.; De Luca, A.; Leali, F. Human-robot coexistence and interaction in open in-dustrial cells. Robot. Comput. Integr. Manuf. 2020, 61, 101846. [Google Scholar] [CrossRef]
- Ajoudani, A.; Zanchettin, A.M.; Ivaldi, S.; Albu-Schäffer, A.; Kosuge, K.; Khatib, O. Progress and prospects of the human–robot collaboration. Auton. Robots 2018, 42, 957–975. [Google Scholar] [CrossRef] [Green Version]
- Michalos, G.; Kousi, N.; Karagiannis, P.; Gkournelos, C.; Dimoulas, K.; Koukas, S.; Mparis, K.; Papavasileiou, A.; Makris, S. Seamless human robot collaborative assembly—An automotive case study. Mechatronics 2018, 55, 194–211. [Google Scholar] [CrossRef]
- Baraglia, J.; Cakmak, M.; Nagai, Y.; Rao, R.P.; Asada, M. Efficient human-robot collaboration: When should a robot take initiative? Int. J. Robot. Res. 2017, 36, 563–579. [Google Scholar] [CrossRef]
- Donner, P.; Buss, M. Cooperative Swinging of Complex Pendulum-Like Objects: Experimental Evaluation. IEEE Trans. Robot. 2016, 32, 744–753. [Google Scholar] [CrossRef]
- Dimeas, F.; Aspragathos, N. Online Stability in Human-Robot Cooperation with Admittance Control. IEEE Trans. Haptics 2016, 9, 267–278. [Google Scholar] [CrossRef]
- Kruse, D.; Radke, R.J.; Wen, J.T. Collaborative human-robot manipulation of highly deformable materials. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 3782–3787. [Google Scholar]
- Gams, A.; Nemec, B.; Ijspeert, A.J.; Ude, A. Coupling Movement Primitives: Interaction with the Environment and Bimanual Tasks. IEEE Trans. Robot. 2014, 30, 816–830. [Google Scholar] [CrossRef] [Green Version]
- Bestick, A.M.; Burden, S.A.; Willits, G.; Naikal, N.; Sastry, S.S.; Bajcsy, R. Personalized kinematics for human-robot collaborative manipulation. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015. [Google Scholar]
- Kildal, J.; Martín, M.; Ipiña, I.; Maurtua, I. Empowering assembly workers with cognitive disabilities by working with collaborative robots: A study to capture design requirements. Procedia CIRP 2019, 81, 797–802. [Google Scholar] [CrossRef]
- Böhme, H.-J.; Wilhelm, T.; Key, J.; Schauer, C.; Schröter, C.; Groß, H.-M.; Hempel, T. An approach to multi-modal human–machine interaction for intelligent service robots. Robot. Auton. Syst. 2003, 44, 83–96. [Google Scholar] [CrossRef] [Green Version]
- Vasconez, J.P.; Kantor, G.A.; Cheein, F.A.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
- Hjorth, S.; Chrysostomou, D. Human–robot collaboration in industrial environments: A literature review on non-destructive disassembly. Robot. Comput. Manuf. 2021, 73, 102208. [Google Scholar] [CrossRef]
- Murphy, R.R. Human—Robot Interaction in Rescue Robotics. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2004, 34, 138–153. [Google Scholar] [CrossRef]
- Magalhaes, P.; Ferreira, N. Inspection Application in an Industrial Environment with Collaborative Robots. Automation 2022, 3, 13. [Google Scholar] [CrossRef]
- Weiss, A.; Wortmeier, A.-K.; Kubicek, B. Cobots in Industry 4.0: A Roadmap for Future Practice Studies on Human–Robot Collaboration. IEEE Trans. Hum. Mach. Syst. 2021, 51, 335–345. [Google Scholar] [CrossRef]
- Tsagarakis, N.G.; Caldwell, D.G.; Negrello, F.; Choi, W.; Baccelliere, L.; Loc, V.G.; Noorden, J.; Muratore, L.; Margan, A.; Cardellino, A.; et al. WALK-MAN: A High-Performance Humanoid Platform for Realistic Environments. J. Field Robot. 2017, 34, 1225–1259. [Google Scholar] [CrossRef]
- Masaracchio, M.; Kirker, K. Resistance Training in Individuals with Hip and Knee Osteoarthritis: A Clinical Commentary with Practical Applications. Strength Cond. J. 2022, 44, 36–46. [Google Scholar] [CrossRef]
- Gravel, D.P.; Newman, W.S. Flexible Robotic Assembly Efforts at Ford Motor Company. Proceeding of the 2001 IEEE International Symposium on Intelligent Control (ISIC’ 01) (Cat. No.01CH37206), Mexico City, Mexico, 5–7 September 2001; Available online: https://ieeexplore.ieee.org/abstract/document/971504/ (accessed on 17 May 2022).
- Mojtahedi, K.; Whitsell, B.; Artemiadis, P.; Santello, M. Communication and Inference of Intended Movement Direction during Human–Human Physical Interaction. Front. Neurorobot. 2017, 11, 21. [Google Scholar] [CrossRef] [Green Version]
- Vogel, J.; Castellini, C.; Van Der Smagt, P. EMG-Based Teleoperation and Manipulation with the DLR LWR-III. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 6434–6437. [Google Scholar]
- Gijsberts, A.; Bohra, R.; González, D.S.; Werner, A.; Nowak, M.; Caputo, B.; Roa, M.A.; Castellini, C. Stable myoelectric control of a hand prosthesis using non-linear incremental learning. Front. Neurorobot. 2014, 8, 8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fleischer, C.; Hommel, G. A Human-Exoskeleton Interface Utilizing Electromyography. IEEE Trans. Robot. 2008, 24, 872–882. [Google Scholar] [CrossRef]
- de Vlugt, E.; Schouten, A.; van der Helm, F.C.; Teerhuis, P.C.; Brouwn, G.G. A force-controlled planar haptic device for movement control analysis of the human arm. J. Neurosci. Methods 2003, 129, 151–168. [Google Scholar] [CrossRef] [PubMed]
- Burdet, E.; Osu, R.; Franklin, D.W. The central nervous system stabilizes unstable dynamics by learning optimal impedance. Nature 2001, 414, 446–449. [Google Scholar] [CrossRef] [PubMed]
- Hao, M.; Zhang, J.; Chen, K.; Asada, H.H.; Fu, C. Supernumerary Robotic Limbs to Assist Human Walking with Load Carriage. J. Mech. Robot. 2020, 12, 061014. [Google Scholar] [CrossRef]
- Luo, J.; Gong, Z.; Su, Y.; Ruan, L.; Zhao, Y.; Asada, H.H.; Fu, C. Modeling and Balance Control of Supernumerary Robotic Limb for Overhead Tasks. IEEE Robot. Autom. Lett. 2021, 6, 4125–4132. [Google Scholar] [CrossRef]
- Bonilla, B.L.; Parietti, F.; Asada, H.H. Demonstration-based control of supernumerary robotic limbs. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 3936–3942. [Google Scholar]
- Parietti, F.; Chan, K.; Asada, H.H. Bracing the human body with supernumerary Robotic Limbs for physical assistance and load reduction. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 141–148. [Google Scholar]
- Ciullo, A.S.; Catalano, M.G.; Bicchi, A.; Ajoudani, A. A Supernumerary Soft Robotic Limb for Reducing Hand-Arm Vibration Syndromes Risks. Front. Robot. AI 2021, 8, 650613. [Google Scholar] [CrossRef]
- Abdi, E.; Burdet, E.; Bouri, M.; Himidan, S.; Bleuler, H. In a demanding task, three-handed manipulation is preferred to two-handed manipulation. Sci. Rep. 2016, 6, 21758. [Google Scholar] [CrossRef] [Green Version]
- Meraz, N.S.; Sobajima, M.; Aoyama, T.; Hasegawa, Y. Modification of body schema by use of extra robotic thumb. Robomech J. 2018, 5, 3. [Google Scholar] [CrossRef] [Green Version]
- Kilner, J.; Hamilton, A.F.d.C.; Blakemore, S.-J. Interference effect of observed human movement on action is due to velocity profile of biological motion. Soc. Neurosci. 2007, 2, 158–166. [Google Scholar] [CrossRef]
- Maurice, P.; Padois, V.; Measson, Y.; Bidaud, P. Human-oriented design of collaborative robots. Int. J. Ind. Ergon. 2017, 57, 88–102. [Google Scholar] [CrossRef] [Green Version]
- Rosen, J.; Brand, M.; Fuchs, M.; Arcan, M. A myosignal-based powered exoskeleton system. IEEE Trans. Syst. Man Cybern. Part A Syst. Humans 2001, 31, 210–222. [Google Scholar] [CrossRef] [Green Version]
- Farry, K.; Walker, I.; Baraniuk, R. Myoelectric teleoperation of a complex robotic hand. IEEE Trans. Robot. Autom. 1996, 12, 775–788. [Google Scholar] [CrossRef]
- Castellini, C.; Artemiadis, P.; Wininger, M.; Ajoudani, A.; Alimusaj, M.; Bicchi, A.; Caputo, B.; Craelius, W.; Dosen, S.; Englehart, K.; et al. Proceedings of the first workshop on Peripheral Machine Interfaces: Going beyond traditional surface electromyography. Front. Neurorobot. 2014, 8, 22. [Google Scholar] [CrossRef] [Green Version]
- Farina, D.; Jiang, N.; Rehbaum, H.; Holobar, A.; Graimann, B.; Dietl, H.; Aszmann, O.C. The Extraction of Neural Information from the Surface EMG for the Control of Upper-Limb Prostheses: Emerging Avenues and Challenges. IEEE Trans. Neural Syst. Rehabilitat. Eng. 2014, 22, 797–809. [Google Scholar] [CrossRef] [PubMed]
- Kim, S.; Kim, C.; Park, J.H. Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2006; pp. 3486–3491. [Google Scholar] [CrossRef] [Green Version]
- Magrini, E.; Flacco, F.; De Luca, A. Estimation of contact forces using a virtual force sensor. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 2126–2133. [Google Scholar] [CrossRef]
- Khatib, O.; Demircan, E.; De Sapio, V.; Sentis, L.; Besier, T.; Delp, S. Robotics-based synthesis of human motion. J. Physiol. 2009, 103, 211–219. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tsumugiwa, T.; Yokogawa, R.; Hara, K. Variable impedance control with virtual stiffness for human-robot cooperative pegin-hole task. In Proceedings of the Intelligent Robots and Systems, Osaka, Japan, 5–7 August 2003; Volume 2, pp. 1075–1081. [Google Scholar] [CrossRef]
- Ficuciello, F.; Romano, A.; Villani, L.; Siciliano, B. Cartesian impedance control of redundant manipulators for human-robot co-manipulation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 2120–2125. [Google Scholar] [CrossRef] [Green Version]
- Kosuge, K.; Hashimoto, S.; Yoshida, H. Human-robots collaboration system for flexible object handling. In Proceedings of the 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146), Leuven, Belgium, 20–20 May 2002; Volume 2, pp. 1841–1846. [Google Scholar] [CrossRef]
- Ajoudani, A.; Godfrey, S.B.; Bianchi, M.; Catalano, M.G.; Grioli, G.; Tsagarakis, N.; Bicchi, A. Exploring Teleimpedance and Tactile Feedback for Intuitive Control of the Pisa/IIT SoftHand. IEEE Trans. Haptics 2014, 7, 203–215. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yang, C.; Ganesh, G.; Haddadin, S.; Parusel, S.; Albu-Schaeffer, A.; Burdet, E. Human-Like Adaptation of Force and Impedance in Stable and Unstable Interactions. IEEE Trans. Robot. 2011, 27, 918–930. [Google Scholar] [CrossRef] [Green Version]
- Plagemann, C.; Ganapathi, V.; Koller, D.; Thrun, S. Real-time identification and localization of body parts from depth images. In Proceedings of the IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 3108–3113. [Google Scholar] [CrossRef]
- Perzanowski, D.; Schultz, A.; Adams, W. Integrating natural language and gesture in a robotics domain. In Proceedings of the 1998 IEEE International Symposium on Intelligent Control (ISIC) held jointly with IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA) Intell, Gaithersburg, MD, USA, 17 September 2002. [Google Scholar] [CrossRef]
- Zanchettin, A.M.; Rocco, P. Reactive motion planning and control for compliant and constraint-based task execution. In Proceedings of the International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 2748–2753. [Google Scholar] [CrossRef]
- Peternel, L.; Noda, T.; Petrič, T.; Ude, A.; Morimoto, J.; Babič, J. Adaptive Control of Exoskeleton Robots for Periodic Assistive Behaviours Based on EMG Feedback Minimisation. PLoS ONE 2016, 11, e0148942. [Google Scholar] [CrossRef] [Green Version]
- Maeda, Y.; Takahashi, A.; Hara, T.; Arai, T. Human-robot cooperation with mechanical interaction based on rhythm entrainment-realization of cooperative rope turning. In Proceedings of the 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, Republic of Korea, 21–26 May 2002; Volume 4, pp. 3477–3482. [Google Scholar]
- Cherubini, A.; Passama, R.; Meline, A.; Crosnier, A.; Fraisse, P. Multimodal control for human-robot cooperation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 2202–2207. [Google Scholar] [CrossRef] [Green Version]
- Lippiello, V.; Siciliano, B.; Villani, L. Robot Interaction Control Using Force and Vision. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 1470–1475. [Google Scholar] [CrossRef]
- Khansari-Zadeh, S.M.; Billard, A. Learning Stable Nonlinear Dynamical Systems with Gaussian Mixture Models. IEEE Trans. Robot. 2011, 27, 943–957. [Google Scholar] [CrossRef] [Green Version]
- Fernandez, V.; Balaguer, C.; Blanco, D.; Salichs, M. Active human-mobile manipulator cooperation through intention recognition. In Proceedings of the 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, Republic of Korea, 21–26 May 2002. [Google Scholar] [CrossRef]
- Stulp, F.; Grizou, J.; Busch, B.; Lopes, M. Facilitating intention prediction for humans by optimizing robot motions. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 1249–1255. [Google Scholar] [CrossRef] [Green Version]
- Ebert, D.; Henrich, D. Safe human-robot-cooperation: Image-based collision detection for industrial robots. IEEE Int. Conf. Intell. Robot. Syst. 2003, 2, 1826–1831. [Google Scholar] [CrossRef] [Green Version]
- Magnanimo, V.; Saveriano, M.; Rossi, S.; Lee, D. A Bayesian approach for task recognition and future human activity prediction. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 726–731. [Google Scholar] [CrossRef]
- Bascetta, L.; Ferretti, G.; Rocco, P.; Ardö, H.; Bruyninckx, H.; Demeester, E.; Di Lello, E. Towards safe human-robot interaction in robotic cells: An approach based on visual tracking and intention estimation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 2971–2978. [Google Scholar] [CrossRef]
- Agravante, D.J.; Cherubini, A.; Bussy, A.; Gergondet, P.; Kheddar, A. Collaborative human-humanoid carrying using vision and haptic sensing. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 607–612. [Google Scholar] [CrossRef] [Green Version]
- Strazzulla, I.; Nowak, M.; Controzzi, M.; Cipriani, C.; Castellini, C. Online Bimanual Manipulation Using Surface Electromyography and Incremental Learning. IEEE Trans. Neural Syst. Rehabilitat. Eng. 2016, 25, 227–234. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Calinon, S.; Sauser, E.L.; Caldwell, D.G.; Billard, A.G. Learning and reproduction of gestures by imitation an approach based on Hidden Markov Model and Gaussian Mixture Regression. IEEE Robot. Autom. Mag. 2010, 17, 44–54. [Google Scholar] [CrossRef] [Green Version]
- Rozo, L.; Bruno, D.; Calinon, S.; Caldwell, D.G. Learning optimal controllers in human-robot cooperative transportation tasks with position and force constraints. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 1024–1030. [Google Scholar] [CrossRef]
- Rozo, L.; Calinon, S.; Caldwell, D.G. Learning force and position constraints in human-robot cooperative transportation. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication: Human-Robot Co-Existence: Adaptive Interfaces and Systems for Daily Life, Therapy, Assistance and Socially Engaging Interactions, Edinburgh, UK, 25–29 August 2014; pp. 619–624. [Google Scholar] [CrossRef]
- Rozo, L.; Calinon, S.; Caldwell, D.G.; Jimenez, P.; Torras, C. Learning Physical Collaborative Robot Behaviors from Human Demonstrations. IEEE Trans. Robot. 2016, 32, 513–527. [Google Scholar] [CrossRef] [Green Version]
- Ivaldi, S.; Lefort, S.; Peters, J.; Chetouani, M.; Provasi, J.; Zibetti, E. Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task: Experiments with the iCub humanoid. Int. J. Soc. Robot. 2016, 9, 63–86. [Google Scholar] [CrossRef] [Green Version]
- Colome, A.; Planells, A.; Torras, C. A friction-model-based framework for Reinforcement Learning of robotic tasks in non-rigid environments. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 5649–5654. [Google Scholar] [CrossRef] [Green Version]
- Lallee, S.; Pattacini, U.; Lemaignan, S.; Lenz, A.; Melhuish, C.; Natale, L.; Skachek, S.; Hamann, K.; Steinwender, J.; Sisbot, E.A.; et al. Towards a Platform-Independent Cooperative Human Robot Interaction System: III An Architecture for Learning and Executing Actions and Shared Plans. IEEE Trans. Auton. Ment. Dev. 2012, 4, 239–253. [Google Scholar] [CrossRef] [Green Version]
- Lee, D.; Ott, C. Incremental kinesthetic teaching of motion primitives using the motion refinement tube. Auton. Robot. 2011, 31, 115–131. [Google Scholar] [CrossRef]
- Lawitzky, M.; Medina, J.R.; Lee, D.; Hirche, S. Feedback motion planning and learning from demonstration in physical robotic assistance: Differences and synergies. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 3646–3652. [Google Scholar] [CrossRef] [Green Version]
- Petit, M.; Lallee, S.; Boucher, J.-D.; Pointeau, G.; Cheminade, P.; Ognibene, D.; Chinellato, E.; Pattacini, U.; Gori, I.; Martinez-Hernandez, U.; et al. The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks. IEEE Trans. Auton. Ment. Dev. 2012, 5, 3–17. [Google Scholar] [CrossRef] [Green Version]
- Peternel, L.; Petrič, T.; Oztop, E.; Babič, J. Teaching robots to cooperate with humans in dynamic manipulation tasks based on multi-modal human-in-the-loop approach. Auton. Robot. 2013, 36, 123–136. [Google Scholar] [CrossRef]
- Zhang, C.; Lin, C.; Leng, Y.; Fu, Z.; Cheng, Y.; Fu, C. An Effective Head-Based HRI for 6D Robotic Grasping Using Mixed Reality. IEEE Robot. Autom. Lett. 2023, 8, 2796–2803. [Google Scholar] [CrossRef]
- Duguleana, M.; Barbuceanu, F.G.; Mogan, G. Evaluating Human-Robot Interaction during a Manipulation Experiment Conducted in Immersive Virtual Reality. In Proceedings of the International Conference on Virtual and Mixed Reality, Orlando, FL, USA, 9–14 July 2011; pp. 164–173. [Google Scholar] [CrossRef]
- Zhang, Z. Building Symmetrical Reality Systems for Cooperative Manipulation. In Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, Shanghai, China, 25–29 March 2023; pp. 751–752. [Google Scholar] [CrossRef]
- Gupta, K.; Hajika, R.; Pai, Y.S.; Duenser, A.; Lochner, M.; Billinghurst, M. Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality. In Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 756–765. [Google Scholar] [CrossRef]
Class | No. |
---|---|
Anthropomorphic | 176 |
Cartesian | 1 |
SCARA | 14 |
Torso | 4 |
COBOT Cluster | Payload (kg) | Reach (mm) |
---|---|---|
Group 1 | P ≤ 5.0 | R < 500 |
Group 2 | 5.0 < P ≤ 10.0 | 500 < R ≤ 1000 |
Group 3 | 10.0 < P ≤ 15.0 | 1000 < R ≤ 1500 |
Group 4 | 15.0 < P ≤ 20.0 | 1500 < R ≤ 2000 |
Group 5 | P > 20.0 | R > 2000 |
Class | Group 1 | Group 2 | Group 3 | Group 4 | Group 5 | Total |
---|---|---|---|---|---|---|
Anthropomorphic | 16 | 91 | 50 | 15 | 4 | 176 |
Cartesian | 1 | 1 | ||||
SCARA | 5 | 6 | 1 | 1 | 1 | 14 |
Torso | 2 | 2 | 4 | |||
Total | 21 | 99 | 54 | 16 | 5 | 195 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Taesi, C.; Aggogeri, F.; Pellegrini, N. COBOT Applications—Recent Advances and Challenges. Robotics 2023, 12, 79. https://doi.org/10.3390/robotics12030079
Taesi C, Aggogeri F, Pellegrini N. COBOT Applications—Recent Advances and Challenges. Robotics. 2023; 12(3):79. https://doi.org/10.3390/robotics12030079
Chicago/Turabian StyleTaesi, Claudio, Francesco Aggogeri, and Nicola Pellegrini. 2023. "COBOT Applications—Recent Advances and Challenges" Robotics 12, no. 3: 79. https://doi.org/10.3390/robotics12030079
APA StyleTaesi, C., Aggogeri, F., & Pellegrini, N. (2023). COBOT Applications—Recent Advances and Challenges. Robotics, 12(3), 79. https://doi.org/10.3390/robotics12030079