Control System Design for Collaborative Robots: Comparison
Please note this is a comparison between Version 2 by Jessie Wu and Version 1 by ayesha hameed.

Human–robot collaboration is an innovative area aiming to construct an environment for safe and efficient collaboration between humans and robots to accomplish a specific task. Collaborative robots cooperate with humans to assist them in undertaking simple-to-complex tasks in several fields, including industry, education, agriculture, healthcare services, security, and space exploration. These robots play a vital role in the revolution of Industry 4.0, which defines new standards of manufacturing and the organization of products in the industry. Incorporating collaborative robots in the workspace improves efficiency, but it also introduces several safety risks.

  • collaborative control
  • robots
  • architectures

1. Human–Robot Collaboration

1.1. Human–Robot Interaction

The human–robot interactions are divided into three subcategories: (i) Human–robot co-existence; (ii) Human–robot cooperation; and (iii) Human–robot collaboration. This classification is based on four criteria: (i) workspace; (ii) working time; (iii) working aim or task; and (iv) the existence of contact (contactless or with-contact).
The workspace can be described as a working area surrounding humans and robots wherein they can perform their tasks individually, as shown in Figure 1. The time during which a human is working in the collaborative workspace is known as the working time. Humans and robots interact in a workspace to achieve a common goal or distinct goals. Therefore, if the workspace is shared between the two entities along with simultaneous action, this interaction is known as HR coexistence [13][1]. HR cooperation implies an interaction when they work simultaneously towards the same aim in a shared workspace. However, HR collaboration covers scenarios in which there is direct contact between humans and robots to accomplish the shared aim or goal. Examples of these interactions are classical industrial robots, cooperative robots, and collaborative robots, respectively.
Figure 1.
Classification of human–robot interactions.
It is important to consider that the term HR collaboration is ambiguous in its definitions [14,15][2][3]. In Figure 1, HR collaboration is shown as the final category of HR interaction that describes a human and robot executing the same task together, wherein the action of the one has an immediate impact on the other.

1.2. Human–Robot Collaboration Types

Human–robot collaboration is the advanced property of robots that allows them to execute a challenging task involving human interaction in two ways: (i) physical collaboration; and (ii) contactless collaboration [14][2]. Physical collaboration entails direct physical contact of the force of the human hand exerted on the robot’s end-effector. These forces/torques assist or predict the robotic motion accordingly [16][4]. However, contactless collaboration does not involve physical interaction. This collaboration is carried out through direct (speech or gestures) or indirect (eye gaze direction, intentions recognition, or facial expressions) communication [15][3]. In these types of collaboration scenarios, human operator cognitive skills and decision-making abilities are combined with the robotic attributes of repetitively and more precisely performing the job with human involvement.
Contactless collaboration faces several issues, e.g., communication channel delay, input actuator saturation, bounded input and output, and data transmission delay in bilateral teleoperation systems. Therefore, various controller methods have been reported in the literature to deal with these issues, such as output feedback control [17][5], fuzzy control [18][6], adaptive robust control [19][7], model predictive control [20][8], and sliding mode control [21,22][9][10]. However, this survey focuses on the critical issues observed by collaborative robots during physical HR collaboration. The key challenging issue in this regard includes the prediction of human intentions, motion synchronization due to human-caused disturbances, and human safety for efficient physical HR interaction. The following section introduces the different robotic operations of a collaborative robot during HR collaboration.

1.3. Collaborative Robotic Operations

Norm ISO/TS15066 describes four operative modes for collaborative robots to ensure human safety: (1) power- and force-limiting; (2) speed and separation monitoring; (3) a safety-rated monitored stop; and (4) hand-guiding [23,24][11][12]. In these operating modes, collaborative robots work in collaboration with a human operator depending on the application. Table 1 presents the four working modes of collaborative operations on the basis of features, monitoring speed, torque-sensing, operator control, and a workspace limit for safe HR collaboration.

References

  1. Schmidtler, J.; Knott, V.; Hölzel, C.; Bengler, K. Human Centered Assistance Applications for the working environment of the future. Occup. Ergon. 2015, 12, 83–95.
  2. Matheson, E.; Minto, R.; Zampieri, E.G.G.; Faccio, M.; Rosati, G. Human–Robot Collaboration in Manufacturing Applications: A Review. Robotics 2019, 8, 100.
  3. Wang, L.; Gao, R.; Váncza, J.; Krüger, J.; Wang, X.; Makris, S.; Chryssolouris, G. Symbiotic human–robot collaborative assembly. CIRP Ann. 2019, 68, 701–726.
  4. Al-Yacoub, A.; Zhao, Y.; Eaton, W.; Goh, Y.; Lohse, N. Improving human robot collaboration through Force/Torque based learning for object manipulation. Robot. Comput.-Integr. Manuf. 2021, 69, 102111.
  5. Rahman, S.M.; Wang, Y. Mutual trust-based subtask allocation for human–robot collaboration in flexible lightweight assembly in manufacturing. Mechatronics 2018, 54, 94–109.
  6. Jiang, J.; Huang, Z.; Bi, Z.; Ma, X.; Yu, G. State-of-the-Art control strategies for robotic PiH assembly. Robot. Comput.-Integr. Manuf. 2020, 65, 101894.
  7. Cheng, C.; Liu, S.; Wu, H.; Zhang, Y. Neural network–based direct adaptive robust control of unknown MIMO nonlinear systems using state observer. Int. J. Adapt. Control Signal Process. 2020, 34, 1–14.
  8. Li, S.; Wang, H.; Zhang, S. Human-Robot Collaborative Manipulation with the Suppression of Human-caused Disturbance. J. Intell. Robot. Syst. 2021, 102, 1–11.
  9. Chen, Z.; Huang, F.; Chen, W.; Zhang, J.; Sun, W.; Chen, J.; Gu, J.; Zhu, S. RBFNN-Based Adaptive Sliding Mode Control Design for Delayed Nonlinear Multilateral Telerobotic System With Cooperative Manipulation. IEEE Trans. Ind. Inform. 2020, 16, 1236–1247.
  10. Abadi, A.S.S.; Hosseinabadi, P.A.; Mekhilef, S. Fuzzy adaptive fixed-time sliding mode control with state observer for a class of high-order mismatched uncertain systems. Int. J. Control Autom. Syst. 2020, 18, 2492–2508.
  11. Scholtz, J. Theory and evaluation of human robot interactions. In Proceedings of the 36th Annual Hawaii International Conference on System Sciences, Big Island, HI, USA, 6–9 January 2003; p. 10.
  12. Cherubini, A.; Passama, R.; Meline, A.; Crosnier, A.; Fraisse, P. Multimodal control for human–robot cooperation. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 2202–2207.
  13. Hua, C.; Yang, Y.; Liu, P.X. Output-feedback adaptive control of networked teleoperation system with time-varying delay and bounded inputs. IEEE/ASME Trans. Mechatron. 2014, 20, 2009–2020.
  14. Zhai, D.H.; Xia, Y. Adaptive fuzzy control of multilateral asymmetric teleoperation for coordinated multiple mobile manipulators. IEEE Trans. Fuzzy Syst. 2015, 24, 57–70.
  15. Chen, Z.; Huang, F.; Sun, W.; Gu, J.; Yao, B. RBF-neural-network-based adaptive robust control for nonlinear bilateral teleoperation manipulators with uncertainty and time delay. IEEE/ASME Trans. Mechatron. 2019, 25, 906–918.
  16. Rosenstrauch, M.J.; Krüger, J. Safe human–robot-collaboration-introduction and experiment using ISO/TS 15066. In Proceedings of the 3rd International Conference on Control, Automation and Robotics (ICCAR), Nagoya, Japan, 24–26 April 2017; pp. 740–744.
  17. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266.
  18. Asfour, T.; Kaul, L.; Wächter, M.; Ottenhaus, S.; Weiner, P.; Rader, S.; Grimm, R.; Zhou, Y.; Grotz, M.; Paus, F.; et al. ARMAR-6: A Collaborative Humanoid Robot for Industrial Environments. In Proceedings of the 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids), Beijing, China, 6–9 November 2018; pp. 447–454.
  19. Tang, L.; Jiang, Y.; Lou, J. Reliability architecture for collaborative robot control systems in complex environments. Int. J. Adv. Robot. Syst. 2016, 13, 17.
  20. Ye, Y.; Li, P.; Li, Z.; Xie, F.; Liu, X.J.; Liu, J. Real-Time Design Based on PREEMPT_RT and Timing Analysis of Collaborative Robot Control System. In Proceedings of the Intelligent Robotics and Applications; Liu, X.J., Nie, Z., Yu, J., Xie, F., Song, R., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 596–606.
  21. Dumonteil, G.; Manfredi, G.; Devy, M.; Confetti, A.; Sidobre, D. Reactive planning on a collaborative robot for industrial applications. In Proceedings of the 2015 12th International Conference on Informatics in Control, Automation and Robotics (ICINCO), Colmar, France, 21–23 July 2015; Volume 2, pp. 450–457.
  22. Parusel, S.; Haddadin, S.; Albu-Schäffer, A. Modular state-based behavior control for safe human-robot interaction: A lightweight control architecture for a lightweight robot. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 4298–4305.
  23. Xi, Q.; Zheng, C.W.; Yao, M.Y.; Kou, W.; Kuang, S.L. Design of a Real-time Robot Control System oriented for Human-Robot Cooperation. In Proceedings of the 2021 International Conference on Artificial Intelligence and Electromechanical Automation (AIEA), Guangzhou, China, 14–16 May 2021; pp. 23–29.
  24. Gambao, E.; Hernando, M.; Surdilovic, D. A new generation of collaborative robots for material handling. In Proceedings of the International Symposium on Automation and Robotics in Construction; IAARC Publications: Eindhoven, The Netherlands, 2012; Volume 29, p. 1.
  25. Fong, T.; Thorpe, C.; Baur, C. Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Remote Driving Tools. Auton. Robot. 2001, 11, 77–85.
  26. Haddadin, S.; Croft, E. Physical human-obot interaction. In Springer Handbook of Robotics; Springer: Cham, Switzerland, 2016; pp. 1835–1874.
  27. Skrinjar, L.; Slavič, J.; Boltežar, M. A review of continuous contact-force models in multibody dynamics. Int. J. Mech. Sci. 2018, 145, 171–187.
  28. Ahmadizadeh, M.; Shafei, A.; Fooladi, M. Dynamic analysis of multiple inclined and frictional impact-contacts in multi-branch robotic systems. Appl. Math. Model. 2021, 91, 24–42.
  29. Korayem, M.; Shafei, A.; Seidi, E. Symbolic derivation of governing equations for dual-arm mobile manipulators used in fruit-picking and the pruning of tall trees. Comput. Electron. Agric. 2014, 105, 95–102.
  30. Shafei, A.; Shafei, H. Planar multibranch open-loop robotic manipulators subjected to ground collision. J. Comput. Nonlinear Dyn. 2017, 12, 06100.
More
Video Production Service