Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1127 2023-10-25 08:53:15 |
2 layout & references Meta information modification 1127 2023-10-27 02:49:27 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Jing, H.; Zheng, T.; Zhang, Q.; Sun, K.; Li, L.; Lai, M.; Zhao, J.; Zhu, Y. Human Operation Augmentation through Wearable Robotic Limb. Encyclopedia. Available online: https://encyclopedia.pub/entry/50773 (accessed on 16 November 2024).
Jing H, Zheng T, Zhang Q, Sun K, Li L, Lai M, et al. Human Operation Augmentation through Wearable Robotic Limb. Encyclopedia. Available at: https://encyclopedia.pub/entry/50773. Accessed November 16, 2024.
Jing, Hongwei, Tianjiao Zheng, Qinghua Zhang, Kerui Sun, Lele Li, Mingzhu Lai, Jie Zhao, Yanhe Zhu. "Human Operation Augmentation through Wearable Robotic Limb" Encyclopedia, https://encyclopedia.pub/entry/50773 (accessed November 16, 2024).
Jing, H., Zheng, T., Zhang, Q., Sun, K., Li, L., Lai, M., Zhao, J., & Zhu, Y. (2023, October 25). Human Operation Augmentation through Wearable Robotic Limb. In Encyclopedia. https://encyclopedia.pub/entry/50773
Jing, Hongwei, et al. "Human Operation Augmentation through Wearable Robotic Limb." Encyclopedia. Web. 25 October, 2023.
Human Operation Augmentation through Wearable Robotic Limb
Edit

The supernumerary robotic limb (SRL) is a new type of wearable robot that improves the human body’s ability to move, perceive, and operate through mechanical and human limbs’ integration, mutual assistance, and cooperation. Unlike traditional collaborative robots, SRLs have a closer human–computer interaction mode and a cooperative mode of moving with the human body.

human augmentation wearable robotic limb supernumerary robotic limb mixed reality

1. Introduction

The supernumerary robotic limb (SRL) is a new type of wearable robot that is different from prosthetic and exoskeleton robots. It improves the human body’s ability to move, perceive, and operate through mechanical and human limbs’ integration, mutual assistance, and cooperation [1][2][3]. According to different application scenarios, scholars from various countries have designed SRLs that can be worn in different positions. Specifically, it includes an extra robotic limb worn on the waist to complete auxiliary support tasks [4] or tasks of tool delivery and remote assistance [5]. Extra robotic limbs are worn on the shoulders to complete overhead support tasks [6]. Extra robotic fingers are used to improve the independent living ability of the disabled [7] or to expand the grasping capacity of ordinary people [8]. In these application scenarios, the SRLs play a vital role in expanding the ability of a single person to operate and improve work efficiency.
It is also imperative to design appropriate interaction methods to make SRLs not affect the wearer’s operation ability and rationally use human body information. Ref. [9] divided the command interfaces of the supernumerary effector into three categories: body, muscle, and neural. The body interface refers to the interaction method based on body motion mapping, such as fingers [10], feet [11], and the interaction method using changes in hand and finger force [12]. Surface electromyography (EMG) interaction methods based on redundant human muscles are proposed, such as chest and abdominal EMG signals [13] and forehead EMG signals [14]. The electroencephalography/magnetoencephalography (EEG/MEG) interaction method [15][16] has also been proposed but not applied to actual task scenarios. Although these interaction methods can realize the essential communication of SRLs, there are still some unsolved problems, such as the increased cognitive load of people, complex sensor data processing, and limited adaptability (wearer adaptability and application scene adaptability). Therefore, it is significant to construct a simple, reliable, and natural human–robot interaction interface for SRLs and to design an efficient interaction strategy.
The use of collaborative robots in the industry has grown over the past few years. Augmented Reality (AR), as a prominent and promising tool to support human operators in understanding and interacting with robots, has been applied in many human–robot collaboration and cooperative industrial applications. Research on the human–robot collaboration of head-mounted displays (HMDs) and projectors has attracted more attention [17]. Compared with industrial collaborative robots, SRLs have the characteristics of closer human–robot interaction, moving with the wearer and not affecting the wearer’s operation ability. Mixed Reality (MR) is a broader term encompassing AR and Augmented Virtuality (AV). It is a general term for blending the virtual and the physical worlds [18][19]. HoloLens2 is an MR device developed by Microsoft Corp.

2. Human–Robot Interaction Method of SRLs

Unlike traditional collaborative robots, SRLs have a closer human–computer interaction mode and a cooperative mode of moving with the human body. Therefore, the current human–robot interaction methods for SRLs focus on researching natural interaction methods that do not affect the wearer’s ability to operate. Refs. [12][13] have proposed an interaction method based on task redundant finger strength to complete the task of opening the door and an interaction method based on EMG signals. Ref. [20] proposed an interaction method based on eye gaze information and gave quantified manipulation accuracy. Ref. [21] proposed a method for assisted walking foot position prediction that fuses continuous 3D gaze and environmental background. Combining human gaze and environmental point cloud information is significant for studying wearable robotic limb control for assisted walking. Ref. [22] proposed a new gaze-based natural human–robot interaction method that suppresses the noise of the gaze signal to extract human grasping intent. This study is different from the previous research on gaze intention, and it has reference significance for the study of human–robot natural interaction with wearable robotic limbs. Ref. [15] researched the method of manipulating external limbs by EEG signals and evaluated the influence of various factors of SRLs on human–robot interaction. Ref. [8] analyzed the ability to control extra robotic fingers with the toes and the effects on human nerves. Ref. [23] proposed a task model for overhead tasks to realize the human–robot collaboration of SRL according to the operator’s actions. This task model has the adaptability of the task and has reference significance for the construction of interaction strategies for similar tasks. The interaction between SRLs and the wearer mainly converts the wearer’s intentions into robot task execution information. This can be understood as a “decomposition” and “synthesis” process. People decompose their intentions into various external data, and the SRLs collect the external information to synthesize the task execution information required by the robot. The essential lies in the collection of human multimodal information, the synthesis and transformation of multi-source information, the reduction in human cognitive load, and the strategy of SRL cooperation. The existing work has no precedent for applying MR to the human–robot interaction of SRLs. Moreover, most current interaction methods call for a single interaction method, and the user experience is not comfortable enough. 

3. AR-Based Operator Support Systems

Advances in display and vision technologies create new interaction methods that enable information-rich, real-time communication in shared workspaces. The visualization methods applied by AR in human–computer collaboration mainly include HMDs, spatial augmented reality projectors, fixed screens, and hand-held displays (HHDs) [17]. The current research trend mainly implements Safety, Guidance, Feedback, Programming, and Quality Control through HMDs or projectors. Ref. [24] communicate the motion intent of a robotic arm through a Mixed Reality head-mounted display. Ref. [25] applied AR in the human–robot collaborative manufacturing process to solve human–robot safety issues and improve operational efficiency. Ref. [26] present the design and implementation of an AR tool that provides production and process-related feedback information and enhances the operator’s immersion in safety mechanisms. Refs. [27][28] also researched the application of AR to realize industrial robot programming and collaboration models. These research results have shown the advantages of AR devices in human–robot interaction through qualitative or quantitative methods. The related works are shown in Table 1.
Table 1. Summary and classification table of related work.

References

  1. Jing, H.; Zhu, Y.; Zhao, S.; Zhang, Q.; Zhao, J. Research status and development trend of supernumerary robotic limbs. J. Mech. Eng. 2020, 26, 1–9.
  2. Tong, Y.; Liu, J. Review of research and development of supernumerary robotic limbs. IEEE/CAA J. Autom. Sin. 2021, 8, 929–952.
  3. Yang, B.; Huang, J.; Chen, X.; Xiong, C.; Hasegawa, Y. Supernumerary robotic limbs: A review and future outlook. IEEE Trans. Med Robot. Bionics 2021, 3, 623–639.
  4. Parietti, F.; Asada, H.H. Supernumerary robotic limbs for aircraft fuselage assembly: Body stabilization and guidance by bracing. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 1176–1183.
  5. Véronneau, C.; Denis, J.; Lebel, L.P.; Denninger, M.; Blanchard, V.; Girard, A.; Plante, J.S. Multifunctional remotely actuated 3-DOF supernumerary robotic arm based on magnetorheological clutches and hydrostatic transmission lines. IEEE Robot. Autom. Lett. 2020, 5, 2546–2553.
  6. Bonilla, B.L.; Asada, H.H. A robot on the shoulder: Coordinated human-wearable robot control using coloured petri nets and partial least squares predictions. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 119–125.
  7. Hussain, I.; Salvietti, G.; Spagnoletti, G.; Malvezzi, M.; Cioncoloni, D.; Rossi, S.; Prattichizzo, D. A soft supernumerary robotic finger and mobile arm support for grasping compensation and hemiparetic upper limb rehabilitation. Robot. Auton. Syst. 2017, 93, 1–12.
  8. Kieliba, P.; Clode, D.; Maimon-Mor, R.O.; Makin, T.R. Robotic hand augmentation drives changes in neural body representation. Sci. Robot. 2021, 6, eabd7935.
  9. Eden, J.; Bräcklein, M.; Ibáñez, J.; Barsakcioglu, D.Y.; Di Pino, G.; Farina, D.; Burdet, E.; Mehring, C. Principles of human movement augmentation and the challenges in making it a reality. Nat. Commun. 2022, 13, 1345.
  10. Wu, F.Y.; Asada, H.H. Implicit and intuitive grasp posture control for wearable robotic fingers: A data-driven method using partial least squares. IEEE Trans. Robot. 2016, 32, 176–186.
  11. Sasaki, T.; Saraiji, M.Y.; Fernando, C.L.; Minamizawa, K.; Inami, M. MetaLimbs: Multiple arms interaction metamorphism. In ACM SIGGRAPH 2017 Emerging Technologies; ACM: New York, NY, USA, 2017; pp. 1–2.
  12. Guggenheim, J.; Hoffman, R.; Song, H.; Asada, H.H. Leveraging the human operator in the design and control of supernumerary robotic limbs. IEEE Robot. Autom. Lett. 2020, 5, 2177–2184.
  13. Parietti, F.; Asada, H.H. Independent, voluntary control of extra robotic limbs. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2017; pp. 5954–5961.
  14. Salvietti, G.; Hussain, I.; Cioncoloni, D.; Taddei, S.; Rossi, S.; Prattichizzo, D. Compensating hand function in chronic stroke patients through the robotic sixth finger. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 25, 142–150.
  15. Penaloza, C.I.; Nishio, S. BMI control of a third arm for multitasking. Sci. Robot. 2018, 3, eaat1228.
  16. Tang, Z.; Zhang, L.; Chen, X.; Ying, J.; Wang, X.; Wang, H. Wearable supernumerary robotic limb system using a hybrid control approach based on motor imagery and object detection. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 1298–1309.
  17. Costa, G.d.M.; Petry, M.R.; Moreira, A.P. Augmented reality for human–robot collaboration and cooperation in industrial applications: A systematic literature review. Sensors 2022, 22, 2725.
  18. Chang, C.Y.; Debra Chena, C.L.; Chang, W.K. Research on immersion for learning using virtual reality, augmented reality and mixed reality. Enfance 2019, 3, 413–426.
  19. Fast-Berglund, Å.; Gong, L.; Li, D. Testing and validating Extended Reality (xR) technologies in manufacturing. Procedia Manuf. 2018, 25, 31–38.
  20. Fan, Z.; Lin, C.; Fu, C. A gaze signal based control method for supernumerary robotic limbs. In Proceedings of the 2020 3rd International Conference on Control and Robots (ICCR), Tokyo, Japan, 26–29 December 2020; pp. 107–111.
  21. Zhang, K.; Liu, H.; Fan, Z.; Chen, X.; Leng, Y.; de Silva, C.W.; Fu, C. Foot placement prediction for assistive walking by fusing sequential 3D gaze and environmental context. IEEE Robot. Autom. Lett. 2021, 6, 2509–2516.
  22. Yang, B.; Huang, J.; Chen, X.; Li, X.; Hasegawa, Y. Natural Grasp Intention Recognition Based on Gaze in Human–Robot Interaction. IEEE J. Biomed. Health Inf. 2023, 27, 2059–2070.
  23. Tu, Z.; Fang, Y.; Leng, Y.; Fu, C. Task-based Human-Robot Collaboration Control of Supernumerary Robotic Limbs for Overhead Tasks. IEEE Robot. Autom. Lett. 2023, 8, 4505–4512.
  24. Rosen, E.; Whitney, D.; Phillips, E.; Chien, G.; Tompkin, J.; Konidaris, G.; Tellex, S. Communicating robot arm motion intent through mixed reality head-mounted displays. In Proceedings of the Robotics Research: The 18th International Symposium ISRR, Puerto Varas, Chile, 11–14 December 2017; Springer: Berlin/Heidelberg, Germany, 2020; pp. 301–316.
  25. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput. Integr. Manuf. 2020, 63, 101891.
  26. Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.S. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 2016, 65, 61–64.
  27. Dimitropoulos, N.; Togias, T.; Zacharaki, N.; Michalos, G.; Makris, S. Seamless human–robot collaborative assembly using artificial intelligence and wearable devices. Appl. Sci. 2021, 11, 5699.
  28. Chan, W.P.; Quintero, C.P.; Pan, M.K.; Sakr, M.; Van der Loos, H.M.; Croft, E. A multimodal system using augmented reality, gestures, and tactile feedback for robot trajectory programming and execution. In Virtual Reality; River Publishers: Gistrup, Denmark, 2022; pp. 142–158.
More
Information
Subjects: Robotics
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , , , ,
View Times: 184
Revisions: 2 times (View History)
Update Date: 27 Oct 2023
1000/1000
ScholarVision Creations