Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1000 2023-10-31 09:39:16 |
2 format correct Meta information modification 1000 2023-10-31 09:54:41 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Cogurcu, Y.E.; Douthwaite, J.A.; Maddock, S. Virtual and Physical Robot Arms Using Augmented Reality. Encyclopedia. Available online: https://encyclopedia.pub/entry/50965 (accessed on 20 May 2024).
Cogurcu YE, Douthwaite JA, Maddock S. Virtual and Physical Robot Arms Using Augmented Reality. Encyclopedia. Available at: https://encyclopedia.pub/entry/50965. Accessed May 20, 2024.
Cogurcu, Yunus Emre, James A. Douthwaite, Steve Maddock. "Virtual and Physical Robot Arms Using Augmented Reality" Encyclopedia, https://encyclopedia.pub/entry/50965 (accessed May 20, 2024).
Cogurcu, Y.E., Douthwaite, J.A., & Maddock, S. (2023, October 31). Virtual and Physical Robot Arms Using Augmented Reality. In Encyclopedia. https://encyclopedia.pub/entry/50965
Cogurcu, Yunus Emre, et al. "Virtual and Physical Robot Arms Using Augmented Reality." Encyclopedia. Web. 31 October, 2023.
Virtual and Physical Robot Arms Using Augmented Reality
Edit

The use of robot arms in various industrial settings has changed the way tasks are completed. However, safety concerns for both humans and robots in these collaborative environments remain a critical challenge. Traditional approaches to visualising safety zones, including physical barriers and warning signs, may not always be effective in dynamic environments or where multiple robots and humans are working simultaneously. Mixed reality technologies offer dynamic and intuitive visualisations of safety zones in real time, with the potential to overcome these limitations.

augmented reality human–robot collaboration safety zone visualisations

1. Introduction

Robotics technology has changed industrial operations, enabling the efficient and accurate completion of tasks. One of the most significant applications of robotics is the use of robot arms for repetitive, complex or hazardous tasks in various industrial settings, including manufacturing, construction, and healthcare. However, ensuring the safety of both humans and robots in these environments is a critical challenge that needs to be addressed. To ensure the safe operation of robot arms in industrial settings, it is essential to provide clear visual representations of safety zones—areas around the robot arm that are free from obstacles and where human workers should not enter to avoid collisions or accidents. Traditional approaches to visualising safety zones include physical barriers, warning signs, and colour-coded floor markings [1]. However, these methods may not always be effective in dynamic environments, where the layout of the workspace may change frequently, or in settings where multiple robots and humans are working simultaneously [2].
Virtual and augmented reality technologies have the potential to overcome these limitations by providing dynamic and intuitive visualisations of safety zones in real time. Virtual reality (VR) technologies can create fully immersive environments, where users can simulate the operation of robot arms without the need for physical equipment [3]. Augmented reality (AR) technologies, on the other hand, can overlay holographic-like images onto the real world, providing a more realistic and contextually relevant representation of safety zones [2][4][5][6][7]. Recent advances in mixed reality devices, such as the Microsoft HoloLens 2, have enabled users to interact with such holographic images overlaid onto the real world. The HoloLens 2 has been used in various applications, including remote assistance [8], training [9], and manufacturing [10] in industrial settings. However, little research has been conducted to evaluate the effectiveness of safety zone visualisations in virtual and real robot arm environments using the HoloLens 2.

2. Safety Zone Visualisations for Virtual and Physical Robot Arms Using Augmented Reality

A range of AR display technologies have been used in studying the safety and efficiency of robot arm operations. For example, Hietanen et al. [4] used both projection-based AR and Microsoft HoloLens 1 in the context of a human–robot collaboration task involving a diesel engine assembly, utilising depth sensor information for work space monitoring in both cases. The results showed that AR interaction led to a reduction in task completion time by up to 21–24% and reduced robot idle time by up to 57–64%. Moreover, the user survey indicated that the projector–mirror setup was deemed more comfortable and easier to use than the HoloLens 1, which users found to be bulky and uncomfortable to wear for extended periods. Additionally, the limited field of view of the HoloLens 1 was identified as an issue. Lotsaris et al. [2] also used a HoloLens 1 in a system that enabled the display of robot sensor information and safety zones to a user. Feedback from the study revealed that, similar to previous research, the HoloLens 1 was considered uncomfortable to wear for prolonged periods, thereby rendering it unfit for extensive use in a work environment.
Hoang et al. [5] presented a virtual barrier system equipped with an AR interface to ensure safe human–robot collaboration (HRC). The system offers two distinct virtual barriers, namely a person barrier that surrounds and tracks the user and a user-defined virtual barrier for objects or areas. The study evaluated the performance of a pick-and-place task that simulated an industrial manufacturing procedure using the Microsoft HoloLens 2 compared to a standard 2D display interface. They concluded that the proposed configurable virtual barrier system, coupled with AR, enhances HRC safety compared to a conventional 2D display interface. Cogurcu and Maddock [7] considered user-configurable virtual geometric objects, such as cubes and cylinders, that enveloped a real Universal Robots 10 (UR10) robot arm. These objects were either a single shape that adapted dynamically to the robot arm’s movements or individual shapes that encapsulated the robot arm’s components. The virtual safety zone sizes or the objects are calculated based on a combination of ISO standards [11], hardware, and network latencies, which were not taken into account in prior studies.
Other work has considered the use of virtual and real environments for testing and optimising HRC. For example, [12] explored the use of virtual environments for testing safety systems in HRC, and [13] investigated whether VR technology used in manufacturing could provide faster design options. While these studies have provided valuable insights into the potential benefits of virtual environments, the overall picture is less clear. Some research suggests that virtual environments are not advanced enough to accurately simulate real-world conditions, and real-world testing is necessary for a comprehensive assessment of systems involving robots [9], while other research argues that virtual environments can provide a suitable approximation of real-world conditions, with some limitations [14]. For example, according to [15], the use of VR technology has been shown to provide users with a similar experience to physical interaction with a robot arm. In their study, the authors found that users were able to perform tasks with a virtual robot arm just as accurately and efficiently as they could with a physical robot arm.
A further aspect of the virtual robot arm experiment that could impact on transferability is the sense of touch. In the virtual robot arm experiment, virtual wooden blocks were manipulated by the user. Comments from some users suggested that they had difficulty grabbing the virtual blocks. This was not the case for the real wooden blocks. This could suggest a potential issue for the transferability of precision and efficiency skills in training situations. Overall, transferability remains an open research question [9][14][15].

References

  1. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266.
  2. Lotsaris, K.; Fousekis, N.; Koukas, S.; Aivaliotis, S.; Kousi, N.; Michalos, G.; Makris, S. Augmented Reality (AR) based framework for supporting human workers in flexible manufacturing. Procedia CIRP 2021, 96, 301–306.
  3. Nenna, F.; Orso, V.; Zanardi, D.; Gamberini, L. The virtualization of human–robot interactions: A user-centric workload assessment. Virtual Real. 2022, 1–19.
  4. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J. AR-based interaction for human-robot collaborative manufacturing. Robot.-Comput.-Integr. Manuf. 2020, 63, 101891.
  5. Hoang, K.; Chan, W.; Lay, S.; Cosgun, A.; Croft, E. Virtual Barriers in Augmented Reality for Safe and Effective Human-Robot Cooperation in Manufacturing. arXiv 2021, arXiv:2104.05211.
  6. Cogurcu, Y.; Maddock, S. An augmented reality system for safe human-robot collaboration. In Proceedings of the 4th UK-RAS Conference for PhD Students and Early-Career Researchers on“ Robotics At Home”, Online, 2 June 2021.
  7. Cogurcu, Y.; Douthwaite, J.; Maddock, S. Augmented reality for safety zones in human-robot collaboration. In Proceedings of the Computer Graphics & Visual Computing (CGVC) 2022, Virtual, 15–16 September 2022.
  8. Druta, R.; Druta, C.; Negirla, P.; Silea, I. A review on methods and systems for remote collaboration. Appl. Sci. 2021, 11, 10035.
  9. Harris, D.; Bird, J.; Smart, P.; Wilson, M.; Vine, S. A framework for the testing and validation of simulated environments in experimentation and training. Front. Psychol. 2020, 11, 605.
  10. Gallala, A.; Kumar, A.; Hichri, B.; Plapper, P. Digital Twin for human–robot interactions by means of Industry 4.0 Enabling Technologies. Sensors 2022, 22, 4950.
  11. ISO/TS 15066:2016; Robots and Robotic Devices Collaborative Robots. International Organization for Standardization: Geneve, Switzerland, 2016.
  12. Oyekan, J.; Hutabarat, W.; Tiwari, A.; Grech, R.; Aung, M.; Mariani, M.; López-Dávalos, L.; Ricaud, T.; Singh, S.; Dupuis, C. The effectiveness of virtual environments in developing collaborative strategies between industrial robots and humans. Robot.-Comput.-Integr. Manuf. 2019, 55, 41–54.
  13. Malik, A.; Masood, T.; Bilberg, A. Virtual reality in manufacturing: Immersive and collaborative artificial-reality in design of human-robot workspace. Int. J. Comput. Integr. Manuf. 2020, 33, 22–37.
  14. Eswaran, M.; Bahubalendruni, M. Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: A state of the art review. J. Manuf. Syst. 2022, 65, 260–278.
  15. Han, Z.; Zhu, Y.; Phan, A.; Garza, F.; Castro, A.; Williams, T. Crossing Reality: Comparing Physical and Virtual Robot Deixis. In Proceedings of the 2023 ACM/IEEE International Conference On Human-Robot Interaction (HRI), Stockholm, Sweden, 13–16 March 2023.
More
Information
Subjects: Others
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , ,
View Times: 194
Revisions: 2 times (View History)
Update Date: 31 Oct 2023
1000/1000