UX in AR-Supported Industrial Human–Robot Collaborative Tasks: Comparison
Please note this is a comparison between Version 2 by Vivi Li and Version 1 by Elisa Prati.

The fourth industrial revolution is promoting the Operator 4.0 paradigm, originating from a renovated attention towards human factors, growingly involved in the design of modern, human-centered processes. New technologies, such as augmented reality or collaborative robotics are thus increasingly studied and progressively applied to solve the modern operators’ needs. Human-centered design approaches can help to identify user’s needs and functional requirements, solving usability issues, or reducing cognitive or physical stress. 

  • User eXperience
  • human–robot interaction
  • human–robot collaboration
  • human-centered design
  • augmented reality
  • human factors

1. Introduction

The creation of intelligent, assisted, and automated machines is characterizing the modern factory aiming at two main aspects: a more conscious distribution of roles between machines and humans, and a more flexible process control to achieve an efficient and optimized production. In this context, high standards of quality, production flexibility, and innovation push towards human-centered design (HCD) approaches, focused on the centrality of the human factors (HF). HF refers to environmental, organizational, and job-related aspects, as well as human individual characteristics, which can highly affect health and safety during the interaction with current technologies. Introducing HF in the design process is the scope of HCD, which is defined as “an approach to systems design and development that aims to make interactive systems more usable by focusing on the use of the system and applying human factors/ergonomics and usability knowledge and techniques[1]. Today, HCD can be generically used for any type of applications to guarantee the satisfaction of user needs and the coherence with the ergonomics principles while designing any type of human–system interaction. HCD enables new ways to define requirements and recommendations to properly design complex systems according to a user-oriented approach. The final goal is to guarantee a valuable User eXperience (UX), which involves “the user’s perceptions and responses that result from the use and/or anticipated use of a system product or service[1], including usability in terms of “the achievement of specified goals with effectiveness, efficiency and satisfaction in a specified context of use”, but also considering users’ emotions and affections [2].
The current frameworks related to the application of HF and HCD in system design need to be further developed with the advent of Operator 4.0 (O4.0) concept, framing a smart and skilled operator performing highly specialized tasks aided by emerging technologies as and if needed [3], in order to reshape the industrial tasks based on the human-machine partnership and to renovate the industrial systems according to Industry 4.0 paradigm. Indeed, the O4.0 idea is introducing new assistive technologies, such as augmented reality (AR), virtual reality (VR), or mixed reality (MR) in modern industries, making them enabling technologies for the design and development of an effective human–machine cooperation. However, to achieve such challenging objectives, technologies must be centered on the figure of the modern Operator 4.0 according to new framework, able to focus on the interface design for collaborative tasks, involving humans and robots. Primarily, a precise distinction among such technologies can be summarized as follows:
  • Augmented reality, as defined by Azuma et al., “supplements the real world with virtual (computer-generated) objects that appear to coexist in the same space as the real world” [4];
  • Virtual reality implies a full immersion into a fictious and digitally generated world which shuts out completely the physical world [5];
  • Mixed reality combines both the previous technologies while enabling a strict interaction between the digital and physical world. Thus, the user interaction with the computer-generated environment provides feedbacks and vice versa [6].

2. What Are the State of the Art UX Approaches in AR-Supported Collaborative Solutions?

The current trends in the design of collaborative tasks supported by AR technologies do not systematically show a great attention to UX topics. Hietanen et al. [8][7] proposed an interactive user interface to assist O4.0 in performing robot-assisted tasks comparing two separate implementations of the same system: a projection-mirror setup, and a wearable device (i.e., Microsoft Hololens). No prior UX assessment was proposed; as part of the subjective evaluation, a final questionnaire including 13 questions divided into six categories (respectively: safety, information processing, ergonomics, autonomy, competence, and relatedness) was submitted to roughly understand mental and physical stress. Comments from users were collected to deepen the subjective impression, without using any structured method to collect the perceived workload, as used for instance in different contexts. A more structured approach is presented by Papanastasiou et al. [34][8]: the paper emphasizes the need of a seamless integration between the human operator and his robotic counterpart by monitoring both working entities through sensors and wearable devices. This led to the re-design of the workplace from the human point of view to promote both the robot’s operability and operator’s mobility, without any barrier to separate them; a multi-stage iterative process has been followed, starting from technical and functional specifications as well as safety requirements. A digital simulation is included for supporting cell setup and risk assessment.
De Pace et al. [18][9] placed attention on AR devices’ usability as enabling tools. The authors reported how usability, workload, and likability can be investigated thanks to the standardized questionnaire (e.g., NASA-TLX [52][10], System Usability Scale (SUS) [53][11], AttrakDiff [54][12]). The same intention is expressed by Huy et al. [35][13], who introduced a novel AR handheld device inspired by the abovementioned multimodality perceptive interface, incorporating hand gesture mapping, haptic buttons, and laser pointers. The system can suggest available options to the operator and wait for a response instead of traditional inputs by keyboard or mouse; a usability investigation is eventually foreseen to improve the interface effectiveness with the help of user’s feedbacks.
Materna et al. [36][14] evaluated the idea of spatial augmented reality (SAR) through a UX study. The outlined approach works towards avoiding continuous switching of attention during demanding tasks, thanks to a correct distribution of information along the operations and a shared workplace, to be usable also for non-expert users. Process simplification was also addressed by Aschenbrenner et al. [37][15] to reduce the installation time of hybrid robot–human production lines, and by De Tommaso et al. [38][16] that defined a new process of skill transfer between human workers and robots. Similarly, Fuste et al. [49][17] presented a holistic UX framework (called “Kinetic AR”) for visual programming of robotic motions using AR: the goal was to guarantee a low entry barrier to intricate spatial hardware programming. The UX approach was achieved through interviews to robotic system integrators, manufacturers, and end-users with different expertise, to finally identify the goals and requirements to be accomplished. Communications and interactions were also investigated by Bazzano et al. [39][18], using 3D immersive simulation to support the design and validation of natural HRI in generic usage contexts, comparing an AR interface and a non-AR one. Among others, subjective observations were gathered through the SASSI methodology [55][19] to evaluate speech interaction in both interfaces. Information on completion times, overall satisfaction, ease of use, perceived time requested, and support information were collected, and their statistical relevance was given by running an independent sample t-test.
As a result of the review, one can state that there are few preliminary attempts to include UX in the design of AR applications for HRI purposes, as summarized in this paragraph, but a reference, ready to use model that is able to integrate the users’ subjective evaluation and the analysis of the quantitative human–robot performance is still missing.
The main weaknesses of the current attempts are:
  • User testing is usually based on the collection of deconstructed data regarding device or interface usability, system likability, cognitive and physical workload, or the overall subjective sense of safety in performing the selected operation, without a robust reference model;
  • Even if a good attention in using multimodal interfaces to optimize HRI is arising, this trend is not mature enough to enhance human sensorial capabilities by integrating different sensors (e.g., force/torque sensors, microphones, cameras, smartwatches, and AR glasses);
  • AR application design does not consider the user perspective and does not help in the improvement of the ease of use of industrial workplaces, avoiding uncomfortable conditions (e.g., extra lightning and noise).
These results highlight the need of a structured framework to design AR interfaces for HRI and pushes towards its definition.

3. What Are the Main Benefits of Adopting UX Approaches in Designing AR-Supported Collaborative Solutions?

After the first analysis, the review focused on the analysis of the benefits related to the adoption of UX-based approaches in the design of AR applications for HRI: these approaches generally turn into a detailed evaluative UX phase, where subjective questionnaires represent the main source of information. Table 31 summarizes the most significant papers dealing with such an aspect, also reporting the main areas of applications.
Table 31. Papers focusing on added value related to adoption of UX approaches in HRI.
Paper Benefits Adopted UX Tools Area of Application
J. A. Frank, M. Moorhead, and V. Kapila [45][20] End-user’s intentions understanding to reduce operator cognitive burden Custom questionnaire Object manipulation
W. P. Chan, G. Hanks, M. Sakr, T. Zuo, H. F. Machiel Van Der Loos, and E. Croft [50][21] The system’s final application must be considered to prevent wrong choices in terms of interfaces and to avoid physical and cognitive repercussion on the user NASA-TLX Large-scale, labor-intensive manufacturing tasks
C. P. Quintero, S. Li, M. K. Pan, W. P. Chan, H. F. Machiel Van Der Loos, and E. Croft [57][22] Reducing robots’ programming operation time and cognitive demand Custom questionnaire Robot programming
Within the context of laboratory object manipulation tests, Frank et al. [45][20] focused on the user interaction effectiveness of a mobile augmented interface and on virtual graphics appearing as task’s visual cues to reduce cognitive burden on end-users. The proposed system can automatically intercept an operator’s intention on virtual objects (i.e., drag and drop of models in the space), thus reducing the human involvement while operating with the collaborative companion. No defined UX approach was adopted: a revision of the overall interface was conducted through a final questionnaire after the user-testing phase to identify possible criticalities. A concurrent interface simplification without losing its functionalities in the human–robot collaboration is indeed of extreme importance, in opposition to what has been defined by [42][23], where high cognitive functionalities are purposely omitted from the proposed interface.
A further critical point in AR-supported collaborative tasks is the choice of the correct interface to use, which is usually conducted without a precise validation tool or methodology. In De Pace et al. [18][9], a series of interesting UI studies resembling HCD approaches were collected concerning whether exocentric or egocentric interfaces are the best in limiting the level of mental and physical involvement in controlling the manipulator. Another study by Chan et al. [50][21] reconsidered AR-based interfaces for human–robot collaboration on large-scale labor-intensive manufacturing tasks (carbon-fiber-reinforced-polymer production) where the accent is on the perceived workload and efficiency. Indeed, as stated in other studies [18][9], the lowest physical and temporal demand is registered with appropriately designed AR solutions, reducing user’s effort and sense of frustration while cutting down operational time. Such an approach does not explicitly make reference to a structured and systematic HCD methodology, but it relies on NASA-TLX questionnaire results. Similar conclusions were reported by Diehl et al. [51][24], where application circumstances for the choice of best device are examined, starting from users’ feedback on robot’s time and area of manipulation up to user sense of safety.
In Xin et al. [48][25], a collaborative task concerning playing board games was explored and evaluated by examining various interaction opportunities arising when humans and robots collaborate. This interesting analysis was related to two contrasting robotic behavioral conditions which have been tested: a human-centric condition where robot behavior is more accustomed to human obedience, and a robot-centric one where suggestions coming from the operator are neglected. Statistical results on the final user testing phase related to a custom questionnaire allows for a reinforced idea of the centrality of a human-centric condition to increase the sense of collaboration of O4.0.
Moreover, Palmarini et al. [56][26] stressed that safety is deemed as one of the most relevant aspects in human–robot collaborative systems and context-awareness information is unavoidably important to enhance user perception. Analogously, Quintero et al. [57][22] proposed two separate approaches to draw AR paths, respectively, a free space and a surface trajectory one. Such proposals could be effectively integrated to optimize robot’s programming phases with a UX sensibility, reducing programming time, and allowing the worker to selectively inspect different robot trajectories and to work on them in a user-friendly interface. For an optimal collaboration, robot intention is another source of essential information within a HCD approach: a general indifference on the topic emerges from actual selection, although Liu et al. [58][27] described a temporal and-or graph (T-AOG) to allow the human understanding of the robot’s internal decision-making process, to supervise its action planner, or to monitor its latent states (i.e., forces and moments exerted while interacting).
As emerged from the review findings, several benefits derived from a UX-based approach when implementing AR-supported collaborative tasks, both objective and subjective: a systematic cognitive and physical relief on the operator, an increased working efficiency, a reduction in operational time and sense of frustration when interacting with shop floor interfaces, and an improved sense of safety and inclusiveness while collaborating with the robotic counterpart. Such conclusions were mainly reported after a user testing campaign in which standard or customized questionnaires were designated to collect final tester impressions to be subsequently reanalyzed by the papers’ authors.

4. What Are the Main Challenges in Designing AR-Supported Collaborative Solutions?

The literary review highlighted that the design, development, and use of AR technologies to improve HRI in industrial contexts is a hot topic from a technological point of view; however, there is a lack of models to deepen the UX and only a limited number of papers have proposed the adoption of UX methods to support the design of AR application in this field, according to user-centered principles. As reported by recent market forecasts, the mixed reality market size (including both augmented and virtual reality technologies) is expected to grow by USD 125.19 billion during 2020–2024 [59][28], up to USD 1.45 trillion to by 2030 [60][29]. This rapid growth entails big challenges from both a technical and technologies viewpoint and a human viewpoint. Some issues, just considered so far, need to be investigated and faced: from privacy problems to safety requirements. This means that the design of AR-supported applications in the context of HRI will consider how to manage the robots’ and operators’ data collected and how to assure the proper privacy and safety levels. Considering current applications [61][30], one can reflect on both critical success factors and challenges related to future robust industrialization. If compared to industrial software systems, current AR hardware readiness seems to still be far from a mature adoption in industry. Thus, human-centered design methods are required to balance industrial system requirements with human needs and social concerns; in this sense AR is so close to human abilities, also affecting and empowering them. Another challenge is the integration of AR devices within modern manufacturing systems: data exchange to and from the AR application should be compliant with robotics and automation standards to assure a full adoption in industry. In regard to this topic, only few research attempts have been made (e.g., AutomationML [62][31]) which are still far from the inclusion of AR data.
Moreover, a proper UX evaluation framework for AR-supported collaborative tasks needs to be defined. A first attempt has been made considering UX analysis in the design of HRI applications using a structured approach [10][32], but not including AR tools. On this topic, the main challenge is to define a systematic and coherent way to interpret data coming from different equipment and returning AR digital contents to the O4.0, in an adaptive and intelligent way, considering the UX, and further enhancing the human physical, sensorial, and cognitive capabilities by means of human cyber–physical system integration [63][33]. In this direction, a further challenge is promoting socially embedded human–robot collaborations where human communications can be used to adapt service robots to the user needs accordingly: it consists of giving the robots the concept of emotional tuning and to emphatically communicate with machines [64][34].
Moreover, the estimation of those variables affecting trust in HRI is necessary to design new, effective AR interfaces providing situational awareness and spatial dialog, and to determine functional elements to improve human confidence in robots. This evaluation should be included in a comprehensive approach considering validated metrics for an overall UX assessment [65][35]. Contextually, the assessment of human cognitive and physical efforts in developing collaborative tasks has an absolute relevance.
Finally, in the context of AR-supported human–robot collaborative operations, user testing needs a more statistically reliable base, including both academic and industrial studies and increasing the number and typology of people involved to assess the effectiveness of AR in HRI tasks [18][9]. The results mainly imply the definition of new UX-based methods to design AR interfaces from a multiple users’ point of view, involving novice and expert users, and the benchmark of the most suitable wearable interfaces to be used together with industrial robots.

References

  1. UNI EN ISO 9241-210:2019. Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems; ISO: Geneva, Switzerland, 2019.
  2. UNI EN ISO 9241-11:2018. Ergonomics of Human-System Interaction—Part 11: Usability: Definitions and Concepts; ISO: Geneva, Switzerland, 2018.
  3. Romero, D.; Stahre, J.; Wuest, T.; Noran, O.; Bernus, P.; Fasth, A.; Gorecky, D. Towards an operator 4.0 typology: A hu-man-centric perspective on the fourth industrial revolution technologies. In Proceedings of the International Conference on Computers & Industrial Engineering (CIE46), Tianjin, China, 29–31 October 2017; pp. 1–11, ISSN 2164-8670 CD-ROM, ISSN 2164–8689 ON-LINE.
  4. Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent advances in augmented reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47.
  5. Peruzzini, M.; Grandi, F.; Cavallaro, S.; Pellicciari, M. Using virtual manufacturing to design human-centric factories: An industrial case. Int. J. Adv. Manuf. Technol. 2020, 115, 873–887.
  6. Milgram, P. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329.
  7. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.-K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput. Manuf. 2020, 63, 101891.
  8. Papanastasiou, S.; Kousi, N.; Karagiannis, P.; Gkournelos, C.; Papavasileiou, A.; Dimoulas, K.; Baris, K.; Koukas, S.; Michalos, G.; Makris, S. Towards seamless human robot collaboration: Integrating multimodal interaction. Int. J. Adv. Manuf. Technol. 2019, 105, 3881–3897.
  9. De Pace, F.; Manuri, F.; Sanna, A.; Fornaro, C. A systematic review of Augmented Reality interfaces for collaborative industrial robots. Comput. Ind. Eng. 2020, 149, 106806.
  10. Scafà, M.; Serrani, E.B.; Papetti, A.; Brunzini, A.; Germani, M. Assessment of Students’ Cognitive Conditions in Medical Simulation Training: A Review Study. In Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2017; pp. 224–233.
  11. Brooke, J. SUS: A ’Quick and Dirty’ Usability Scale. Usability Eval. Ind. 1996, 189, 4–7.
  12. AttrakDiff. Available online: http://attrakdiff.de/index-en.html (accessed on 5 July 2021).
  13. Huy, D.Q.; Vietcheslav, I.; Lee, G.S.G. See-through and spatial augmented reality—A novel framework for human-robot interaction. In Proceedings of the 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), Nagoya, Japan, 24–26 April 2017; pp. 719–726.
  14. Materna, Z.; Kapinus, M.; Beran, V.; Smrz, P.; Zemcik, P. Interactive Spatial Augmented Reality in Collaborative Robot Programming: User Experience Evaluation. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 80–87.
  15. Aschenbrenner, D.; Li, M.; Dukalski, R.; Verlinden, J.; Lukosch, S. Collaborative Production Line Planning with Augmented Fabrication. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 509–510.
  16. De Tommaso, D.; Calinon, S.; Caldwell, D.G. A Tangible Interface for Transferring Skills. Int. J. Soc. Robot. 2012, 4, 397–408.
  17. Fuste, A.; Reynolds, B.; Hobin, J.; Heun, V.; Ptc, B.A.F. Kinetic AR: A Framework for Robotic Motion Systems in Spatial Computing. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–8.
  18. Bazzano, F.; Gentilini, F.; Lamberti, F.; Sanna, A.; Paravati, G.; Gatteschi, V.; Gaspardone, M. Immersive Virtual Reality-Based Simulation to Support the Design of Natural Human-Robot Interfaces for Service Robotic Applications. Lect. Notes Comput. Sci. 2016, 9768, 33–51.
  19. Hone, K.S.; Graham, R. Towards a tool for the Subjective Assessment of Speech System Interfaces (SASSI). Nat. Lang. Eng. 2000, 6, 287–303.
  20. Frank, J.A.; Moorhead, M.; Kapila, V. Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 302–307.
  21. Chan, W.P.; Hanks, G.; Sakr, M.; Zuo, T.; Van der Loos, H.M.; Croft, E. An Augmented Reality Human-Robot Physical Collaboration Interface Design for Shared, Large-Scale, Labour-Intensive Manufacturing Tasks. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020; pp. 11308–11313.
  22. Quintero, C.P.; Li, S.; Pan, M.K.; Chan, W.P.; Van der Loos, H.M.; Croft, E. Robot Programming Through Augmented Trajectories in Augmented Reality. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1838–1844.
  23. Kyjanek, O.; Al Bahar, B.; Vasey, L.; Wannemacher, B.; Menges, A. Implementation of an Augmented Reality AR Workflow for Human Robot Collaboration in Timber Prefabrication. In Proceedings of the 36th International Symposium on Automation and Robotics in Construction (ISARC), Banff, AB, Canada, 21–24 May 2019; pp. 1223–1230.
  24. Diehl, M.; Plopski, A.; Kato, H.; Ramirez-Amaro, K. Augmented Reality interface to verify Robot Learning. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 378–383.
  25. Xin, M.; Sharlin, E. Sheep and wolves: Test bed for human-robot interaction. In Proceedings of the CHI’ 06 Extended Abstracts on Human Factors in Computing Systems, Montréal, QC, Canada, 22–27 April 2006; pp. 1553–1558.
  26. Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A systematic review of augmented reality applications in maintenance. Robot. Comput. Manuf. 2018, 49, 215–228.
  27. Liu, H.; Zhang, Y.; Si, W.; Xie, X.; Zhu, Y.; Zhu, S.-C. Interactive Robot Knowledge Patching Using Augmented Reality. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 13 September 2018; pp. 1947–1954.
  28. Bloomberg. Available online: https://www.bloomberg.com/press-releases/2021–02-02/-125-billion-growth-in-global-augmented-reality-ar-and-virtual-reality-vr-market-2020–2024-apac-to-emerge-as-major-market (accessed on 2 July 2021).
  29. Why We Believe VR/AR will Boost Global GDP by $1.5 Trillion. Available online: https://www.pwc.co.uk/services/economics/insights/vr-ar-to-boost-global-gdp.html (accessed on 2 July 2021).
  30. Masood, T.; Egger, J. Augmented reality in support of Industry 4.0—Implementation challenges and success factors. Robot. Comput. Manuf. 2019, 58, 181–195.
  31. Gonçalves, E.M.N.; Freitas, A.; Botelho, S. An AutomationML Based Ontology for Sensor Fusion in Industrial Plants. Sensors 2019, 19, 1311.
  32. Prati, E.; Peruzzini, M.; Pellicciari, M.; Raffaeli, R. How to include User eXperience in the design of Human-Robot Interaction. Robot. Comput. Manuf. 2021, 68, 102072.
  33. Romero, D.; Bernus, P.; Noran, O.; Stahre, J.; Berglund, Å.F. The operator 4.0: Human cyber-physical systems & adaptive automation towards human-automation symbiosis work systems. In IFIP Advances in Information and Communication Technology; Springer: New York, NY, USA, 2016; Volume 488, pp. 677–686.
  34. Woo, J.; Ohyama, Y.; Kubota, N. Robot Partner Development Platform for Human-Robot Interaction Based on a User-Centered Design Approach. Appl. Sci. 2020, 10, 7992.
  35. Palmarini, R.; del Amo, I.F.; Bertolino, G.; Dini, G.; Erkoyuncu, J.A.; Roy, R.; Farnsworth, M. Designing an AR interface to improve trust in Human-Robots collaboration. Procedia CIRP 2018, 70, 350–355.
More
ScholarVision Creations