Human Decision Making in Human–Robot Collaboration: Comparison
Please note this is a comparison between Version 2 by Sirius Huang and Version 1 by Yuan Liu.

The advent of Industry 4.0 has heralded advancements in human–robot collaboration (HRC), necessitating a deeper understanding of the factors influencing human decision making within this domain. An HRC system combines human soft skills such as decision making, intelligence, problem-solving, adaptability and flexibility with robots’ precision, repeatability, and the ability to work in dangerous environments.

  • human decision making
  • human–robot collaboration
  • human–robot interaction
  • human factors
  • Industry 4.0

1. Introduction

As a socio-technical system, collaborative robots (cobots) are designed to improve productivity, flexibility, and ergonomics, and to increase customised production rather than mass production. Since the fourth industrial revolution (Industry 4.0), concerns regarding the human workers’ role in the production environment [1] have increased causing Human–robot Collaboration (HRC) to become an emerging area of robotic and cobotic research in recent years. A popular discussion about the next industrial revolution (Industry 5.0) is human–robot co-working [2], which emphasises bringing human workers back to the production process loop [3]. An HRC system combines human soft skills such as decision making, intelligence, problem-solving, adaptability and flexibility with robots’ precision, repeatability, and the ability to work in dangerous environments [4].
For this reason, cobots are adopted to work and interact safely with humans on shared tasks, in a shared workspace simultaneously [5,6,7][5][6][7]. Cobots have enormous potential for their increased use in many industries. To introduce industrial cobots clearly, [8] proposed a framework which categorises the interaction between humans and robots into four types (Figure 1). First type is the full automatization of conventional industrial robots, and the latter three types are categorised based on the interaction between humans and cobots: coexistence, cooperation, and collaboration. In the coexistence scenario, humans and cobots work sequentially in different workspaces. In the cooperation scenario, the humans and cobots work in a shared space, and the tasks of the humans and cobots are linked. In the collaboration scenario, which is the highest level of interaction, humans and cobots work in the shared space on the shared tasks simultaneously.
Figure 1. Types of human–robot relationships from full automation to coexistence, cooperation, and collaboration [9]. Adopted from [8] and ISO 10218-1 [10].

2. Human Decision Making

Since humans and robots can work together as a team in HRC, they both have the authority to make decisions. Robots can make decisions by using algorithms based on models such as Markov decision processes (MDP) [11], partially observable Markov decision processes (POMDP) [12], Bayesian Decision Model (BDM) [13], Adaptive Bayesian policy selection (ABPS) [14], and others. Therefore, the performance of robot decision making relies on algorithms. With the emergence of technologies such as Artificial Intelligence (AI) and Machine Learning (ML), robots can make decisions on their own and be more intelligent in some specific situations [15]. Different from robots, humans make decisions based on normative inference, influenced by their previous experiences, unconscious drives, and emotions [16]. Robots are good at making decisions in stable and predictable situations, while human decision making is essential for handling complex and dynamic situations [17].
One example of a dynamic situation where humans are working with robots is robot-assisted surgery. In this scenario, a surgeon makes decisions based on their knowledge, experience, and the current situation in the operation theatre. Other factors such as communication within the surgical team, situation awareness, and workload are related to the surgeon’s decision making [18]. For instance, during a procedure, a surgeon must carefully choose the right instrument based on the specific needs of the task and the current conditions. They must evaluate the completion of each step satisfactorily, be ready with an alternative plan if necessary, and always consider the most appropriate next action. The quality of the surgeon’s decisions regarding patient care, incision placement, and procedure steps directly affects the overall success of the surgery. Another dynamic example is a search and rescue task. In this scenario, humans make decisions based on the data collected from a robot where effective information can aid humans to make decisions necessary for the search. For instance, in an Urban Search and Rescue (USAR) task [19], robots provide data continuously to human operators, and human operators analyse the data, update search strategy, and reassign tasks to ensure the efficiency and safety.
Therefore, collaboration between humans and robots necessitates that human operators apply their expertise to make situationally appropriate decisions. This decision making process aligns with the concept of the Naturalistic Decision Making (NDM) theory proposed by Klein et al. in 1993 [20]. This theory elucidates decision-making processes in environments that are both significant and familiar to humans, portraying them as expert decision makers with domain-specific knowledge and experience. Klein et al. further introduced the Recognition-Primed Decision making (RPD) model as a framework for understanding how effective decisions were made [20,21][20][21].
This model starts with an assessment of whether the situation is familiar. If it is not, the individual seeks more information and re-assesses the situation. If the situation is familiar, the model predicts that the individual will have expectancies about what is normal for that situation. If these expectancies are not violated, the individual engages in a mental simulation of the action, essentially predicting the outcome of an action without actually performing it. If the mental simulation suggests that the action will succeed, the individual implements the action. If not, they modify the plan and re-evaluate its potential success through another mental simulation. This process repeats until a workable plan is formulated.
RDP model, along with the NDM theory, has been applied to various real-world domains, including Unmanned Air Vehicles (UAV) operations by Yesilbas and Cotter in 2019 [22] and human–agent collaboration by Fan et al. in 2005 [23], demonstrating its broad applicability.

3. Cognitive Workload and Human Decision Making

In human decision making, cognitive workload plays a critical role, especially in environments where humans interact with complex systems or technology, such as robotics [24]. This factor is frequently examined alongside human decision making due to its profound impact on performance and outcomes [25]. Effective decision making is a complex cognitive process that necessitates the optimal distribution of an individual’s attention and mental capacity. The quality of decisions heavily relies on the ability to analyse information, evaluate possible outcomes, and choose the best course of action [26,27][26][27]. However, cognitive overload can significantly impede this process. When an operator faces an excess of information or task demands that exceed their cognitive resources, they are likely to experience mental fatigue [28,29][28][29], which can lead to reliance on simplifying strategies, known as heuristics. While heuristics can be useful for quick judgments, they often ignore much of the available data and the nuance required for high-quality decision making [30,31][30][31].
For example, using tools designed to minimize cognitive effort in pattern recognition can significantly improve group-decision outcomes by allowing better resource allocation decisions in dispersed groups [32]. Similarly, the frontal network in the brain responds to uncertainty in decision making tasks by modulating cognitive resource allocation, and the cognitive control in navigating uncertain situations is important [33]. In complex operations, cognitive readiness, which includes situation awareness, problem-solving, and decision making, is essential for effective resource allocation. In addition, supporting long-term anticipation in decision making can significantly improve performance in complex environments, and cognitive support tools can enhance the anticipation of future outcomes [34].

4. Factors Related to Human Decision Making

Within the selected studies, 24 factors that influence human decision making during tasks were identified. Generally, these factors affecting human decision making are categorized into four groups: human factors, robot factors, communication factors, and environmental factors. (Figure 2, Table 1).
Figure 2.
Factors related to human decision making in HRC.
Table 1.
Factors Related to Human Decision making.

4.1. Human Factors

In the realm of human factors impacting decision making in HRC, cognitive workload receives the most attention, as evidenced by its focus in 18 studies. Trust, operator ability, human characteristics, and acceptance are also prominent, having been extensively studied across multiple research works. Additionally, physical workload, stress levels, and emotional responses such as satisfaction or frustration are deemed critical. These factors, together with the perception of the situation and environment, play significant roles in influencing human decision making during collaborative tasks with robots.

4.2. Robot Factors

Robot factors that influence human decision making in collaborative tasks encompass a spectrum of the robot’s physical characteristics and actions. Key aspects such as the force exerted by the robot, its speed, and the distance maintained from human operators are crucial, directly impacting task performance and safety protocols. Additionally, the frequency and nature of robot errors, as well as the trajectory and movement patterns, are vital considerations. The role of the robot, whether as a leader or a follower, alongside the degree of automation implemented, plays a significant part in shaping the human–robot interaction dynamic.

4.3. Communication Factors

Communication factors include user interface design, control modality, feedback mechanisms, usability, human intent prediction, and mutual awareness. These elements facilitate interaction and are foundational for intuitive operation and effective teamwork between humans and robots, as evidenced by numerous studies. The user interface is particularly emphasized, as it directly impacts the efficiency and satisfaction of the operator. Feedback and mutual awareness are also integral, ensuring that both humans and robots can respond adaptively to each other’s actions and intentions.

4.4. Environmental Factors

Environmental factors include the dynamics of the situation, task complexity, workspace design, and physical safety. Among the selected studies, task complexity received the most emphasis. The design of the workspace was also deemed crucial, while the dynamics of the situation and physical safety were likewise noted as important considerations.

5. User Interface

Figure 3 shows that, among all selected studies, the Graphical User Interface (GUI) [37,39,46,54,57,60,63,66,69,71,72,73,75,77,80][37][38][39][40][42][43][46][48][49][51][52][56][57][58][76] was the most utilized type of user interface, implemented in 15 studies. Other interfaces include physical interaction [49,57,69,71,72,73,77][37][38][39][42][52][57][75] and touch interfaces [46[46][52][54][55],55,65,71], which facilitate direct engagement with robots. Additional methods such as gesture recognition [38,39,63,65,72[38][40][42][44][49][55],73], voice control [36[44][49][50][53][55],38,63,64,65], and eye gaze tracking [65,68][55][70] were employed, enhancing the versatility of interactions. Haptic feedback [62] and extended reality (XR) interfaces [38,39,46,52,59,60,63][40][44][46][48][49][61][69] were noted for their immersive and tactile capabilities. Furthermore, EMG-based interfaces [37,39][40][43] and Brain–Computer Interfaces (BCIs) [36,39,40][40][53][66] have been adopted, indicating progress in user command and collaboration with robots. Beyond the commonly used GUI, XR-based interfaces have garnered more attention than other emerging technologies.
Figure 3.
Types of user interfaces.

6. Technologies Related to Human Decision Making

Numerous studies have highlighted a variety of advanced technologies such as XR, gesture control, voice control, and AI-based perception, which significantly influence human decision making in collaborative environments.
Some studies mentioned a multimodal interface which combines different sensory input to improve interaction to enhance human decision making and collaboration. For example, gesture recognition [38[37][42][44][49][55],57,63,65,72], speech recognition [38[44][49][55],63,65], cognitive signals like EEG [36][53], ECG [55][54], fNIRS [75][51], and force sensory [36][53] were used in some studies.
XR-base interfaces including AR [38[44][49][61][69],52,59,63], VR [46], and MR [60][48] were discussed in several studies. These technologies can help improve human decision making in HRC by assisting with visual and other perceptions.
Other technologies such as Brain–Computer Interface (BCI) [40][66], eye gaze [65[55][70],68], and AI-based perception [57,72][37][42] were also mentioned. Such technologies have the potential to reduce cognitive workload and enhance users’ preference and needs therefore improving their decision making.

References

  1. Baratta, A.; Cimino, A.; Gnoni, M.G.; Longo, F. Human Robot Collaboration in Industry 4.0: A Literature Review. Procedia Comput. Sci. 2023, 217, 1887–1895.
  2. Demir, K.A.; Halil, C. The next Industrial Revolution: Industry 5.0 and Discussions on Industry 4.0, Industry 4.0 from the Management Information Systems Perspectives. In Industry 4.0 from the MIS Perspective; Peter Lang GmbH: Frankfurt, Germany, 2018.
  3. Raja Santhi, A.; Muthuswamy, P. Industry 5.0 or Industry 4.0S? Introduction to Industry 4.0 and a Peek into the Prospective Industry 5.0 Technologies. Int. J. Interact. Des. Manuf. 2023, 17, 947–979.
  4. Gervasi, R.; Mastrogiacomo, L.; Franceschini, F. A Conceptual Framework to Evaluate Human-Robot Collaboration. Int. J. Adv. Manuf. Technol. 2020, 108, 841–865.
  5. Colgate, J.E.; Wannasuphoprasit, W.; Peshkin, M.A. Cobots: Robots for Collaboration with Human Operators; Kwon, Y.W., Davis, D., Chung, H.H., Eds.; ASME: New York, NY, USA, 1996; Volume 58.
  6. Akella, P.; Peshkin, M.; Colgate, E.; Wannasuphoprasit, W.; Nagesh, N.; Wells, J.; Holland, S.; Pearson, T.; Peacock, B. Cobots for the Automobile Assembly Line. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Detroit, MI, USA, 10–15 May 1999; Volume 1, pp. 728–733.
  7. Guertler, M.; Tomidei, L.; Sick, N.; Carmichael, M.; Paul, G.; Wambsganss, A.; Moreno, V.H.; Hussain, S. When Is a Robot a Cobot? Moving beyond Manufacturing and Arm-Based Cobot Manipulators. Proc. Des. Soc. 2023, 3, 3889–3898.
  8. Kopp, T.; Baumgartner, M.; Kinkel, S. Success Factors for Introducing Industrial Human-Robot Interaction in Practice: An Empirically Driven Framework. Int. J. Adv. Manuf. Technol. 2021, 112, 685–704.
  9. Burden, A.G.; Caldwell, G.A.; Guertler, M.R. Towards Human–Robot Collaboration in Construction: Current Cobot Trends and Forecasts. Constr. Robot. 2022, 6, 209–220.
  10. ISO 10218-1:2011. Available online: https://www.iso.org/standard/51330.html (accessed on 29 December 2023).
  11. Puterman, M.L. Chapter 8 Markov Decision Processes. In Handbooks in Operations Research and Management Science; Elsevier: Amsterdam, The Netherlands, 1990; Volume 2, pp. 331–434.
  12. Kaelbling, L.P.; Littman, M.L.; Cassandra, A.R. Planning and Acting in Partially Observable Stochastic Domains. Artif. Intell. 1998, 101, 99–134.
  13. Ma, W.J. Bayesian Decision Models: A Primer. Neuron 2019, 104, 164–175.
  14. Can Görür, O.; Rosman, B.; Sivrikaya, F.; Albayrak, S. FABRIC: A Framework for the Design and Evaluation of Collaborative Robots with Extended Human Adaptation. ACM Trans. Hum.-Robot. Interact. 2023, 12, 38.
  15. Soori, M.; Arezoo, B.; Dastres, R. Artificial Intelligence, Machine Learning and Deep Learning in Advanced Robotics, a Review. Cogn. Robot. 2023, 3, 54–70.
  16. Jonassen, D.H. Designing for Decision Making. Educ. Technol. Res. Dev. 2012, 60, 341–359.
  17. Baltrusch, S.J.; Krause, F.; de Vries, A.W.; van Dijk, W.; de Looze, M.P. What about the Human in Human Robot Collaboration?: A Literature Review on HRC’s Effects on Aspects of Job Quality. Ergonomics 2022, 65, 719–740.
  18. Randell, R.; Honey, S.; Alvarado, N.; Pearman, A.; Greenhalgh, J.; Long, A.; Gardner, P.; Gill, A.; Jayne, D.; Dowding, D. Embedding Robotic Surgery into Routine Practice and Impacts on Communication and Decision Making: A Review of the Experience of Surgical Teams. Cogn. Technol. Work 2016, 18, 423–437.
  19. Liu, Y.; Nejat, G. Multirobot Cooperative Learning for Semiautonomous Control in Urban Search and Rescue Applications. J. Field Robot. 2016, 33, 512–536.
  20. Klein, G.A.; Orasanu, J.; Caldenwood, R. Decision Making in Action: Models and Methods; Praeger: Westport, CT, USA, 1993; ISBN 9780893919436.
  21. Klein, G. Naturalistic Decision Making. Hum. Factors 2008, 50, 456–460.
  22. Yesilbas, V.; Cotter, T.S. Application of Naturalistic Decision Making to the Domain of Unmanned Air Vehicles Operations. In Proceedings of the International Annual Conference of the American Society for Engineering Management, Philadelphia, PA, USA, 23–26 October 2019; American Society for Engineering Management (ASEM): Huntsville, AL, USA, 2019; pp. 1–7.
  23. Fan, X.; Sun, S.; McNeese, M.; Yen, J. Extending the Recognition-Primed Decision Model to Support Human-Agent Collaboration. In Proceedings of the Fourth International Joint Conference on Autonomous Agents and Multiagent Systems, Utrecht, The Netherlands, 25–29 July 2005; Association for Computing Machinery: New York, NY, USA; pp. 945–952.
  24. Gonzalez, C. Task Workload and Cognitive Abilities in Dynamic Decision Making. Hum. Factors 2005, 47, 92–101.
  25. Simone, V.D.; Pasquale, V.D.; Giubileo, V.; Miranda, S. Human-Robot Collaboration: An Analysis of Worker’s Performance. Procedia Comput. Sci. 2022, 200, 1540–1549.
  26. Ewusi-Mensaxh, K. Evaluating Information Systems Projects: A Perspective on Cost-Benefit Analysis. Inf. Syst. 1989, 14, 205–217.
  27. Kumar, A.; Maskara, S. Coping up with the Information Overload in the Medical Profession. J. Biosci. Med. 2015, 03, 124–127.
  28. Amadori, P.V.; Fischer, T.; Wang, R.; Demiris, Y. Predicting Secondary Task Performance: A Directly Actionable Metric for Cognitive Overload Detection. IEEE Trans. Cogn. Dev. Syst. 2022, 14, 1474–1485.
  29. Edgcumbe, D.R. Transcranial Direct Current Stimulation and Decision-Making: The Neuromodulation of Cognitive Reflection. Ph.D. Thesis, University of East London, London, UK, 2018.
  30. Gigerenzer, G.; Brighton, H. Homo Heuristicus: Why Biased Minds Make Better Inferences. Top. Cogn. Sci. 2009, 1, 107–143.
  31. Parpart, P.; Jones, M.; Love, B.C. Heuristics as Bayesian Inference under Extreme Priors. Cogn. Psychol. 2018, 102, 127–144.
  32. Hayne, S.C.; Smith, C.A.P.; Turk, D. The Effectiveness of Groups Recognizing Patterns. Int. J. Hum. Comput. Stud. 2003, 59, 523–543.
  33. Satterthwaite, T.D.; Green, L.; Myerson, J.; Parker, J.; Ramaratnam, M.; Buckner, R.L. Dissociable but Inter-Related Systems of Cognitive Control and Reward during Decision Making: Evidence from Pupillometry and Event-Related FMRI. Neuroimage 2007, 37, 1017–1031.
  34. Lafond, D.; DuCharme, M.B.; Gagnon, J.-F.; Tremblay, S. Support Requirements for Cognitive Readiness in Complex Operations. J. Cogn. Eng. Decis. Mak. 2012, 6, 393–426.
  35. Hopko, S.; Wang, J.; Mehta, R. Human Factors Considerations and Metrics in Shared Space Human-Robot Collaboration: A Systematic Review. Front. Robot. AI 2022, 9, 799522.
  36. Sadrfaridpour, B.; Wang, Y. Collaborative Assembly in Hybrid Manufacturing Cells: An Integrated Framework for Human–Robot Interaction. IEEE Trans. Autom. Sci. Eng. 2018, 15, 1178–1192.
  37. Gualtieri, L.; Fraboni, F.; De Marchi, M.; Rauch, E. Development and Evaluation of Design Guidelines for Cognitive Ergonomics in Human-Robot Collaborative Assembly Systems. Appl. Ergon. 2022, 104, 103807.
  38. Gualtieri, L.; Fraboni, F.; De Marchi, M.; Rauch, E. Evaluation of Variables of Cognitive Ergonomics in Industrial Human-Robot Collaborative Assembly Systems. In Proceedings of the 21st Congress of the International Ergonomics Association (IEA 2021) Virtual, 13–18 June 2021; Volume 223.
  39. Panchetti, T.; Pietrantoni, L.; Puzzo, G.; Gualtieri, L.; Fraboni, F. Assessing the Relationship between Cognitive Workload, Workstation Design, User Acceptance and Trust in Collaborative Robots. Appl. Sci. 2023, 13, 1720.
  40. Wang, L.; Gao, R.; Váncza, J.; Krüger, J.; Wang, X.V.; Makris, S.; Chryssolouris, G. Symbiotic Human-Robot Collaborative Assembly. CIRP Ann. 2019, 68, 701–726.
  41. Amanhoud, W.; Hernandez Sanchez, J.; Bouri, M.; Billard, A. Contact-Initiated Shared Control Strategies for Four-Arm Supernumerary Manipulation with Foot Interfaces. Int. J. Rob. Res. 2021, 40, 986–1014.
  42. Fraboni, F.; Gualtieri, L.; Millo, F.; De Marchi, M.; Pietrantoni, L.; Rauch, E. Human-Robot Collaboration During Assembly Tasks: The Cognitive Effects of Collaborative Assembly Workstation Features. In Proceedings of the 21st Congress of the International Ergonomics Association (IEA 2021), Virtual, 13–18 June 2021; Volume 223.
  43. Ajoudani, A.; Zanchettin, A.M.; Ivaldi, S.; Albu-Schäffer, A.; Kosuge, K.; Khatib, O. Progress and Prospects of the Human–Robot Collaboration. Auton. Robots 2018, 42, 957–975.
  44. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on Human–Robot Collaboration in Industrial Settings: Safety, Intuitive Interfaces and Applications. Mechatronics 2018, 55, 248–266.
  45. Vicentini, F. Collaborative Robotics: A Survey. J. Mech. Des. 2020, 143, 1–29.
  46. Szafir, D.; Szafir, D.A. Connecting Human-Robot Interaction and Data Visualization; Association for Computing Machinery: New York, NY, USA, 2021; pp. 281–292.
  47. Lemasurier, G.; Bejerano, G.; Albanese, V.; Parrillo, J.; Yanco, H.A.; Amerson, N.; Hetrick, R.; Phillips, E. Methods for Expressing Robot Intent for Human–Robot Collaboration in Shared Workspaces. J. Hum.-Robot Interact. 2021, 10, 40.
  48. Kalatzis, A.; Rahman, S.; Prabhu, V.G.; Stanley, L.; Wittie, M. A Multimodal Approach to Investigate the Role of Cognitive Workload and User Interfaces in Human-Robot Collaboration; Association for Computing Machinery: New York, NY, USA, 2023; pp. 5–14.
  49. Ruiz Garcia, M.A.; Rojas, R.; Gualtieri, L.; Rauch, E.; Matt, D. A Human-in-the-Loop Cyber-Physical System for Collaborative Assembly in Smart Manufacturing; Butala, P., Govekar, E., Vrabic, R., Eds.; Elsevier B.V.: Amsterdam, The Netherlands, 2019; Volume 81.
  50. Lambrecht, J.; Nimpsch, S. Human Prediction for the Natural Instruction of Handovers in Human Robot Collaboration. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019.
  51. Howell-Munson, A.; Doherty, E.; Gavriel, P.; Nicolas, C.; Norton, A.; Neamtu, R.; Yanco, H.; Wu, Y.-N.; Solovey, E.T. Towards Brain Metrics for Improving Multi-Agent Adaptive Human-Robot Collaboration: A Preliminary Study; Association for Computing Machinery: New York, NY, USA, 2022; p. 11.
  52. Rossato, C.; Pluchino, P.; Cellini, N.; Jacucci, G.; Spagnolli, A.; Gamberini, L. Facing with Collaborative Robots: The Subjective Experience in Senior and Younger Workers. Cyberpsychol. Behav. Soc. Netw. 2021, 24, 349–356.
  53. Rabby, K.M.; Khan, M.; Karimoddini, A.; Jiang, S.X. An Effective Model for Human Cognitive Performance within a Human-Robot Collaboration Framework. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019.
  54. Lagomarsino, M.; Lorenzini, M.; De Momi, E.; Ajoudani, A. Robot Trajectory Adaptation to Optimise the Trade-off between Human Cognitive Ergonomics and Workplace Productivity in Collaborative Tasks. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022.
  55. Saren, S.; Mukhopadhyay, A.; Ghose, D.; Biswas, P. Comparing Alternative Modalities in the Context of Multimodal Human–Robot Interaction. J. Multimodal User Interfaces 2023, 18, 69–85.
  56. Memar, A.H.; Esfahani, E.T. Objective Assessment of Human Workload in Physical Human-Robot Cooperation Using Brain Monitoring. J. Hum.-Robot Interact. 2019, 9, 13.
  57. Zhou, T.; Cha, J.S.; Gonzalez, G.; Wachs, J.P.; Sundaram, C.P.; Yu, D. Multimodal Physiological Signals for Workload Prediction in Robot-Assisted Surgery. J. Hum.-Robot Interact. 2020, 9, 12.
  58. Karakikes, M.; Nathanael, D. The Effect of Cognitive Workload on Decision Authority Assignment in Human–Robot Collaboration. Cogn. Technol. Work 2023, 25, 31–43.
  59. Messeri, C.; Masotti, G.; Zanchettin, A.M.; Rocco, P. Human-Robot Collaboration: Optimizing Stress and Productivity Based on Game Theory. IEEE Robot. Autom. Lett. 2021, 6, 8061–8068.
  60. Hostettler, D.; Bektaş, K.; Mayer, S. Pupillometry for Measuring User Response to Movement of an Industrial Robot; Association for Computing Machinery: New York, NY, USA, 2023; p. 52.
  61. Cleaver, A.; Tang, D.V.; Chen, V.; Short, E.S.; Sinapov, J. Dynamic Path Visualization for Human-Robot Collaboration; Association for Computing Machinery: New York, NY, USA, 2021; pp. 339–343.
  62. Zhu, Y.; Yang, C.; Tu, Z.; Ling, Y.; Chen, Y. A Haptic Shared Control Architecture for Tracking of a Moving Object. IEEE Trans. Ind. Electron. 2023, 70, 5034–5043.
  63. Riedelbauch, D.; Luthardt-Bergmann, D.; Henrich, D. A Cognitive Human Model for Virtual Commissioning of Dynamic Human-Robot Teams. In Proceedings of the 2021 Fifth IEEE International Conference on Robotic Computing (IRC), Taichung, Taiwan, 15–17 November 2021.
  64. Puttero, S.; Verna, E.; Genta, G.; Galetto, M. Towards the Modelling of Defect Generation in Human-Robot Collaborative Assembly. Procedia CIRP 2023, 118, 247–252.
  65. Messeri, C. Enhancing the Quality of Human-Robot Cooperation Through the Optimization of Human Well-Being and Productivity. In SpringerBriefs in Applied Sciences and Technology; Springer: Cham, Switzerland, 2023; pp. 15–25.
  66. Alimardani, M.; Hiraki, K. Passive Brain-Computer Interfaces for Enhanced Human-Robot Interaction. Front. Robot. AI 2020, 7, 125.
  67. Norton, A.; Admoni, H.; Crandall, J.; Fitzgerald, T.; Gautam, A.; Goodrich, M.; Saretsky, A.; Scheutz, M.; Simmons, R.; Steinfeld, A.; et al. Metrics for Robot Proficiency Self-Assessment and Communication of Proficiency in Human-Robot Teams. J. Hum.-Robot Interact. 2022, 11, 29.
  68. Selvaggio, M.; Cognetti, M.; Nikolaidis, S.; Ivaldi, S.; Siciliano, B. Autonomy in Physical Human-Robot Interaction: A Brief Survey. IEEE Robot. Autom. Lett. 2021, 6, 7989–7996.
  69. Eze, C.; Crick, C. Enhancing Human-Robot Collaboration by Exploring Intuitive Augmented Reality Design Representations; Association for Computing Machinery: New York, NY, USA, 2023; pp. 282–286.
  70. Newman, B.A.; Biswas, A.; Ahuja, S.; Girdhar, S.; Kitani, K.K.; Admoni, H. Examining the Effects of Anticipatory Robot Assistance on Human Decision Making. In International Conference on Social Robotics; Springer: Cham, Switzerland, 2020; Volume 12483.
  71. Parra, P.S.; Calleros, O.L.; Ramirez-Serrano, A. Human-Robot Collaboration Systems: Components and Applications. In Proceedings of the 7th International Conference of Control Systems, and Robotics (CDSR’20), Niagara Falls, ON, Canada, 9–11 November 2020.
  72. Klaer, V.; Wibranek, B. Human Decisions for Robot Integration Task Allocation in a Plan Based Building Assignment. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020.
  73. Bales, G.; Kong, Z. Neurophysiological and Behavioral Differences in Human-Multiagent Tasks: An EEG Network Perspective. J. Hum.-Robot Interact. 2022, 11, 42.
  74. Luo, Y.; Chen, Y.; Hu, B. Multisensory Evaluation of Human-Robot Interaction in Retail Stores—The Effect of Mobile Cobots on Individuals’ Physical and Neurophysiological Responses; Association for Computing Machinery: New York, NY, USA, 2023; pp. 403–406.
  75. Rahman, S.M.M. Human Features-Based Variable Admittance Control for Improving HRI and Performance in Power-Assisted Heavy Object Manipulation. In Proceedings of the 2019 12th International Conference on Human System Interaction (HSI), Richmond, VA, USA, 25–27 June 2019.
  76. Lagomarsino, M.; Lorenzini, M.; Balatti, P.; De Momi, E.; Ajoudani, A. Pick the Right Co-Worker: Online Assessment of Cognitive Ergonomics in Human-Robot Collaborative Assembly. IEEE Trans. Cogn. Dev. Syst. 2022, 15, 1928–1937.
More
ScholarVision Creations