Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2089 2023-11-16 22:25:31 |
2 format correct -9 word(s) 2080 2023-11-17 01:41:08 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Spisak, S.E.; Indurkhya, B. Social Exclusion in Human-Robot Interaction. Encyclopedia. Available online: https://encyclopedia.pub/entry/51714 (accessed on 16 November 2024).
Spisak SE, Indurkhya B. Social Exclusion in Human-Robot Interaction. Encyclopedia. Available at: https://encyclopedia.pub/entry/51714. Accessed November 16, 2024.
Spisak, Sharon Ewa, Bipin Indurkhya. "Social Exclusion in Human-Robot Interaction" Encyclopedia, https://encyclopedia.pub/entry/51714 (accessed November 16, 2024).
Spisak, S.E., & Indurkhya, B. (2023, November 16). Social Exclusion in Human-Robot Interaction. In Encyclopedia. https://encyclopedia.pub/entry/51714
Spisak, Sharon Ewa and Bipin Indurkhya. "Social Exclusion in Human-Robot Interaction." Encyclopedia. Web. 16 November, 2023.
Social Exclusion in Human-Robot Interaction
Edit

As the technology for social robots is maturing at a rapid rate, one expects that scenarios for human–robot teams, which so far have remained in the realm of fiction (for example, The Murderbot Diaries by Martha Wells), will become a reality soon. Philosophers are discussing ways to redefine the concept of friendship to include robots as well. To prepare for this inevitable future, it is necessary that we study cognitive and affective aspects of how humans respond to robot team members.

bomb defusal task human–robot teams social exclusion in HRI

1. Introduction

The presence of robots in everyday activities is becoming a reality to which we humans need to adapt. Nowadays, robots are not just present in factories but are playing an increasingly important role in social contexts. A major design issue for these social robots is how to design and program them so that they are trustworthy, safe, and useful for humans. Though humans vary widely in terms of their gender, age, culture, personality, and so on, we can still try to find some general design principles that would lead to a trustworthy and smooth human–robot interaction.
One issue that needs to be investigated thoroughly is how a robot’s presence in a team with humans, who are engaged in a cooperative task, affects the feelings and behavior of the human team members. This is important because robots are increasingly being used in environments where they must interact with more than just a single person. We can think of such examples as hospitals, care homes, offices, or even households, where robots are supposed to interact with groups of different people [1][2][3][4][5][6]
Previous research has shown that people’s preconceptions and previous experiences with robots affect their attitudes towards robots and artificial agents. For example, it has been shown that people who do not have extensive experience with social robots generally have positive feelings towards them [7]. In another recent study, Esterwood and Robert [8] have shown that an individual’s attitude towards working with robots (AWOR) significantly affects their trust repair strategy in human-robot interaction, and this effect can change over the course of repeated trust violations. (See also [9][10].) In this context, it is important to study team cooperation in human–robot teams.
One such study [11] examined how the social structure between humans and machines is affected when the group size is increased from two (human and the machine) to three (two humans and a machine). It was found that the addition of one more human to a social situation resulted in higher doubt about the machine’s non-subjectivity, while simultaneously consolidating its position in a secondary relationship.
In another study, Rosenthal-von der Pütten and Abrams [12] pointed out that, besides the social dynamics in human–robot groups, the social consequence of their cooperation is also important. As robots adapt their behavior based on data gathered through interactions with people, it is very likely that we will have to deal with the unequal adaptation of robots to different members of the human group. The underlying reason is that a robot’s actions and behaviors are based on machine learning algorithms. So, a robot will better adapt and more accurately predict and match the preferences of the group members with a larger data set. This might be perceived as unfairness or even discrimination by other group members [13]. One possible way to avoid such a situation is to first investigate more carefully how humans behave when a robot is seen as a group member and how they might react when it shows biased behavior and seems to favor a certain group member. Only after better understanding the effects of unequal treatment of group members, can strategies aimed at reducing such problems be developed.

2. Social Exclusion

The term social exclusion was first coined in France in 1974 by René Lenoir, so it is quite a recent notion. There are many different definitions in the literature that describe this phenomenon [14][15], but for this research, we characterize it as a multidimensional process of depriving and preventing an individual from engaging in fair, equal, and unbiased interaction within a social group, as well as not perceiving such a person as a full-fledged member of the group. The result is that the rejected person has no opportunity to participate under the same conditions as others in a social group. It was originally used in the context of the marginalization of certain social groups, such as the handicapped, unemployed, or single parents, addressing serious social issues facing governments all over the world.
Even though some individuals are more prone to being socially excluded because of, for example, their socio-economic status or bad health, it should be emphasized that it affects practically every single person. Consider, for example, a hypothetical situation with a school trip where a child named John gets sick and has to stay home while all his friends join the trip. After his friends get back from this excursion, they constantly talk about all the exciting things that happened during the trip. As John cannot relate to his friends, he starts to feel like an outsider, and he might no longer be perceived as a full-fledged member of the group. 
A series of studies on the psychological effects of social exclusion conducted by Jones et al. [16][17][18] revealed that participants in the out-of-the-loop condition, meaning those who, in contrast to their team members, did not receive information needed to solve the main task, reported decreased fulfillment of fundamental needs such as belonging, self-esteem, control, and meaningful existence. They also reported a worse mood and rated their team members lower in terms of liking and trustworthiness. These findings suggest that depriving a person of an opportunity to interact with others on the same level has many negative psychological outcomes for the person. 

2.1. Social Exclusion in HRI

The question of how social exclusion might affect people while interacting with artificial agents such as robots needs a more thorough investigation, though the existing research so far has suggested that this also has a negative impact on human wellbeing. In a study conducted by Ruijten, Ham, and Midden [19], the participants first played the Cyberball game, during which the ball was tossed fewer times to some participants (the exclusion group) than to others (the inclusion group). This was followed by the main washing machine task, where the participants were given feedback by a virtual agent about their choices (about the washing machine task). The results showed that the participants who were excluded, especially women, were more sensitive to the feedback provided by the agent.
In a study by Erel et al. [20], regarding ostracism in robot–robot–human interaction, the Cyberball paradigm in its physical form was used to investigate if interaction with two non-humanoid robots might lead to the experience of being excluded from the team and might have an impact on the fundamental psychological needs of the participants. As predicted, those in the exclusion condition (only about 10% of the ball tosses were directed at them) reported worse moods and more negative ratings regarding three needs: belonging, control, and meaningful existence, compared to the participants in the inclusion and the over-inclusion conditions.
As observed by Claure et al. [21], the problem of unfairness might also arise when a robot assigns more resources to a team member whose performance is the highest, which in turn might lead to decreased trust in the system by the other workers. To investigate this issue, a Tetris game was used where the algorithm assigned blocks to two players. The algorithm represented three levels of fairness, depending on the minimum allocation rate for each participant. Even though the team’s performance did not change across the conditions, there was a decreased level of trust reported by the weaker performers. Another study that also investigated the perceived fairness of decisions made by an algorithm showed that the level to which a system is perceived as fair might also depend on how the person was evaluated by this system [22]. In this study, an algorithm was perceived as fairer when participants received favorable feedback from it than when the outcome was unfavorable. It was also emphasized that perceived fairness might depend on individual differences such as gender or level of education.
Even though such results provide some clues to how humans behave in discriminating settings while interacting with artificial systems, it is still unclear if they are susceptible to unfair behavior when it is shown by such a system. (For effects of algorithms behaving unfairly, see [23].) It might happen that in a place where more people interact with a robot, such as, for example, at work, in a hospital, or even at home, the robot would be better at predicting and understanding the requests and preferences of those who spend more time with it because it has been able to gather more data on these people’s habits and behavior. Then those who do not spend as much time with the robot, especially new members of a group that includes a robot, might feel not only irritated by not being understood appropriately by the robot but also excluded from the interactions with such a system.

2.2. Consequences of Social Exclusion

Research shows that social exclusion might cause several negative consequences for humans. It is not surprising that those who experience rejection report having worse mood than others, but what might not seem so obvious is that, as shown by Van Beest and Williams [24], this effect remains even if being excluded by others is beneficial. In this experiment, the participants played a Euroball game modeled after the commonly used Cyberball paradigm. Even when the ostracized participants were receiving more money than the other players, they reported lower mood ratings and satisfaction levels.
It has also been shown that individuals are more aggressive after being socially excluded. In the studies conducted by Twenge et al. [25], experimenters manipulated the participants’ feelings of belonging by giving them predictions about their future social lives that were allegedly based on their personality tests. The participants were also asked to write an essay on abortion, after which they were given feedback on the quality of the essay, which was allegedly evaluated by another competent participant. To test the level of aggressiveness, the participants evaluated the person who rated their essay. As predicted, the ratings given by those who were told that their future social life would be lonely were more negative than those given by the people in the other conditions. Moreover, in the next experiment, participants who were made to feel rejected by their peers (meaning that they received information that no one chose them as the person with whom they would like to work) set a higher intensity and a longer duration for the noise blast that was allegedly given to the person with whom they played the computer game.
Receiving predictions about a lonely future life seems to also affect both the pain tolerance and the pain threshold, as was shown by DeWall and Baumeister [26]. In this research, the participants who received a pessimistic future-life scenario had a significantly higher physical pain threshold and tolerance compared to those in the other three control conditions. It supports the view that feelings of rejection and exclusion might result in emotional numbness.
A similar procedure was also used in Baumeister et al. [27], where participants were asked to fill out the general mental abilities test after receiving predictions about their future lives. Those who received an anticipation of a lonely future answered fewer questions from the test accurately, which suggests that a feeling of social rejection might impair intelligence. However, a significant decrease in cognitive abilities was not observed in the case of simple information processing. A possible explanation might lie in the impairment of one’s own executive function. This suggestion was supported by Baumeister et al. [28], where it was found that social exclusion might indeed have a detrimental effect on self-regulation. After receiving feedback about their anticipated future social lives, participants were encouraged to consume a bad-tasting vinegar drink. This task required a high level of self-regulation, as the participants had to force themselves to drink a beverage that was said to be beneficial for their health even though it was unpalatable. It was observed that the participants who received a prediction about a lonely future life drank fewer ounces of the beverage than those who were told to be accepted by others in the future and even fewer than those who were predicted to experience misfortunes in their future.

References

  1. Jevtić, A.; Valle, A.F.; Alenya, G.; Chance, G.; Caleb-Solly, P.; Dogramadzi, S.; Torras, C. Personalized Robot Assistant for Support in Dressing. IEEE Trans. Cogn. Dev. Syst. 2019, 11, 363–374.
  2. Duret, C.; Grosmaire, G.; Krebs, H.I. Robot-Assisted Therapy in Upper Extremity Hemiparesis: Overview of an Evidence-Based Approach. Front. Neurol. 2019, 10, 412.
  3. Cao, H.-L.; Esteban, P.G.; Bartlett, M.; Baxter, P.; Belpaeme, T.; Billing, E.; Cai, H.; Coeckelbergh, M.; Costescu, C.; David, D.; et al. Robot-Enhanced Therapy: Development and Validation of Supervised Autonomous Robotic System for Autism Spectrum Disorders Therapy. IEEE Robot. Autom. Mag. 2019, 26, 49–58.
  4. Arvin, F.; Espinosa, J.; Bird, B.; West, A.; Watson, S.; Lennox, B. Mona: An Affordable Open-Source Mobile Robot for Education and Research. J. Intell. Robot. Syst. 2019, 94, 761–775.
  5. Mondada, F.; Bonani, M.; Riedo, F.; Briod, M.; Pereyre, L.; Re, P.; Magnenat, S. Bringing Robotics to Formal Education: The Thymio Open-Source Hardware Robot. IEEE Robot. Autom. Mag. 2017, 24, 77–85.
  6. Bouchard, K.; Liu, P.P.; Tulloch, H. The Social Robots Are Coming: Preparing for a New Wave of Virtual Care in Cardiovascular Medicine. Circulation 2022, 145, 1291–1293.
  7. Naneva, S.; Gou, M.S.; Webb, T.L.; Prescott, T.J. A Systematic Review of Attitudes, Anxiety, Acceptance, and Trust Towards Social Robots. Int. J. Soc. Robot. 2020, 12, 1179–1201.
  8. Esterwood, C.; Robert, L.P. Having The Right Attitude: How Attitude Impacts Trust Repair in Human-Robot Interaction. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (HRI 2022), Sapporo, Japan, 7–10 March 2022.
  9. Staffa, M.; Rossi, S. Recommender Interfaces: The More Human-Like, the More Humans Like. In Social Robotics. ICSR 2016; Agah, A., Cabibihan, J.J., Howard, A., Salichs, M., He, H., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2016; Volume 9979.
  10. Abate, A.F.; Barra, P.; Bisogni, C.; Cascone, L.; Passero, I. Contextual Trust Model with a Humanoid Robot Defense for Attacks to Smart Eco-Systems. IEEE Access 2020, 8, 207404–207414.
  11. Etzrodt, K. The third party will make a difference—A study on the impact of dyadic and triadic social situations on the relationship with a voice-based personal agent. Int. J. Hum. Comput. Stud. 2022, 168, 102901.
  12. der Pütten, A.M.R.-V.; Abrams, A. Social Dynamics in Human-Robot Groups—Possible Consequences of Unequal Adaptation to Group Members through Machine Learning in Human-Robot Groups; Springer: Cham, Switzerland, 2020; pp. 396–411.
  13. Misztal-Radecka, J.; Indurkhya, B. Bias-Aware Hierarchical Clustering for Detecting the Discriminated Groups of Users in Recommendation Systems. Inf. Process. Manag. 2021, 58, 102519.
  14. Millar, J. Social Exclusion and Social Policy Research: Defining Exclusion. In Multidisciplinary Handbook of Social Exclusion Research; Abrams, D., Christian, J., Gordon, D., Eds.; John Wiley & Sons: Hoboken, NJ, USA, 2007; pp. 1–15.
  15. Daly, M. Social Exclusion as Concept and Policy Template in the European Union. CES Working Paper, No. 135. 2006. Available online: http://aei.pitt.edu/9026/1/Daly135.pdf (accessed on 18 March 2023).
  16. Jones, E.E.; Carter-Sowell, A.R.; Kelly, J.R.; Williams, K.D. ‘I’m Out of the Loop’: Ostracism Through Information Exclusion. Group Process. Intergroup Relat. 2009, 12, 157–174.
  17. Jones, E.E.; Kelly, J.R. ‘Why Am I Out of the Loop?’ Attributions Influence Responses to Information Exclusion. Personal. Soc. Psychol. Bull. 2010, 36, 1186–1201.
  18. Jones, E.E.; Carter-Sowell, A.R.; Kelly, J.R. Participation Matters: Psychological and Behavioral Consequences of Information Exclusion in Groups. Group Dyn. Theory Res. Pract. 2011, 15, 311–325.
  19. Ruijten, P.A.M.; Ham, J.; Midden, C.J.H. Investigating the Influence of Social Exclusion on Persuasion by a Virtual Agent. In Persuasive Technology, Persuasive 2014; Spagnolli, A., Chittaro, L., Gamberini, L., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2014; Volume 8462, pp. 191–200.
  20. Erel, H.; Cohen, Y.; Shafrir, K.; Levy, S.D.; Vidra, I.D.; Tov, T.S.; Zuckerman, O. Excluded by Robots: Can Robot-Robot-Human Interaction Lead to Ostracism? In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, Boulder, CO, USA, 9–11 March 2021; pp. 312–321.
  21. Claure, H.; Chen, Y.; Modi, J.; Jung, M.; Nikolaidis, S. Multi-Armed Bandits with Fairness Constraints for Distributing Resources to Human Teammates. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 299–308.
  22. Wang, R.; Harper, F.M.; Zhu, H. Factors Influencing Perceived Fairness in Algorithmic Decision-Making: Algorithm Outcomes, Development Procedures, and Individual Differences. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–14.
  23. O’Neil, C. Weapons of Math Destruction; Crown Books: New York, NY, USA, 2016.
  24. van Beest, I.; Williams, K.D. When Inclusion Costs and Ostracism Pays, Ostracism Still Hurts. J. Personal. Soc. Psychol. 2006, 91, 918–928.
  25. Twenge, J.M.; Baumeister, R.F.; Tice, D.M.; Stucke, T.S. If You Can’t Join Them, Beat Them: Effects of Social Exclusion on Aggressive Behavior. J. Personal. Soc. Psychol. 2001, 81, 1058–1069.
  26. DeWall, C.N.; Baumeister, R.F. Alone but Feeling No Pain: Effects of Social Exclusion on Physical Pain Tolerance and Pain Threshold, Affective Forecasting, and Interpersonal Empathy. J. Personal. Soc. Psychol. 2006, 91, 1–15.
  27. Baumeister, R.F.; Twenge, J.M.; Nuss, C.K. Effects of Social Exclusion on Cognitive Processes: Anticipated Aloneness Reduces Intelligent Thought. J. Personal. Soc. Psychol. 2002, 83, 817–827.
  28. Baumeister, R.; Twenge, J.M.; Ciarocco, N.J. Social Exclusion Impairs Self-Regulation. J. Personal. Soc. Psychol. 2005, 88, 589–604.
More
Information
Subjects: Robotics
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : ,
View Times: 155
Revisions: 2 times (View History)
Update Date: 17 Nov 2023
1000/1000
ScholarVision Creations