Human-like Behavior of Service Robot and Social Distance: Comparison
Please note this is a comparison between Version 2 by Vivi Li and Version 1 by Yi Li.

Human likeness refers to the degree to which a robot looks and behaves like a human. Human likeness includes two categories: human-like appearance and human-like behavior. Appearance describes the static aspects of the robot (look, sound, sense of touch, etc.), while behavior describes the dynamic aspects of the robot (actions, expressions, emotions, etc.). In the process of human–robot interaction, humans would perceive social distance from the robot. Social distance can be understood as the closeness between two individuals’ relationships.

  • human-like behavior
  • service robot
  • social distance
  • perceived competence
  • perceived warmth

1. Introduction

Robots can be used to perform a series of complex actions [1]. A service robot performs service tasks for humans or devices [2]. It is an autonomous robot capable of interacting with people and completing specific service tasks [1]. The development of artificial-intelligence technology has popularized service robots, such as educational robots, therapeutic robots, and entertainment robots [3]. However, human acceptance of service robots is the main obstacle to popularizing service robots [4]. Since service robots have certain social attributes [5] and human-like characteristics that can encourage humans to treat service robots as social participants, human-like characteristics can influence the service effectiveness of robots [5,6][5][6]. The human-like characteristics of robots can effectively influence human attitudes toward robots [6]. Human acceptance of the human-like characteristics of robots promotes human acceptance of service robots [7], whereas human non-acceptance of the human-like characteristics of robots inhibits human acceptance of service robots [6].
However, scholars have different views on human acceptance of human-like robots. Some scholars believe that humans have positive emotions toward human-like robots and are more willing to deal with a robot that has more human-like features [7,8[7][8][9],9], while some other scholars believe that more human-like robots can cause fear and anxiety in people, decreasing their willingness to interact with the robot [6,10][6][10]. This paperntry focused on how the human likeness of a service robot affects human acceptance of it.
Human likeness refers to the degree to which a robot looks and behaves like a human [11]. Human likeness includes two categories: human-like appearance and human-like behavior [12,13][12][13]. Appearance describes the static aspects of the robot (look, sound, sense of touch, etc.) [14[14][15][16],15,16], while behavior describes the dynamic aspects of the robot (actions, expressions, emotions, etc.) [11,12][11][12]. To enhance the human likeness of the service robot, the designer would endow the service robot with more human characteristics. For example, the designer would make a robot’s face look like a human’s or add more human characteristics to its actions [13]. A few previous empirical studies have explored the human-like behavior of service robots (HLBR) [8]. However, this factor also has an important effect on human–robot interaction [8]. Therefore, this paperntry focused on the effects of HLBR on human acceptance of a service robot.
In previous studies, scholars used two types of constructs to measure human acceptance of a service robot: (i) psychological constructs, such as trust [16[16][17][18],17,18], likes [11], use intention [19], and satisfaction [20]; and (ii) sociological constructs, such as social distance [21]. Most studies have employed psychological constructs, while few have employed sociological constructs. However, it is important to examine the human acceptance of a service robot from a sociological perspective. The previous literature has shown that the social rules in people-to-people interactions apply to human–robot interaction [22], and robots can be viewed as social actors with specific behavioral patterns [23]. This paperntry focused on the sociological aspect of human beings’ acceptance of service robots (i.e., social distance). Social distance refers to the closeness of the relationship between the two individuals in people-to-people interactions [24]. The social distance between humans and service robots (SDHR) can measure the closeness of the relationship between humans and service robots [25]. Thus, SDHR can indicate human acceptance of a robot [21].
In addition, previous studies have shown that cultural background can affect human responses to robots [27,28,29][26][27][28]. In the US and China, robots are widely used in various fields, including the service industry [30][29], such as Sony’s entertainment robot AIBO and Takara’s home-care robot TERA [27][26]. However, the two countries differ in their views on robots. Americans regard robots as assistants, while Chinese tend to regard robots as friends [31][30]. It is generally believed that the US is an individualist country, and China is a collectivist country [32,33][31][32]. Compared to individualism, interpersonal relationships are more intimate in the context of collectivism [34][33]. Social rules in interpersonal communication can also apply to human–robot interaction [22]. A cross-cultural study of human–robot interaction found that Chinese people have a higher sense of intimacy with robots than Americans [27][26]

2. Human Likeness and Social Distance

In the process of human–robot interaction, humans would perceive social distance from the robot [25]. Social distance can be understood as the closeness between two individuals’ relationships [35][34]. SDHR is the result of the dynamic interaction between human attributes (gender, age, and the experience dealing with the robot) and robot attributes (appearance and interaction cues) [25]. Previous studies have found that humans naturally attribute human characteristics to non-human objects [14]. Consumers would spontaneously give human attributes to, for example, cars [36][35] or brands [37][36]. Human-like service robots have some characteristics of humans [4]. The higher the human likeness of the robot, the richer the human characteristics of the robot, and the stronger the human perception of the similarity between robot and human [38][37]. Perceived similarity can affect an individual’s perceived social distance; the higher the similarity, the smaller the social distance [39,40][38][39]. As an aspect of the robot’s human likeness, a higher HLBR can also lead to a smaller SDHR.

3. Human Likeness as well as Competence and Warmth

Anthropomorphism is the tendency to attribute human-like qualities to non-human objects [41,42][40][41]. The robot’s human-like appearance can promote the robot’s anthropomorphism [8]. Anthropomorphism can enhance human emotional attachment to non-human objects in service scenarios [43][42]. When humans interact with anthropomorphized robots, they may feel an affinity with the robots [44][43]. Warmth and competence are basic dimensions used to characterize others [45][44]. Human perception of the robot’s competence is related to the capabilities, intelligence, skills, and other characteristics of the service robot, while the human perception of the warmth of the robot is related to the caring, friendliness, sociability, and other characteristics of the service robot [8,45][8][44]. Anthropomorphism affects these two basic judgment dimensions [6,46,47][6][45][46]. Studies have found that if the robot is anthropomorphized by the HLBR, human perception of the robot’s competence may increase [6], and human perception of the warmth of the robot may become more positive [8,46][8][45]. Therefore, wresearchers can speculate that HLBR may affect human perceptions of the competence and warmth of the service robot. A service robot with a high human-like behavior should be considered more competent and warmer by humans.

4. Competence, Warmth, and Social Distance

Social distance reflects the consciousness of kind in human sociological attributes [25,48][25][47]. The social-identity theory holds that humans would categorize individuals based on social-categorization cues [45][44]. Human perceptions of the competence and warmth of robots would serve as social-categorization cues [49,50][48][49] and affect the results of categorizing the social groups of robots by humans [43][42]. The subjective categorization of inter-and intra-social groups affects social distance [39][38]. When humans regard other individuals as members of the same group, social distance tends to be smaller [40][39]. Therefore, wresearchers can speculate that the stronger the human perception of the competence and warmth of the robot, the smaller the SDHR would be.

5. Mediating Effects of Perceived Competence and Perceived Warmth

Studies suggest that humans tend to be attracted to human-like objects because of their conformity with humans [36,51][35][50]. Competence and warmth are the two universal dimensions of human-impression formation [45[44][48],49], accounting for almost 80% of human impressions of others [49][48]. The robot’s human likeness can significantly affect these two basic judgment dimensions [47][46]. Van Doorn et al. found that perceived competence and perceived warmth mediate the relationship between a robot’s human likeness and the service performance of the robot (such as customer satisfaction and loyalty) [52][51]. Kim et al. found that the human likeness of the service robot affects consumer attitudes toward the service robot indirectly through competence and warmth [8]. Social distance is a construct close to satisfaction and attitude [25]. Therefore, wresearchers speculated that perceived competence and perceived warmth might mediate the relationship between HLBR and SDHR.

References

  1. Wirtz, J.; Patterson, P.G.; Kunz, W.H. Brave new world: Service robots in the frontline. J. Serv. Manag. 2018, 29, 907–931.
  2. IFR. “Service Robots”. Available online: https://www.ifr.org/service-robots/ (accessed on 10 March 2019).
  3. Lee, I. Service Robots: A Systematic Literature Review. Electronics 2021, 10, 2658.
  4. Castelo, N.; Schmitt, B.; Sarvary, M. Human or robot? Consumer responses to radical cognitive enhancement products. J. Associat. Consum. Res. 2019, 4, 217–230.
  5. Choi, S.; Liu, S.Q.; Mattil, A.S. “How may I help you?” Says a robot: Examining language styles in the service encounter. Int. J. Hosp. Manag. 2019, 82, 32–38.
  6. Duffy, B.R. Anthropomorphism and the Social Robot. Robot. Auton. Syst. 2003, 42, 177–190.
  7. Kiesler, S.; Powers, A.; Fussell, S.R.; Torrey, C. Anthropomorphic interactions with a robot and robot–like agent. Soc. Cognit. 2008, 26, 169–181.
  8. Kim, S.Y.; Schmitt, B.H.; Thalmann, N.M. Eliza in the uncanny valley: Anthropomorphizing consumer robots increases their perceived warmth but decreases liking. Mark. Lett. 2019, 30, 1–12.
  9. Hancock, P.A.; Billings, D.R.; Schaefer, K.E.; Chen, J.Y.; De Visser, E.J.; Parasuraman, R. A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction. Hum. Factors 2011, 53, 517–527.
  10. Murphy, J.; Gretzel, U.; Pesonen, J. Marketing robot services in hospitality and tourism: The role of anthropomorphism. J. Travel Tour. Mark. 2019, 36, 784–795.
  11. Zitzewitz, J.V.; Boesch, P.M.; Wolf, P.; Riener, R. Quantifying the human likeness of a humanoid robot. Int. J. Soc. Robot. 2013, 5, 263–276.
  12. Minato, T.; Shimada, M.; Itakura, S.; Lee, K.; Ishiguro, H. Evaluating the human likeness of an android by comparing gaze behaviors elicited by the android and a person. Adv. Robot. 2006, 20, 1147–1163.
  13. Choi, J.; Kim, M. The usage and evaluation of anthropomorphic form in robot design. In Proceedings of the Design Research Society Conference, Sheffield, UK, 16–19 July 2008.
  14. Epley, N.; Waytz, A.; Cacioppo, J.T. On Seeing a Human: A Three-Factor Theory of Anthropomorphism. Psychol. Rev. 2007, 114, 864–886.
  15. Yao, S.; Luximon, A.; Yan, L. The Effect of Facial Features on Facial Anthropomorphic Trustworthiness in Social. Robots. Appl. Ergon. 2021, 94, 103420.
  16. Bernotat, J.; Eyssel, F.; Sachse, J. The (Fe)male Robot: How Robot Body Shape Impacts First Impressions and Trust Towards Robots. Int. J. Soc. Robot. 2021, 13, 477–489.
  17. Kim, W.; Kim, N.; Lyons, J.B.; Chang, S.N. Factors affecting trust in high-vulnerability human-robot interaction contexts: A structural equation modelling approach. Appl. Ergon. 2020, 85, 103056.
  18. Christoforakos, L.; Gallucci, A.; Surmava-Große, T.; Ullrich, D.; Diefenbach, S. Can Robots Earn Our Trust the Same Way Humans Do? A Systematic Exploration of warmth, competence, and Anthropomorphism as Determinants of Trust Development in HRI. Front. Robot. AI 2021, 8, 640444.
  19. Pinxteren, M.V.; Wetzels, R.J.; Pluymaekers, M. Trust in humanoid robots: Implications for services marketing. J. Serv. Mark. 2019, 33, 507–518.
  20. Jia, J.W.; Chung, N.; Hwang, J. Assessing the hotel service robot interaction on tourists’ behaviour: The role of anthropomorphism. Ind. Manag. Data Syst. 2021, 121, 1457–1478.
  21. Kim, Y.; Mutlu, B. How social distance shapes human–robot interaction. Int. J. Hum.-Comput. Stud. 2014, 72, 783–795.
  22. Lee, M.K.; Kiesler, S.; Forlizzi, J.; Srinivasa, S.; Rybski, P. Gracefully mitigating breakdowns in robotic services. In Proceedings of the 5th ACM/IEEE International Conference on Human Robot Interaction, Osaka, Japan, 2–5 March 2010.
  23. Rahwan, I.; Cebrian, M.; Obradovich, N.; Bongard, J.; Bonnefon, J.F.; Breazeal, C.; Crandall, J.W.; Christakis, N.A.; Couzin, I.D.; Jackson, M.O.; et al. Machine behaviour. Nature 2019, 568, 477–486.
  24. Liviatan, I.; Trope, Y.; Liberman, N. Interpersonal similarity as a social distance dimension: Implications for perception of others’ actions. J. Exp. Soc. Psychol. 2008, 44, 1256–1269.
  25. Kim, Y.; Kwak, S.S.; Kim, M. Am I acceptable to you? Effect of a robot’s verbal language forms on people’s social distance from robots. Comput. Hum. Behav. 2013, 29, 1091–1101.
  26. Li, D.; Rau, P.; Ye, L. A Cross-cultural Study: Effect of Robot Appearance and Task. Int. J. Soc. Robot. 2010, 2, 175–186.
  27. Eresha, G.; Häring, M.; Endrass, B.; André, E.; Obaid, M. Investigating the influence of culture on proxemic behaviors for humanoid robots. In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, Gyeongju, Korea, 26–29 August 2013.
  28. Ho, Y.; Sato-Shimokawara, E.; Yamaguchi, T.; Tagawa, N. Interaction robot system considering culture differences. In Proceedings of the IEEE Workshop on Advanced Robotics and Its Social Impacts, Tokyo, Japan, 7–9 November 2013.
  29. Cette, G.; Devillard, A.; Spiezia, V. The contribution of robots to productivity growth in 30 OECD countries over 1975–2019. Econ. Lett. 2021, 200, 109762.
  30. Evers, V.; Maldonado, H.C.; Brodecki, T.L.; Hinds, P.J. Relational vs. group self-construal: Untangling the role of national culture in HRI. In Proceedings of the 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI), New York, NY, USA, 12–15 March 2008.
  31. Germani, A.; Delvecchio, E.; Nartova-Bochaver, S.K.; Li, J.B.; Lis, A.; Vazsonyi, A.T.; Mazzeschi, C. The link between individualism–collectivism and life satisfaction among emerging adults from four countries. Appl. Psychol. Health Well Being 2021, 13, 437–453.
  32. Chang, J.; Panjwani, A.; Perera, S.; Steinberg, H. Information Technology Customer Service, Cultural Differences, & the Big 5 in China and the USA. In Allied Academies International Conference: Proceedings of the Academy of Management Information & Decision Sciences; Jordan Whitney Enterprises, Inc.: Candler, NC, USA, 2016; Volume 20, pp. 1–5.
  33. Oyserman, D.; Coon, H.M.; Kemmelmeier, M. Rethinking individualism and collectivism: Evaluation of theoretical assumptions and meta-analyses. Psychol. Bull. 2002, 128, 3.
  34. Bogardus, E.S. Measurement of personal-group relations. Sociometry 1947, 10, 306–311.
  35. Aggarwal, P.; McGill, A.L. Is that car smiling at me? Schema congruity as a basis for evaluating anthropomorphized products. J. Consum. Res. 2007, 34, 468–479.
  36. Denntt, D.C. Kinds of Minds: Towards an Understanding of Consciousness; Basic Books: New York, NY, USA, 1996; p. 184.
  37. Seyama, J.I.; Nagayama, R.S. The uncanny valley: Effect of realism on the impression of artificial human faces. Presence 2007, 16, 337–351.
  38. Bar-Anan, Y.; Liberman, N.; Trope, Y. The association between psychological distance and construal level: Evidence from an implicit association test. J. Exp. Psychol. 2006, 135, 609–622.
  39. Kruglansik, A.W.; Higgins, E.T. Social Psychology: Handbook of Basic Principles; Guilford Press: New York, NY, USA, 2007; pp. 540–561.
  40. Guido, G.; Peluso, A.M. Brand anthropomorphism: Conceptualization, measurement, and impact on brand personality and loyalty. J. Brand Manag. 2015, 22, 1–19.
  41. DiSalvo, C.; Gemperle, F. From seduction to fulfillment: The use of anthropomorphic form indesign. In Proceedings of the Designing Pleasurable Products and Interfaces Conference, Pittsburgh, PA, USA, 23–26 June 2003; Available online: https://dl.acm.org/doi/10.1145/782896.782913 (accessed on 22 June 2021).
  42. Belanche, D.; Casaló, L.V.; Schepers, J.; Flavián, C. Examining the effects of robots’ physical appearance, warmth, and competence in frontline services: The Humanness-Value-Loyalty model. Psychol. Mark. 2021, 38, 2357–2376.
  43. Komatsu, T.; Takahashi, H. How does unintentional eye contact with a robot affect users’ emotional attachment to it? Investigation on the effects of eye contact and joint attention on users’ emotional attachment to a robot. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction: User & Context Diversity, Las Vegas, NV, USA, 21–26 July 2013.
  44. Fiske, S.T.; Cuddy, A.J.; Glick, P. Universal dimensions of social cognition: Warmth and competence. Trends Cogn. Sci. 2007, 11, 77–83.
  45. Zhu, D.H.; Chang, Y.P. Robot with Humanoid Hands Cooks Food Better? Effect of Robotic Chef Anthropomorphism on Food Quality Prediction. Int. J. Hosp. Manag. 2020, 32, 1367–1383.
  46. Scott, M.L.; Martin, M.; Lisa, E.B. Judging the Book by Its Cover? How Consumers Decode Conspicuous Consumption Cues in Buyer-Seller Relationships. J. Mark. Res. 2013, 50, 334–347.
  47. Duvall, R. The Conflict Helix. In Understanding Conflict and War; Rummel, R.J., Ed.; Halsted Press: Beverly Hills, CA, USA, 1976; Volume 2, p. 400.
  48. Cuddy, A.J.; Glick, P.; Beninger, A. The dynamics of warmth and competence judgments, and their outcomes in organizations. Res. Organ. Behav. 2011, 31, 73–98.
  49. Rosenthal-von der Püthen, A.M.; Kramer, N.C. How design characteristics of robots determine evaluation and uncanny valley related responses. Comput. Hum. Behav. 2014, 36, 422–439.
  50. Bartneck, C.; Kulic, D.; Croft, E.; Zoghbi, S. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 2009, 1, 71–81.
  51. Doorn, J.V.; Mende, M.; Noble, S.M.; Hulland, J.; Ostrom, A.L.; Grewal, D.; Petersen, J.A. Domo arigato mr. roboto: Emergence of automated social presence in organizational frontlines and customers service experiences. J. Serv. Res. 2016, 20, 43–58.
More
ScholarVision Creations