Applications of Social Robotics: Comparison
Please note this is a comparison between Version 2 by Amina Yu and Version 3 by Amina Yu.

Social robots are being proposed in telepresence, medicine, education, entertainment, assistance, and other domains. Benefiting from their information acquisition and processing, and actuation capacities, social robots are conceived to either replace or assist humans in daily social interaction contexts.

  • social robotics
  • assistive robotics
  • artificial intelligence

1. Telepresence

In telepresence applications, a user can rely on a robotic platform to ensure a certain extent of social interaction with other persons while being at a distant location from them. Different technologies have been implemented on social robots to ensure a realistic interaction from both user and interaction partner sides. Telepresence robots require features such as autonomy, controllability, maneuverability, and stability to ensure safe interaction with humans [1]. For instance, in [2], a deep-learning approach has been proposed for a telepresence robot to learn by demonstrating how to maintain an appropriate position and orientation within a group of people, and how to follow moving interaction targets. Herein, the robot has been qualified as semi-autonomous as its pilot still had control over certain high-level tasks. A similar platform has been used in [3] for the interaction between users in and outside an elderly day center. In [1], a robotic telepresence system design and a control approach for social interaction have been presented. The robot has been equipped with capabilities of vision, hearing, speaking, and moving, all controlled remotely by a user. In [4], a study has been shown where a Double telepresence robot was installed in rooms of care homes, for the purpose of allowing old persons to communicate with their family members. Despite some technical difficulties, the experience of using this system was positively evaluated by persons involved in the study. Figure 1 shows a Double telepresence robot [5]. A telepresence robotic system for people with motor disabilities has been proposed in [6]. Eye gaze was used as an input to the system as an eye-tracking mechanism involving a virtual reality head-mounted display was intended to provide driving commands to the robotic platform.
Figure 1. A Double telepresence robot in the American University of the Middle east.

2. Education

Robots have also been involved in education where they had applications in language teaching, teaching assistance, writing, and vocabulary enhancement for example [7][8][9]. Indeed, they can facilitate learning and improve the educational performance of students, adding social interaction to the learning context in certain cases [10]. Herein, the attitudes of teachers and students towards robots are important to evaluate and the study shown in [11] tackled the negative attitude toward robot scale (NARS) that was developed in [12] to measure general human attitudes towards robots. The study pointed to the importance of knowing the attitudes of teachers towards robots used in classes. Additionally, the study made in [13] studied the acceptability of robots by teaching second language words to adults with the robot and human tutors. A negative attitude toward robots was shown to have a possible negative impact on the ability of individuals to learn vocabulary. Second language tutoring was addressed in [14][15] with children and a social robot with a tablet. Children were not shown to be able to learn more words when learning from a robot and a tablet than from a tablet without a robot. Additionally, iconic gestures from the robot were not shown to help children learn more words. A closely related topic in second language learning was addressed in [16] where different interaction styles of human moderators in language café style conversations were developed for a robot in the role of a host in spoken conversation practice. These styles were rated differently by human participants, due to different factors, not only due to the robot interaction but also due to the participants themselves. This allowed us to state that interaction can be improved, taking these different factors into account. In [17], the humanoid robot Pepper has been used in vocabulary enhancement in children in a game scenario intended to improve their abilities to manipulate and learn words. The capabilities of the Pepper robot such as connecting to the Internet, and acquiring and emitting sound signals have been exploited to accomplish a proper interaction with children. A robotic platform was used to support professors in mathematics classes in [18]. The Nao robot was used for giving theoretical explanations, as well as the instructions and evaluation of the activities made during class. It was programmed before each session to move the most humanly possible to generate accurate visual gestures. Moreover, the vision and sound generation capacities of Nao were exploited for the interaction. In [19], a robotic platform was used for transferring students’ opinions or questions to the lecturer. The desktop humanoid robot was collaboratively controlled and provided a messaging interface where messages consisted of questions or opinions of students to be uttered by the robot. This approach was shown to improve the participation of students during lectures. Another paradigm of learning in children is learning by teaching , which engages a student in the act of teaching another. In [20], a humanoid robot was used as a handwriting partner with simulated handwriting mistakes intentionally made, being typical of children learning to handwrite. Children taught the robot how to write, learning through their teaching.

3. Care and Assistance

Another domain of application where social robots have emerged is assistance in health and daily care services. In [21][22], the increasing demand for elderly care and the Socially assistive robot (SAR) technology roles in this field are highlighted. It was proposed in [23] that socially assistive robots could support people with health conditions in social interactions, with the aim of improving their health and well-being. For elderly people, social robots can be embedded into their homes or care facilities and play different roles. In [24], the field of in-house assistance for older adults was reviewed. It is suggested that the deployments that have been made for robots for in-house assistance are mostly prototypes and that robots have yet to succeed as personal assistance. It is reported that in healthcare, robots have a variety of applications, and can be classified into three categories: surgical, rehabilitation, and social robots. Furthermore, social robots hHerein were divided into service and companion categories, dedicated for assistance in one or more tasks, or for user companionship, respectively. In the last context, the appearance of the robot and the extent to which it resembles a human being were reported to affect its acceptability by end users [25][26]. Herein, the study made in [27] focused on the perception of care robots among end users. It addressed the end users’ understandings, assumptions, and expectations of care robots. The study covered different groups of stakeholders such as line managers, frontline care staff, older people, and students training to become careers. Congruent frames between the groups regarding the understanding of the nature of care robots were found. But incongruent frames of the nature of care robots were found between the different groups when addressing the question of sketching the ideal robot. The study identified adequate training, usability, and finances among potential criteria for the successful use of care robots. Perceptions of stakeholders of socially assistive robots were also addressed in [28] where a study on a robot known as Stevie was conducted with older adults and professional care workers in a retirement community. Focus groups were formed where the robot was teleoperated and observations and discussions were made. Staff and residents had different views and concerns about the robot but they both saw its potential utility and suggested many possible use-cases. Older people care has also been addressed in [29] where a user-centered soft and bedside communication robot was developed and evaluated. A collaborative and iterative development process was adopted, involving different stakeholders. The resulting system helped in the improvement of the mood and behavior of participants, as well as in obtaining a positive engagement from their side. Another aspect of assistance was addressed in [30] where an approach to develop a robot for the assistance of workers with intellectual and developmental disabilities was presented.
For care and assistance for older persons, Ambient Assistive Living (AAL) has been defined, as reported in [31] from [32] as “the use of information and communication technologies (ICT) in a person’s daily living and working environment to enable them to stay active longer, remain socially connected and live independently into old age”. AAL covers tasks such as observation and the detection of events such as falls but goes beyond that to interact with users [31]. Herein, the integration of socially assistive robots into AAL has been shown in [33] to succeed in long-term support to older adults. It was reported that the robot incentivized the usage of the system but slightly lowered its overall acceptability. The Giraff-X (a version of the Giraff teleoperated robot [34][35]) was used as an embodiment for a virtual caregiver at the elder’s house.

4. Medicine

Another field of application of social robots in health care. Different usages can be found for social robots herein, ranging from the assistance of nurses to rehabilitation [36]. The interventions of socially assistive robots in supporting mental health in children have been reviewed in [37]. It was found that the contexts of the interventions affect their impacts. Indeed, the place, the culture of the user, and the robot used are all factors contributing to the outcomes of the intervention. The study showed different robotic platforms used and reported consistent positive outcomes such as relief of distress and increase of positive effects regardless of the robot used. However, disparities have been seen between outcome measures, robots used, and study quality. The usage of a social robot was shown to have possible benefits in attention improvement for children with cognitive impairement [38]. Herein, a child-robot interaction was designed and implemented, consisting of several modules during which the children played short games with the robot, taking the capacities of the robot into account. Additionally, dementia was addressed in [39][40]. The research work in [40] focused on designing robots for dementia caregiving, addressing the needs of both the caregiver and the caregiver. This covered the intended purpose and functions of the robots as “robots for joy”, “robots for repetition” and “robots for wellness” were designed. Additionally, different morphologies and modalities for interacting with the robots such as voice interaction where the voices of people that whom caregivers were familiar were discussed. Moreover, different roles were assigned to robots, such as “the bad guy”, “the facilitator” and “the counselor”. The Softbank robot, Pepper, was used in [39] for encouraging exercise in dementia patients. Specifically, the study used simple dance moves as the exercise modality due to the engagement and repetitiveness of dancing. A heart-rate monitor was used for sending feedback to the robot to adjust the intensity of the exercise. Preliminary results were reported to be promising. Pepper was also used in [41] in a system developed for the audiometry tests and rehabilitation of children with hearing disabilities. Positive and negative emotions of children were shown to be better distinguished when they interact with the robot than in setups without the robot. Social anxiety disorder, a condition pushing people to fear social situations, was addressed in [42] where an overview of certain usages of social robots in clinical interventions was made. It was proposed that social robots can be used to complement the work of clinicians. Additionally, Pepper’s usage in health data acquisition was explored in [43] to act as a nurse assistant and reduce data registration workloads on nurses.A multimodal dialogue involving verbal, gesture, and screen display aspects were designed to facilitate the interaction between the robot and the patient. Evaluations made by patients and nurses showed the possible acceptability of the robot. Another usage of robots was shown in [22], not directly as a health assistant, but as an assistant in enhancing the skills of nursing students, specifically in patient transfer, where a patient is moved from a bed to a wheelchair, and vice-versa. The robot in this work simulated a patient to be transferred while measuring different motion parameters during the transfer and assessing whether the transfer was made accurately by the nursing student or not. Indeed, proper body mechanics need to be used in this task, indispensable to the patient’s daily life, especially with elderly patients affected by weaknesses in their limbs. Results showed that the robot can be a good substitute for an actual patient when performing this task. Rehabilitation was addressed in [44] and specifically, the trust of users interacting with a rehabilitation robot. Exercises were performed at different velocities of robot motion and data on participants’ heart rates and perception of safety were collected. Notably, the perception of safety was negatively affected by increasing velocity and exercise extent. Another application of socially assistive robots has been shown in [45] where robot prototypes that assist persons in sorting their medications have been developed and tested, to organize the days and times pills should be taken.

5. Autism Spectrum Disorders

Autism Spectrum Disorders (ASD) cause abnormalities of impaired development in social communication and interaction, and in restricted and repetitive patterns of behavior, interests, or activities. Individuals with ASD have difficulties interacting and communicating with others [46][47]. This is due to their inability to understand social cues and the behaviors and feelings of others. Research on information communication technology (ICT) has been active in the domain of the education of people with autism [48]. In the same context. different works in socially assistive robotics have tackled the treatment of individuals with ASD, increasingly since 2000, with different directions of research [49]. Such works target the improvement of the social functioning of children with ASD [50]. One of these directions is imitation, as a deficit in imitation is a symptom of ASD. The study made in [51] compared the body gesture imitation performance of participants with ASD and typically developing subjects. It also compared this performance in robot-child and adult-child imitation tasks. In the presented experimental setup, both participants with typical development and with ASD performed better in adult-child mode than in robot-child mode. Additionally, participants with typical development showed better performance than participants with ASD. Additionally, in [52], children with typical development and with ASD performed better with a therapist in a joint attention elicitation task. Joint attention is related to social relationships and has been defined in [53] as a triadic interaction between two agents focusing on a single object. Additionally, in [52], children with typical development performed better than children with ASD with the robot. Another direction of research, where the usage of robots proved efficient in the enhancement of social skills of children with ASD was shown in [50]. Herein, a robot was used for an intervention in a social skill training program that consisted of different phases with and without the robot. Results showed that the intervention of the robot improved the social motivation and skills of children with ASD, measured by eye contact frequency and duration, and verbal initiation frequency. This improvement lasted even after the robot was withdrawn from the program. A similar result was obtained in [54] where children with ASD participated in sessions divided into sessions with the Kaspar robot and sessions with a human teacher. The usage of the robot increased interactions among children such as non-verbal imitation, touching, and attention duration. The Kaspar robot’s evolution is shown in [55]. Initially, the Kaspar robot was developed in 2005 for research in human-robot interaction, then it was adopted for investigation as a therapeutic device for children with ASD which has been its primary application and target of improvement since then. The evolution of this robot benefited from the knowledge improvement in the therapy of children with ASD and shows hardware modifications and improvement of sensory devices aiming to improve its usability and autonomy for child-robot interaction.

6. Other Applications of Children Companionship

As reported, social robots have been used with children in applications such as education and ASD. Nevertheless, other applications of social robots used with children can be reported from other domains such as entertainment, awareness-raising and cognition, perception, and behavioral studies. In [56], a social robot was used in a game intended to make children more aware of the importance of waste recycling. The game had 2 players: the child and the Softbank robot Pepper, and a human judge. The study reported promising results in changing children’s attitudes toward recycling and showed a positive evaluation of Pepper by children. Children’s gaze aversion was addressed in [57] where gaze aversion was reported from other sources to refer to human being reflexive redirection of the gaze away from a potentially distracting visual stimulus while solving a mentally demanding task, facilitating thinking. The study evaluated the influence of the interaction with a humanoid robot on children’s gaze aversion. Results showed that gaze aversion rates increased when children interacted with other humans, or with robots that were told to be human-controlled, in contrast with their interactions with robots controlled by computers. These findings were linked to the perception children make of minds in their interaction agents. Child-robot interaction was also explored in [58] to develop a method for emotion recognition relying on functional infrared imaging. It allowed for the assessment of the level of child engagement while interacting with an artificial agent and the presented work was said to constitute a step toward a more natural interaction between a child and an artificial agent, based on physiological signals. In [59], the focus was on implementing the behavior of a robot storyteller using an analysis of human storytellers. The effects of implementing emotions in the storytelling, contextual storyteller head movements, and voice fitting to characters were evaluated. Positive impacts on listeners were found in emotional robot and voice-acting robot storytellers. Contextual head movements, on the other hand, did not have effects on the perception users make of the robot storyteller. The usage of robots in childcare was addressed in [60] where the requirements, needs, and attitudes of working parents toward childcare social robots were identified. The study suggested socialization, education, entertainment, and expert counseling as childcare functions of social robots and created questionnaire items to explore different aspects of the parents’ views of these different functions. The results suggested positive impacts of social robots in childcare through aspects such as social interactions and entertainment. Different parenting conditions such as parenting styles (work-oriented, dominant and so on) and children’s ages were reported to change parents’ needs for specific childcare functions. This implies that robots can be strategically designed and introduced to customer groups in line with their characteristics. The study shown in [61] focused on games involving humans and robots in physical and demanding activities. Herein, robots need to be perceived as rational agents aiming to win the game, and the study focused on deciding and communicating deceptive behaviors in robots. This strategy improved human-robot interaction by helping robots match the expectation of interacting with people to attribute rationality to the robot companion. Another field of research where social robotics had applications in affective computing, aiming to understand the effect of a person using specific signals and modalities, and applied in education for example [62]. In [63], a children companion robot was equipped with the capacity of real-time affective computing, allowing the robot to adapt its behavior to the effect of the child it is interacting with, improving the interaction.

7. Other Domains of Research and Application

As stated in [64], human-robot interaction has been explored in children, adults, and seniors, but it was less explored in teens. The authors in [64] state that designing robots for interaction with teens is different from other types of human-robot interaction. Additionally, aside from the different domains of research and application that have already been shown, social robots have been used and explored in different contexts, for different objectives. For instance, “edutainment”, where robots participate in people’s education and entertainment can be mentioned [65]. Additionally, several studies have been made to improve human-robot interaction by embedding human social skills in robots. For example, instead of using pre-programmed manually crafted gestures, a humanoid robot learned, using neural networks and a database of TED talks, to generate gestures by the uttered speech as humans would, in [66]. Storytelling is also a field of human-robot interaction where different aspects of robot behavior can be explored [67] a service robot’s ability to adapt its behavior was addressed by implementing a human-like thought process. Behavior herein can be defined as a combination of a facial expression, a gesture, and a movement. Social intelligence and familiarity with robots have also been the objective in [68][69]. In [70], A robot was equipped with the ability to assess whether the human interaction partner is lying or not, in the purpose of assessing his trustworthiness and improving the interaction. In [68], a robot used deep neural networks to learn human behavior based on data it gathered during its interactions. The purpose was to use the most appropriate action among waving, looking toward human, waving, and handshaking Additionally, in the context of social intelligence, a vision-based framework for allowing robots to recognize and respond to hand waving gestures were presented in [71], increasing its social believability. Furthermore, a humanoid robot was endowed with human-like welcoming behaviors with enthusiasm for the purpose of drawing the attention of persons entering a building in [72]. In a related application in terms of constraints, a flyer distributing robot for pedestrians in a shopping mall has been presented in [73]. Indeed, the robot needed to draw the attention of pedestrians, plan its motions and behave in a manner helping to make the pedestrians accept the flyers. Additionally, a guide robot was developed in [74] for a science museum. The robot had abilities to build relationships with humans through friendly attitudes and was positively evaluated by visitors. Finally, a design and framework were shown in [75] for a robot intended to be used as a receptionist in a university. The platform consisted of an animatronic head with several degrees of freedom and a capacity in engaging in conversations without ‘a priori information about questions it may have to answer. Such an application is an example of how a social robot can combine aspects from design, hardware, software, artificial intelligence, and communication to play roles that are usually attributed to humans.

References

  1. Belay Tuli, T.; Olana Terefe, T.; Ur Rashid, M.M. Telepresence Mobile Robots Design and Control for Social Interaction. Int. J. Soc. Robot. 2021, 13, 877–886.
  2. Shiarlis, K.; Messias, J.; Whiteson, S. Acquiring Social Interaction Behaviours for Telepresence Robots via Deep Learning from Demonstration. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017.
  3. Shiarlis, K.; Messias, J.; van Someren, M.; Whiteson, S.; Kim, J.; Vroon, J.; Englebienne, G.; Truong, K.; Evers, V.; Pérez-Higueras, N.; et al. TERESA: A Socially Intelligent SEmi-autonomous Telepresence System. In Proceedings of the International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015.
  4. Niemela, M.; van Aerschot, L.; Tammela, A.; Aaltonen, L.; Lammi, H. Towards Ethical Guidelines of Using Telepresence Robots in Residential Care. Int. J. Soc. Robot. 2019, 13, 431–439.
  5. Double Robotics—Telepresence Robot for the Hybrid Office. Available online: https://www.doublerobotics.com/ (accessed on 1 March 2022).
  6. Zhang, G.; Hansen, J.P.; Minkata, K.; Alapetite, A.; Wang, Z. Eye0Gaze-Controlled Telepresence Robots for People with Motor Disabilities. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  7. Mubin, O.; Alhashmi, M.; Baroud, R.; Alnajjar, F.S. Humanoid Robots as Teaching Asistants in an Arab School. In Proceedings of the 31st Australian Conference on Human-Computer Interaction, Fremantle, Australia, 2–5 December 2019.
  8. Mispa, T.A.; Sojib, N. Educational Robot Kiddo Learns to Draw to Enhance Interactive Handwriting Scenario for Primary School Children. In Proceedings of the 3rd Intrernational Conference of Intelligent Robotic and Control Engineering (IRCE), Oxford, UK, 10–12 August 2020.
  9. Schodde, T.; Bergmann, K.; Kopp, S. Adaptive Robot Language Tutoring Based on Bayesian Knowledge Tracing and Predictive Decision-Making. In Proceedings of the 12th ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017.
  10. Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.J. A Review of the Applicability of Robots in Education. Technol. Educ. Learn. 2013, 1, 13.
  11. Xia, Y.; LeTendre, G. Robots for Future Classrooms: A Cross-Cultural Validation Study of “Negative Attitudes Toward Robots Scale” in the U.S. Context. Int. J. Soc. Robot. 2021, 13, 703–714.
  12. Nomura, T.; Kanda, T.; Suzuki, T. Experimental investigation into influence of negative attitudes toward robots on human-robot interaction. Ai Soc. 2006, 20, 138–150.
  13. Kanero, J.; Oranc, C.; Koskulu, S.; Kumkale, G.T.; Goksun, T.; Kuntay, A.C. Are Tutor Robots for Everyone? The Influence of Attitudes, Anxiety, and Personality on Robot-Led Language Learning. Int. J. Soc. Robot. 2022, 14, 297–312.
  14. Vogt, P.; van den Berghe, R.; de Haas, M.; Hoffman, L.; Kanero, J.; Mamus, E.; Montanier, J.M.; Oranc, C.; Oudgenoeg-Paz, O.; Hernandez Garcia, D.; et al. Second language tutoring using social robots: L2TOR—The movie. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  15. Vogt, P.; van den Berghe, R.; de Haas, M.; Hoffman, L.; Kanero, J.; Mamus, E.; Montanier, J.M.; Oranc, C.; Oudgenoeg-Paz, O.; Hernandez Garcia, D.; et al. Second Language Turoting using Social Robots: A Large-Scale Study. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  16. Engwall, O.; Lopes, J.; Ahlund, A. Robot Interaction Styles for Conversation Practice in Second Language Learning. Int. J. Soc. Robot. 2020, 13, 251–276.
  17. Schicchi, D.; Pilato, G. A Social Humanoid Robot as a Playfellow for Vocabulary Enhancement. In Proceedings of the Second IEEE International Conference on Robotic Computing, Laguna Hills, CA, USA, 31 January–2 February 2018.
  18. Reyes, G.E.B.; Lopez, E.; Ponce, P.; Mazon, N. Role Assignment Analysis of an Assistive Robotic Platform in a High School Mathematics Class, Through a Gamification and Usability Evaluation. Int. J. Soc. Robot. 2021, 13, 1063–1078.
  19. Shimaya, J.; Yoshikawa, Y.; Palinko, O.; Ogawa, K.; Jinnai, N.; Ishiguro, H. Active Participation in Lectures via a Collaboratively Controlled Robot. Int. J. Soc. Robot. 2021, 13, 587–598.
  20. Hood, D.; Lemaignan, S.; Dillenbourg, P. When Children Teach a Robot to Write: An Autonomous Teachable Humanoid Which Uses Simulated Handwriting. In Proceedings of the 10th ACM/IEEE International Conference on Human-Robot Interaction, Portland, OR, USA, 2–5 March 2015.
  21. Abdi, J.; Al-Hindawi, A.; Ng, T.; Vizcaychipi, M.P. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open 2017, 8, e018815.
  22. Lin, C.; Ogata, T.; Zhong, Z.; Kanai-Pak, M.; Maeda, J.; Kitajima, Y.; Nakamura, M.; Kuwahara, N.; Ota, J. Development and Validation of Robot Patient Equipped with an Inertial Measurement Unit and Angular Position Sensors to Evaluate Transfer Skills of Nurses. Int. J. Soc. Robot. 2021, 13, 899–917.
  23. Meia, C.T.; Scheutz, M. Assistive Robots for the Social Management of Health: A Framework for Robot Design and Human–Robot Interaction Research. Int. J. Soc. Robot. 2021, 13, 197–217.
  24. Bardaro, G.; Antonini, A.; Motta, E. Robots for Elderly Care in the Home: A Landscape Analysis and Co-Design Toolkit. Int. J. Soc. Robot. 2022, 14, 657–681.
  25. Obayashi, K.; Kodate, N.; Masuyama, S. Enhancing older people’s activity and participation with socially asisstive robots: A multicentre quasi-experimental study using the ICF framework. Adv. Robot. 2018, 32, 1207–1216.
  26. Broadbent, E.; Stafford, R.; MacDonald, B. Acceptance of Healthcare Robots for the Older Population: Review and Future Directions. Int. J. Soc. Robot. 2009, 1, 319–330.
  27. Frennert, S.; Aminoff, H.; Ostlund, B. Technological Framces and Care Robots in Eldercare. Int. J. Soc. Robot. 2021, 13, 317–325.
  28. McGinn, C.; Bourke, E.; Murtagh, A.; Donovan, C.; Cullinan, M.F. Meeting Stevie: Perceptions of a Socially Assistive Robot by Residents and Staff in a Long-term Care Facility. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  29. Obayashi, K.; Kodate, N.; Masuyama, S. Assessing the Impact of an Original Soft Communicative Robot in a Nursing Home in Japan: Will Softness or Conversations Bring more Smiles to Older People? Int. J. Soc. Robot. 2022, 14, 645–656.
  30. Williams, A.B.; Williams, R.M.; Moore, R.E.; McFarlane, M. AIDA: A Social Co-Robot to Uplift Workers with Intellectual and Developmental Disabilities. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  31. Monekosso, D.; Florez-Revuelta, F.; Remagnino, P. Ambient Assisted Living . IEEE Intell. Syst. 2015, 30, 2–6.
  32. AAL Home 2020—AAL Programme. Available online: www.aal-europe.eu (accessed on 15 February 2022).
  33. Luperto, M.; Monroy, J.; Renoux, J.; Lunardini, F.; Basilico, N.; Bulgheroni, M.; Cangelosi, A.; Cesari, M.; Cid, M.; Ianes, A.; et al. Integrating Social Assistive Robots, IoT, Virtual Communities and Smart Objects to Assist at-Home Independently Living Elders: The MoveCare Project. Int. J. Soc. Robot. 2022, 14, 1–31.
  34. Casiddu, N.; Cesta, A.; Cortellessa, G.; Orlandini, A.; Porfirione, C.; Divano, A.; Micheli, E.; Zallio, M. Robot Interface Design: The Giraff Telepresence Robot for Social Interaction. Biosyst. Biorobot. 2015, 11, 499–509.
  35. Coradeschi, S.; Cesta, A.; Cortellessa, G.; Coraci, L.; Galindo, C.; González-Jiménez, J.; Karlsson, L.; Forsberg, A.; Frennert, S.; Furfari, F.; et al. GiraffPlus: A System for Monitoring Activities and Physiological Parameters and Promoting Social Interaction for Elderly. Adv. Intell. Syst. Comput. 2014, 300, 261–271.
  36. Karar, A.; Said, S.; Beyrouthy, T. Pepper Humanoid Robot as a Service Robot: A Customer Approach. In Proceedings of the 2019 3rd International Conference on Bio-Engineering for Smart Technologies (BioSMART), Paris, France, 24–26 April 2019; pp. 1–4.
  37. Kabacinska, K.; Prescott, T.J.; Robillard, J.M. Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review. Int. J. Soc. Robot. 2021, 13, 919–935.
  38. Ismail, L.I.; Hanapiah, F.A.; Belpaeme, T.; Dambre, J.; Wyffels, F. Analysis of Attention in Child-Robot Interaction Among Children Diagnosed with Cognitive Impairement. Int. J. Soc. Robot. 2021, 13, 141–152.
  39. Schrum, M.; Park, C.H.; Howard, A. Humanoid Therapy Robot for Encouraging Exercise in Dementia Patients. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  40. Moharana, S.; Panduro, A.E.; Lee, H.R.; Rick, L.D. Robots for Joy, Robots for Sorrow: Community Based Robot Design for Dementia Caregivers. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  41. Uluer, P.; Kose, H.; Gumuslu, E.; Erol Barkana, D. Experience with an Affective Robot Assistant for Children with Hearing Disabilities. Int. J. Soc. Robot. 2021, 16, 1–8.
  42. Rasouli, S.; Gupta, G.; Nilsen, E.; Dautenhahn, K. Potential Applications of Social Robots in Robot-Assisted Interventions for Social Anxiety. Int. J. Soc. Robot. 2022. ahead of printing.
  43. Van der Putte, D.; Boumans, R.; Neerincx, M.; Rikkert, M.O.; De Mul, M. A Social Robot for Autonomous Health Data Acquisition among Hospitalized Patients: An Exploratory Field Study. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  44. Nielsen, C.; Mathiesen, M.; Nielsen, J.; Jensen, L.C. Changes in Heart Rate and Feeling of Safety when Led by a Rehabilitation Robot. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  45. Wilson, J.R.; Lee, N.Y.; Saechao, A.; Tickle-Degnen, L.; Scheutz, M. Supporting Human Autonomy in a Robot-Assisted Medication Sorting Task. Int. J. Soc. Robot. 2018, 10, 621–641.
  46. Chatbots|GPT-3 Demo. Available online: https://gpt3demo.com/category/chatbots (accessed on 13 April 2022).
  47. Delaherche, E.; Chetouani, M.; Bigouret, F.; Xavier, J.; Plaza, M.; Cohen, D. Assessment of the communicative and coordination skills of children with Autism Spectrum Disorders and typically developing children using social signal processing. Res. Autism Spectr. Disord. 2013, 7, 741–756.
  48. Boucenna, S.; Narzisi, A.; Tilmont, E.; Muratori, F.; Pioggia, G.; Cohen, D.; Chetouani, M. Interactive Technologies for Autistic Children: A Review. Cogn. Comput. 2014, 6, 722–740.
  49. Chetouani, M.; Boucenna, S.; Chaby, L.; Plaza, M.; Cohen, D. Social Signal Processing and Socially Assistive Robotics in Developmental Disorders; Cambrige University Press: Cambrige, UK, 2017; pp. 389–403.
  50. Chung, E.Y.H. Robot-Mediated Social Skill Intervention Programme for Children with Autism Spectrum Disorder: An ABA Time-Series Study. Int. J. Soc. Robot. 2021, 13, 1095–1107.
  51. Taheri, A.; Meghdari, A.; Mahoor, M.H. A Close Look at the Imitation Performance of Children with Autism and Typically Developing Children Using a Robotic System. Int. J. Soc. Robot. 2021, 13, 1125–1147.
  52. Anzalone, S.M.; Tilmont, E.; Boucenna, S.; Xavier, J.; Jouen, A.L.; Bodeau, N.; Maharatna, K.; Chetouani, M.; Cohen, D.; the MICHELANGELO Study Group. How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3D + time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Disord. 2014, 8, 814–826.
  53. Emery, N. The eyes have it: The neuroethology, function and evolution of social gaze. Neurosci. Biobehav. Rev. 2000, 24, 581–604.
  54. Huijnen, C.A.G.J.; Verreussel-Willen, H.A.M.D.; Lexis, M.A.S.; de Witte, L.P. Robot KASPAR as Mediator in Making Contact with Children with Autism: A Pilot Study. Int. J. Soc. Robot. 2021, 13, 237–249.
  55. Wood, L.J.; Zaraki, A.; Robins, B.; Dautenhahn, K. Developing Kaspar: A Humanoid Robot for Children with Autism. Int. J. Soc. Robot. 2021, 13, 491–508.
  56. Castellano, G.; De Carolis, B.; D’Errico, F.; Macchiarulo, N.; Rossano, V. PeppeRecycle: Improving Children’s Attitude Toward Recycling by Playing with a Social Robot. Int. J. Soc. Robot. 2021, 13, 97–111.
  57. Desideri, L.; Bonifacci, P.; Croati, G.; Dalena, A.; Gesualdo, M.; Molinario, G.; Gherardini, A.; Cesario, L.; Ottaviani, C. The Mind in the Machine: Mind Perception Modulates Gaze Aversion During Child-Robot Interaction. Int. J. Soc. Robot. 2021, 13, 599–614.
  58. Filippini, C.; Spadolini, E.; Cardone, D.; Bianchi, D.; Preziuso, M.; Sciarretta, C.; del Cimmuto, V.; Lisciani, D.; Merla, A. Facilitating the Child-Robot Interaction by Endowing the Robot with the Capability of Understanding the Child Engagement: The Case of Mio Amico Robot. Int. J. Soc. Robot. 2021, 13, 677–689.
  59. Striepe, H.; Donnermann, M.; Lein, M.; Lugrin, B. Modeling and Evaluating Emotion, Contextual Head Movement and Voices for a Social Robot Storyteller. Int. J. Soc. Robot. 2021, 13, 441–457.
  60. Lee, J.; Lee, D.; Lee, J.G. Can Robots Help Working Parents with Childcare? Optimizing Childcare Functions for Different Parenting Characteristics. Int. J. Soc. Robot. 2022, 14, 193–201.
  61. de Oliveira, E.; Donadoni, L.; Boriero, S.; Bonarini, A. Deceptive Actions to Improve the Attribution of Rationality to Playing Robotic Agents. Int. J. Soc. Robot. 2021, 13, 391–405.
  62. Wu, C.H.; Huang, Y.M.; Hwang, J.P. Review of affective computing in education/learning: Trends and challenges. Br. J. Educ. Technol. 2016, 47, 1304–1323.
  63. Zheng, M.; She, Y.; Chen, J.; Shu, Y.; XiaHou, J. BabeBay—A Companion Robot for Children Based on Multimodal Affective Computing. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019.
  64. Bjorling, E.A.; Rose, E.; Davidson, A.; Ren, R.; Wong, D. Can We Keep Him Forever? Teens’ Engagement and Desire for Emotional Connection with a Social Robot. Int. J. Soc. Robot. 2020, 12, 65–77.
  65. Gonzalez-Pacheco, V.; Ramey, A.; Alonso-Martin, F.; Castro-Gonzalez, A.; Salichs, M.A. Maggie: A Social Robot as a Gaming Platform. Int. J. Soc. Robot. 2011, 3, 371–381.
  66. Yoon, Y.; Ko, W.R.; Jang, M.; Lee, J.; Kim, J.; Lee, G. Robots Learn Social Skills: End-to-End Learning of Co-Speech Gesture Generation for Humanoid Robots. In Proceedings of the International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019.
  67. Mutlu, B.; Forlizzi, J.; Hodgins, J. A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior. In Proceedings of the IEEE-RAS Intrernational Conference on Humanoid Robots, Genova, Italy, 4–6 December 2006.
  68. Qureshi, A.H.; Nakamura, Y.; Yoshikawa, Y.; Ishiguro, H. Robot gains Social Intelligence through Multimodal Deep Reinforcement Learning. In Proceedings of the IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016.
  69. Hsieh, W.F.; Sato-Shimokawara, E.; Yamaguchi, T. Enhancing the Familiarity for Humanoid Robot Pepper by Adopting Customizable Motion. In Proceedings of the IECON 2017—43rd Annual Conference of the IEEE Industrial Electronics Society, Beijing, China, 29 October–1 November 2017.
  70. Pasquali, D.; Gonzalez-Billandon, J.; Aroyo, A.M.; Sandini, G.; Sciutti, A.; Rea, F. Detecting Lies is a Child (Robot)’s Play: Gaze-Based Lie Detection in GRI. Int. J. Soc. Robot. 2021.
  71. Castellano, G.; Cervelione, A.; Cianciotta, M.; De Carolis, B.; Vessio, G. Recognizing the Waving Gesture in the Interaction with a Social Robot. In Proceedings of the 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020.
  72. Saad, E.; Broekens, J.; Neerincx, M.A.; Hindriks, K.V. Enthusiastic Robots Make Better Contact. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019.
  73. Shi, C.; Satake, S.; Kanda, T.; Ishiguro, H. A Robot that Distributes Flyers to Pedestrians in a Shopping Mall. Int. J. Soc. Robot. 2018, 10, 421–437.
  74. Iio, T.; Satake, S.; Kanda, T.; Hayashi, K.; Ferreri, F.; Hagita, N. Human-Like Guide Robot that Proactively Explains Exhibits. Int. J. Soc. Robot. 2020, 12, 549–566.
  75. Youssef, K.; Said, S.; Beyrouthy, T.; Alkork, S. A Social Robot with Conversational Capabilities for Visitor Reception: Design and Framework. In Proceedings of the 2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART), Paris/Créteil, France, 8–10 December 2021; pp. 1–4.
More
Video Production Service