Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2002 2023-11-14 04:04:35 |
2 layout + 5 word(s) 2007 2023-11-14 04:20:52 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Mohd Yamin, M.N.; Ab. Aziz, K.; Gek Siang, T.; Ab. Aziz, N.A. Emotion Recognition Systems. Encyclopedia. Available online: https://encyclopedia.pub/entry/51501 (accessed on 19 May 2024).
Mohd Yamin MN, Ab. Aziz K, Gek Siang T, Ab. Aziz NA. Emotion Recognition Systems. Encyclopedia. Available at: https://encyclopedia.pub/entry/51501. Accessed May 19, 2024.
Mohd Yamin, Muhammad Nadzree, Kamarulzaman Ab. Aziz, Tan Gek Siang, Nor Azlina Ab. Aziz. "Emotion Recognition Systems" Encyclopedia, https://encyclopedia.pub/entry/51501 (accessed May 19, 2024).
Mohd Yamin, M.N., Ab. Aziz, K., Gek Siang, T., & Ab. Aziz, N.A. (2023, November 14). Emotion Recognition Systems. In Encyclopedia. https://encyclopedia.pub/entry/51501
Mohd Yamin, Muhammad Nadzree, et al. "Emotion Recognition Systems." Encyclopedia. Web. 14 November, 2023.
Emotion Recognition Systems
Edit

Emotion recognition systems (ERS) are an emerging technology with immense potential, exemplifying the innovative utilization of artificial intelligence (AI) within the context of the fourth industrial revolution (IR 4.0). Given that personalization is a key feature of the fifth industrial revolution (IR 5.0), ERS has the potential to serve as an enabler for IR 5.0. Furthermore, the COVID-19 pandemic has increased the relevance of this technology as work processes were adapted for social distancing and the use of face masks. Even in the post-pandemic era, many individuals continue to wear face masks. Therefore, ERS offers a technological solution to address communication challenges in a masked world. The existing body of knowledge on ERS primarily focuses on exploring modalities or modes for emotion recognition, system development, and the creation of applications utilizing emotion recognition functions.

emotion recognition system fourth industrial revolution fifth industrial revolution artificial intelligence

1. Introduction

Artificial intelligence (AI) has evolved from being an interesting theoretical concept to tangibility, with recent applications of AI making significant impacts across businesses, industries, and societies [1]. Inspired by human intelligence, AI aims to learn, reason, and make decisions like humans, reducing the need for human intervention [2]. AI has established trustworthiness, enabling AI systems in the aspect of beneficence, non-maleficence, autonomy, justice, and explicability [2][3][4]. AI systems are designed to operate with varying levels as well as defined objectives, predictions, and recommendations influenced by real or virtual environments. Furthermore, AI offers benefits for businesses and industries, such as the automation of repetitive and time-consuming tasks, which allows humans to focus on higher-value work [4]. For example, massive data, which used to be a challenge to analyze, are now easily processed by AI; complex problems can be tackled using AI in a more efficient manner by integration with thousands of computers and other resources [2][3][4].
Moving ahead to the present, we find ourselves in the era of the fourth industrial revolution (IR 4.0), characterized by digitalization and the integration of AI and computers in collaboration with societies [5]. IR 4.0 primarily emphasizes the manufacturing industry, enabling smart manufacturing through technologies such as AI [6]. The constant nature of technological change is propelling us towards the fifth industrial revolution (IR 5.0). What distinguishes IR 5.0 from IR 4.0 is the specialization of machines and computers, endowed with the capability to comprehend human actions [7][8]. Aspects of human–computer interaction (HCI) will become more significant as we move into IR 5.0.emotion recognition systems (ERS) is well positioned to be a key enabling technology for this, as it can enhance AI with abilities to understand human emotions and behavioral responses. To facilitate HCI, the computer system must have the ability to communicate with humans in some form [8]. Since the used cases of IR 5.0 are still in their formative years, manufacturers must actively consider strategies to incorporate humans and machines and maximize the opportunities that can be exploited in IR 5.0. Hence, the introduction of ERS may enhance robots and machines in understanding human emotions with the proposition of collaborative robots.
ERS is an emerging technology in the field of AI that allows machines to recognize human emotions by learning from various data modalities. It has gained significance due to technological advancements and potential applications. Initially introduced as part of affective computing (AC) by [9] to predict and understand human behavior, AC has further evolved to achieve recent advancements in recognizing emotions. Over the past decade, researchers have developed several ERS systems, which are now commonly embedded in various AI applications [10]. ERS offers a wide range of potential innovative solutions based on modalities introduced by previous researchers [11]. It has attracted significant interest from researchers, as evidenced by the increasing trend in studies related to ERS over the past decade, as shown in Figure 1. This underscores the importance of the research area. For the development of complex systems such as human-interacting robots, a sub-system capable of understanding and expressing human emotions has been proposed [6]. Previous studies indicate that ERS holds promise as a significant technology, offering advantages to individuals, societies, organizations, businesses, and industries across various platforms or applications. Examples include ERS in healthcare [12], driving assistance [13], and enhancing teaching and learning technologies in the education sector [14].
Figure 1. ERS trend (https://link.lens.org/MjQS1wQaWkj), accessed on 10 June 2023.
ERS is an advanced AI application that utilizes affective computing to understand and respond to human cues [15] and has become increasingly significant in the related field of study since it was introduced by [9]. ERS enhances AI in human–computer interaction and represents an additional advancement in technological progress [16]. AI, as the basis for intelligent machines or computers that enhance productivity in different settings [17], forms the foundation for incorporating emotion recognition as a subset of AI technology. Over the last decade, researchers and innovators have explored ERS modalities that consist of physiological, physical, and data mining from text or documents. These modalities have been suggested as part of the innovation for AI to enable tasks such as learning and understanding human emotions [18].
Physiological modalities are commonly found within the healthcare industry, such as electroencephalography (EEG), electrocardiography (ECG), and photoplethysmography (PPG). EEG serves as an analytical tool utilized in neuroscience, neural engineering, and biomedical engineering to measure human brain signals by observing the electromagnetic activity of specific components [19][20]. EEG is the preferred modality for accurate data in automated emotion recognition, as it aligns with AI systems that employ convolutional neural networks and deep machine learning [21][22]. It has been tested in detecting human emotions and is considered a cost-effective, portable, and simple method of identifying emotions [23]. ECG is one of the most well-known modalities and is commonly used in emotion recognition and affective computing research. Previous studies have utilized ECG to detect stress and emphasize the importance of monitoring emotional stress levels to prevent negative outcomes [12]. Machine-based ERS utilizing ECG provides an alternative to physical modalities. PPG, along with the galvanic skin response (GSR), is considered a practical and suitable modality for real-life applications [24].
For physical modalities, the modalities involved are facial recognition, speech recognition, body movement, etc. Facial recognition and speech recognition are considered well-known physical modalities for ERS researchers and have been extensively utilized in previous works [25][26]. Facial recognition, in particular, has been identified among ERS practitioners due to its wide range of real-world applications, including security supervision, online learning, and gaming experiences [26]. Speech recognition, as a modality of ERS, is capable of identifying human feelings and “makes conventional speech emotion recognition (SER) more suitable for real-life applications” [27] (p. 1). According to [28], one of the earlier instances of the detection of human emotion was through speech recognition: based on someone’s voice, the computer can specify the emotive cue and determine the emotion of the person. Combining the modalities also leads to better results in enabling ERS. For example, a study by [29] suggested that a combination of modalities such as EEG and facial recognition compensates for their defects as single information sources [29].
Text data mining refers to machine learning techniques that involve learning-based algorithms and feature extraction to describe the main characteristics of textual data [30]. In a recent study by [31], text word mining using emotion-label lexicons, such as a small set of seed words, was employed. For example, the text “Hurray!” can be labeled as indicating happy emotions, while “Argh!” may represent anger and frustration. Nevertheless, certain words may possess overlapping potential emotions; for instance, the word “Aww”, can convey both pleasant sentiments and expressions of pity and sympathy [31][32]. Various applications leveraging data and text mining for the automatic recognition of sentiments or emotions can be observed, particularly in eliciting opinions related to marketing or promotional content from sources like blog posts, social media, articles, surveys, etc. [33]. This can be applied to the web, such as chats on social networks, by analyzing their sentiments and emotions. Moreover, deep-learning-enabled emojis such as smilies, symbols, and characters based on text can be used to further classify emotions [34].

2. Emotion Recognition Systems Applications

ERS holds the potential to be applied in and bring benefits to various sectors due to its adaptability as an embedded technological function within a system. In other words, it can be one of the functionalities used to process the inputs of AI-enabled smart machines and computers to affect higher levels of HCI. ERS has been identified as having the potential to benefit the education sector as it can enable better engagement between instructors and learners [14][35]. Emotions exert a noteworthy influence on academic performance, with positive emotions being particularly instrumental in enhancing student interest and focus and increasing the likelihood of academic success [35]. Instructors have derived benefits from using a webcam equipped with facial recognition technology within a computer to identify students’ moods [14]. Another example of an ERS application is its integration into the implementation of a smart car [13]. A driver’s performance can be influenced by their emotions, particularly given their impact on the driver’s focus. Therefore, ERS has significance for applications to ensure driving safety. Specifically, [13] used a driving simulation with a built-in ECG modality in the steering wheels to detect human blood pressure in indicating emotional states of stress and fatigue.
Similarly, the use of facial expressions for ERS towards video surveillance was proposed in [36]. It has been highlighted that video surveillance systems nowadays are operated via human capabilities to interpret behavior through video surveillance, which leads to delays in responding to emergencies [36]. The experiments concluded that the implementation of facial recognition as part of ERS towards video surveillance system can improve the reliability of abnormal behavior detection via facial expressions depending on different emotions and environmental conditions. Furthermore, facial expressions can be used in identifying pain, which will benefit the healthcare industry [37]. Assessing a patient’s pain levels over time is deemed to be important, specifically regarding the effectiveness of medical treatments. Therefore, the usage of facial expression recognition can be widely anticipated in the healthcare industry.
Physiological modalities have gained increased attention towards the successful implementation of ERS since physiological factors are more useful in understanding human emotions through neural activity [38]. Among physiological modalities, most of the methods have been assessed through healthcare facilities; therefore, the implementation of ERS through physiological modalities is more likely to be beneficial for the healthcare industry. Moreover, ERS applications in the healthcare industry can serve as a supportive aid for people with conditions like Down’s syndrome and autism and among the elderly [39]. Multi-modal approaches combining facial expressions for automated emotion recognition and computer advisors guiding appropriate reactions to specific situations have been explored [39][40]. Additionally, a communication aid using speech recognition was proposed to identify the tone and voice of special needs patients with conditions like autism or Down’s syndrome [40]. However, given the accessibility of technologies, some components may be implemented in technological devices; thus, they can enhance global outreach to users.
During the pandemic caused by COVID-19, some potential innovative solutions were introduced to enable technologies to be adopted in daily life, supported by virtual videoconferencing, which made breakthroughs and enabled working environments such as work from home, online classes, virtual event gatherings, and more [41]. Ref. [41] suggested that facial emotion recognition may provide a significant effect in reducing videoconferencing fatigue by analyzing participants in videoconferencing through Zoom; they tracked users using a facial recognition modality to recognize six emotions. Furthermore, in the marketing sector, ERS has significant applications in increasing brand awareness through image, video, and text mining [42]. For instance, text mining implemented in web browsers can analyze feedback and comments from potential users, revealing their sentiments and emotions towards a certain product [34][42]; other studies have gathered a small group of individuals in a room, introducing a product and recording their reactions to evaluate their emotions [43].
With such potential of ERS and its innovative applications, there is a need to understand whether individuals are ready for the technology. Considering that ERS will be available in various industries and implemented for daily use, investigating the importance of ERS is crucial; this can help ERS scientists, engineers, practitioners, and technology developers to understand the factors influencing users in adopting ERS. Furthermore, to identify the factors, previous studies have suggested technology adoption theories and concepts that identify the user’s behavior, intention, adoption, and readiness for such technologies.

References

  1. Choung, H.; David, P.; Ross, A. Trust in AI and Its Role in the Acceptance of AI Technologies. Int. J. Hum. -Comput. Interact. 2022, 39, 1727–1739.
  2. Thiebes, S.; Lins, S.; Sunyaev, A. Trustworthy artificial intelligence. Electron. Mark. 2021, 31, 447–464.
  3. Krafft, P.M.; Young, M.; Katell, M.; Huang, K.; Bugingo, G. Defining AI in policy versus practice. In Proceedings of the AIES 2020—AAAI/ACM Conference on AI, Ethics, and Society, New York, NY, USA, 7–9 February 2020; pp. 72–78.
  4. Nishant, R.; Kennedy, M.; Corbett, J. Artificial intelligence for sustainability: Challenges, opportunities, and a research agenda. Int. J. Inf. Manag. 2020, 53, 102104.
  5. Ghobakhloo, M. Industry 4.0, digitization, and opportunities for sustainability. J. Clean. Prod. 2020, 252, 119869.
  6. Yoshitomi, Y. Human–computer communication using recognition and synthesis of facial expression. J. Robot. Netw. Artif. Life 2021, 8, 10–13.
  7. Sumi, K. Affective Human Computer Interaction. In Proceedings of the International Conference on Artificial Life and Robotics, Okinawa, Japan, 29–31 January 2016; Volume 21, pp. 244–248.
  8. George, A.S.; George, A.S.H. Industrial Revolution 5.0: The Transformation of the Modern Manufacturing Process to Enable Man and Machine to Work Hand in Hand. J. Seybold Rep. 2020, 15, 214–234.
  9. Picard, R.W. Affective Computing for HCI. In Proceedings of the 8th HCI International on Human-Computer Interaction: Ergonomics and User Interfaces, Munich, Germany, 22–26 August 1999; pp. 829–833. Available online: http://dl.acm.org/citation.cfm?id=647943.742338 (accessed on 12 January 2022).
  10. Kodhai, E.; Pooveswari, A.; Sharmila, P.; Ramiya, N. Literature Review on Emotion Recognition System. In Proceedings of the 2020 International Conference on System, Computation, Automation and Networking, ICSCAN, Pondicherry, India, 3–4 July 2020; pp. 18–21.
  11. Hippe, Z.S.; Kulikowski, J.L.; Mroczek, T.; Wtorek, J. Human-Computer Systems Interaction: Backgrounds and Applications 3; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2014; Volume 300, pp. 51–62.
  12. Hasnul, M.A.; Aziz, N.A.A.; Alelyani, S.; Mohana, M.; Aziz, A.A. Electrocardiogram-Based Emotion Recognition Systems and Their Applications in Healthcare—A Review. Sensors 2021, 21, 5015.
  13. Wang, X.; Guo, Y.; Ban, J.; Xu, Q.; Bai, C.; Liu, S. Driver emotion recognition of multiple-ECG feature fusion based on BP network and D-S evidence. IET Intell. Transp. Syst. 2020, 14, 815–824.
  14. Putra, W.B.; Arifin, F. Real-Time Emotion Recognition System to Monitor Student’s Mood in a Classroom. J. Phys. Conf. Ser. 2019, 1413, 012021.
  15. Landowska, A. Uncertainty in emotion recognition. J. Inf. Commun. Ethics Soc. 2019, 17, 273–291.
  16. Kratzwald, B.; Ilić, S.; Kraus, M.; Feuerriegel, S.; Prendinger, H. Deep learning for affective computing: Text-based emotion recognition in decision support. Decis. Support Syst. 2018, 115, 24–35.
  17. Peres, R.S.; Jia, X.; Lee, J.; Sun, K.; Colombo, A.W.; Barata, J. Industrial Artificial Intelligence in Industry 4.0—Systematic Review, Challenges and Outlook. IEEE Access 2020, 8, 220121–220139.
  18. Gloor, P.A.; Colladon, A.F.; Altuntas, E.; Cetinkaya, C.; Kaiser, M.F.; Ripperger, L.; Schaefer, T. Your Face Mirrors Your Deepest Beliefs—Predicting Personality and Morals through Facial Emotion Recognition. Future Internet 2022, 14, 5.
  19. Alhalaseh, R.; Alasasfeh, S. Machine-Learning-Based Emotion Recognition System Using EEG Signals. Computers 2020, 9, 95.
  20. Craik, A.; He, Y.; Contreras-Vidal, J.L. Deep learning for electroencephalogram (EEG) classification tasks: A review. J. Neural Eng. 2019, 16, 031001.
  21. Fang, W.-C.; Wang, K.-Y.; Fahier, N.; Ho, Y.-L.; Huang, Y.-D. Development and Validation of an EEG-Based Real-Time Emotion Recognition System Using Edge AI Computing Platform with Convolutional Neural Network System-on-Chip Design. IEEE J. Emerg. Sel. Top. Circuits Syst. 2019, 9, 645–657.
  22. Tong, Z.; Chen, X.; He, Z.; Tong, K.; Fang, Z.; Wang, X. Emotion Recognition Based on Photoplethysmogram and Electroencephalogram. In Proceedings of the 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC), Tokyo, Japan, 23–27 July 2018; Volume 2, pp. 402–407.
  23. Suhaimi, N.S.; Mountstephens, J.; Teo, J. EEG-Based Emotion Recognition: A State-of-the-Art Review of Current Trends and Opportunities. Comput. Intell. Neurosci. 2020, 2020, 8875426.
  24. Udovičić, G.; Ðerek, J.; Russo, M.; Sikora, M. Wearable Emotion Recognition system based on GSR and PPG signals. In Proceedings of the MMHealth 2017—2nd International Workshop on Multimedia for Personal Health and Health Care, Co-Located with MM, Mountain View, CA, USA, 23 October 2017; pp. 53–59.
  25. Rathour, N.; Khanam, Z.; Gehlot, A.; Singh, R.; Rashid, M.; AlGhamdi, A.S.; Alshamrani, S.S. Real-Time Facial Emotion Recognition Framework for Employees of Organizations Using Raspberry-Pi. Appl. Sci. 2021, 11, 10540.
  26. Kundu, T.; Saravanan, C. Advancements and recent trends in emotion recognition using facial image analysis and machine learning models. In Proceedings of the International Conference on Electrical, Electronics, Communication Computer Technologies and Optimization Techniques, ICEECCOT, Mysuru, India, 15–16 December 2017; pp. 1–6.
  27. Shinde, A.S.; Patil, V.V. Speech Emotion Recognition System: A Review. In Proceedings of the 4th International Conference on Advances in Science and Technology (ICAST, 2021), Mumbai, India, 7 May 2021.
  28. El Ayadi, M.; Kamel, M.S.; Karray, F. Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognit. 2011, 44, 572–587.
  29. Huang, Y.; Yang, J.; Liu, S.; Pan, J. Combining facial expressions and electroencephalography to enhance emotion recognition. Future Internet 2019, 11, 105.
  30. Zucco, C.; Calabrese, B.; Agapito, G.; Guzzi, P.H.; Cannataro, M. Sentiment analysis for mining texts and social networks data: Methods and tools. WIREs Data Min. Knowl. Discov. 2020, 10, e1333.
  31. Murthy, A.R.; Anil Kumar, K.M. A Review of Different Approaches for Detecting Emotion from Text. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1110, 012009.
  32. GeethaRamani, R.; Kumar, M.N.; Balasubramanian, L. Identification of emotions in text articles through data pre-processing and data mining techniques. In Proceedings of the 2016 International Conference on Advanced Communication Control and Computing Technologies, ICACCCT, Ramanathapuram, India, 25–27 May 2016; Volume 978, pp. 611–615.
  33. Ortega, M.G.S.; Rodríguez, L.-F.; Gutierrez-Garcia, J.O. Towards emotion recognition from contextual information using machine learning. J. Ambient. Intell. Humaniz. Comput. 2019, 11, 3187–3207.
  34. Atif, M.; Franzoni, V. Tell Me More: Automating Emojis Classification for Better Accessibility and Emotional Context Recognition. Future Internet 2022, 14, 142.
  35. Bouhlal, M.; Aarika, K.; Abdelouahid, R.A.; Elfilali, S.; Benlahmar, E. Emotions recognition as innovative tool for improving students’ performance and learning approaches. Procedia Comput. Sci. 2020, 175, 597–602.
  36. Kalyta, O.; Barmak, O.; Radiuk, P.; Krak, I. Facial Emotion Recognition for Photo and Video Surveillance Based on Machine Learning and Visual Analytics. Appl. Sci. 2023, 13, 9890.
  37. Alghamdi, T.; Alaghband, G. SAFEPA: An Expandable Multi-Pose Facial Expressions Pain Assessment Method. Appl. Sci. 2023, 13, 7206.
  38. Duville, M.M.; Pérez, Y.; Hugues-Gudiño, R.; Naal-Ruiz, N.E.; Alonso-Valerdi, L.M.; Ibarra-Zarate, D.I. Systematic Review: Emotion Recognition Based on Electrophysiological Patterns for Emotion Regulation Detection. Appl. Sci. 2023, 13, 6896.
  39. Vinola, C.; Vimaladevi, K. A survey on human emotion recognition approaches, databases and applications. Electron. Lett. Comput. Vis. Image Anal. 2015, 14, 24–44.
  40. Haridas, A.V.; Marimuthu, R.; Chakraborty, B. Emotion Recognition System for Specially Needed People with Optimized Deep Learning Algorithm. In Proceedings of the 2020 Fourth International Conference on Inventive Systems and Control (ICISC), Coimbatore, India, 8–10 January 2020.
  41. Rößler, J.; Sun, J.; Gloor, P. Reducing Videoconferencing Fatigue through Facial Emotion Recognition. Future Internet 2021, 13, 126.
  42. Santamaria-Granados, L.; Mendoza-Moreno, J.F.; Ramirez-Gonzalez, G. Tourist Recommender Systems Based on Emotion Recognition—A Scientometric Review. Future Internet 2021, 13, 2.
  43. Arrais, J.P.; Laranjeira, A.; Oliveira, G.; Ribeiro, B. Deep learning in digital marketing: Brand detection and emotion recognition. Int. J. Mach. Intell. Sens. Signal Process. 2017, 2, 32.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , ,
View Times: 126
Revisions: 2 times (View History)
Update Date: 14 Nov 2023
1000/1000