Continuous Pain Intensity Monitoring: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor: , , , , ,

This research focuses on improving healthcare quality by introducing an automated system that continuously monitors patient pain intensity. The system analyzes the Electrodermal Activity (EDA) sensor modality modality, compares the results obtained from both EDA and facial expressions modalities, and late fuses EDA and facial expressions modalities.

  • continuous pain intensity recognition
  • electrodermal activity
  • facial expressions

1. Introduction

A reliable assessment of pain is crucial to determine proper and prompt treatment, especially for vulnerable patients who cannot communicate their pain, such as those in intensive care, people with dementia, or adults with cognitive impairment. To make the clinical observations go well, it is promising to provide an automated system due to its possibility for objective and robust measurements and the monitoring of pain [1]. The COVID-19 pandemic has further highlighted the importance of such systems. Many countries like China adopted automated systems to effectively manage patients [2]. Thus, this research aims to develop an automated system for clinical settings that can rapidly and objectively monitor patients’ pain levels by analyzing the informative modalities in the X-ITE Pain Dataset. Such a database has been made to complement existing databases and provide valuable information for more advanced discriminating pain or pain intensities versus no pain.
Physical expressions of pain encompass visual cues (facial expressions and body movements), vocalization cues (verbally and non-verbally), and physiological cues (electrocardiography (ECG), electromyography (EMG), Electrodermal Activity (EDA), and brain activity) [3][4][5]; these cues play a significant role in assessing pain in individuals [5]. The extracted features from EDA and facial expression modalities indicate the spontaneous pain expression, stress, and anxiety caused by different pain levels; both modalities are good measures for pain assessment [6][7]. This research presents the findings obtained from analyzing two important modalities regarding classification and regression. Regarding regression approaches, the pain intensity stimuli were handled as continuous labels and normalized between 0 and 1 to fit using all regression approaches.
EDA records the changes in the skin’s electrical activity using two electrodes attached to the index and ring fingers. It correlates significantly with pain intensity ratings, as it reflects the intense body reactions after experiencing pain when a painful stimulus is applied [8][9][10]. An increasing number of studies [11][12][13][14][15] explored physiological signals and machine learning models for objective assessments of pain intensity; findings demonstrated that EDA signals tend to outperform other physiological signals in terms of accurate pain assessment. Thus, many studies [16][17][18] focused on EDA for pain assessment. Further, the temporal integration of EDA features were investigated to improve the performance of pain assessment [14][19][20]. The temporal integration was represented as a time series statistics descriptor (EDA-D) that was calculated from several statistical measures along with their first and second derivatives per time series.
Ekman and Friesen [21] decomposed facial expressions into individual facial Action Units (AUs) with the Facial Action Coding System (FACS). A combination of some of these AUs expresses pain behaviors [22]. Prior studies [13][14][23][24][25][26][27] using facial expressions have explored machine learning approaches to recognize pain intensity. Regarding the use of the temporal integration of frame-level features represented by the Facial Activity Descriptor (FAD), RF showed superior performance compared to linear Support Vector Machine (SVM) and Radial Basis Function kernel (RBF-SVM) [24]; thus, it was used in [27] and this research as baseline approach regarding classification and regression. Approaches that use FAD to recognize pain intensity showed better results than those that used facial features [24], which relied on independently extracted facial features from each frame of a given sequence. FAD is good at describing dynamics among neighboring frames.

2. Continuous Pain Intensity Recognition

Several studies of pain have focused on physiological signals because of the strong correlation between these signals and pain [28][29][30]. In [5][14][15], it was reported that the EDA signal obtained the best performance compared to other single physiological signals. Thus, EDA has gained attention in automatic pain recognition systems. EDA records changes in the electrical activity of the skin of the hands, which is controlled by the autonomic nervous system [31][32]. The sweat on the skin’s surface changes the electrical conductivity of the skin (e.g., people sweat when they are scared, nervous, and in pain). EDA is composed of phasic and tonic signals. The phasic signal is a quick response caused by external stimuli such as pain stimuli. The tonic signal is a slower component of the signal, including the baseline of the signal due to unconscious activities [33].
Recent studies have focused on deep-learning methods due to their success in classifying pain using EDA, such as 1D convolutional neural networks [CNNs] [13], a multi-task learning method based on neural networks [34], and the Recurrent Convolutional Neural Network [RCNN] [12]. These deep-learning methods were utilized because of their ability to mine the sequential relationships between different periods of EDA signals. Posada et al. [17] presented classification and regression machine learning models to estimate pain sensation in healthy subjects using EDA. They computed the extracted features of EDA based on time-domain decomposition, spectral analysis, and differential features. The maximum macro-averaged geometric mean scores of the models were 69.7% and 69.2%, respectively. Kong et al. [18] analyzed the spectral characteristics of EDA to obtain reliable performance because it is more sensitive and reproducible for the assessment of sympathetic arousal than traditional indices (tonic and phasic signals). Bhatkar et al. [16] reported a successful novel method to discriminate the reduction in pain with clinically effective analgesics by combining self-reports with continuous physiological data in a structured and specific-to-pain protocol.
A common knowledge is that pain databases have a significant impact on the performance of automatic pain assessment systems. The above-mentioned studies of EDA signals for pain intensity recognition used databases that include fewer variants of quality and duration. By analyzing pain in terms of quality and length, additional valuable information is provided for more advanced discrimination between pain or pain intensity versus no pain. Thus, the X-ITE Pain Database [35] is designed to complement existing databases. The X-ITE Pain Database includes behavioral and physiological data that were recorded when healthy participants (subjects) were exposed to different qualities and durations of pain stimuli. The use of healthy subjects in a medical study has always played a vital role in evaluating safety and tolerability without interference from concomitant pathological conditions [36].
Werner et al. [24] introduced a novel feature set for describing facial actions and their dynamics, which researchers call facial activity descriptors [FAD]. They trained FAD (extracted from the BioVid Heat Pain Dataset) with SVM and RFc, and the results showed that RFc with 100 trees outperformed SVM. They focused on the video-level using temporal integration for pain recognition because it was more effective in describing the dynamic information beneficial for pain intensity recognition [23]. This approach often involves the temporal integration of frame-level features. For example, video content can be condensed to high-level features using a time series statistics descriptor that consists of several statistical measures of the time series. In [14], the same RFc was trained using the extracted features from facial expressions, audio, ECG, EMG, and EDA that were introduced in [35] to recognize pain levels. They classified the pre-segmented time windows (7 s) cut out from the continuous recording of the main stimulation phase in the X-ITE Pain Database. According to the ability of Random Forest (RF) [37] for pain detection using facial expressions [14][23], researchers introduced RFc using temporal information of facial expressions by representing time-series statistics descriptor (FAD) [25][26]. FAD was represented by calculating several statistical measures with their first and second derivatives per time series. The performances of reduced MobileNetV2 and simple Convolutional Neural Network (CNN) were better than RFc. CNN accuracy improved when using the sample weighting method by about 1%. The sample weighting method was suggested to reduce the weight of misclassified samples by duplicating some training samples with more facial responses if their classification scores are above 0.3 to improve the pain intensity recognition performance [26].
In [5][6][13][14][20][23][34][35][38], the authors reported that fusing modalities could improve the results of pain recognition. After investigating these studies, it was found that some fused physiological modalities, while others fused both behavioral and physiological modalities. The models combining the fused modalities of EMG and EDA were the most successful in the majority of the aforementioned studies for developing pain recognition systems. However, physiological signals could also be indicative of other pathological conditions unrelated to pain. In the study by Werner et al. [14], fusion was applied with multiple modalities (frontal RGB camera, audio, ECG, EMG, and EDA). Firstly, they individually trained random forests (RF) using the features of each modality. Secondly, they concatenated the feature vectors of all modalities and trained and tested the RF (referred to as feature fusion). Thirdly, they applied decision fusion by training the RF on individual modalities and then aggregating the RF scores into final decisions. They employed two types of aggregation: fixed mapping and trained mapping approaches.

This entry is adapted from the peer-reviewed paper 10.3390/life13091828

References

  1. Mieronkoski, R.; Syrjälä, E.; Jiang, M.; Rahmani, A.; Pahikkala, T.; Liljeberg, P.; Salanterä, S. Developing a pain intensity prediction model using facial expression: A feasibility study with electromyography. PLoS ONE 2020, 15, e0235545.
  2. Ye, M.Q.; Zhou, B.J.; Wu, H. Using Information Technology to Manage the COVID-19 Pandemic: Development of a Technical Framework Based on Practical Experience in China. JMIR Med. Inform. 2020, 8, e19515.
  3. Williams, A.C.d.C. Facial Expression of Pain: An Evolutionary Account. Behav. Brain Sci. 2002, 25, 439–455.
  4. Kunz, M.; Lautenbacher, S.; LeBlanc, N.; Rainville, P. Are both the sensory and the affective dimensions of pain encoded in the face? Pain 2012, 153, 350–358.
  5. Thiam, P.; Kestler, H.; Schwenker, F. Multimodal deep denoising convolutional autoencoders for pain intensity classification based on physiological signals. In Proceedings of the International Conference on Pattern Recognition Applications and Methods (ICPRAM), Valletta, Malta, 22–24 February 2020; pp. 289–296.
  6. Susam, B.T.; Riek, N.T.; Akcakaya, M.; Xu, X.; Sa, V.R.d.; Nezamfar, H.; Diaz, D.; Craig, K.D.; Goodwin, M.S.; Huang, J.S. Automated Pain Assessment in Children Using Electrodermal Activity and Video Data Fusion via Machine Learning. IEEE Trans. Biomed. Eng. 2022, 69, 422–431.
  7. Rokicki, J.P.; Ivanauskas, A.; Janužis, G. The Role of Facial Expression Analysis and Electrodermal Activity as an Objective Evaluation of Persistent Idiopathic Facial Pain. J. Craniofac. Surg. 2022, 33, e14–e16.
  8. Storm, H. Changes in skin conductance as a tool to monitor nociceptive stimulation and pain. Curr. Opin. Anaesthesiol. 2008, 12, 796–804.
  9. Ledowski, T.; Bromilow, J.; Paech, M.J.; Storm, H.; Hacking, R.; Schug, S.A. Monitoring of skin conductance to assess postoperative pain intensity. Br. J. Anaesth. 2006, 97, 862–865.
  10. Loggia, M.L.; Juneau, M.; Bushnell, C.M. Autonomic responses to heat pain: Heart rate, skin conductance, and their relation to verbal ratings and stimulus intensity. Pain 2011, 152, 592–598.
  11. Chu, Y.; Zhao, X.; Han, J.; Su, Y. Physiological Signal-Based Method for Measurement of Pain Intensity. Front. Neurosci. 2017, 11, 279.
  12. Lopez-Martinez, D.; Picard, R. Continuous Pain Intensity Estimation from Autonomic Signals with Recurrent Neural Networks. In Proceedings of the Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 17–21 July 2018; pp. 5624–5627.
  13. Thiam, P.; Kessler, V.; Amirian, M.; Bellmann, P.; Layher, G.; Zhang, Y.; Velana, M.; Gruss, S.; Walter, S.; Traue, H.C.; et al. Multi-modal Pain Intensity Recognition based on the SenseEmotion Database. IEEE Trans. Affect. Comput. 2019, 12, 743–760.
  14. Werner, P.; Al-Hamadi, A.; Gruss, S.; Walter, S. Twofold-Multimodal Pain Recognition with the X-ITE Pain Database. In Proceedings of the 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), Cambridge, UK, 3–6 September 2019.
  15. Pouromran, F.; Radhakrishnan, S.; Kamarthi, S. Exploration of Physiological Sensors, Features, and Machine Learning Models for Pain Intensity Estimation. PLoS ONE 2021, 16, e0254108.
  16. Bhatkar, V.; Picard, R.; Staahl, C. Combining Electrodermal Activity With the Peak-Pain Time to Quantify Three Temporal Regions of Pain Experience. Front. Pain Res. 2022, 3, 764128.
  17. Posada-Quintero, H.F.; Kong, Y.; Chon, K.H. Objective pain stimulation intensity and pain sensation assessment using machine learning classification and regression based on electrodermal activity. Am. J. Physiol. Regul. Integr. Comp. Physiol. 2021, 321, R186–R196.
  18. Kong, Y.; Posada-Quintero, H.F.; Chon, K.H. Sensitive Physiological Indices of Pain Based on Differential Characteristics of Electrodermal Activity. IEEE Trans. Biomed. Eng. 2021, 68, 3122–3130.
  19. Walter, S.; Gruss, S.; Limbrecht-Ecklundt, K.; Traue, H.C. Automatic Pain Quantification using Autonomic Parameters. Psychol. Neurosci. 2014, 7, 363–380.
  20. Walter, S.; Al-Hamadi, A.; Gruss, S.; Frisch, S.; Traue, H.C.; Werner, P. Multimodale Erkennung von Schmerzintensität und -modalität mit maschinellen Lernverfahren. Der. Schmerz. 2020, 34, 400–409.
  21. Craig, K.D. The facial expression of pain Better than a thousand words? APS J. 1992, 1, 153–162.
  22. Werner, P.; Al-Hamadi, A.; Limbrecht-Ecklundt, K.; Walter, S.; Traue, H.C. Head movements and postures as pain behavior. PLoS ONE 2018, 13, e0192767.
  23. Kächele, M.; Thiam, P.; Amirian, M.; Werner, P.; Walter, S.; Schwenker, F.; Palm, G. Multimodal Data Fusion for Person-Independent, Continuous Estimation of Pain Intensity. In Proceedings of the Engineering Applications of Neural Networks: 16th International Conference, Rhodes, Greece, 25–28 September 2015.
  24. Werner, P.; Al-Hamadi, A.; Limbrecht-Ecklundt, K.; Walter, S.; Gruss, S.; Traue, H.C. Automatic Pain Assessment with Facial Activity Descriptors. IEEE Trans. Affect. Comput. 2017, 8, 286–299.
  25. Othman, E.; Werner, P.; Saxen, F.; Al-Hamadi, A.; Walter, S. Cross-Database Evaluation of Pain Recognition from Facial Video. In Proceedings of the Proceedings of the 11th International Symposium on Image and Signal Processing and Analysis (ISPA), Dubrovnik, Croatia, 23–25 September 2019.
  26. Othman, E.; Werner, P.; Saxen, F.; Al-Hamadi, A.; Gruss, S.; Walter, S. Automatic vs. Human Recognition of Pain Intensity from Facial Expression on the X-ITE Pain Database. Sensors 2021, 21, 3273.
  27. Othman, E.; Werner, P.; Saxen, F.; Al-Hamadi, A.; Walter, S. Regression Networks for Automatic Pain Intensity Recognition in Video using Facial Expression on the X-ITE Pain Database. In Proceedings of the 25th Int’l Conf on Image Processing, Computer Vision, & Pattern Recognition (IPCV’21), Las Vegas, NV, USA, 26–29 July 2021.
  28. Chu, Y.; Zhao, X.; Yao, J.; Zhao, Y.; Wu, Z. Physiological Signals Based Quantitative Evaluation Method of the Pain. IFAC Proc. Vol. 2014, 47, 2981–2986.
  29. Naranjo-Hernández, D.; Reina-Tosina, J.; Roa, L.M. Sensor Technologies to Manage the Physiological Traits of Chronic Pain: A Review. Sensors 2020, 20, 365.
  30. Moscato, S.; Cortelli, P.; Lorenzo, C. Physiological responses to pain in cancer patients: A systematic review. Comput. Methods Programs Biomed. 2022, 217, 106682.
  31. Christie, M.J. Electrodermal activity in the 1980s: A review. J. R. Soc. Med. 1981, 74, 616–622.
  32. Boucsein, W.; Fowles, D.C.; Grimnes, S.; Ben-Shakhar, G.; Rroth, W.T.; Dawson, M.E.; Filion, D.L.; Society for Psychophysiological Research Ad Hoc Committee on Electrodermal Measures. Publication Recommendations for Electrodermal Measurements. Psychophysiology 2012, 49, 232–239.
  33. Yu, D.; Sun, S. A Systematic Exploration of Deep Neural Networks for EDA-Based Emotion Recognition. Information 2020, 11, 212.
  34. Lopez-Martinez, D.; Picard, R. Multi-task Neural Networks for Personalized Pain Recognition from Physiological Signals. In Proceedings of the 7th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), San Antonio, TX, USA, 23–26 October 2017.
  35. Gruss, S.; Geiger, M.; Werner, p.; Wilhelm, O.; Traue, H.C.; Al-Hamadi, A.; Walter, S. Multi-Modal Signals for Analyzing Pain Responses to Thermal and Electrical Stimuli. J. Vis. Exp. JoVE 2019, 146, e59057.
  36. Pasqualetti, G.; Gori, G.; Blandizzi, C.; Tacca, M.D. Healthy volunteers and early phases of clinical experimentation. Eur. J. Clin. Pharmacol. 2010, 66, 647–653.
  37. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32.
  38. Subramaniam, S.D.; Dass, B. Automated Nociceptive Pain Assessment using Physiological Signals and a Hybrid Deep Learning Network. IEEE Sens. J. 2021, 21, 3335–3343.
More
This entry is offline, you can click here to edit this entry!
ScholarVision Creations