Human Activity Recognition in Telemedicine: Comparison
Please note this is a comparison between Version 2 by Wendy Huang and Version 3 by Wendy Huang.

Telemedicine has the potential to improve access and delivery of healthcare to diverse and aging populations. Recent advances in technology allow for remote monitoring of physiological measures such as heart rate, oxygen saturation, blood glucose, and blood pressure. However, the ability to accurately detect falls and monitor physical activity remotely without invading privacy or remembering to wear a costly device remains an ongoing concern. Human activity involves a series of actions carried out by one or more individuals to perform an action or task, such as sitting, lying, walking, standing, and falling. The field of human activity recognition (HAR) has made remarkable advancements. The primary objective of HAR is to discern a user’s behavior, enabling computing systems to accurately classify and measure human activity.

  • telemedicine
  • human activity recognition
  • healthcare
  • wearable devices
  • sensors
  • mmwave
  • continuous human activity monitoring
  • PointNet
  • fall alert
  • non-wearable devices

1. Introduction

Demands on the healthcare system associated with an aging population pose a significant challenge to nations across the world. Addressing these issues will require the ongoing adaptation of healthcare and social systems [1]. According to the United States Department of Health and Human Services, those aged 65 and older comprised 17% of the population in 2020, but this proportion is projected to rise to 22% by 2040 [2]. Further, the projected increase in the population of those aged 85 and above is anticipated to increase by twofold. Older adults are more prone to chronic and degenerative diseases such as Alzheimer’s, respiratory diseases, diabetes, cardiovascular disease, osteoarthritis, stroke, and other chronic ailments [3] which require frequent medical care, monitoring, and follow-up. Further, many seniors choose to live independently and are often alone for extended periods of time. For example, in 2021, over 27% (15.2 million) of older adults residing in the community lived alone [2]. One major problem for older adults who choose to live alone is their vulnerability to accidental falls, which are experienced by over a quarter of those aged 65 and older annually, leading to three million emergency visits [4]. Recent studies confirm that preventive measures through active monitoring could help curtail these incidents [5].
Clinicians who treat patients with chronic neurological conditions such as stroke, Parkinson’s disease, and multiple sclerosis also encounter challenges in providing effective care. This can be due to difficulty in monitoring and measuring changes in function and activity levels over time and assessing patient compliance with treatment outside of scheduled office visits [6]. Therefore, it would be beneficial if there were accurate and effective ways to continuously monitor patient activity over extended periods of time without infringing on patient privacy. For these reasons, telemedicine and continuous human activity monitoring have become increasingly important components of today’s healthcare system because they can allow clinicians to engage remotely using objective data [7][8].
Telemedicine systems allow for the transmission of patient data from home to healthcare providers, enabling data analysis, diagnosis, and treatment planning [9][10]. Given the scenario that many older people prefer to live independently in their homes, incorporating and improving telemedicine services has become crucial for many healthcare organizations [11], a sentiment supported by the 37% of the population that utilized telemedicine services in 2021 [12]. Telemedicine monitoring facilitates the collection of long-term data, provides analysis reports to healthcare professionals, and enables them to discern both positive and negative trends and patterns in patient behavior. These data are also essential for real-time patient safety monitoring, alerting caregivers and emergency services during incidents such as a fall [13]. This capability is valuable for assessing patient adherence and responses to medical and rehabilitation interventions [14]. Various technologies have been developed for human activity recognition (HAR) and fall detection [15]. However, non-contact mmwave-based radar technology has garnered considerable attention in recent years due to its numerous advantages [16], such as its portability, low cost, and ability to operate in different ambient and temperature conditions. Furthermore, it provides more privacy compared to traditional cameras and is more convenient than wearable devices [17][18].
The integration of mmwave-based radar systems in healthcare signifies notable progress, specifically in improving the availability of high-quality medical care for patients in distant areas, thus narrowing the disparity between healthcare services in rural and urban regions. This technological transition allows healthcare facilities to allocate resources more efficiently to situations that are of higher importance, therefore reducing the difficulties associated with repeated hospital visits for patients with chronic illnesses. Moreover, these advancements enhance in-home nursing services for the elderly and disabled communities, encouraging compliance with therapeutic treatments and improving the distribution of healthcare resources. Crucially, these sophisticated monitoring systems not only enhance the quality and effectiveness of treatment but also lead to significant cost reductions. These advancements play a crucial role in helping healthcare systems effectively address the changing requirements of an aging population, representing a significant advancement in modern healthcare delivery.
While mmwave-based radar technology offers significant advantages for HAR and fall detection, the complexity of the data it generates presents a formidable challenge [19]. Typically, radar signals are composed of high-dimensional point cloud data that is inherently information-rich, requiring advanced processing techniques to extract meaningful insights. Charles et al. [20] recently proposed PointNet, a deep learning architecture that enables the direct classification of point cloud data from mmwave-based radar signals. Their model preserves spatial information by processing point clouds in their original form. The combination of mmwave radar and PointNet can help HAR applications by improving their performance in terms of precision, responsiveness, and versatility across a wide range of scenarios [21].

2. Human Activity Recognition Approaches

Human activity involves a series of actions carried out by one or more individuals to perform an action or task, such as sitting, lying, walking, standing, and falling [22]. The field of HAR has made remarkable advancements over the past decade. The primary objective of HAR is to discern a user’s behavior, enabling computing systems to accurately classify and measure human activity [23]. Today, smart homes are being constructed with HAR to aid the health of the elderly, disabled, and children by continuously monitoring their daily behavior [24]. HAR may be useful for observing daily routines, evaluating health conditions, and assisting elderly or disabled individuals. HAR plays a role in automatic health tracking, enhancements in diagnostic methods and care, and enables remote monitoring in home and institutional settings, thereby improving safety and well-being [25]. Existing literature in this area often categorizes research based on the features of the devices used, distinguishing between wearable and non-wearable devices, as depicted in Figure 1. Wearable devices encompass smartphones, smartwatches, and smart gloves [26], all capable of tracking human movements. In contrast, non-wearable devices comprise various tools like visual-based systems, intelligent flooring, and radar systems. An illustrative summary of these methodologies is presented in this section, offering a snapshot of the investigations undertaken and a brief overview of diverse applications utilizing these techniques.
Figure 1. Classification of human activity recognition approaches.
Wearable technology has become increasingly useful in capturing detailed data on an individual’s movements and activity patterns through the utilization of sensors placed on the body [15]. This technology includes various devices such as Global Positioning System (GPS) devices, smartwatches, smartphones, smart shirts, and smart gloves. Its application has made notable contributions to the domains of HAR and human–computer interfaces (HCIs) [26]. Nevertheless, it is important to acknowledge that every type of device presents its own set of advantages and disadvantages. For instance, GPS-based systems face obstacles when it comes to accurately identifying specific human poses, and experience signal loss in indoor environments [27]. Smartwatches and smartphones can provide real-time tracking to monitor physical activity and location. They feature monitoring applications that possess the ability to identify health fluctuations and possibly life-threatening occurrences [28][29]. However, smartwatches have disadvantages such as limited battery life, and users must remember to wear them continuously [30]. Further, smartphones encounter issues with sensor inaccuracy when they are kept in pockets or purses [31] and they encounter difficulties in monitoring functions that require direct contact with the body. Other wearable devices, such as the Hexoskin smart shirt [32] and smart textile gloves developed by Itex [33], present alternative options for HAR. However, the persistent need to wear these devices imposes limitations on their utilization in a variety of situations such as when individuals need to take a shower or during sleep [34]. As mentioned before, especially when monitoring older adults, failure to constantly wear monitoring devices can lead to missing unexpected events such as falls [35]. Non-wearable approaches for HAR utilize ambient sensors like camera-based devices, smart floors, and radar systems. Vision-based systems have shown promise in classifying human poses and detecting falls, leveraging advanced computer vision algorithms and high-quality optical sensors [36]. However, challenges like data storage needs, processing complexity, ambient light sensitivity, and privacy concerns hinder their general acceptance [37]. Intelligent floor systems such as carpets and floor tiles provide alternative means for monitoring human movement and posture [38]. A study on a carpet system displayed its ability to use surface force information for 3D human pose analysis but revealed limitations in detecting certain body positions and differentiating similar movements [39]. Recently, radar-based HAR has gained interest due to its ease of deployment in diverse environments, insensitivity to ambient lighting conditions, and maintaining user privacy [18][40]. Mmwave is a subset of radar technology [41], that is relatively low cost, has a compact form factor, and has high-resolution detection capabilities [42]. Further, it can penetrate thin layers of some materials such as fabrics, allowing seamless indoor placement in complex living environments [43]. Commercially available mmwave devices have the capability to create detailed 3D point cloud models of objects. The collected data can be effectively analyzed using edge Artificial Intelligence (AI) algorithms to accurately recreate human movements for HAR applications [44]. The mmwave radar generates point clouds by emitting electromagnetic waves and capturing their reflections as they interact with the object or person. These point clouds represent the spatial distribution of objects and movements, which are then processed to decipher human activities. However, the fluctuating count of cloud points in each frame from mmwave radar introduces challenges in crafting precise activity classifiers, as these typically require fixed input dimensions and order [35]. To address this, researchers commonly standardize the data into forms like micro-Doppler signatures [45][46], image sequences [47][48][49], or 3D voxel grids [19][50] before employing machine learning. This standardization often results in the loss of spatial features [51] and can cause data bloat and related challenges [20]. The proposed approach uses the PointNet network to overcome constraints faced by directly processing raw point cloud data, thereby retaining fine-grained spatial relationships essential for object tracking [52]. As shown in Table 1, theour proposed system achieved novel high accuracy compared with prior studies and extracted accurate tracking maps using spatial features. PointNet’s architecture, leveraging shared Multi-Layer Perceptron (MLP), is computationally efficient and lightweight, making it well-suited for real-time HAR applications [20].
Table 1. Overview on the mmwave radar with machine learning for detecting simple HAR studies.
Ref. Preprocessing Methods Sensor Type Model Activity Detection Overall Accuracy
[45] Micro-Doppler Signatures TI AWR1642 CNN 1 Walking, swinging hands, sitting, and shifting. 95.19%
[46] Micro-Doppler Signatures TI AWR1642 CNN Standing, walking, falling, swing, seizure, restless. 98.7%
[53] Micro-Doppler Signatures TI IWR6843 DNN 2 Standing, running, jumping jacks, jumping, jogging, squats. 95%
[54] Micro-Doppler Signatures TI IWR6843ISK CNN Stand, sit, move toward, away, pick up something from ground, left, right, and stay still. 91%
[55] Micro-Doppler TI xWR14xx RNN 3 Stand up, sit down, walk, fall, get in, lie down, NA 4
  Signatures TI xWR68xx   roll in, sit in, and get out of bed.  
[56] Dual-Micro Motion Signatures TI AWR1642 CNN Standing, sitting, walking, running, jumping, punching, bending, and climbing. 98%
[57] Reflection Two LSTM 5 Falling, walking, pickup, stand up, boxing, sitting, 80%
  Heatmap TI IWR1642   and Jogging.  
[58] Doppler Maps TI AWR1642 PCA 6 Fast walking, slow walking (with swinging hands, or without swinging hands), and limping. 96.1%
[59] Spatial-Temporal Heatmaps TI AWR1642 CNN 14 Common in-home full-body workout. 97%
[47] Heatmap Images TI IWR1443 CNN Standing, walking, and sitting. 71%
[48] Doppler Images TI AWR1642 SVM 7 Stand up, pick up, drink while standing, walk, sit down. 95%
[49] Doppler Images TI AWR1642 SVM Shoulder press, lateral raise, dumbbell, squat, boxing, right and left triceps. NA
[19] Voxelization TI IWR1443 T-D 8 CNN Walking, jumping, jumping jacks, squats and boxing. 90.47%
      B-D 9 LSTM    
[50] Voxelization TI IWR1443 CNN Sitting posture with various directions. 99%
[21] Raw Points Cloud TI IWR1843 PointNet Walking, rotating, waving, stooping, and falling. 95.40%
This Raw Points Cloud TI IWR6843 PointNet Standing, walking, sitting, lying, falling. 99.5%
1 CNN: Convolutional Neural Network. 2 DNN: Deep Neural Network. 3 RNN: Recurrent Neural Network. 4 NA: Not Available. 5 LSTM: Long Short-Term Memory. 6 PCA: principal Component Analysis. 7 SVM: Support Vector Machine. 8 T-D: Time-distributed. 9 B-D: Bi-directional.


  1. World Health Organization (WHO). National Programmes for Age-Friendly Cities and Communities: A Guide. 2023. Available online: (accessed on 1 June 2023).
  2. Administration for Community Living (ACL). 2021 Profile of Older Americans; The Administration for Community Living: Washington, DC, USA, 2022; Available online: (accessed on 5 May 2023).
  3. Debauche, O.; Mahmoudi, S.; Manneback, P.; Assila, A. Fog IoT for Health: A new Architecture for Patients and Elderly Monitoring. Procedia Comput. Sci. 2019, 160, 289–297.
  4. Burns, E.; Kakara, R.; Moreland, B. A CDC Compendium of Effective Fall Interventions: What Works for Community-Dwelling Older Adults, 4th ed.; Centers for Disease Control and Prevention, National Center for Injury Prevention and Control: Atlanta, GA, USA, 2023; Available online: (accessed on 10 July 2023).
  5. Bargiotas, I.; Wang, D.; Mantilla, J.; Quijoux, F.; Moreau, A.; Vidal, C.; Barrois, R.; Nicolai, A.; Audiffren, J.; Labourdette, C.; et al. Preventing falls: The use of machine learning for the prediction of future falls in individuals without history of fall. J. Neurol. 2023, 270, 618–631.
  6. Chakraborty, C.; Ghosh, U.; Ravi, V.; Shelke, Y. Efficient Data Handling for Massive Internet of Medical Things: Healthcare Data Analytics; Springer: Berlin/Heidelberg, Germany, 2021.
  7. Sakamaki, T.; Furusawa, Y.; Hayashi, A.; Otsuka, M.; Fernandez, J. Remote patient monitoring for neuropsychiatric disorders: A scoping review of current trends and future perspectives from recent publications and upcoming clinical trials. Telemed.-Health 2022, 28, 1235–1250.
  8. Alanazi, M.A.; Alhazmi, A.K.; Alsattam, O.; Gnau, K.; Brown, M.; Thiel, S.; Jackson, K.; Chodavarapu, V.P. Towards a low-cost solution for gait analysis using millimeter wave sensor and machine learning. Sensors 2022, 22, 5470.
  9. Palanisamy, P.; Padmanabhan, A.; Ramasamy, A.; Subramaniam, S. Remote Patient Activity Monitoring System by Integrating IoT Sensors and Artificial Intelligence Techniques. Sensors 2023, 23, 5869.
  10. World Health Organization. Telemedicine: Opportunities and Developments in Member States. Report on the Second Global Survey on eHealth; World Health Organization: Geneva, Switzerland, 2010.
  11. Zhang, X.; Lin, D.; Pforsich, H.; Lin, V.W. Physician workforce in the United States of America: Forecasting nationwide shortages. Hum. Resour. Health 2020, 18, 8.
  12. Lucas, J.W.; Villarroel, M.A. Telemedicine Use among Adults: United States, 2021; US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics: Hyattsville, MD, USA, 2022.
  13. Alanazi, M.A.; Alhazmi, A.K.; Yakopcic, C.; Chodavarapu, V.P. Machine learning models for human fall detection using millimeter wave sensor. In Proceedings of the 2021 55th Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, USA, 24–26 March 2021; pp. 1–5.
  14. Seron, P.; Oliveros, M.J.; Gutierrez-Arias, R.; Fuentes-Aspe, R.; Torres-Castro, R.C.; Merino-Osorio, C.; Nahuelhual, P.; Inostroza, J.; Jalil, Y.; Solano, R.; et al. Effectiveness of telerehabilitation in physical therapy: A rapid overview. Phys. Ther. 2021, 101, pzab053.
  15. Usmani, S.; Saboor, A.; Haris, M.; Khan, M.A.; Park, H. Latest research trends in fall detection and prevention using machine learning: A systematic review. Sensors 2021, 21, 5134.
  16. Li, X.; He, Y.; Jing, X. A survey of deep learning-based human activity recognition in radar. Remot. Sens. 2019, 11, 1068.
  17. Texas Instruments. IWR6843, IWR6443 Single-Chip 60- to 64-GHz mmWave Sensor. 2021. Available online: (accessed on 25 June 2023).
  18. Alhazmi, A.K.; Alanazi, M.A.; Liu, C.; Chodavarapu, V.P. Machine Learning Enabled Fall Detection with Compact Millimeter Wave System. In Proceedings of the NAECON 2021-IEEE National Aerospace and Electronics Conference, Dayton, OH, USA, 16–19 August 2021; pp. 217–222.
  19. Singh, A.D.; Sandha, S.S.; Garcia, L.; Srivastava, M. Radhar: Human activity recognition from point clouds generated through a millimeter-wave radar. In Proceedings of the 3rd ACM Workshop on Millimeter-Wave Networks and Sensing Systems, Los Cabos, Mexico, 25 October 2019; pp. 51–56.
  20. Qi, C.R.; Su, H.; Mo, K.; Guibas, L.J. Pointnet: Deep learning on point sets for 3d classification and segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 652–660.
  21. Huang, T.; Liu, G.; Li, S.; Liu, J. RPCRS: Human Activity Recognition Using Millimeter Wave Radar. In Proceedings of the 2022 IEEE 28th International Conference on Parallel and Distributed Systems (ICPADS), Nanjing, China, 10–12 January 2023; pp. 122–129.
  22. Beddiar, D.R.; Nini, B.; Sabokrou, M.; Hadid, A. Vision-based human activity recognition: A survey. Multimed. Tools Appl. 2020, 79, 30509–30555.
  23. Bulling, A.; Blanke, U.; Schiele, B. A tutorial on human activity recognition using body-worn inertial sensors. Acm Comput. Surv. (CSUR) 2014, 46, 1–33.
  24. Bouchabou, D.; Nguyen, S.M.; Lohr, C.; LeDuc, B.; Kanellos, I. A survey of human activity recognition in smart homes based on IoT sensors algorithms: Taxonomies, challenges, and opportunities with deep learning. Sensors 2021, 21, 6037.
  25. Kim, K.; Jalal, A.; Mahmood, M. Vision-based human activity recognition system using depth silhouettes: A smart home system for monitoring the residents. J. Electr. Eng. Technol. 2019, 14, 2567–2573.
  26. Zhang, S.; Li, Y.; Zhang, S.; Shahabi, F.; Xia, S.; Deng, Y.; Alshurafa, N. Deep learning in human activity recognition with wearable sensors: A review on advances. Sensors 2022, 22, 1476.
  27. Bibbò, L.; Carotenuto, R.; Della Corte, F. An overview of indoor localization system for human activity recognition (HAR) in healthcare. Sensors 2022, 22, 8119.
  28. Tarafdar, P.; Bose, I. Recognition of human activities for wellness management using a smartphone and a smartwatch: A boosting approach. Decis. Support Syst. 2021, 140, 113426.
  29. Tan, T.H.; Shih, J.Y.; Liu, S.H.; Alkhaleefah, M.; Chang, Y.L.; Gochoo, M. Using a Hybrid Neural Network and a Regularized Extreme Learning Machine for Human Activity Recognition with Smartphone and Smartwatch. Sensors 2023, 23, 3354.
  30. Ramezani, R.; Cao, M.; Earthperson, A.; Naeim, A. Developing a Smartwatch-Based Healthcare Application: Notes to Consider. Sensors 2023, 23, 6652.
  31. Kheirkhahan, M.; Nair, S.; Davoudi, A.; Rashidi, P.; Wanigatunga, A.A.; Corbett, D.B.; Mendoza, T.; Manini, T.M.; Ranka, S. A smartwatch-based framework for real-time and online assessment and mobility monitoring. J. Biomed. Inform. 2019, 89, 29–40.
  32. Montes, J.; Young, J.C.; Tandy, R.; Navalta, J.W. Reliability and validation of the hexoskin wearable bio-collection device during walking conditions. Int. J. Exerc. Sci. 2018, 11, 806.
  33. Ravichandran, V.; Sadhu, S.; Convey, D.; Guerrier, S.; Chomal, S.; Dupre, A.M.; Akbar, U.; Solanki, D.; Mankodiya, K. iTex Gloves: Design and In-Home Evaluation of an E-Textile Glove System for Tele-Assessment of Parkinson’s Disease. Sensors 2023, 23, 2877.
  34. di Biase, L.; Pecoraro, P.M.; Pecoraro, G.; Caminiti, M.L.; Di Lazzaro, V. Markerless radio frequency indoor monitoring for telemedicine: Gait analysis, indoor positioning, fall detection, tremor analysis, vital signs and sleep monitoring. Sensors 2022, 22, 8486.
  35. Rezaei, A.; Mascheroni, A.; Stevens, M.C.; Argha, R.; Papandrea, M.; Puiatti, A.; Lovell, N.H. Unobtrusive Human Fall Detection System Using mmWave Radar and Data Driven Methods. IEEE Sensors J. 2023, 23, 7968–7976.
  36. Pareek, P.; Thakkar, A. A survey on video-based human action recognition: Recent updates, datasets, challenges, and applications. Artif. Intell. Rev. 2021, 54, 2259–2322.
  37. Xu, D.; Qi, X.; Li, C.; Sheng, Z.; Huang, H. Wise information technology of med: Human pose recognition in elderly care. Sensors 2021, 21, 7130.
  38. Lan, G.; Liang, J.; Liu, G.; Hao, Q. Development of a smart floor for target localization with bayesian binary sensing. In Proceedings of the 2017 IEEE 31st International Conference on Advanced Information Networking and Applications (AINA), Taipei, Taiwan, 27–29 March 2017; pp. 447–453.
  39. Luo, Y.; Li, Y.; Foshey, M.; Shou, W.; Sharma, P.; Palacios, T.; Torralba, A.; Matusik, W. Intelligent carpet: Inferring 3d human pose from tactile signals. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 11255–11265.
  40. Zhao, Y.; Zhou, H.; Lu, S.; Liu, Y.; An, X.; Liu, Q. Human activity recognition based on non-contact radar data and improved PCA method. Appl. Sci. 2022, 12, 7124.
  41. Iovescu, C.; Rao, S. The Fundamentals of Millimeter Wave Sensors; Texas Instrument: Dallas, TX, USA, 2017; pp. 1–8.
  42. Jin, F.; Sengupta, A.; Cao, S. mmfall: Fall detection using 4-d mmwave radar and a hybrid variational rnn autoencoder. IEEE Trans. Autom. Sci. Eng. 2020, 19, 1245–1257.
  43. Broeder, G. Human Activity Recognition Using a mmWave Radar. Bachelor’s Thesis, University of Twente, Netherlands, Enschede, The Netherlands, 2022.
  44. An, S.; Ogras, U.Y. Mars: Mmwave-based assistive rehabilitation system for smart healthcare. Acm Trans. Embed. Comput. Syst. (TECS) 2021, 20, 1–22.
  45. Zhang, R.; Cao, S. Real-time human motion behavior detection via CNN using mmWave radar. IEEE Sens. Lett. 2018, 3, 3500104.
  46. Jin, F.; Zhang, R.; Sengupta, A.; Cao, S.; Hariri, S.; Agarwal, N.K.; Agarwal, S.K. Multiple patients behavior detection in real-time using mmWave radar and deep CNNs. In Proceedings of the 2019 IEEE Radar Conference (RadarConf), Boston, MA, USA, 22–26 April 2019; pp. 1–6.
  47. Cui, H.; Dahnoun, N. Real-time short-range human posture estimation using mmWave radars and neural networks. IEEE Sens. J. 2021, 22, 535–543.
  48. Liu, K.; Zhang, Y.; Tan, A.; Sun, Z.; Ding, C.; Chen, J.; Wang, B.; Liu, J. Micro-doppler feature and image based human activity classification with FMCW radar. In Proceedings of the IET International Radar Conference (IET IRC 2020), Online, 4–6 November 2020; Volume 2020, pp. 1689–1694.
  49. Tiwari, G.; Gupta, S. An mmWave radar based real-time contactless fitness tracker using deep CNNs. IEEE Sens. J. 2021, 21, 17262–17270.
  50. Wu, J.; Cui, H.; Dahnoun, N. A voxelization algorithm for reconstructing MmWave radar point cloud and an application on posture classification for low energy consumption platform. Sustainability 2023, 15, 3342.
  51. Li, Z.; Ni, H.; He, Y.; Li, J.; Huang, B.; Tian, Z.; Tan, W. mmBehavior: Human Activity Recognition System of millimeter-wave Radar Point Clouds Based on Deep Recurrent Neural Network. 2023; preprint.
  52. Li, Z.; Li, W.; Liu, H.; Wang, Y.; Gui, G. Optimized pointnet for 3d object classification. In Proceedings of the Advanced Hybrid Information Processing: Third EAI International Conference, ADHIP 2019, Nanjing, China, 21–22 September 2019; Proceedings, Part I. Springer: Berlin/Heidelberg, Germany, 2019; pp. 271–278.
  53. Rajab, K.Z.; Wu, B.; Alizadeh, P.; Alomainy, A. Multi-target tracking and activity classification with millimeter-wave radar. Appl. Phys. Lett. 2021, 119, 034101.
  54. Ahmed, S.; Park, J.; Cho, S.H. FMCW radar sensor based human activity recognition using deep learning. In Proceedings of the 2022 International Conference on Electronics, Information, and Communication (ICEIC), Jeju, Republic of Korea, 6–9 February 2022; pp. 1–5.
  55. Werthen-Brabants, L.; Bhavanasi, G.; Couckuyt, I.; Dhaene, T.; Deschrijver, D. Split BiRNN for real-time activity recognition using radar and deep learning. Sci. Rep. 2022, 12, 7436.
  56. Hassan, S.; Wang, X.; Ishtiaq, S.; Ullah, N.; Mohammad, A.; Noorwali, A. Human Activity Classification Based on Dual Micro-Motion Signatures Using Interferometric Radar. Remote Sens. 2023, 15, 1752.
  57. Sun, Y.; Hang, R.; Li, Z.; Jin, M.; Xu, K. Privacy-preserving fall detection with deep learning on mmWave radar signal. In Proceedings of the 2019 IEEE Visual Communications and Image Processing (VCIP), Sydney, NSW, Australia, 1–4 December 2019; pp. 1–4.
  58. Senigagliesi, L.; Ciattaglia, G.; Gambi, E. Contactless walking recognition based on mmWave radar. In Proceedings of the 2020 IEEE Symposium on Computers and Communications (ISCC), Rennes, France, 7–10 July 2020; pp. 1–4.
  59. Xie, Y.; Jiang, R.; Guo, X.; Wang, Y.; Cheng, J.; Chen, Y. mmFit: Low-Effort Personalized Fitness Monitoring Using Millimeter Wave. In Proceedings of the 2022 International Conference on Computer Communications and Networks (ICCCN), Honolulu, HI, USA, 25–28 July 2022; pp. 1–10.