Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1488 2023-04-24 11:15:15 |
2 format corrected. -3 word(s) 1485 2023-04-24 11:20:59 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Pal, R.; Adhikari, D.; Heyat, M.B.B.; Ullah, I.; You, Z. Yoga Using Intelligent Internet of Things. Encyclopedia. Available online: https://encyclopedia.pub/entry/43379 (accessed on 29 April 2024).
Pal R, Adhikari D, Heyat MBB, Ullah I, You Z. Yoga Using Intelligent Internet of Things. Encyclopedia. Available at: https://encyclopedia.pub/entry/43379. Accessed April 29, 2024.
Pal, Rishi, Deepak Adhikari, Md Belal Bin Heyat, Inam Ullah, Zili You. "Yoga Using Intelligent Internet of Things" Encyclopedia, https://encyclopedia.pub/entry/43379 (accessed April 29, 2024).
Pal, R., Adhikari, D., Heyat, M.B.B., Ullah, I., & You, Z. (2023, April 24). Yoga Using Intelligent Internet of Things. In Encyclopedia. https://encyclopedia.pub/entry/43379
Pal, Rishi, et al. "Yoga Using Intelligent Internet of Things." Encyclopedia. Web. 24 April, 2023.
Yoga Using Intelligent Internet of Things
Edit

The detection and monitoring of the yoga postures are possible with the Intelligent Internet of Things (IIoT), which is the integration of intelligent approaches (machine learning) and the Internet of Things (IoT). Considering the increment in yoga practitioners, the integration of IIoT and yoga has led to the successful implementation of IIoT-based yoga training systems.

future intelligence health intelligence IoT IoMT medical intelligence exercise

1. Sensor-Based Approach

Multiple sensors are used to detect yoga postures, including wearable and infrared sensors, which are explained below.

1.1. Wearable Sensor

Wearable sensors are small, lightweight, cheap, and portable medical devices that can acquire numerous daily data without disturbance. Wearable devices are considered a better approach for monitoring and detecting yoga positions. Pal et al. [1] used a smart belt to analyze the performance of yoga. Pauranik and Kanthi [2] designed wearable devices to monitor heart rate, yogic breathing, and posture. Ashish and Hari [3] designed a wearable-based yoga help system that can guide practitioners without requiring a trainer.

1.2. Infrared Sensors

Infrared sensors are also widely used for the detection and monitoring of yoga through the privacy-preserving approach [4]. The infrared sensor-based Infinity Yoga Tutor was designed by Rishan et al. [5], and it can identify the yoga posture and guide the practitioner through visual information. The system uses CNN and LSTM to learn and predict the yoga posture and is also able to capture the movement of the practitioner [6]. A self-assistance-based yoga poses recognition and real-time feedback system using an infrared sensor is designed in [7]. The deployed system can also identify the hand gestures, commonly named yogic mudra. The system uses machine learning-based XBoost and random CV as a learning approach. Experimental results show the system was able to detect yoga postures with high accuracy. YogaNet is designed in [8], which is based on CNN and LSTM, which can detect the yoga postures and also provides feedback for the correction.

1.3. RFID

The progress on RFID technology has enabled many human action recognition tasks. The use of active and passive tags has overcome the limitations of the RFID that existed initially. Yoga posture detection using RFID has been a common practice. Sun [9] implemented an RFID-based yoga mat that can detect and estimate yoga postures. The method deploys deep learning as the learning approach to predict yoga postures. Yao et al. [10] deployed an RFID-based human activity recognition system to detect human postures. The experimental analysis shows that the method was able to detect multiple postures with high accuracy. A system in [11] is designed to detect yoga postures based on RFID. Along with the detection of poses, this method also evaluates the stress levels in the practitioner.

1.4. Smart Mat

The mat is a convenient tool for the practitioner to practice yoga. There have been numerous attempts made at the detection of yoga using the smart mat. A smart mat is a mat that uses intelligent techniques and sensors, taking data from practitioners and learning from them to make a prediction. The design of the smart mat strategy is still in its infancy as tremendous research is required before deploying them. Smart mats are usually designed by implementing numerous sensors in the mat, where force-resistive sensors (FRs)are the common practices [12]. Chinnaaiah et al. [13] deployed FSRs to design the smart yoga mat. The smart mat was only able to detect the lying and sitting yoga postures. Standing yoga postures were not detected using FSR sensors. The smart prayer mat is designed in [14], wherein arrays of FSRs were used to detect the prayer.

2. Vision-Based Approach

The vision-based approach relies on the camera for the input, which is further processed using intelligent approaches for the detection of the yoga postures [15][16]. The intelligent approaches used in vision-based approaches are machine learning, deep learning, and hybrid approaches, the comparison of which is shown in Table 1.
Several algorithms make use of intelligent and hybrid models. The most important is [39], which proposes a hybrid approach that combines two algorithms, SVM and Inception V3. Before categorization, the posture dataset normalized and enhanced the images. The picture dataset was then submitted for modeling training and validation after its features were chosen using the LASSO FS technique. In order to facilitate hybridization, the Inception V3 TL model’s final layer was swapped out for an SVM classifier in the investigation.
Using portable systems and intelligent technology to anticipate and manage human health is an essential feature of smart cities. Consequently, posture recognition in this research is accomplished using multisensory and LoRa technology. The two benefits of the LoRa WAN are its low-cost and wide range of communication. These two technologies—multisensory and LoRa—are combined to create comfortable wearable apparel in any setting. Due to LoRa’s low transmission frequency and small data transfer size, multiprocessing was used in this investigation. RF is considered for data processing, feature extraction, and selection, whereas sliding windows are utilized for multiprocessing. The three testers from a group of 500 datasets are employed to enhance functionality and accuracy [40]. In addition to body language, nonverbal communication techniques also include gestures and postures. This research uses augmented reality and cutting-edge body tracking techniques to identify stagnant posture. Moreover, Kinect body position sensors and unsupervised machine learning are applied to detect group participation and learning [41]. Posture detection has made it feasible to practice yoga correctly. There are only a few datasets and a real-time basis, so posture detection is challenging. A sizable data collection with at least 5500 images of different yoga postures was produced to solve this problem. The tf-pose estimation method was employed for posture detection, which shows the human body’s skeleton in real time. Many ML algorithms employ the tf-pose skeleton as a feature to extract the locations of the human body joints (KNN, logistic regression, SVM, NB, DT, and RF). The RF model has the greatest level of accuracy [42][43][44][48]. Another posture issue that impacts people is that they spend most of their time sitting down.
In addition, [45][46] created a hybrid machine learning strategy for posture recognition by fusing deep neural networks with conventional machine learning techniques. Combining the weight that the deep learning method has learned with the standard model’s forecast yields the final class prediction. Another study [47] classified data using a hybrid CNN–LSTM layer after extracting key points using OpenPose. A total of 88 videos of six distinct yoga stances were used to construct the model.

References

  1. Pal, R.; Adhikari, D.; Heyat, M.B.B.; Guragai, B.; Lipari, V.; Brito Ballester, J.; De la Torre Díez, I.; Abbas, Z.; Lai, D. A Novel Smart Belt for Anxiety Detection, Classification, and Reduction Using IIoMT on Students ’ Cardiac Signal and MSY. Bioengineering 2022, 9, 793.
  2. Puranik, K.A.; Kanthi, M. Wearable Device for Yogic Breathing. In Proceedings of the 2019 Amity International Conference on Artificial Intelligence (AICAI), Dubai, United Arab Emirates, 4–6 February 2019; pp. 605–610.
  3. Gupta, A.; Gupta, H.P. YogaHelp: Leveraging Motion Sensors for Learning Correct Execution of Yoga with Feedback. IEEE Trans. Artif. Intell. 2021, 2, 362–371.
  4. Gochoo, M.; Tan, T.H.; Huang, S.C.; Batjargal, T.; Hsieh, J.W.; Alnajjar, F.S.; Chen, Y.F. Novel IoT-based privacy-preserving yoga posture recognition system using low-resolution infrared sensors and deep learning. IEEE Internet Things J. 2019, 6, 7192–7200.
  5. Rishan, F.; De Silva, B.; Alawathugoda, S.; Nijabdeen, S.; Rupasinghe, L.; Liyanapathirana, C. Infinity yoga tutor: Yoga posture detection and correction system. In Proceedings of the 2020 5th International Conference on Information Technology Research (ICITR), Moratuwa, Sri Lanka, 2–4 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6.
  6. Bhagat, A.; Ansarullah, S.I.; Othman, M.T.B.; Hamid, Y.; Alkahtani, H.K.; Ullah, I.; Hamam, H. A Novel Framework for Classification of Different Alzheimer’s Disease Stages Using CNN Model. Electronics 2023, 12, 469.
  7. Sharma, A.; Agrawal, Y.; Shah, Y.; Jain, P. iYogacare: Real-time Yoga recognition and self-correction for smart healthcare. IEEE Consum. Electron. Mag. 2022.
  8. Yadav, S.K.; Agarwal, A.; Kumar, A.; Tiwari, K.; Pandey, H.M.; Akbar, S.A. YogNet: A two-stream network for realtime multiperson yoga action recognition and posture correction. Knowl.-Based Syst. 2022, 250, 109097.
  9. Sun, W. Rfitness: Enabling smart yoga mat for fitness posture detection with commodity passive rfids. In Proceedings of the 2021 IEEE International Conference on RFID (RFID), Delhi, India, 6–8 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–8.
  10. Yao, L.; Sheng, Q.; Ruan, W.; Gu, T.; Li, X.; Falkner, N.; Yang, Z. Rf-care: Device-free posture recognition for elderly people using a passive rfid tag array. In Proceedings of the 12th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (MOBIQUITOUS), Coimbra, Portugal, 22–24 July 2015.
  11. Nagalakshmi Vallabhaneni, D.P.P. The analysis of the impact of yoga on healthcare and conventional strategies for human pose recognition. Turk. J. Comput. Math. Educ. (TURCOMAT) 2021, 12, 1772–1783.
  12. Tangkongchitr, P.; Buathang, M.; Unsuwan, T.; Wongpatikaseree, K. SAFLOOR: Smart Fall Detection System for the Elderly. In Proceedings of the 2018 International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP), Pattaya, Thailand, 15–17 November 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–6.
  13. Anusha, M.; Dubey, S.; Raju, P.S.; Pasha, I.A. Real–Time Yoga Activity with Assistance of Embedded based Smart Yoga Mat. In Proceedings of the 2019 2nd International Conference on Innovations in Electronics, Signal Processing and Communication (IESC), Shillong, India, 1–2 March 2019; pp. 1–6.
  14. Kasman, K.; Moshnyaga, V.G. New technique for posture identification in smart prayer mat. Electronics 2017, 6, 61.
  15. Islam, M.U.; Mahmud, H.; Ashraf, F.B.; Hossain, I.; Hasan, M.K. Yoga posture recognition by detecting human joint points in real time using microsoft kinect. In Proceedings of the 2017 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Dhaka, Bangladesh, 21–23 December 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 668–673.
  16. Tang, C.; Chen, X.; Gong, J.; Occhipinti, L.G.; Gao, S. WMNN: Wearables-Based Multi-Column Neural Network for Human Activity Recognition. IEEE J. Biomed. Health Inform. 2022, 27, 339–350.
  17. Kishore, D.M.; Bindu, S.; Manjunath, N.K. Estimation of yoga postures using machine learning techniques. Int. J. Yoga 2022, 15, 137.
  18. Trejo, E.W.; Yuan, P. Recognition of yoga poses through an interactive system with kinect device. In Proceedings of the 2018 2nd International Conference on Robotics and Automation Sciences (ICRAS), Wuhan, China, 23–25 June 2023; IEEE: Piscataway, NJ, USA, 2018; pp. 1–5.
  19. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32.
  20. Dantone, M.; Gall, J.; Leistner, C.; Van Gool, L. Human pose estimation using body parts dependent joint regressors. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 3041–3048.
  21. Gong, W.; Zhang, X.; Gonzàlez, J.; Sobral, A.; Bouwmans, T.; Tu, C.; Zahzah, E.H. Human pose estimation from monocular images: A comprehensive survey. Sensors 2016, 16, 1966.
  22. Kothari, S. Yoga Pose Classification Using Deep Learning. Master’s Thesis, San Jose State University, San Jose, CA, USA, 2020.
  23. Pal, R.; Heyat, M.B.B.; You, Z.; Pardhan, B.; Akhtar, F.; Abbas, S.J.; Guragai, B.; Acharya, K. Effect of Maha Mrityunjaya HYMN recitation on human brain for the analysis of single EEG channel C4-A1 using machine learning classifiers on yoga practitioner. In Proceedings of the 2020 17th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), Chengdu, China, 18–20 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 89–92.
  24. Shah, D.; Rautela, V.; Sharma, C.; Florence A, A. Yoga Pose Detection and Correction using Posenet and KNN. In Proceedings of the International Conference on Computing, Communication and Green Engineering (CCGE), Pune, India, 23–25 September 2021; Volume 9, pp. 1290–1293.
  25. Mohanty, A.; Ahmed, A.; Goswami, T.; Das, A.; Vaishnavi, P.; Sahay, R.R. Robust pose recognition using deep learning. In Proceedings of the International Conference on Computer Vision and Image Processing: CVIP 2016; Springer: Berlin/Heidelberg, Germany, 2017; Volume 2, pp. 93–105.
  26. Li, B.; Sano, A. Extraction and interpretation of deep autoencoder-based temporal features from wearables for forecasting personalized mood, health, and stress. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 1–26.
  27. Lee, R.C.; Chen, I.Y. A Deep Dive of Autoencoder Models on Low-Contrast Aquatic Images. Sensors 2021, 21, 4966.
  28. Anand Thoutam, V.; Srivastava, A.; Badal, T.; Kumar Mishra, V.; Sinha, G.; Sakalle, A.; Bhardwaj, H.; Raj, M. Yoga pose estimation and feedback generation using deep learning. Comput. Intell. Neurosci. 2022, 2022, 4311350.
  29. Trumble, M.; Gilbert, A.; Hilton, A.; Collomosse, J. Deep autoencoder for combined human pose estimation and body model upscaling. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 784–800.
  30. Yingdong, R. Research on Different Convolutional Neural Networks in the Classification Scenes of Yoga Poses based on OpenPose Extraction. In Proceedings of the 2022 IEEE International Conference on Advances in Electrical Engineering and Computer Applications (AEECA), Dalian, China, 20–21 August 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1532–1535.
  31. Haque, S.; Rabby, A.S.A.; Laboni, M.A.; Neehal, N.; Hossain, S.A. ExNET: Deep neural network for exercise pose detection. In Proceedings of the Recent Trends in Image Processing and Pattern Recognition: Second International Conference, RTIP2R 2018, Solapur, India, 21–22 December 2018; Springer: Berlin/Heidelberg, Germany, 2019; pp. 186–193.
  32. Belagiannis, V.; Zisserman, A. Recurrent human pose estimation. In Proceedings of the 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May–3 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 468–475.
  33. Kuppusamy, P.; Harika, C. Human action recognition using CNN and LSTM-RNN with attention model. Int. J. Innov. Technol. Explor. Eng 2019, 8, 1639–1643.
  34. Janocha, K.; Czarnecki, W.M. On loss functions for deep neural networks in classification. arXiv 2017, arXiv:1702.05659.
  35. Toshev, A.; Szegedy, C. Deeppose: Human pose estimation via deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1653–1660.
  36. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90.
  37. Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 580–587.
  38. Szegedy, C.; Toshev, A.; Erhan, D. Object detection via deep neural networks. In Proceedings of the Annual Conference on Neural Information Processing Systems NIPS 2013, Nevada, CA, USA, 5–10 December 2013; Volume 26.
  39. Ogundokun, R.O.; Maskeliūnas, R.; Misra, S.; Damasevicius, R. Hybrid InceptionV3-SVM-Based Approach for Human Posture Detection in Health Monitoring Systems. Algorithms 2022, 15, 410.
  40. Han, J.; Song, W.; Gozho, A.; Sung, Y.; Ji, S.; Song, L.; Wen, L.; Zhang, Q. Lora-based smart IoT application for smart city: An example of human posture detection. Wirel. Commun. Mob. Comput. 2020, 2020, 8822555.
  41. Radu, I.; Tu, E.; Schneider, B. Relationships between body postures and collaborative learning states in an augmented reality study. In Proceedings of the Artificial Intelligence in Education: 21st International Conference, AIED 2020, Ifrane, Morocco, 6–10 July 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 257–262.
  42. Imran, M.A.; Ghannam, R.; Abbasi, Q.H. Engineering and Technology for Healthcare; John Wiley & Sons: New York, NY, USA, 2020.
  43. Swain, D.; Satapathy, S.; Acharya, B.; Shukla, M.; Gerogiannis, V.C.; Kanavos, A.; Giakovis, D. Deep Learning Models for Yoga Pose Monitoring. Algorithms 2022, 15, 403.
  44. Agrawal, Y.; Shah, Y.; Sharma, A. Implementation of machine learning technique for identification of yoga poses. In Proceedings of the 2020 IEEE 9th International Conference on Communication Systems and Network Technologies (CSNT), Gwalior, India, 10–12 April 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 40–43.
  45. Liaqat, S.; Dashtipour, K.; Arshad, K.; Assaleh, K.; Ramzan, N. A hybrid posture detection framework: Integrating machine learning and deep neural networks. IEEE Sens. J. 2021, 21, 9515–9522.
  46. Ashraf, F.B.; Islam, M.U.; Kabir, M.R.; Uddin, J. YoNet: A Neural Network for Yoga Pose Classification. SN Comput. Sci. 2023, 4, 198.
  47. Kumar, D.; Sinha, A. Yoga Pose Detection and Classification Using Deep Learning; LAP LAMBERT Academic Publishing: London, UK, 2020.
  48. Tufail, A.B.; Ullah, K.; Khan, R.A.; Shakir, M.; Khan, M.A.; Ullah, I.; Ma, Y.K.; Ali, M. On improved 3D-CNN-based binary and multiclass classification of alzheimer’s disease using neuroimaging modalities and data augmentation methods. J. Healthc. Eng. 2022, 2022, 1302170.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , ,
View Times: 317
Revisions: 2 times (View History)
Update Date: 24 Apr 2023
1000/1000