Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1972 2022-11-08 12:28:24 |
2 update references and layout -2 word(s) 1970 2022-11-09 02:22:14 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Palumbo, A. Microsoft HoloLens 2 in Medical and Healthcare Context. Encyclopedia. Available online: https://encyclopedia.pub/entry/33528 (accessed on 15 November 2024).
Palumbo A. Microsoft HoloLens 2 in Medical and Healthcare Context. Encyclopedia. Available at: https://encyclopedia.pub/entry/33528. Accessed November 15, 2024.
Palumbo, Arrigo. "Microsoft HoloLens 2 in Medical and Healthcare Context" Encyclopedia, https://encyclopedia.pub/entry/33528 (accessed November 15, 2024).
Palumbo, A. (2022, November 08). Microsoft HoloLens 2 in Medical and Healthcare Context. In Encyclopedia. https://encyclopedia.pub/entry/33528
Palumbo, Arrigo. "Microsoft HoloLens 2 in Medical and Healthcare Context." Encyclopedia. Web. 08 November, 2022.
Microsoft HoloLens 2 in Medical and Healthcare Context
Edit

Microsoft® HoloLens was developed and manufactured by Microsoft (MS) and can be presented as a pair of mixed reality smart glasses able to describe an environment in which real and virtual elements appear to coexist. More specifically, the Microsoft® HoloLens is a novel MR-based HMD that makes the user the protagonist of an immersive experience and allows him to interact with the surrounding environment using holograms whilst engaging their senses throughout. It is used in a variety of applications such as medical and surgical aids and systems, medical education and simulation, architecture and several engineering fields (civil, industrial and so on). The use of HoloLens 2 in a medical and healthcare context was analyzed by dividing contributions into the following sub-field applications: surgical navigation, AR-BCI (Brain-Computer Interface) systems integration and human computer interaction (HCI), gait analysis and rehabilitation, medical education and training/virtual teaching/tele-mentoring/tele-consulting and other applications.

HoloLens head-mounted display augmented reality mixed reality telemedicine

1. Surgical Navigation

Most of the studies [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28] herein are focused on application of HoloLens 2 in surgical navigation in the operating room and in the emergency department. The use of the AR/MR based system herein provided the user with computer-generated information superimposed to real-world environment and improved accuracy, safety and efficacy of surgical procedures [29]. The AR-based HoloLens 2 is mainly used as surgical aids aimed at the visualization of medical data, blood vessel search and targeting support for precise positioning of mechanical elements [29].

2. Human Computer Interaction and AR-BCI (Brain-Computer Interface) Systems Integration

In recent years, the Brain-Computer Interface application has been growing rapidly, establishing itself as an emerging technology [30] tested in several scenarios such as rehabilitation [31], robotics [32], precision surgery and speech recognition. However, the usability of many professional brain-sensing equipment remains limited. Indeed, these systems remain expensive, bulky, uncomfortable to wear due to the gel applied to the electrodes and tethered, as well as prone to classification errors. Thus, the modern trend of the scientific community is directed to the use of BCI systems in association with other input modalities such as gaze trackers [33], or head-mounted display (HMD) such as Virtual Reality (VR) [34] and Augmented Reality (AR) headsets [35][36].
Two research contributions [37][38] integrated the BCI and AR HMD systems within the same physical prototype. More specifically, in a first pilot study [37], it was proposed a prototype which combines the Microsoft HoloLens 2 with an EEG BCI system based on covert visuospatial attention (CVSA)—a process of focusing attention on different regions of the visual field without overt eye movements. Fourteen participants were enrolled to test the system over the course of two days using a CVSA paradigm. In another study [38], considering the introduced clip-on solution for the AR-BCI integration, it was designed a simple 3D game, which changed in real time according to the user’s state of attention measured via EEG and coupled the prototype with a real-time attention classifier. The results of these studies, though promising, needed to be considered preliminary due to the small number of participants (n = 14).
In addition to the described contributions [37][38], the work of Wolf et al. [39] fits into the human–computer interaction (HCI) field. More specifically, it was analyzed hand-eye coordination in real-time to predict hand actions during target selection and thus giving the possibility to avoid users’ potential errors before they occur. In a first user study, 10 participants were enrolled and recorded them playing a memory card game, which involves frequent hand-eye coordination with little task-relevant information. In a second user study, considering a group of 12 participants, the real time effectiveness of the method to stop participants’ motions in time (i.e., before they reach and start manipulating a target), was evaluated. Despite this contribution’s limitation being represented by the small number of participants, the results demonstrated that the support of the implemented method was effective with a mean accuracy of 85.9%.
Another hot topic in virtual reality research is the use of embodied avatars (i.e., 3D models of human beings controlled by the user), or so-called full-body illusions, a promising tool able to enhance the user’s mental health. To complement the research, augmented reality is able to incorporate real elements, such as the therapist or the user’s real body, into therapeutic scenarios. Wolf et al. [40] presented a holographic AR mirror system based on an OST device and markerless body tracking to collect qualitative feedback regarding its user experience. Additionally, authors compared quantitative results in terms of presence, embodiment and body weight perception to similar systems using video see-through (VST), AR and VR. As results, the comparative evaluation between OST AR, VST AR, and VR revealed significant differences in relevant measures (lower feelings of presence and higher body weight of the generic avatar when using the OST AR system).

3. Gait Analysis and Rehabilitation

Augmented reality may be a technology solution for the assessment of gait and functional mobility metrics in clinical settings. Indeed, they provide interactive digital stimuli in the context of ecologically valid daily activities while allowing one to objectively quantify the movements of the user by using the inertial measurement units (IMUs). The project of Koop et al. [41] aimed to determine the equivalency of kinematic outcomes characterizing lower-extremity function derived from the HoloLens 2 and three-dimensional (3D) motion capture systems (MoCap). Kinematic data of sixty-six healthy adults were collected using the HoloLens 2 and MoCap while they completed two lower-extremity tasks: (1) continuous walking and (2) timed up-and-go (TUG). The authors demonstrated that the TUG metrics, including turn duration and velocity, were statistically equivalent between the two systems.
In the rehabilitation context, the developed technologies such as virtual and augmented reality can also enable gait and balance training outside the clinics. The study of Held et al. [42] aimed to investigate the manipulation of the gait pattern of persons who have had a stroke based on virtual augmentation during overground walking compared to walking without AR performance feedback. Subsequently, it was evaluated the usability of the AR feedback prototype in a chronic stroke subject with minor gait and balance impairments. The results provided the first evidence of gait adaptation during overground walking based on real-time feedback through visual and auditory augmentation.

4. Medical Education and Training/Virtual Teaching/Tele-Mentoring/Tele-Consulting

During the COVID-19 pandemic, undergraduate medical training was significantly restricted with the suspension of medical student clerkships onwards. Aiming to continue to deliver training for medical students, augmented reality has started to emerge as a medical education and training tool, allowing new and promising possibilities for visualization and interaction with digital content.
Nine contributions [43][44][45][46][47][48][49][50][51] described the use of AR technology and the feasibility of using the HoloLens 2 headset to deliver remote bedside teaching or to enable 3D display for the facilitation of the learning process or in tele-mentoring and tele-consulting contexts.
Wolf et al. [43], for example, investigated the potential benefits of AR-based and step-by-step contextual instructions for ECMO cannulation training and compare them with the conventional training instructions regularly used at a university hospital. A comparative study between conventional and AR-based instructions for ECMO cannulation training was conducted with 21 medical students. The results demonstrated the high potential of AR instructions to improve ECMO cannulation training outcomes as a result of better information acquisition by participants during task execution.
Several studies [49][51] confirmed that the use of AR technology also enhanced the performance of tele-mentoring and teleconsulting systems in healthcare environments [49]. Tele-mentoring can be considered as an approach in which a mentor interactively guides a mentee at a different geographic location using a technological communication device.
Bui et al. [49] demonstrated the usability of AR technology in tele-mentoring clinical healthcare professionals in managing clinical scenarios. In a quasi-experimental study, four experienced health professionals and a minimum of 12 novice health practitioners were recruited for the roles of mentors and mentees, respectively. Each mentee wears the AR headset and performs a maximum of four different clinical scenarios (Acute Coronary Syndrome, Acute Myocardial Infarction, Pneumonia Severe Reaction to Antibiotics, and Hypoglycaemic Emergency) in a simulated learning environment. The role of a mentor, who stays in a separate room, is to use a laptop to provide the mentee remote instruction and guidance following the standard protocols related to each scenario. The mentors and mentees’ perception of the AR’s usability, the mentorship effectiveness, and the mentees’ self-confidence and skill performance were considered as outcome measures.
Bala et al. [50] presented a proof-of-concept study at a London teaching hospital using mixed reality (MR) technology (HoloLens 2™) to deliver a remote access teaching ward ward-round. It was evaluated the feasibility, acceptability and effectiveness of this technology for educational purposes from the perspectives of students, faculty members and patients.

5. Other Applications

Four contributions [52][53][54][55] have been included in the “other applications” subgroup since, due to their characteristics, they cannot be configured as belonging to the subgroups mentioned. More specifically, in the study of Onishi et al. [52], it was implemented a prototype system, named Gaze-Breath, in which gaze and breathing are integrated, using an MR headset and a thermal camera, respectively, for hands-free or intuitive inputs to control the cursor three-dimensionally and facilitate switching between pointing and selection. Johnson et al. [53] developed and preliminarily tested a radiotherapy system for patient posture correction and alignment using a mixed reality visualization. Kurazume et al. [54] presented a comparative study of two AR training systems for Humanitude dementia care, a multimodal comprehensive care methodology for patients with dementia. Herein, it was presented a new prototype called HEARTS 2 consisting of Microsoft HoloLens 2 as well as realistic and animated computer graphics (CG) models of older women. Finally, Matyash et al. [55] investigated accuracy measurement of HoloLens 2 inertial measurement units (IMUs) in medical environments. Indeed, it was analyzed the accuracy and repeatability of the HoloLens 2 position finding to provide a quantitative measure of pose repeatability and deviation from a path while in motion.
The number of publications on the Microsoft® HoloLens 2 application in a medical and healthcare context is shown in Figure 1.
Figure 1. Reviewed publications related to HoloLens 2 research by year.
Starting from the year following the release of the second-generation product, the demand for HoloLens 2 has increased exponentially in medical sector until today and the research is expected to expand further in the future. Indeed, in 2020 the number of publications was 3, increasing to 19 in 2021 and to 25 in 2022.
In addition, the use of HoloLens 2 in a medical and healthcare context was analyzed by dividing contributions into the following sub-field applications: surgical navigation, AR-BCI systems integration and human computer interaction, gait analysis and rehabilitation, medical education and training/virtual teaching/tele-mentoring/tele-consulting and other applications. Figure 2 illustrates that surgical navigation represents the most common application (60%, n = 28) of HoloLens 2 and that also in medical training /virtual teaching/tele-mentoring and tele-consulting contexts, the use of this methodology is increasing considerably (19%, n = 9).
Figure 2. Reviewed publications related to HoloLens 2 research by sub-field applications.
Despite the enormous potential of augmented reality in gait analysis and rehabilitation as well as in a brain computer interface, few studies have been published so far (4%, n = 2 and 8%, n = 4).
Concerning the type of publication, most of the reviewed papers were research articles (77%, n = 36), while a smaller percentage (23%, n = 11) was composed of conference proceedings (Figure 3).
Figure 3. Reviewed publications related to HoloLens 2 research by type of publication.
Analyzing the review results in terms of types of visualization technologies (Figure 4), the two types of approaches, AR and MR, were used for applications in a medical and healthcare context. More specifically, AR application was the most common, as evidenced by its use in = 33 research papers (70%), while MR was present in only 14 contributions (30%).
Figure 4. Reviewed publications related to HoloLens 2 research by types of visualization technologies.
Considering the huge potential application of this technology, also demonstrated in the pandemic context of COVID-19, this systematic literature review aims to prove the feasibility and applicability of HoloLens 2 in a medical and healthcare context as well as to highlight the limitations in the use of this innovative approach and bring focus to emerging research topics, such as telemedicine, remote control and motor rehabilitation.

References

  1. Wang, L.; Zhao, Z.; Wang, G.; Zhou, J.; Zhu, H.; Guo, H.; Huang, H.; Yu, M.; Zhu, G.; Li, N.; et al. Application of a three-dimensional visualization model in intraoperative guidance of percutaneous nephrolithotomy. Int. J. Urol. 2022, 29, 838–844.
  2. Liu, X.; Sun, J.; Zheng, M.; Cui, X. Application of Mixed Reality Using Optical See-Through Head-Mounted Displays in Transforaminal Percutaneous Endoscopic Lumbar Discectomy. BioMed Res. Int. 2021, 2021, 9717184.
  3. Eom, S.; Kim, S.; Rahimpour, S.; Gorlatova, M. AR-Assisted Surgical Guidance System for Ventriculostomy. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 402–405.
  4. Kitagawa, M.; Sugimoto, M.; Haruta, H.; Umezawa, A.; Kurokawa, Y. Intraoperative holography navigation using a mixed-reality wearable computer during laparoscopic cholecystectomy. Surgery 2022, 171, 1006–1013.
  5. Doughty, M.; Ghugre, N.R. Head-Mounted Display-Based Augmented Reality for Image-Guided Media Delivery to the Heart: A Preliminary Investigation of Perceptual Accuracy. J. Imaging 2022, 8, 33.
  6. Torabinia, M.; Caprio, A.; Fenster, T.B.; Mosadegh, B. Single Evaluation of Use of a Mixed Reality Headset for Intra-Procedural Image-Guidance during a Mock Laparoscopic Myomectomy on an Ex-Vivo Fibroid Model. Appl. Sci. 2022, 12, 563.
  7. Gsaxner, C.; Pepe, A.; Schmalstieg, D.; Li, J.; Egger, J. Inside-out instrument tracking for surgical navigation in augmented reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, Osaka, Japan, 8–10 December 2021.
  8. García-sevilla, M.; Moreta-martinez, R.; García-mato, D.; Pose-diez-de-la-lastra, A.; Pérez-mañanes, R.; Calvo-haro, J.A.; Pascau, J. Augmented reality as a tool to guide psi placement in pelvic tumor resections. Sensors 2021, 21, 7824.
  9. Amiras, D.; Hurkxkens, T.J.; Figueroa, D.; Pratt, P.J.; Pitrola, B.; Watura, C.; Rostampour, S.; Shimshon, G.J.; Hamady, M. Augmented reality simulator for CT-guided interventions. Eur. Radiol. 2021, 31, 8897–8902.
  10. Park, B.J.; Hunt, S.J.; Nadolski, G.J.; Gade, T.P. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: A phantom study using HoloLens 2. Sci. Rep. 2020, 10, 18620.
  11. Benmahdjoub, M.; Niessen, W.J.; Wolvius, E.B.; Van Walsum, T. Virtual extensions improve perception-based instrument alignment using optical see-through devices. IEEE Trans. Vis. Comput. Graph. 2021, 27, 4332–4341.
  12. Benmahdjoub, M.; Niessen, W.J.; Wolvius, E.B.; Walsum, T. Multimodal markers for technology-independent integration of augmented reality devices and surgical navigation systems. Virtual Real 2022.
  13. Farshad, M.; Spirig, J.M.; Suter, D.; Hoch, A.; Burkhard, M.D.; Liebmann, F.; Farshad-Amacker, N.A.; Fürnstahl, P. Operator independent reliability of direct augmented reality navigated pedicle screw placement and rod bending. N. Am. Spine Soc. J. 2021, 8, 100084.
  14. Doughty, M.; Singh, K.; Ghugre, N.R. SurgeonAssist-Net: Towards Context-Aware Head-Mounted Display-Based Augmented Reality for Surgical Guidance. In Medical Image Computing and Computer Assisted Intervention; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2021; pp. 667–677.
  15. Nagayo, Y.; Saito, T.; Oyama, H. Augmented reality self-training system for suturing in open surgery: A randomized controlled trial. Int. J. Surg. 2022, 102, 106650.
  16. Nagayo, Y.; Saito, T.; Oyama, H. A Novel Suture Training System for Open Surgery Replicating Procedures Performed by Experts Using Augmented Reality. J. Med. Syst. 2021, 45, 60.
  17. Haxthausen, F.V.; Chen, Y.; Ernst, F. Superimposing holograms on real world objects using HoloLens 2 and its depth camera. Curr. Dir. Biomed. Eng. 2021, 7, 20211126.
  18. Wierzbicki, R.; Pawłowicz, M.; Job, J.; Balawender, R.; Kostarczyk, W.; Stanuch, M.; Janc, K.; Skalski, A. 3D mixed-reality visualization of medical imaging data as a supporting tool for innovative, minimally invasive surgery for gastrointestinal tumors and systemic treatment as a new path in personalized treatment of advanced cancer diseases. J. Cancer Res. Clin. Oncol. 2022, 148, 237–243.
  19. Brunzini, A.; Mandolini, M.; Caragiuli, M.; Germani, M.; Mazzoli, A.; Pagnoni, M. HoloLens 2 for Maxillofacial Surgery: A Preliminary Study. In Design Tools and Methods in Industrial Engineering II; Lecture Notes in Mechanical Engineering; Springer: Cham, Switzerland, 2022; pp. 133–140.
  20. Thabit, A.; Benmahdjoub, M.; van Veelen, M.L.C.; Niessen, W.J.; Wolvius, E.B.; van Walsum, T. Augmented reality navigation for minimally invasive craniosynostosis surgery: A phantom study. Int. J. Comput. Assist. Radiol. Surg. 2022, 17, 1453–1460.
  21. Cercenelli, L.; Babini, F.; Badiali, G.; Battaglia, S.; Tarsitano, A.; Marchetti, C.; Marcelli, E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front. Oncol. 2021, 11, 804748.
  22. Felix, B.; Kalatar, S.B.; Moatz, B.; Hofstetter, C.; Karsy, M.; Parr, R.; Gibby, W. Augmented Reality Spine Surgery Navigation Increasing Pedicle Screw : Insertion Accuracy for Both Open and Minimally Invasive S Surgeries. Spine 2022, 47, 865–872.
  23. Tu, P.; Gao, Y.; Lungu, A.J.; Li, D.; Wang, H.; Chen, X. Augmented reality based navigation for distal interlocking of intramedullary nails utilizing Microsoft HoloLens 2. Comput. Biol. Med. 2021, 133, 104402.
  24. Zhou, Z.; Yang, Z.; Jiang, S.; Zhuo, J.; Zhu, T.; Ma, S. Augmented reality surgical navigation system based on the spatial drift compensation method for glioma resection surgery. Med. Phys. 2022, 49, 3963–3979.
  25. Ivanov, V.M.; Krivtsov, A.M.; Strelkov, S.V.; Kalakutskiy, N.V.; Yaremenko, A.I.; Petropavlovskaya, M.Y.; Portnova, M.N.; Lukina, O.V.; Litvinov, A.P. Intraoperative use of mixed reality technology in median neck and branchial cyst excision. Future Internet 2021, 13, 214.
  26. Heinrich, F.; Schwenderling, L.; Joeres, F.; Hansen, C. 2D versus 3D: A Comparison of Needle Navigation Concepts between Augmented Reality Display Devices. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 260–269.
  27. Morita, S.; Suzuki, K.; Yamamoto, T.; Kunihara, M.; Hashimoto, H.; Ito, K.; Fujii, S.; Ohya, J.; Masamune, K.; Sakai, S. Mixed Reality Needle Guidance Application on Smartglasses Without Pre-procedural CT Image Import with Manually Matching Coordinate Systems. CardioVascular Interv. Radiol. 2022, 45, 349–356.
  28. Mitani, S.; Sato, E.; Kawaguchi, N.; Sawada, S.; Sakamoto, K.; Kitani, T.; Sanada, T.; Yamada, H.; Hato, N. Case-specific three-dimensional hologram with a mixed reality technique for tumor resection in otolaryngology. Laryngoscope Investig. Otolaryngol. 2021, 6, 432–437.
  29. Vávra, P.; Roman, J.; Zonča, P.; Ihnát, P.; Němec, M.; Kumar, J.; Habib, N.; El-Gendi, A. Recent Development of Augmented Reality in Surgery: A Review. J. Healthc. Eng. 2017, 2017, 4574172.
  30. Zabcikova, M.; Koudelkova, Z.; Jasek, R.; Lorenzo Navarro, J.J. Recent advances and current trends in brain-computer interface research and their applications. Int. J. Dev. Neurosci. 2022, 82, 107–123.
  31. van Dokkum, L.E.H.; Ward, T.; Laffont, I. Brain computer interfaces for neurorehabilitation-its current status as a rehabilitation strategy post-stroke. Ann. Phys. Rehabil. Med. 2015, 58, 3–8.
  32. Palumbo, A.; Gramigna, V.; Calabrese, B.; Ielpo, N. Motor-imagery EEG-based BCIs in wheelchair movement and control: A systematic literature review. Sensors 2021, 21, 6285.
  33. Kos’Myna, N.; Tarpin-Bernard, F. Evaluation and comparison of a multimodal combination of BCI paradigms and eye tracking with affordable consumer-grade hardware in a gaming context. IEEE Trans. Comput. Intell. AI Games 2013, 5, 150–154.
  34. Amores, J.; Richer, R.; Zhao, N.; Maes, P.; Eskofier, B.M. Promoting relaxation using virtual reality, olfactory interfaces and wearable EEG. In Proceedings of the 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks, BSN 2018, Las Vegas, NV, USA, 4–7 March 2018; pp. 98–101.
  35. Semertzidis, N.; Scary, M.; Andres, J.; Dwivedi, B.; Kulwe, Y.C.; Zambetta, F.; Mueller, F.F. Neo-Noumena: Augmenting Emotion Communication. In Proceedings of the Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020.
  36. Kohli, V.; Tripathi, U.; Chamola, V.; Rout, B.K.; Kanhere, S.S. A review on Virtual Reality and Augmented Reality use-cases of Brain Computer Interface based applications for smart cities. Microprocess. Microsyst. 2022, 88, 104392.
  37. Kosmyna, N.; Hu, C.Y.; Wang, Y.; Wu, Q.; Scheirer, C.; Maes, P. A Pilot Study using Covert Visuospatial Attention as an EEG-based Brain Computer Interface to Enhance AR Interaction. In Proceedings of the International Symposium on Wearable Computers, ISWC, Cancun, Mexico, 12–16 September 2020; pp. 43–47.
  38. Kosmyna, N.; Wu, Q.; Hu, C.Y.; Wang, Y.; Scheirer, C.; Maes, P. Assessing Internal and External Attention in AR using Brain Computer Interfaces: A Pilot Study. In Proceedings of the 2021 IEEE 17th International Conference on Wearable and Implantable Body Sensor Networks, BSN 2021, Athens, Greece, 27–30 July 2021.
  39. Wolf, J.; Lohmeyer, Q.; Holz, C.; Meboldt, M. Gaze comes in Handy: Predicting and preventing erroneous hand actions in ar-supported manual tasks. In Proceedings of the 2021 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2021, Bari, Italy, 4–8 October 2021; pp. 166–175.
  40. Wolf, E.; Fiedler, M.L.; Dollinger, N.; Wienrich, C.; Latoschik, M.E. Exploring Presence, Avatar Embodiment, and Body Perception with a Holographic Augmented Reality Mirror. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 350–359.
  41. Koop, M.M.; Rosenfeldt, A.B.; Owen, K.; Penko, A.L.; Streicher, M.C.; Albright, A.; Alberts, J.L. The Microsoft HoloLens 2 Provides Accurate Measures of Gait, Turning, and Functional Mobility in Healthy Adults. Sensors 2022, 22, 2009.
  42. Held, J.P.O.; Yu, K.; Pyles, C.; Bork, F.; Heining, S.M.; Navab, N.; Luft, A.R. Augmented reality-based rehabilitation of gait impairments: Case report. JMIR mHealth uHealth 2020, 8, e17804.
  43. Wolf, J.; Wolfer, V.; Halbe, M.; Maisano, F.; Lohmeyer, Q.; Meboldt, M. Comparing the effectiveness of augmented reality-based and conventional instructions during single ECMO cannulation training. Int. J. Comput. Assist. Radiol. Surg. 2021, 16, 1171–1180.
  44. Mill, T.; Parikh, S.; Allen, A.; Dart, G.; Lee, D.; Richardson, C.; Howell, K.; Lewington, A. Live streaming ward rounds using wearable technology to teach medical students: A pilot study. BMJ Simul. Technol. Enhanc. Learn. 2021, 7, 494–500.
  45. Levy, J.B.; Kong, E.; Johnson, N.; Khetarpal, A.; Tomlinson, J.; Martin, G.F.; Tanna, A. The mixed reality medical ward round with the MS HoloLens 2: Innovation in reducing COVID-19 transmission and PPE usage. Future Healthc. J. 2021, 8, e127–e130.
  46. Sivananthan, A.; Gueroult, A.; Zijlstra, G.; Martin, G.; Baheerathan, A.; Pratt, P.; Darzi, A.; Patel, N.; Kinross, J. Using Mixed Reality Headsets to Deliver Remote Bedside Teaching during the COVID-19 Pandemic: Feasibility Trial of HoloLens 2. JMIR Form. Res. 2022, 6, e35674.
  47. Rafi, D.; Stackhouse, A.A.; Walls, R.; Dani, M.; Cowell, A.; Hughes, E.; Sam, A.H. A new reality: Bedside geriatric teaching in an age of remote learning. Future Healthc. J. 2021, 8, e714–e716.
  48. Dolega-Dolegowski, D.; Proniewska, K.; Dolega-Dolegowska, M.; Pregowska, A.; Hajto-Bryk, J.; Trojak, M.; Chmiel, J.; Walecki, P.; Fudalej, P.S. Application of holography and augmented reality based technology to visualize the internal structure of the dental root—A proof of concept. Head Face Med. 2022, 18, 12.
  49. Bui, D.T.; Barnett, T.; Hoang, H.; Chinthammit, W. Usability of augmented reality technology in tele-mentorship for managing clinical scenarios-A study protocol. PLoS ONE 2022, 17, e0266255.
  50. Bala, L.; Kinross, J.; Martin, G.; Koizia, L.J.; Kooner, A.S.; Shimshon, G.J.; Hurkxkens, T.J.; Pratt, P.J.; Sam, A.H. A remote access mixed reality teaching ward round. Clin. Teach. 2021, 18, 386–390.
  51. Mentis, H.M.; Avellino, I.; Seo, J. AR HMD for Remote Instruction in Healthcare. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 437–440.
  52. Onishi, R.; Morisaki, T.; Suzuki, S.; Mizutani, S.; Kamigaki, T.; Fujiwara, M.; Makino, Y.; Shinoda, H. GazeBreath: Input Method Using Gaze Pointing and Breath Selection. In Proceedings of the Augmented Humans 2022, Kashiwa, Chiba, Japan, 13–15 March 2022; pp. 1–9.
  53. Johnson, P.B.; Jackson, A.; Saki, M.; Feldman, E.; Bradley, J. Patient posture correction and alignment using mixed reality visualization and the HoloLens 2. Med. Phys. 2022, 49, 15–22.
  54. Kurazume, R.; Hiramatsu, T.; Kamei, M.; Inoue, D.; Kawamura, A.; Miyauchi, S.; An, Q. Development of AR training systems for Humanitude dementia care. Adv. Robot. 2022, 36, 344–358.
  55. Matyash, I.; Kutzner, R.; Neumuth, T.; Rockstroh, M. Accuracy measurement of HoloLens2 IMUs in medical environments. Curr. Dir. Biomed. Eng. 2021, 7, 633–636.
More
Information
Contributor MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register :
View Times: 1.4K
Revisions: 2 times (View History)
Update Date: 10 Nov 2022
1000/1000
ScholarVision Creations