Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1329 2023-11-24 03:46:17 |
2 Reference format revised. + 6 word(s) 1335 2023-11-24 07:41:17 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Coe, P.; Evreinov, G.; Ziat, M.; Raisamo, R. Volumetric Haptic Displays. Encyclopedia. Available online: https://encyclopedia.pub/entry/52012 (accessed on 27 April 2024).
Coe P, Evreinov G, Ziat M, Raisamo R. Volumetric Haptic Displays. Encyclopedia. Available at: https://encyclopedia.pub/entry/52012. Accessed April 27, 2024.
Coe, Patrick, Grigori Evreinov, Mounia Ziat, Roope Raisamo. "Volumetric Haptic Displays" Encyclopedia, https://encyclopedia.pub/entry/52012 (accessed April 27, 2024).
Coe, P., Evreinov, G., Ziat, M., & Raisamo, R. (2023, November 24). Volumetric Haptic Displays. In Encyclopedia. https://encyclopedia.pub/entry/52012
Coe, Patrick, et al. "Volumetric Haptic Displays." Encyclopedia. Web. 24 November, 2023.
Volumetric Haptic Displays
Edit

Volumetric interfaces and displays have been an area of substantial focus in human–computer interaction (HCI), bringing about new ways to enhance interactivity, learning, and understanding as an extension to existing modalities. While much research in the area exists, current haptic display projects tend to be difficult to implement as part of existing visual display technologies. 

communication hardware interfaces haptic devices

1. Introduction

Volumetric interfaces and displays have been an area of substantial focus in human–computer interaction (HCI), bringing about new ways to enhance interactivity, learning, and understanding as an extension to existing modalities. While much research in the area exists, current haptic display projects tend to be difficult to implement as part of existing visual display technologies. Moreover, visual imagery is often an afterthought when designing haptic displays. When implemented, it often relies on techniques not suited for portability, such as overhead projection. This limits the places and situations in which the current state of volumetric haptic displays can be used. Existing commercially volumetric haptic technologies available on the market tend to separate the visual (stereoscopic) component from the volumetric haptic component (afferent flow), which often consists of a nearby display or is in conjunction with a VR/AR headset display. While this may work well in a controlled environment with fixed equipment to provide an enhanced experience, these products are not designed to be used or integrated into the more common devices and displays we use daily.

2. Volumetric Haptic Displays

A good example of this is the inFORCE shape display prototype proof of concept implemented by Ken Nakagaki and his team at the MIT Media Lab [1]. It is a shape display that uses the bed of pins design, a mechanical arrangement proven to work via many other implementations [2][3][4][5][6]. Nagaki’s prototype is complemented with projected visuals onto the bed of pins, which are raised upwards proportionally to the top surface of the simulated 3D object that can be explored by hand. While the implementation of the bed of pins display is quite effective when simulating volume, it is held back by its often cumbersome design requirements related to the physical size of the pins themselves alongside accompanying actuator technology [7] as well as the current inability to integrate with modern display technologies.
Ultrasound has also been well studied as a method of rendering volumetric shapes in mid-air. In one of the most notable examples, Benjamin Long and his team at the University of Bristol have successfully presented a method for creating haptic shapes in mid-air [8]. This method has been further developed into the UltraHaptics system used by other researchers in the field to further explore its possible use and implementation in use cases where volumetric haptics could be helpful [9][10][11][12][13][14]. The advantage is that this allows the rendering of feelable shapes within a wide space in mid-air. Once again, there is a limitation to the display technologies that can be incorporated. Due to the arrangement and placement of the ultrasonic actuators, there is no reflection surface or visual plane on which an image can be projected, nor can a display be placed over the actuators as it would impede the ultrasonic output. While the implementation of mid-air haptics through the use of a combined ultrasonic array can create explorable and feelable objects, the low intensity of the haptic points does not provide the adjustable physical response one could expect from a repulsion force of a virtual surface (>0.5 N), thus not allowing the feeling of volumetric feedback [15][16]. Seki Inoue and his team implement a similar combination array of ultrasonic actuators within an orthogonal cavity to create feelable haptic objects [17] with similarly documented effects and conclusions.
The 3D Tractus and TouchMover [18][19][20] are earlier examples of devices for interaction and exploration of 3D data that use a wide single-axis range of movement to successfully emulate the feeling of volume on a laptop and a large touchscreen display. The cited works show that by actuating a single axis of depth combined with stereoscopic 3D visual cues, the volume of virtual shapes can be successfully conveyed to users of said devices without the need for additional use of handheld peripherals. The limitation of a single axis brings into question the degree to which similar devices could be immersive if they could actuate similarly with six degrees of freedom.
As scholars explore volumetric haptic surfaces on a flat display, it is with deep interest that scholars look at the work of Dongbum Pyo’s team on dielectric elastomers [21]. Dielectrics enable an array of actuators within a transparent thin film that can be placed in the top layer of a display stack. While a loss of brightness does exist, it has been shown to convey depth and texture successfully. The thin 500 µm implementation means that this type of display suits modern display technologies. The limitations come from the limited displacement at an amplitude of around 12 µm. While this is suitable to suggest changes in shape or volume, the volume characteristics are somewhat limited [15].
Other techniques that aim to emulate volumetric feedback come in the form of wearable haptic displays. For example, the work of Adel et al. introduces a finger splint with an attached magnet where a volumetric feelable and explorable space is designed above an array of powerful electromagnets [22], similar to the array of pins methods. Still, through wearable technology, a force can be precisely adjusted when applied to the fingertip using a controllable magnetic field. As there is no need for direct physical feedback via the grasp of the fingertip, such an implementation opens up new possibilities such that it is possible to provide volumetric haptic feedback in regular displays when the magnetic field could be placed behind it.
Yet, as scholars look at the shift from a base device to wearable gloves, scholars see several emerging technologies from the private sector that use similar finger restrainment systems [23][24][25][26][27][28] where a glove provides a resistive force that does not allow movement of the user’s fingers. It can be useful when the need is to simulate holding an object. These techniques are often limited in simulating the ability to push an object or to simulate activities such as typing, an activity that does not require a user to hold the keys, but rather press them down. It is also, for the moment, demonstrably limited to VR and AR environments. There are attempts to incorporate haptic feedback via holographic displays [29], but these displays require additional equipment that may not integrate with today’s devices. An interesting hand-worn device by Trinitova et al. [30] allows application of pressure to the palm, improving the sensation of weight as expected when interacting with an object. These principles of repulsive and attractive forces found in the mentioned products and research are essential to consider while attempting to simulate mechanical feedback.
The use of air as an interaction medium continues to be of interest due to the possibility of implementation without requiring the user to wear cumbersome equipment such as haptic gloves or fingertip caps [31][32]. For this reason, scholars continue to see implementations that use this medium better. Christou et al. implement a spatial haptics system dubbed aerohaptics that uses an air blower on a servo configuration that allows it blowing along an XYZ offset [33]. It shows the continued interest in the field to create spatial volumetric haptics that can be felt without the need to wear additional hardware on their person.
In general, scholars see many novel ways to produce volumetric feedback, including using unmanned drones to simulate objects in virtual space or repurposing robotic arms for similar purposes [34]. It should be noted that kinesthetic devices (e.g., Novint’s Falcon Haptic Device) available on the market can accurately generate vector torque transferred to the user’s fingertips through a manipulandum or an attached display [35][36]. While many systems have been proposed and continue to exist, there is yet to be a low-cost implementation that can complement standard consumer devices such as phones, tablets, and other touch-sensitive surfaces equipped with onscreen keyboards or naked-eye stereoscopic display technologies.

References

  1. Nakagaki, K.; Fitzgerald, D.; Ma, Z.; Vink, L.; Levine, D.; Ishii, H. inFORCE: Bi-directionalforce’shape display for haptic interaction. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, Tempe, AZ, USA, 17–20 March 2019; pp. 615–623.
  2. Saga, S.; Deguchi, K. Lateral-force-based 2.5-dimensional tactile display for touch screen. In Proceedings of the 2012 IEEE Haptics Symposium (HAPTICS), Vancouver, BC, Canada, 4–7 March 2012; pp. 15–22.
  3. Follmer, S.; Leithinger, D.; Olwal, A.; Hogge, A.; Ishii, H. inFORM: Dynamic Physical Affordances and Constraints through Shape and Object Actuation (UIST’13). Uist 2013, 13, 2501–2988.
  4. Evreinova, T.V.; Evreinov, G.; Raisamo, R. From kinesthetic sense to new interaction concepts: Feasibility and constraints. Int. J. Adv. Comput. Technol. 2014, 3, 1–33.
  5. Perry, D.A.; Wright, H.E. Touch Enhancing Pad. US Patent 4,657,021, 14 April 1987.
  6. Martinez, J.S.; Holt, L.L.; Reed, C.M.; Tan, H.Z. Incidental Categorization of Vibrotactile Stimuli. IEEE Trans. Haptics 2020, 13, 73–79.
  7. Culbertson, H.; Schorr, S.B.; Okamura, A.M. Haptics: The present and future of artificial touch sensation. Annu. Rev. Control. Robot. Auton. Syst. 2018, 1, 385–409.
  8. Long, B.; Seah, S.A.; Carter, T.; Subramanian, S. Rendering volumetric haptic shapes in mid-air using ultrasound. ACM Trans. Graph. (TOG) 2014, 33, 1–10.
  9. Kervegant, C.; Raymond, F.; Graeff, D.; Castet, J. Touch hologram in mid-air. In ACM SIGGRAPH 2017 Emerging Technologies; ACM: New York, NY, USA, 2017; pp. 1–2.
  10. Carter, T.; Seah, S.A.; Long, B.; Drinkwater, B.; Subramanian, S. UltraHaptics: Multi-point mid-air haptic feedback for touch surfaces. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, St. Andrews, UK, 8–11 October 2013; pp. 505–514.
  11. Rutten, I.; Frier, W.; Van den Bogaert, L.; Geerts, D. Invisible touch: How identifiable are mid-air haptic shapes? In Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–6.
  12. Abdouni, A.; Clark, R.; Georgiou, O. Seeing is believing but feeling is the truth: Visualising mid-air haptics in oil baths and lightboxes. In Proceedings of the 2019 International Conference on Multimodal Interaction, Suzhou, China, 14–18 October 2019; pp. 504–505.
  13. Subramanian, S.; Seah, S.A.; Shinoda, H.; Hoggan, E.; Corenthy, L. Mid-air haptics and displays: Systems for un-instrumented mid-air interactions. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 3446–3452.
  14. Rakkolainen, I.; Freeman, E.; Sand, A.; Raisamo, R.; Brewster, S. A survey of mid-air ultrasound haptics and its applications. IEEE Trans. Haptics 2020, 14, 2–19.
  15. van Kuilenburg, J.; Masen, M.A.; van der Heide, E. A review of fingerpad contact mechanics and friction and how this affects tactile perception. Proc. Inst. Mech. Eng. Part J J. Eng. Tribol. 2015, 229, 243–258.
  16. Smith, A.M.; Gosselin, G.; Houde, B. Deployment of fingertip forces in tactile exploration. Exp. Brain Res. 2002, 147, 209–218.
  17. Inoue, S.; Makino, Y.; Shinoda, H. Active touch perception produced by airborne ultrasonic haptic hologram. In Proceedings of the 2015 IEEE World Haptics Conference (WHC), Evanston, IL, USA, 22–26 June 2015; pp. 362–367.
  18. Lapides, P.; Sharlin, E.; Sousa, M.C.; Streit, L. The 3D tractus: A three-dimensional drawing board. In Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP’06), Adelaide, Australia, 5–7 January 2006; p. 8.
  19. Sinclair, M.; Pahud, M.; Benko, H. TouchMover: Actuated 3D touchscreen with haptic feedback. In Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces, St Andrews, UK, 6–9 October 2013; pp. 287–296.
  20. Sinclair, M.; Pahud, M.; Benko, H. TouchMover 2.0-3D touchscreen with force feedback and haptic texture. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS), Houston, TX, USA, 23–26 February 2014; pp. 1–6.
  21. Pyo, D.; Ryu, S.; Kim, S.C.; Kwon, D.S. A new surface display for 3D haptic rendering. In Proceedings of the Haptics: Neuroscience, Devices, Modeling, and Applications: 9th International Conference, EuroHaptics 2014, Versailles, France, 24–26 June 2014; Proceedings, Part I 9. Springer: Berlin/Heidelberg, Germany, 2014; pp. 487–495.
  22. Adel, A.; Micheal, M.M.; Abou Self, M.; Abdennadher, S.; Khalil, I.S. Rendering of virtual volumetric shapes using an electromagnetic-based haptic interface. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–9.
  23. Needleman, S.E. Virtual Reality, Now with the Sense of Touch. Wall Str. J.. Available online: https://www.wsj.com/articles/virtual-reality-now-with-the-sense-of-touch-1522764377 (accessed on 1 October 2023).
  24. Matteson, L.; Jones, M.; Hinojosa, S. HaptX Team Final Project Report. Available online: https://digitalcommons.trinity.edu/engine_designreports/42/ (accessed on 1 October 2023).
  25. Zuckerberg, M. Zuckerberg Facebook Post and Video about Meta Reality Labs Haptic Gloves. Available online: https://www.facebook.com/zuck/posts/10114081291949851 (accessed on 1 October 2023).
  26. de Vries, J. Redesigning a Haptic Glove for New Features and Improved Assembly. Ph.D. Thesis, Delft University of Technology, Delft, The Netherlands, 2023.
  27. Caeiro-Rodríguez, M.; Otero-González, I.; Mikic-Fonte, F.A.; Llamas-Nistal, M. A systematic review of commercial smart gloves: Current status and applications. Sensors 2021, 21, 2667.
  28. Wokke, S. Calibrating the Manus VR Glove: Improving Calibration for the Manus VR Flex Sensor Glove Using Ground Truths. Master’s Thesis, Eindhoven University of Technology, Eindhoven, The Netherlands, 24 April 2017.
  29. Yoshida, T.; Shimizu, K.; Kurogi, T.; Kamuro, S.; Minamizawa, K.; Nii, H.; Tachi, S. RePro3D: Full-parallax 3D display with haptic feedback using retro-reflective projection technology. In Proceedings of the 2011 IEEE International Symposium on VR Innovation, Singapore, 19–23 March 2011; pp. 49–54.
  30. Trinitatova, D.; Tsetserukou, D. Deltatouch: A 3D haptic display for delivering multimodal tactile stimuli at the palm. In Proceedings of the 2019 IEEE World Haptics Conference (WHC), Tokyo, Japan, 9–12 July 2019; pp. 73–78.
  31. Abad, A.C.; Reid, D.; Ranasinghe, A. A Novel Untethered Hand Wearable with Fine-Grained Cutaneous Haptic Feedback. Sensors 2022, 22, 1924.
  32. Tsai, H.R.; Tsai, C.; Liao, Y.S.; Chiang, Y.T.; Zhang, Z.Y. FingerX: Rendering Haptic Shapes of Virtual Objects Augmented by Real Objects Using Extendable and Withdrawable Supports on Fingers. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 30 April–5 May 2022; pp. 1–14.
  33. Christou, A.; Chirila, R.; Dahiya, R. Pseudo-Hologram with Aerohaptic Feedback for Interactive Volumetric Displays. Adv. Intell. Syst. 2022, 4, 2100090.
  34. Mercado, V.R.; Marchal, M.; Lécuyer, A. “Haptics On-Demand”: A Survey on Encountered-Type Haptic Displays. IEEE Trans. Haptics 2021, 14, 449–464.
  35. Costes, A.; Danieau, F.; Argelaguet-Sanz, F.; Lécuyer, A.; Guillotel, P. Kinestouch: 3D force-feedback rendering for tactile surfaces. In Proceedings of the Virtual Reality and Augmented Reality: 15th EuroVR International Conference, EuroVR 2018, London, UK, 22–23 October 2018; Proceedings 15. Springer: Berlin/Heidelberg, Germany, 2018; pp. 97–116.
  36. Kim, S.C.; Han, B.K.; Kwon, D.S. Haptic rendering of 3D geometry on 2D touch surface based on mechanical rotation. IEEE Trans. Haptics 2017, 11, 140–145.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , ,
View Times: 119
Revisions: 2 times (View History)
Update Date: 24 Nov 2023
1000/1000