Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2232 2024-01-18 07:52:20

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Zhang, X.; Pan, Z.; Song, Z.; Zhang, Y.; Li, W.; Ding, S. Indoor Electronic Travel Aid for Visually Impaired Individuals. Encyclopedia. Available online: https://encyclopedia.pub/entry/54011 (accessed on 15 November 2024).
Zhang X, Pan Z, Song Z, Zhang Y, Li W, Ding S. Indoor Electronic Travel Aid for Visually Impaired Individuals. Encyclopedia. Available at: https://encyclopedia.pub/entry/54011. Accessed November 15, 2024.
Zhang, Xiaochen, Ziyi Pan, Ziyang Song, Yang Zhang, Wujing Li, Shiyao Ding. "Indoor Electronic Travel Aid for Visually Impaired Individuals" Encyclopedia, https://encyclopedia.pub/entry/54011 (accessed November 15, 2024).
Zhang, X., Pan, Z., Song, Z., Zhang, Y., Li, W., & Ding, S. (2024, January 18). Indoor Electronic Travel Aid for Visually Impaired Individuals. In Encyclopedia. https://encyclopedia.pub/entry/54011
Zhang, Xiaochen, et al. "Indoor Electronic Travel Aid for Visually Impaired Individuals." Encyclopedia. Web. 18 January, 2024.
Indoor Electronic Travel Aid for Visually Impaired Individuals
Edit

Most navigation aids for visually impaired individuals require users to pay close attention and actively understand the instructions or feedback of guidance, which impose considerable cognitive loads in long-term usage. The “Aerial Guide Dog”, a helium balloon aerostat drone designed for indoor guidance, which leverages gentle tugs in real time for directional guidance, ensuring a seamless and intuitive guiding experience. The introduced Aerial Guide Dog has been evaluated in terms of directional guidance and path following in the pilot study, focusing on assessing its accuracy in orientation and the overall performance in navigation.

indoor electronic travel aid visual impairment wearable assistive devices cognitive load

1. Introduction

According to August 2023 statistics from the World Health Organization (WHO), at least 1 billion individuals suffer from vision impairments or blindness. Vision disorders or blindness affect people of all ages, potentially limiting their educational development, reducing labor participation rates, social interaction, and independence, and often lead to a high prevalence of depression. These issues significantly impact the quality of life for individuals with visual impairments [1][2][3][4][5].
Vision is one of humans’ most important senses, essential for normal living and normal moving around. When travelling in unknown environments, it aids individuals in recognizing environmental features to find the correct path and avoid potential hazards on the way [6][7]. However, for individuals with visual impairments, navigating unfamiliar environments and complex buildings is particularly challenging. They often cannot identify key features to guide movements for negotiating stairs/steps/doors or avoid obstacles, such as walls/people/furniture, etc., to reach their desired destination, resulting in feelings of insecurity and anxiety. In fact, up to 70% of individuals with visual impairments tend to avoid moving around independently in indoor spaces, perceiving shopping malls as one of the most challenging environments. When it is essential for them to go shopping, they must rely on getting help from sighted people, which not only undermines their confidence and independence, but also significantly affects their ability to gain more experience in carrying out indoor everyday activities [7][8][9][10][11][12].
Due to the difficulty visual impaired individuals face in recognizing their surroundings, white canes and guide dogs have become preferred solutions because of their simplicity and intuitive nature. However, they have limitations in that they primarily help in identifying objects near the user and are thus mainly suitable for individuals with reasonable confidence and the ability to move around effectively [5][7][13]. The cane relies on tactile feedback transmitted mainly from its tip when swung at ground level, making it difficult to detect obstacles higher up above the swinging range, placing the users in potential hazardous situations [6][10][13]. While guide dogs offer an intuitive and solution, they are limited by an insufficient supply of trained dogs whose lifespan is relatively short (about 6 to 8 years) and high training costs (≈USD 42,000) [5].
Systems designed to enhance the walking autonomy of blind individuals through various technological solutions are generally referred to as Electronic Travel Aids (ETAs). The design of ETAs is particularly filled with challenges [10], because very demanding requirements need to be met, such as real-time guidance, portability, power limitations, suitable interfaces, continuous availability, independence from infrastructure, low-cost solutions, and minimal training. Simultaneously, the system should be easy to use, clear, and user-friendly [5][6][12][13][14]. However, various studies on assistive technology for the blind have primarily focused on object recognition, navigation, and mobility [7], exploring the diverse needs of visually impaired individuals in different activity scenarios. These studies aim to solve context-specific challenges by developing various technological solutions. Currently, no single assistive device has been developed that can be used as extensively and long term in the lives of visually impaired individuals as traditional white canes and guide dogs. Therefore, the focus should be on developing cost-effective, user-friendly long-term solutions able to be used in real-world situations, rather than solely advancing technology [11].
In the past decade of research on indoor ETAs, substituting visual perception via alternative methods has been a mainstream approach [15] stemming from the theory of sensory substitution neuroplasticity. This refers to the capability of the brain to assimilate specific sensory information in alternative ways [12] and requires individuals to consciously integrate their sensory disability with their other functioning senses; e.g., visual impairment can be replaced with auditory and/or tactile senses [6][11].
Based on such sensory substitution approaches, indoor ETAs have been designed relying on methods to generate stimuli that substitute for vision, and users need to learn how to understand the (auditory/tactile) signals to successfully complete activities such as travelling tasks in complex environments. Complexity in understanding the environmental information has been recognized as placing a large cognitive load on the user [15]. Cognitive Load Theory suggests that our working memory is only able to hold a small amount of information at any one time and that instructional methods should avoid overloading it in order to maximize learning [16], and researchers like Giudice et al. have suggested that developers should focus on assisting users in performing specific and necessary tasks, while minimizing the amount of information passed to the user. Indoor ETAs involve utilizing “perceptual and cognitive factors related to processing non-visual information”; however, the bandwidth of non-visual senses such as auditory, tactile, and olfactory is much smaller than that of vision [17]. Hence, balancing the relationship between the minimum and the necessary information becomes crucial [12][18].
The main sensory approaches for replacing visual information are via auditory and tactile methods. Audio methods can be divided into audio description and spatial audio; audio description can provide general guidance but often lacks the detail needed for precise movements [19][20][21][22][23], whereas spatial audio, which links sound source locations to intended directions, is more intuitive for the user. However, spatial audio can interfere with environmental sounds, which can cause hazardous situations to arise [24][25][26][27][28][29][30]. Tactile methods involve vibrotactile and kinesthetic approaches. Vibrotactile feedback uses vibrations to convey environmental information and can be felt on different body parts, but its effectiveness varies due to factors like body part sensitivity and clothing thickness [30][31][32][33][34][35][36][37][38]. Kinesthetic devices use traction force for providing directional cues. For instance, Antolini et al. proposed a method of providing kinesthetic stimulation to users by tilting a flywheel inside the device, allowing users to determine left or right directions based on the sensation of motion simulated by the flywheel, thereby guiding user navigation [39]. Another method includes devices that change shape to provide directional clues [38]. For example, Spiers et al. proposed a cube-like device with a top section that rotates and extends, providing tactile feedback on various finger areas to indicate direction [38][40][41].

2. A Low-Cognitive-Load Indoor Electronic Travel Aid for Visually Impaired Individuals

The preferences, suggestions, and actual needs of visually impaired individuals regarding ETAs in both indoor and outdoor environments are crucial references for researchers developing suitable commercial ETA solutions. In this regard, Plikynas et al. [6] conducted comprehensive interviews with 25 blind experts, revealing that 16 of them avoided using any ETAs for indoor navigation due to the absence of suitable and convenient commercial solutions. Therefore, compared to existing outdoor solutions, the market still demands further development and enhancement of suitable tactile and auditory devices for indoor orientation and navigation [8].
The pros and cons of navigation system feedback methods are a qualitative assessment, varying according to the specific needs and capabilities of users in different environments. Plikynas et al. indicate that, taking voice commands as an example, visually impaired individuals tend to prefer this type of audio feedback for outdoor navigation as compared to indoor environments [6]. Therefore, it is crucial to provide users with appropriate feedback methods tailored to specific situations and needs. Although tactile feedback may encounter limitations in comprehending all transmitted information in areas of perception after prolonged use, visually impaired individuals still show a preference for receiving commands or information through this feedback method in indoor environments [6].Hence, considering tactile feedback as a more accepted method for visually impaired individuals in indoor environments, it should be prioritized as a vital sensory alternative in the development of indoor ETAs.
Enhancing vibration-sensed signals through advanced signal processing algorithms to convey directional information to visually impaired users is a common tactile and effective feedback method. Robert et al. proposed a method called ALVU (Array of Lidars and Vibrotactile Units), which includes a sensor belt worn around the waist and a separate tactile belt worn around the upper abdomen [42]. The sensor belt operates by emitting infrared light pulses to measure the distance between the person and nearby obstacles, effectively detecting obstacles around the individual. In contrast, the tactile belt utilizes vibrating motors to provide feedback. These motors adjust their vibration frequency and intensity based on the distance to detected obstacles, as measured by the sensor belt, thus conveying the distance information of these obstacles to the user [39]. This system has been identified as an effective method of feedback. Khusro et al. developed a real-time feedback system for indoor navigation that utilizes the vibration motors within smartphones to deliver rich tactile information based on vibration characteristics such as frequency, rhythm, and duration. By systematically arranging different lengths of patterns in the manner of Morse code, this system mimics natural tones familiar to users, such as ‘heartbeat’ and ‘knocking’, thereby greatly improving the learnability and understandability of the information received by users [43]. See et al. utilized a robotic operating system to integrate depth camera sensors and obstacle localization algorithms, employing tactile feedback to detect obstacles surrounding the user. This wearable device, equipped with vibration motors in various areas on the user’s body, conveys the location of obstacles by activating the corresponding directional motor and indicates the distance to these obstacles through the intensity of the vibrations. Users can stop and make necessary adjustments based on the specific vibration cues to navigate around all types of obstacles [44].
In most approaches, the tactile signals of the assistive devices developed rely on coding and requiring users to learn to understand the “coded information” corresponding to different vibration signals which can demand significant effort to learn and memorize. Additionally, while the information provided via vibration-based mechanisms is generally effective, prolonged use can lead to fatigue and numbness, resulting in individuals being unable to comprehend all the information for effective use [6]. Another approach using force feedback for guidance has been found to be more intuitive and less demanding cognitively. Federica et al. [10] proposed an ETA system where users receive directional haptic feedback through forces provided by motors worn around in an armband. The device works through the motors spinning in opposite directions to tighten or loosen the armband, advising the user to walk or stop, and the motors spinning in the same direction, causing the armband to slide up or down the arm advising the user to turn left or right. This simple method has been evaluated to convey clear directional information through pressure and skin stretching on specific body parts, akin to a volunteer holding a visually impaired person’s arm for guidance. Navigating by replicating such familiar approaches from the experiences of visual impaired persons is clearly a valid method to adopt in realizing effective user-centered designs that can work well in real-world situations. However, an issue that needs to be addressed is that the thickness of clothing needs to be taken into account as it can affect the user’s perception of the signals. Therefore, compared to reproducing the method of volunteers guiding individuals with visual impairments, the Aerial Guide Dog chooses the more sensitive finger pulp area for tactile feedback. By emulating the working method of guide dogs to lead the visually impaired, it can enhance the effectiveness of perception and reduce the impact of other external factors.
Avila et al. [24] demonstrated that an assistive navigation system with a drone as the guidance module is an efficient and accurate method of guiding, as it provides continuous directional feedback [45]. Notably, the drones developed utilize a soft rope to relay the forces to the user to enhance the independent navigation abilities of visually impaired users. However, due to the use of a soft rope connection, users must maintain a strict relative spatial position with the drone to fully perceive the traction force, as any change in relative position renders it ineffective. When users follow the drone for navigation, changing their walking speed can cause a mismatch between the expected and actual traction forces provided by the drone, leading to ambiguous directional guidance [46]. Compared to the guidance systems of commercial drones, the Aerial Guide Dog utilizes a quieter helium balloon aerostat drone and uses a flexible carbon rod for the traction rope, ensuring that users can clearly perceive directional signals by merely holding the handle. This makes the new approach presented here more in line with the visually impaired users’ requirements as well as being cost-effective [7][11].
Compared to the traditional robotic guide dog method developed by Hwang et al., the advantage of the Aerial Guide Dog lies in its flying guidance approach [47], which reduces ground interaction challenges with the environment encountered [7]. Being above the ground, it also has a wider field of view, thereby improving its range to provide more complete environmental information to the user. Furthermore, this aerial approach reduces the wear and tear often seen in ground-based systems due to continuous contact with irregularities of the ground surface.
The introduction of the Aerial Guide Dog as an indoor ETA is felt to represent a significant advancement in technology to help individuals with visual impairments move around effectively. Furthermore, it underscores the importance and practical applicability of the Aerial Guide Dog’s tactile sensory substitution approach, which needs to be investigated in future research on indoor ETAs.

References

  1. Bourne, R.R.A.; Flaxman, S.R.; Braithwaite, T.; Cicinelli, M.V.; Das, A.; Jonas, J.B.; Keeffe, J.; Kempen, J.H.; Leasher, J.; Limburg, H.; et al. Magnitude, Temporal Trends, and Projections of the Global Prevalence of Blindness and Distance and near Vision Impairment: A Systematic Review and Meta-Analysis. Lancet Glob. Health 2017, 5, e888–e897.
  2. Vu, H.T.V. Impact of Unilateral and Bilateral Vision Loss on Quality of Life. Br. J. Ophthalmol. 2005, 89, 360–363.
  3. Vision Impairment and Blindness. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 14 November 2023).
  4. Crewe, J.M.; Morlet, N.; Morgan, W.H.; Spilsbury, K.; Mukhtar, A.; Clark, A.; Ng, J.Q.; Crowley, M.; Semmens, J.B. Quality of Life of the Most Severely Vision-Impaired. Clin. Exp. Ophthalmol. 2011, 39, 336–343.
  5. Slade, P.; Tambe, A.; Kochenderfer, M.J. Multimodal Sensing and Intuitive Steering Assistance Improve Navigation and Mobility for People with Impaired Vision. Sci. Robot. 2021, 6, eabg6594.
  6. Plikynas, D.; Žvironas, A.; Budrionis, A.; Gudauskis, M. Indoor Navigation Systems for Visually Impaired Persons: Mapping the Features of Existing Technologies to User Needs. Sensors 2020, 20, 636.
  7. Masal, K.M.; Bhatlawande, S.; Shingade, S.D. Development of a Visual to Audio and Tactile Substitution System for Mobility and Orientation of Visually Impaired People: A Review. Multimed. Tools Appl. 2023, 82, 1–41.
  8. Nair, V.; Olmschenk, G.; Seiple, W.H.; Zhu, Z. ASSIST: Evaluating the Usability and Performance of an Indoor Navigation Assistant for Blind and Visually Impaired People. Assist. Technol. 2022, 34, 289–299.
  9. Gharghan, S.K.; Al-Kafaji, R.D.; Mahdi, S.Q.; Zubaidi, S.L.; Ridha, H.M. Indoor Localization for the Blind Based on the Fusion of a Metaheuristic Algorithm with a Neural Network Using Energy-Efficient WSN. Arab. J. Sci. Eng. 2023, 48, 6025–6052.
  10. Barontini, F.; Catalano, M.G.; Pallottino, L.; Leporini, B.; Bianchi, M. Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach. IEEE Trans. Haptics 2021, 14, 109–122.
  11. Tapu, R.; Mocanu, B.; Zaharia, T. Wearable Assistive Devices for Visually Impaired: A State of the Art Survey. Pattern Recognit. Lett. 2020, 137, 37–52.
  12. Real, S.; Araujo, A. Navigation Systems for the Blind and Visually Impaired: Past Work, Challenges, and Open Problems. Sensors 2019, 19, 3404.
  13. Kassim, A.M.; Yasuno, T.; Suzuki, H.; Jaafar, H.I.; Aras, M.S.M. Indoor Navigation System Based on Passive RFID Transponder with Digital Compass for Visually Impaired People. Int. J. Adv. Comput. Sci. Appl. 2016, 7, 604–611.
  14. Wise, E.; Li, B.; Gallagher, T.; Dempster, A.G.; Rizos, C.; Ramsey-Stewart, E.; Woo, D. Indoor Navigation for the Blind and Vision Impaired: Where Are We and Where Are We Going? In Proceedings of the 2012 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sydney, Australia, 13–15 November 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1–7.
  15. Romeo, K.; Pissaloux, E.; Gay, S.L.; Truong, N.-T.; Djoussouf, L. The MAPS: Toward a Novel Mobility Assistance System for Visually Impaired People. Sensors 2022, 22, 3316.
  16. The Importance of Cognitive Load Theory|Society for Education and Training. Available online: https://set.et-foundation.co.uk/resources/the-importance-of-cognitive-load-theory (accessed on 17 December 2023).
  17. Chanana, P.; Paul, R.; Balakrishnan, M.; Rao, P. Assistive Technology Solutions for Aiding Travel of Pedestrians with Visual Impairment. J. Rehabil. Assist. Technol. Eng. 2017, 4, 205566831772599.
  18. Loomis, J.; Klatzky, R.; Giudice, N. Sensory Substitution of Vision: Importance of Perceptual and Cognitive Processing. In Assistive Technology for Blindness and Low Vision; CRC Press: Boca Raton, FL, USA, 2012; pp. 162–191.
  19. Liu, G.; Yu, T.; Yu, C.; Xu, H.; Xu, S.; Yang, C.; Wang, F.; Mi, H.; Shi, Y. Tactile Compass: Enabling Visually Impaired People to Follow a Path with Continuous Directional Feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1–13.
  20. Ahmetovic, D.; Gleason, C.; Ruan, C.; Kitani, K.; Takagi, H.; Asakawa, C. NavCog: A Navigational Cognitive Assistant for the Blind. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, Florence, Italy, 6 September 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 90–99.
  21. Fiannaca, A.; Apostolopoulous, I.; Folmer, E. Headlock: A Wearable Navigation Aid That Helps Blind Cane Users Traverse Large Open Spaces. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility, Rochester, NY, USA, 20–24 October 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 19–26.
  22. Guerreiro, J.; Ahmetovic, D.; Sato, D.; Kitani, K.; Asakawa, C. Airport Accessibility and Navigation Assistance for People with Visual Impairments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–14.
  23. Sato, D.; Oh, U.; Naito, K.; Takagi, H.; Kitani, K.; Asakawa, C. NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, Baltimore, MD, USA, 29 October–1 November 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 270–279.
  24. Avila Soto, M.; Funk, M.; Hoppe, M.; Boldt, R.; Wolf, K.; Henze, N. DroneNavigator: Using Leashed and Free-Floating Quadcopters to Navigate Visually Impaired Travelers. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, Baltimore, MD, USA, 29 October–1 November 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 300–304.
  25. Blum, J.R.; Bouchard, M.; Cooperstock, J.R. What’s around Me? Spatialized Audio Augmented Reality for Blind Users with a Smartphone. In Proceedings of the Mobile and Ubiquitous Systems: Computing, Networking, and Services, Beijing, China, 12–14 December 2012; Puiatti, A., Gu, T., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; Volume 104, pp. 49–62.
  26. Katz, B.F.G.; Kammoun, S.; Parseihian, G.; Gutierrez, O.; Brilhault, A.; Auvray, M.; Truillet, P.; Denis, M.; Thorpe, S.; Jouffrais, C. NAVIG: Augmented Reality Guidance System for the Visually Impaired. Virtual Real. 2012, 16, 253–269.
  27. Kay, L. A Sonar Aid to Enhance Spatial Perception of the Blind: Engineering Design and Evaluation. Radio Electron. Eng. 1974, 44, 605–627.
  28. Avila, M.; Funk, M.; Henze, N. DroneNavigator: Using Drones for Navigating Visually Impaired Persons. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, Lisbon, Portugal, 26–28 October 2015; Association for Computing Machinery: Lisbon, Portugal, 2015; pp. 327–328.
  29. Fernandes, H.; Costa, P.; Filipe, V.; Paredes, H.; Barroso, J. A Review of Assistive Spatial Orientation and Navigation Technologies for the Visually Impaired. Univ. Access. Inf. Soc. 2019, 18, 155–168.
  30. Xu, S.; Yang, C.; Ge, W.; Yu, C.; Shi, Y. Virtual Paving: Rendering a Smooth Path for People with Visual Impairment through Vibrotactile and Audio Feedback. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 1–25.
  31. Ryu, D.; Yang, G.-H.; Kang, S. T-Hive: Bilateral Haptic Interface Using Vibrotactile Cues for Presenting Spatial Information. IEEE Trans. Syst. Man Cybern. C 2012, 42, 1318–1325.
  32. Heuten, W.; Henze, N.; Boll, S.; Pielot, M. Tactile Wayfinder: A Non-Visual Support System for Wayfinding. In Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, Lund, Sweden, 20–22 October 2008; Association for Computing Machinery: Lund, Sweden, 2008; pp. 172–181.
  33. Kammoun, S.; Jouffrais, C.; Guerreiro, T.; Nicolau, H.; Jorge, J. Guiding Blind People with Haptic Feedback. Front. Access. Pervasive Comput. 2012, 3, 18–22.
  34. Tsukada, K.; Yasumura, M. ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation. In Proceedings of the UbiComp 2004: Ubiquitous Computing, Nottingham, UK, 7–10 September 2004; Davies, N., Mynatt, E.D., Siio, I., Eds.; Springer: Berlin/Heidelberg, Germany, 2004; pp. 384–399.
  35. Erp, J.B.F.V.; Veen, H.A.H.C.V.; Jansen, C.; Dobbins, T. Waypoint Navigation with a Vibrotactile Waist Belt. ACM Trans. Appl. Percept. 2005, 2, 106–117.
  36. Petrausch, V.; Schwarz, T.; Stiefelhagen, R. Prototype Development of a Low-Cost Vibro-Tactile Navigation Aid for the Visually Impaired. In Proceedings of the Computers Helping People with Special Needs, Linz, Austria, 11–13 July 2018; Miesenberger, K., Kouroupetroglou, G., Eds.; Springer International Publishing: Cham, Switzerland, 2018; Volume 10897, pp. 63–69.
  37. Kammoun, S.; Bouhani, W.; Jemni, M. Sole Based Tactile Information Display for Visually Impaired Pedestrian Navigation. In Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Corfu, Greece, 1–3 July 2015; ACM: Corfu, Greece, 2015; pp. 1–4.
  38. Spiers, A.J.; Van Der Linden, J.; Wiseman, S.; Oshodi, M. Testing a Shape-Changing Haptic Navigation Device with Vision-Impaired and Sighted Audiences in an Immersive Theater Setting. IEEE Trans. Human-Mach. Syst. 2018, 48, 614–625.
  39. Antolini, M.; Bordegoni, M.; Cugini, U. A Haptic Direction Indicator Using the Gyro Effect. In Proceedings of the 2011 IEEE World Haptics Conference, Istanbul, Turkey, 21–24 June 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 251–256.
  40. Spiers, A.J.; Dollar, A.M. Design and Evaluation of Shape-Changing Haptic Interfaces for Pedestrian Navigation Assistance. IEEE Trans. Haptics 2017, 10, 17–28.
  41. Spiers, A.J.; van Der Linden, J.; Oshodi, M.; Dollar, A.M. Development and Experimental Validation of a Minimalistic Shape-Changing Haptic Navigation Device. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 2688–2695.
  42. Katzschmann, R.K.; Araki, B.; Rus, D. Safe Local Navigation for Visually Impaired Users with a Time-of-Flight and Haptic Feedback Device. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 583–593.
  43. Khusro, S.; Shah, B.; Khan, I.; Rahman, S. Haptic Feedback to Assist Blind People in Indoor Environment Using Vibration Patterns. Sensors 2022, 22, 361.
  44. See, A.R.; Costillas, L.V.M.; Advincula, W.D.C.; Bugtai, N.T. Haptic Feedback to Detect Obstacles in Multiple Regions for Visually Impaired and Blind People. Sens. Mater. 2021, 33, 1799.
  45. Tan, H.; Chen, C.; Luo, X.; Zhang, J.; Seibold, C.; Yang, K.; Stiefelhagen, R. Flying Guide Dog: Walkable Path Discovery for the Visually Impaired Utilizing Drones and Transformer-Based Semantic Segmentation. In Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 6–10 December 2021; pp. 1123–1128.
  46. Tognon, M.; Alami, R.; Siciliano, B. Physical Human-Robot Interaction with a Tethered Aerial Vehicle: Application to a Force-Based Human Guiding Problem. IEEE Trans. Robot. 2021, 37, 723–734.
  47. Hwang, H.; Xia, T.; Keita, I.; Suzuki, K.; Biswas, J.; Lee, S.I.; Kim, D. System Configuration and Navigation of a Guide Dog Robot: Toward Animal Guide Dog-Level Guiding Work. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 9778–9784.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , ,
View Times: 254
Revision: 1 time (View History)
Update Date: 18 Jan 2024
1000/1000
ScholarVision Creations