1000/1000
Hot
Most Recent
Morden Navigation aids for the visually impaired is an applied science for people with special needs. They are a key sociotechnique that helps users to independently navigate and access needed resources indoors and outdoors.
Visual impairment refers to the congenital or acquired impairment of visual function, resulting in decreased visual acuity or an impaired visual field. According to the World Health Organization, approximately 188.5 million people worldwide suffer from mild visual impairment, 217 million from moderate to severe visual impairment, and 36 million people are blind, with the number estimated to reach 114.6 million by 2050 [1]. In daily life, it is challenging for people with visual impairments (PVI) to travel, especially in places they are not familiar with. Although there have been remarkable efforts worldwide toward barrier-free infrastructure construction and ubiquitous services, people with visual impairments have to rely on their own relatives or personal travel aids to navigate, in most cases. In the post-epidemic era, independent living and independent travel elevate in importance since people have to maintain a social distance from each other. Thus, there has been consistent research conducted that concentrates on coupling technology and tools with a human-centric design to extend the guidance capabilities of navigation aids.
CiteSpace is a graphical user interface (GUI) bibliometric analysis tool developed by Chen [2], which has been widely adopted to analyze co-occurrence networks with rich elements, including authors, keywords, institutions, countries, and subject categories, as well as cited authors, cited literature, and a citation network of cited journals [3][4]. It has been widely applied to analyze the research features and trends in information science, regenerative medicine, lifecycle assessment, and other active research fields. Burst detection, betweenness centrality, and heterogeneous networks are the three core concepts of CiteSpace, which help to identify research frontiers, influential keywords, and emerging trends, along with sudden changes over time [3]. The visual knowledge graph derived by CiteSpace consists of nodes and relational links. In this graph, the size of a particular node indicates the co-occurrence frequency of an element, the thickness and color of the ring indicate the co-occurrence time slice of this element, and the thickness of a link between the two nodes shows the frequency of popularity in impacts [5]. Additionally, the purple circle represents the centrality of an element, and the thicker the purple circle, the stronger the centrality. Nodes with high centrality are usually regarded as turning points or pivotal points in the field [6]. In this work, we answer the following questions: What are the most influential publication sources? Who are the most active and influential authors? What are their research interests and primary contributions to society? What are the featured key studies in the field? What are the most popular topics and research trends, described by keywords? Moreover, we closely investigate milestone sample works that use different multisensor fusion methods that help to better illustrate the machine perception, intelligence, and human-machine interactions for renowned cases and reveals how frontier technologies influence the PVI navigation aids. By conducting narrative studies on representative works with unique multisensor combinations or representative multimodal interaction mechanisms, we aim to enlighten upcoming researchers by illustrating the state-of-art multimodal trial works conducted by predecessors.
Year | Title of Reference |
---|---|
2012 | NAVIG: augmented reality guidance system for the visually impaired [7] |
2012 | An indoor navigation system for the visually impaired [8] |
2013 | Multichannel ultrasonic range finder for blind people navigation [9] |
2013 | New indoor navigation system for visually impaired people using visible light communication [10] |
2013 | Blind navigation assistance for visually impaired based on local depth hypothesis from a single image [11] |
2013 | A system-prototype representing 3D space via alternative-sensing for visually impaired navigation [12] |
2014 | Navigation assistance for the visually impaired using RGB-D sensor with range expansion [13] |
2015 | Design, implementation and evaluation of an indoor navigation system for visually impaired people [14] |
2015 | An assistive navigation framework for the visually impaired [15] |
2016 | NavCog: turn-by-turn smartphone navigation assistant for people with visual impairments or blindness [16] |
2016 | ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind [17] |
2018 | PERCEPT navigation for visually impaired in large transportation hubs [18] |
2018 | Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device [19] |
2019 | An astute assistive device for mobility and object recognition for visually impaired people [20] |
2019 | Wearable travel aid for environment perception and navigation of visually impaired people [21] |
2019 | An ARCore based user centric assistive navigation system for visually impaired people [22] |
2020 | Integrating wearable haptics and obstacle avoidance for the visually impaired in indoor navigation: A user-centered approach [23] |
2020 | ASSIST: Evaluating the usability and performance of an indoor navigation assistant for blind and visually impaired people [24] |
2020 | V-eye: A vision-based navigation system for the visually impaired [25] |