Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 + 1628 word(s) 1628 2021-07-20 04:48:11 |
2 format correct Meta information modification 1628 2021-08-20 06:14:45 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Zhang, X. Electric-Powered Wheelchairs Driving. Encyclopedia. Available online: https://encyclopedia.pub/entry/13386 (accessed on 07 July 2024).
Zhang X. Electric-Powered Wheelchairs Driving. Encyclopedia. Available at: https://encyclopedia.pub/entry/13386. Accessed July 07, 2024.
Zhang, Xiaochen. "Electric-Powered Wheelchairs Driving" Encyclopedia, https://encyclopedia.pub/entry/13386 (accessed July 07, 2024).
Zhang, X. (2021, August 20). Electric-Powered Wheelchairs Driving. In Encyclopedia. https://encyclopedia.pub/entry/13386
Zhang, Xiaochen. "Electric-Powered Wheelchairs Driving." Encyclopedia. Web. 20 August, 2021.
Electric-Powered Wheelchairs Driving
Edit

Electric power wheelchairs (EPWs) enhance the mobility capability of the elderly and the disabled, while the human-machine interaction (HMI) determines how well the human intention will be precisely delivered and how human-machine system cooperation will be efficiently conducted. A bibliometric quantitative analysis of 1154 publications related to this research field, published between 1998 and 2020, was conducted. We identified the development status, contributors, hot topics, and potential future research directions of this field.

EPW driving HMI methodology bibliometric analysis research status emerging trends

1. Introduction

HMI is an essential technology, especially for the EPW. With the rising need for better life quality for all, there has been an increasing demand for affordability, ease of use, safety, and humanization for the EPW and its HMI. The vigorous development of autonomous ground vehicles and multimodal perception technology has led to an upgrade of the potential of the human-machine system’s capability and accordingly the driving behavior. Moreover, it has gradually transformed, from assistive equipment to carry people here and there, to a hybrid human-machine collaboration partner with diverse merits from the aspects of extended functionality, usability, feasibility, and social technology. In most cases, the HMI defines how people cooperate with the EPW; the effects can affect the adoption of the entire system, or even impact the overall success or failure of an EPW. Therefore, it is important to conduct the HMI research for the EPW, since the dedication to HMI implies not only a positive attitude regarding the latest advances in perception and information conveying technology but also the understanding of users and their needs.

Due to the advent of powerful low-cost computer equipment, and expanding application and in-depth research of HMI technology, the development of robotics, and growing recognition of the needs and potential of the disabled, more attention has been focused on the study of EPW HMI methodology, and numerous academic outputs have been published accordingly. These studies apply the latest progress in technology or human-centered design frameworks to conduct direct and in-depth investigations into specific EPW HMI scenarios. Therefore, a systematic analysis that presents an overview of the evolving research over the past few decades and conveys the latest progress of EPW HMI is essential. However, to the best of our knowledge, there is still a lack of more systematic quantitative review work on the EPW HMI. Thus, this paper concentrates on conducting a visualized bibliometric analysis of academic publications from the Web of Science (WOS) to determine the development status, contributors, hot topics, and latest trends.

The contributions of this work are listed as follows:(1)To the best of our knowledge, our work is the first up-to-date systematical review work on relevant HMIs for the EPW which is essential in the EPW human-machine loop and socio-technical meaning.(2)This work interprets the EPW HMI objective quantitative bibliometric analysis, which differs from the conventional qualitative review.It presents and analyzes the essential aspects of the corresponding bibliometrics, including Journal co-citation map, collaboration-ship, co-authorship, co-citation-ship of author and references, and keywords co-occurrence-ship.(3)This work focuses not only on the aspects of engineering and intelligence but also includes relevant works from the field of socio-technical system design and interaction design. Compared with review works from a pure engineering viewpoint, our work is solid regarding interdisciplinary research and cooperation.

The remainder of this paper is organized as follows. In the next section, we provide a literature review of EPW HMI methodology. Then, we describe the data collection procedure and analysis methods. Next, we present the main results of the bibliometric analysis and a discussion of the findings. Finally, we conclude the key findings and explain the contributions and limitations of this work.

2. Literature Review on EPW HMI Methodology

Due to the advancement of speech recognition, pupil detection, computer vision, electro-ophthalmogram (EOG), electroencephalogram (EEG), and electromyography (EMG) technologies, there are numerous academic reports on EPW HMI research. Cooper et al.compared the isometric joystick with a conventional position-sensitive joystick during a driving task in a virtual environment and a real environment. The isometric joystick converts the force vector exerted by the operator’s hand into the magnitude and direction of the input, having the possible advantage of reducing the cognitive overhead normally required to monitor joint orientations and torques plus the inertia generated by the mobile limb. Their study found that performance with an isometric joystick and a conventional position-sensing joystick was similar while performing selected driving tasks in both virtual and real environments, supporting additional testing of the isometric joystick as an interface device [1]. Barea et al. introduced an EOG-based eye control method that can be used to guide and control wheelchairs [2]. By combining EOG and EMG technologies, Tsui et al.proposed an EPW hands-free control system that can analyze EMG signals of eyebrow movement and EOG signals of eye movement and convert them into steering control commands (forward, left, right, etc.)for wheelchair driving [3]. Faria et al.used voice commands, facial expressions, head movements, and joysticks as main inputs to control wheelchairs through a flexible multimodal interactive interface. They evaluated the use of the smart wheelchair (SW) in real and simulated environments to demonstrate its practicability and usability [4]. Grewal et al. introduced a new type of automatic wheelchair with a sip-and-puff (SnP) user interface, which can alleviate users’ fatigue compared with traditional SnP-controlled wheelchairs [5]. Kim et al. applied the tongue drive system (TDS) to drive the wheelchair, giving people with severe motor impairment access to a computer and a wheelchair. They proved that the TDS has a better performance in terms of speed and accuracy than the SnP system [6][7]. Iturrate introduced a non-invasive brain-powered wheelchair using a P300 neurophysiological protocol and automated navigation [8]. Long et al. introduced the hybrid brain-computer interface (BCI) to solve the problem of previous BCI systems not providing the multiple independent control signals needed for the continuous control of wheelchairs. The hybrid BCI provides control commands with higher accuracy for users [9]. These studies prove the usability of EPW driving for severely disabled people, improving the mobility experience of users, and helping the disabled and the elderly achieve self-independence.

There are also some reviews of HMI methods for EPWs.Lebedev reported that brain-machine interfaces (BMIs) have undergone rapid development in recent years and have a broad range of clinical goals, in addition to enhancing normal brain functions [10]. Phinyomark et al. reviewed the latest EMG-controlled EPW technologies and various EMG-based control methods and summarized the achievements of EMG [11]. Simpson et al. summarized the current technical status of SWs and the direction of future research. Leaman et al. gave a complete overview of the research trends in SWs, including input methods [12][13]. The integrated and analyzed numerous individual studies. Most, however, were expert dependent, and, to a certain degree, this individual preference leads to a lack of objective, systematic quantitative analysis in this field. Thus, this paper concentrates on filling the gap in EPW HMI methodology research by performing a visualized bibliometric analysis of academic publications in this field.

3. Bibliometric Results and Discussion

It can be seen from these analyses that the landmark studies on EPW HMI methodology research mostly focus on BCWs.These studies confirm the critical role and widespread application of BCI in EPW HMI, indicate the main neurophysiological protocols used in BCI-based HMI, and reflect the continuous improvement of BCI-based EPW HMI performance. Two studies introducing the shared control architecture demonstrate that man-machine collaborative control is an adaptive form of cooperation between humans and machines in EPW HMI because it fuses multiple information sources, decides on appropriate maneuvers for execution, takes advantage of the respective benefits of humans and machines, and reduces the workload of human operators. In addition, safety, efficiency, accuracy, and user workload were important metrics for EPW HMI evaluation in these studies, showing that traditional machine performance and human factors are both considerations in determining the success of EPW HMI design.

Top 10most cited references in EPW HMI methodology research.

Keyword analysis is used to gain more insight into the substance of a field and can identify current research hotspots and future directions [14].CiteSpace can detect keywords with the highest frequency and align them based on their appearance time. These functions can be used to depict the knowledge structure of focus and potential future trends visually. To reduce noise, we first combined words with the same meaning, such as brain-computer interface; electroencephalogram (EEG), and electroencephalogram; people and individual; disabled person and disabled people.

To determine the evolution of the research focus, a time zone view of the keywords is illustrated, and this visualization arranges the keywords according to the time of their first appearance.EOG and EMG were used as input interfaces for EPW HMI before 2007 [15][16]. The extensive appearance of intelligent wheelchairs, BCI, and EEG in 2007, consistent with the first sharp increase in EPW HMI methodology research publications stated, indicates that research on brain-controlled wheelchairs has flourished since 2007 due to the advancement of BCI technology and EEG application. Motor imagery, P300, and SSVEP appeared in 2012, 2014, and the same year 2014, respectively, indicating that research has focused on further exploration of BCI-based HMI to realize better-humanized control (continuous control and prevention on fatigue driving) of wheelchairs [8][9][17][18][19][20][21]. The appearance of signals, recognition, feature extraction, classification, and neural networks focusing on information processing and shared control focusing on control strategies proves that not only machine performance (efficiency and accuracy) but also human factors (comfort and independence) are increasingly considered in EPW HMI research [22][23][24][25][26][27][28][29]. The relatively large nodes of the time zone map in recent years, such as speed, tetraplegia, brain, switch, eye movement, and human-robot interaction, demonstrate that speed control (high speed, low speed) and state (moving forward, stopped) or control mode switching [30][31][32][33], the application of computer technology (neural network, deep learning)[34], the development of novel EOG-based HMI [35], and the accessibility of people with tetraplegia [36], have become hot topics in recent years.

References

  1. Cooper, R.A.; Spaeth, D.M.; Jones, D.K.; Boninger, M.L.; Fitzgerald, S.G.; Guo, S. Comparison of virtual and real electric powered wheelchair driving using a position sensing joystick and an isometric joystick. Med. Eng. Phys. 2002, 24, 703–708.
  2. Barea, R.; Boquete, L.; Mazo, M.; López, E. Wheelchair Guidance Strategies Using EOG. J. Intell. Robot. Syst. 2002, 34, 279–299.
  3. Tsui, C.S.L.; Jia, P.; Gan, J.Q.; Hu, H.; Yuan, K. EMG-based hands-free wheelchair control with EOG attention shift detection. In Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 15–18 December 2007; pp. 1266–1271.
  4. Mónica Faria, B.; Vasconcelos, S.; Paulo Reis, L.; Lau, N. Evaluation of distinct input methods of an intelligent wheel-chair in simulated and real environments: A performance and usability study. Assist. Technol. 2013, 25, 88–98.
  5. Grewal, H.S.; Matthews, A.; Tea, R.; Contractor, V.; George, K. Sip-and-Puff Autonomous Wheelchair for Individuals with Severe Disabilities. In Proceedings of the 9th IEEE Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), New York, NY, USA, 8–10 November 2018; pp. 705–710.
  6. Kim, J.; Park, H.; Bruce, J.; Sutton, E.; Rowles, D.; Pucci, D.; Holbrook, J.; Minocha, J.; Nardone, B.; West, D.; et al. The Tongue Enables Computer and Wheelchair Control for People with Spinal Cord Injury. Sci. Transl. Med. 2013, 5, 213ra166.
  7. Kim, J.; Park, H.; Bruce, J.; Rowles, D.; Holbrook, J.; Nardone, B.; West, D.P.; Laumann, A.E.; Roth, E.; Veledar, E.; et al. Qualitative assessment of tongue drive system by people with high-level spinal cord injury. J. Rehabil. Res. Dev. 2014, 51, 451–465.
  8. Iturrate, I.; Antelis, J.M.; Kubler, A.; Minguez, J. A Noninvasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation. IEEE Trans. Robot. 2009, 25, 614–627.
  9. Long, J.; Li, Y.; Wang, H.; Yu, T.; Pan, J.; Li, F. A Hybrid Brain Computer Interface to Control the Direction and Speed of a Simulated or Real Wheelchair. IEEE Trans. Neural Syst. Rehabil. Eng. 2012, 20, 720–729.
  10. Lebedev, M. Brain-machine interfaces: An overview. Transl. Neurosci. 2014, 5, 99–110.
  11. Phinyomark, A.; Limsakul, C.; Phukpattaranont, P. A Review of Control Methods for Electric Power Wheelchairs Based on Electromyography Signals with Special Emphasis on Pattern Recognition. IETE Tech. Rev. 2011, 28, 316.
  12. Simpson, R.C. Smart wheelchairs: A literature review. J. Rehabil. Res. Dev. 2005, 42, 423.
  13. Leaman, J.; La, H.M. A comprehensive review of smart wheelchairs: Past, present and future. IEEE Trans. Hum. Mach. Syst. 2017, 47, 486–499.
  14. Chen, C.; Hu, Z.; Liu, S.; Tseng, H. Emerging trends in regenerative medicine: A scientometric analysis in CiteSpace. Expert Opin. Biol. Ther. 2012, 12, 593–608.
  15. Barea, R.; Boquete, L.; Mazo, M.; Lopez, E. System for assisted mobility using eye movements based on electrooculography. IEEE Trans. Neural Syst. Rehabil. Eng. 2002, 10, 209–218.
  16. Han, J.S.; Bien, Z.Z.; Kim, D.J.; Lee, H.E.; Kim, J.S. Human-machine interface for wheelchair control with EMG and its evaluation. In Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Cancun, Mexico, 17–21 September 2003; Volume 2, pp. 1602–1605.
  17. Li, Y.; Pan, J.; Wang, F.; Yu, Z. A Hybrid BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control. IEEE Trans. Biomed. Eng. 2013, 60, 3156–3166.
  18. Carlson, T.; Millan, J.D.R. Brain-controlled wheelchairs: A robotic architecture. IEEE Robot. Autom. Mag. 2013, 20, 65–73.
  19. Diez, P.F.; Müllerc, S.M.T.; Mut, V.A.; Laciar, E.; Avila, E.; Bastos-Filho, T.F.; Sarcinelli-Filho, M. Commanding a robotic wheelchair with a high-frequency steady-state visual evoked potential based brain–computer interface. Med. Eng. Phys. 2013, 35, 1155–1164.
  20. Cao, L.; Li, J.; Ji, H.; Jiang, C. A hybrid brain computer interface system based on the neurophysiological protocol and brain-actuated switch for wheelchair control. J. Neurosci. Methods 2014, 229, 33–43.
  21. Zhang, R.; Li, Y.; Yan, Y.; Zhang, H.; Wu, S.; Yu, T.; Gu, Z. Control of a Wheelchair in an Indoor Environment Based on a Brain–Computer Interface and Automated Navigation. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 24, 128–139.
  22. Pushp, S.; Saikia, A.; Khan, A.; Hazarika, S.M. A cognitively enhanced collaborative control architecture for an intelligent wheelchair: Formalization, implementation and evaluation. Cogn. Syst. Res. 2018, 49, 114–127.
  23. Saleh, A.I.; Shehata, S.A.; Labeeb, L.M. A fuzzy-based classification strategy (FBCS) based on brain–computer interface. Soft Comput. 2019, 23, 2343–2367.
  24. Belwafi, K.; Gannouni, S.; Aboalsamh, H.; Mathkour, H.; Belghith, A. A dynamic and self-adaptive classification algorithm for motor imagery EEG signals. J. Neurosci. Methods 2019, 327, 108346.
  25. Lotte, F.; Congedo, M.; Lécuyer, A.; Lamarche, F.; Arnaldi, B. A review of classification algorithms for EEG-based brain–computer interfaces. J. Neural Eng. 2007, 4, R1–R13.
  26. Nicolas-Alonso, L.F.; Gomez-Gil, J. Brain computer interfaces: A review. Sensors 2012, 12, 1211–1279.
  27. Lopes, A.C.; Pires, G.; Nunes, U. Assisted navigation for a brain-actuated intelligent wheelchair. Robot. Auton. Syst. 2013, 61, 245–258.
  28. Millán, J.D.R.; Galán, F.; Vanhooydonck, D.; Lew, E.; Philips, J.; Nuttin, M. Asynchronous non-invasive brain-actuated control of an intelligent wheelchair. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; pp. 3361–3364.
  29. Bi, L.; Fan, X.-A.; Liu, Y. EEG-Based Brain-Controlled Mobile Robots: A Survey. IEEE Trans. Hum. Mach. Syst. 2013, 43, 161–176.
  30. Yu, Y.; Zhou, Z.; Liu, Y.; Jiang, J.; Yin, E.; Zhang, N.; Wang, Z.; Liu, Y.; Wu, X.; Hu, D. Self-Paced Operation of a Wheelchair Based on a Hybrid Brain-Computer Interface Combining Motor Imagery and P300 Potential. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 2516–2526.
  31. Bi, L.; Lu, Y.; Fan, X.; Lian, J.; Liu, Y. Queuing Network Modeling of Driver EEG Signals-Based Steering Control. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 25, 1117–1124.
  32. Li, Y.; He, S.; Huang, Q.; Gu, Z.; Yu, Z.L. A EOG-based switch and its application for “start/stop” control of a wheelchair. Neurocomputing 2018, 275, 1350–1357.
  33. Yu, Y.; Liu, Y.; Jiang, J.; Yin, E.; Zhou, Z.; Hu, D. An Asynchronous Control Paradigm Based on Sequential Motor Imagery and Its Application in Wheelchair Navigation. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 2367–2375.
  34. Zhou, X.; Wang, F.; Wang, J.; Wang, Y.; Yan, J.; Zhou, G. Deep Learning Based Gesture Recognition and Its Application in Interactive Control of Intelligent Wheelchair. In Petri Nets and Other Models of Concurrency XV; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2019; pp. 547–557.
  35. Choudhari, A.M.; Porwal, P.; Jonnalagedda, V.; Mériaudeau, F. An Electrooculography based Human Machine Interface for wheelchair control. Biocybern. Biomed. Eng. 2019, 39, 673–685.
  36. Sahadat, N.; Sebkhi, N.; Ghovanloo, M. Simultaneous multimodal access to wheelchair and computer for people with tetraple-gia. In Proceedings of the 20th ACM International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; pp. 393–399.
More
Information
Subjects: Others
Contributor MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register :
View Times: 378
Revisions: 2 times (View History)
Update Date: 20 Aug 2021
1000/1000
Video Production Service