Applications of Brain–Computer Interfaces to Control and Automation: Comparison
Please note this is a comparison between Version 2 by Catherine Yang and Version 1 by Simone Fiori.

Brain–computer interfacing (BCI) is a real-time communication system that connects the brain and external devices. A BCI system can directly convert the information sent by the brain into commands that can drive external devices and can replace human limbs or phonation organs to achieve communication with the outside world and to control the external environment. In other words, a BCI system can replace the normal peripheral nerve and muscle tissue to achieve communication between a human and a computer or between a human and the external environment. BCIs have been validated in various noisy structured environments such as homes, hospitals, and expositions, resulting in the direct application of BCIs gaining popularity with regular consumers.

  • brain–computer interfacing
  • automation and control
  • electrophysiological recordings

1. Application to Unmanned Vehicles and Robotics

Visual feedback was provided via a forward-facing camera placed on the hull of the drone. Brain activity was used to move the quadcopter through an obstacle course. The subjects were able to quickly pursue a series of foam ring targets by passing through them in a real-world environment. They obtained up to 90.5%90.5% of all valid targets through the course, and the movement was performed in an accurate and continuous way. The performance of such a system was quantified by using metrics suitable for asynchronous BCI. The results provide an affordable framework for the development of multidimensional BCI control in telepresence robotics. The study showed that BCI can be effectively used to accomplish complex control in a three-dimensional space. Such an application can be beneficial both to people with severe disabilities as well as in industrial environments. In fact, the authors of [228][1] faced problems related to typical control applications where the BCI acts as a controller that moves a simple object in a structured environment. Such a study follows previous research endeavors: the works in [229,230][2][3] that showed the ability of users to control the flight of a virtual helicopter with 2D control, and the work of [231][4], that demonstrated 3D control by leveraging a motor imagery paradigm with intelligent control strategies.
Applications on different autonomous robots are under investigation. For example, the study of [232][5] proposed a new humanoid navigation system directly controlled through an asynchronous sensorimotor rhythm-based BCI system. Their approach allows for flexible robotic motion control in unknown environments using a camera vision. The proposed navigation system includes posture-dependent control architecture and is comparable with the previous mobile robot navigation system that depends on an agent-based model.

2. Application to “Smart Home” and Virtual Reality

Applications of EEG-based brain–computer interfaces are emerging in “Smart Homes”. BCI technology can be used by disabled people to improve their independence and to maximize their residual capabilities at home. In the last years, novel BCI systems were developed to control home appliances. A prototypical three-wheel, small-sized robot for smart-home applications used to perform experiments is shown in Figure 12.
Figure 12. A prototypical three-wheel, small-sized robot for smart-home applications used to perform experiments (from the Department of Information Engineering at the Università Politecnica delle Marche).
The aim of the study [233][6] was to improve the quality of life of disabled people through BCI control systems during some daily life activity such as opening/closing doors, switching on and off lights, controlling televisions, using mobile phones, sending massages to people in their community, and operating a video camera. To accomplish such goals, the authors of the study [233][6] proposed a real-time wireless EEG-based BCI system based on commercial EMOTIV EPOC headset. EEG signals were acquired by an EMOTIV EPOC headset and transmitted through a Bluetooth module to a personal computer. The received EEG data were processed by the software provided by EMOTIV, and the results were transmitted to the embedded system to control the appliances through a Wi-Fi module. A dedicated graphical user interface (GUI) was developed to detect a key stroke and to convert it to a predefined command.
In the studies of [234[7][8],235], the authors proposed the integration of the BCI technique with universal plug and play (UPnP) home networking for smart house applications. The proposed system can process EEG signals without transmitting them to back-end personal computers. Such flexibility, the advantages of low-power-consumption and of using small-volume wireless physiological signal acquisition modules, and embedded signal processing modules make this technology be suitable for various kinds of smart applications in daily life.
The study of [236][9] evaluated the performances of an EEG-based BCI system to control smart home applications with high accuracy and high reliability. In said study, a P300-based BCI system was connected to a virtual reality system that can be easily reconfigurable and therefore constitutes a favorable testing environment for real smart homes for disabled people. The authors of [237][10] proposed an implementation of a BCI system for controlling wheelchairs and electric appliances in a smart house to assist the daily-life activities of its users. Tests were performed by a subject achieving satisfactory results.
Virtual reality concerns human–computer interaction, where the signals extracted from the brain are used to interact with a computer. With advances in the interaction with computers, new applications have appeared: video games [227][11] and virtual reality developed with noninvasive techniques [238,239][12][13].

3. Application to Mobile Robotics and Interaction with Robotic Arms

The EEG signals of a subject can be recorded and processed appropriately in order to differentiate between several cognitive processes or “mental tasks”. BCI-based control systems use such mental activity to generate control commands in a device or a robot arm or a wheelchair [132,240][14][15]. As previously said, BCIs are systems that can bypass conventional channels of communication (i.e., muscles and speech) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. This kind of control can be successfully applied to support people with motor disabilities to improve their quality of life, to enhance the residual abilities, or to replace lost functionality [78][16]. For example, with regard to individuals affected by neurological disabilities, the operation of an external robotic arm to facilitate handling activities could take advantage of these new communication modalities between humans and physical devices [22][17]. Some functions such as those connected with the abilities to select items on a screen by moving a cursor in a three-dimensional scene is straightforward using BCI-based control [77,241][18][19]. However, a more sophisticated control strategy is required to accomplish the control tasks at more complex levels because most external effectors (mechanical prosthetics, motor robots, and wheelchairs) posses more degrees of freedom. Moreover, a major feature of brain-controlled mobile robotic systems is that these mobile robots require higher safety since they are used to transport disabled people [78][16]. In BCI-based control, EEG signals are translated into user intentions.
In synchronous protocols, usually P300 and SSVEP-BCIs based on external stimulation are adopted. For asynchronous protocols, event-related de-synchronization, and ERS, interfaces independent of external stimuli are used. In fact, since asynchronous BCIs do not require any external stimulus, they appear more suitable and natural for brain-controlled mobile robots, where users need to focus their attention on robot driving but not on external stimuli.
Another aspect is related to two different operational modes that can be adopted in brain-controlled mobile robots [78][16]. One category is called “direct control by the BCI”, which means that the BCI translates EEG signals into motion commands to control robots directly. This method is computationally less complex and does not require additional intelligence. However, the overall performance of these brain-controlled mobile robots mainly depends on the performance of noninvasive BCIs, which are currently slow and uncertain [78][16]. In other words, the performance of the BCI systems limits that of the robots. In the second category of brain-controlled robots, a shared control was developed, where a user (using a BCI) and an intelligent controller (such as an autonomous navigation system) share the control over the robots. In this case, the performance of robots depend on their intelligence. Thus, the safety of driving these robots can be better ensured, and even the accuracy of intention inference of the users can be improved. This kind of approach is less compelling for the users, but their reduced effort translates into higher computational cost. The use of sensors (such as laser sensors) is often required.

4. Application to Robotic Arms, Robotic Tele-Presence, and Electrical Prosthesis

Manipulator control requires more accuracy in space target reaching compared with the wheelchair and other devices control. Control of the movement of a cursor in a three-dimensional scene is the most significant pattern in BCI-based control studies [123,242][20][21]. EEG changes, normally associated with left-hand, right-hand, or foot movement imagery, can be used to control cursor movement [242][21].
Several research studies [240,243,244,245,246,247][15][22][23][24][25][26] presented applications aimed at the control of a robot or a robotic arm to assist people with severe disabilities in a variety of tasks in their daily life. In most of the cases, the focus of these papers is on the different methods adopted to classify the action that the robot arm has to perform with respect to the mental activity recorded by BCI. In the contributions [240[15][22],243], a brain–computer interface is used to control a robot’s end-effector to achieve a desired trajectory or to perform pick/place tasks. The authors use an asynchronous protocol and a new LDA-based classifier to differentiate between three mental tasks. In [243][22], in particular, the system uses radio-frequency identification (RFID) to automatically detect objects that are close to the robot. A simple visual interface with two choices, “move” and “pick/place”, allows the user to pick/place the objects or to move the robot. The same approach is adopted in the research framework described in [245][24], where the user has to concentrate his/her attention on the option required in order to execute the action visualized on the menu screen. In the work of [244][23], an interactive, noninvasive, synchronous BCI system is developed to control in a whole three-dimensional workspace a manipulator having several degrees of freedom.
Using a robot-assisted upper-limb rehabilitation system, in the work of [248][27], the patient’s intention is translated into a direct control of the rehabilitation robot. The acquired signal is processed (through wavelet transform and LDA) to classify the pattern of left- and right-upper-limb motor imagery. Finally, a personal computer triggers the upper-limb rehabilitation robot to perform motor therapy and provides a virtual feedback.
In the study of [249][28], the authors showed how BCI-based control of a robot moving at a user’s home can be successfully reached after a training period. P300 is used in [247][26] to discern which object the robot should pick up and which location the robot should take the object to. The robot is equipped with a camera to frame objects. The user is instructed to attend to the image of the object, while the border around each image is flashed in a random order. A similar procedure is used to select a destination location. From a communication viewpoint, the approach provides cues in a synchronous way. The research study [232][5] deals with a similar approach but with an asynchronous BCI-based direct-control system for humanoid robot navigation. The experimental procedures consist of offline training, online feedback testing, and real-time control sessions. Five healthy subjects controlled a humanoid robot navigation to reach a target in an indoor maze by using their EEGs based on real-time images obtained from a camera on the head of the robot.
Brain–computer interface-based control has been adopted also to manage hand or arm prosthesis [1,250][29][30]. In such cases, patients were subjected to a training period, during which they learned to use their motor imagery. In particular, in the work described in [1][29], tetraplegic patients were trained to control the opening and closing of their paralyzed hand by means of orthosis by an EEG recorded over the sensorimotor cortex.

5. Application to Wheelchair Control and Autonomous Vehicles

Power wheelchairs are traditionally operated by a joystick. One or more switches change the function that is controlled by the joystick. Not all persons who could experience increased mobility by using a powered wheelchair possess the necessary cognitive and neuromuscular capacity needed to navigate a dynamic environment with a joystick. For these users, “shared” control approach coupled with an alternative interface is indicated. In a traditional shared control system, the assistive technology assists the user in path navigation. Shared control systems typically can work in several modes that vary the assistance provided (i.e., user autonomy) and rely on several movement algorithms. The authors of [251][31] suggest that shared control approaches can be classified in two ways: (1) mode changes triggered by the user via a button and (2) mode changes hard-coded to occur when specific conditions are detected.
Most of the current research related to BCI-based control of wheelchair shows applications of synchronous protocols [252,253,254,255,256,257][32][33][34][35][36][37]. Although synchronous protocols showed high accuracy and safety [253][33], low response efficiency and inflexible path option can represent a limit for wheelchair control in the real environment.
Minimization of user involvement is addressed by the work in [251][31], through a novel semi-autonomous navigation strategy. Instead of requiring user control commands at each step, the robot proposes actions (e.g., turning left or going forward) based on environmental information. The subject may reject the action proposed by the robot if he/she disagrees with it. Given the rejection of the human subject, the robot takes a different decision based on the user’s intention. The system relies on the automatic detection of interesting navigational points and on a human–robot dialog aimed at inferring the user’s intended action.
The authors of the research work [252][32] used a discrete approach for the navigation problem, in which the environment is discretized and composed by two regions (rectangles of 1 m22, one on the left and the other on the right of the start position), and the user decides where to move next by imagining left or right limb movements. In [253[33][34],254], a P300-based (slow-type) BCI is used to select the destination in a list of predefined locations. While the wheelchair moves on virtual guiding paths ensuring smooth, safe, and predictable trajectories, the user can stop the wheelchair by means of a faster BCI. In fact, the system switches between the fast and the slow BCIs depending on the state of the wheelchair. The paper [255][35] describes a brain-actuated wheelchair based on a synchronous P300 neurophysiological protocol integrated in a real-time graphical scenario builder, which incorporates advanced autonomous navigation capabilities (shared control). In the experiments, the task of the autonomous navigation system was to drive the vehicle to a given destination while also avoiding obstacles (both static and dynamic) detected by the laser sensor. The goal/location was provided by the user by means of a brain–computer interface.
The contributions of [256,257][36][37] describe a BCI based on SSVEPs to control the movement of an autonomous robotic wheelchair. The signals used in this work come from individuals who are visually stimulated. The stimuli are black-and-white checkerboards flickering at different frequencies.
Asynchronous protocols have been suggested for the BCI-based wheelchair control in [258,259,260][38][39][40]. The authors of [258][38] used beta oscillations in the EEG elicited by imagination of movements of a paralysed subject for a self-paced asynchronous BCI control. The subject, immersed in a virtual street populated with avatars, was asked to move among the avatars toward the end of the street, to stop by each avatar, and to talk to them. In the experiments described in [259][39], a human user makes path planning and fully controls a wheelchair except for automatic obstacle avoidance based on a laser range finder. In the experiments reported in [260][40], two human subjects were asked to mentally drive both a real and a simulated wheelchair from a starting point to a goal along a prespecified path.
Several recent papers describe BCI applications where wheelchair control is multidimensional. In fact, it appears that control commands from a single modality were not enough to meet the criteria of multi-dimensional control. The combination of different EEG signals can be adopted to give multiple control (simultaneous or sequential) commands. The authors of [261,262][41][42] showed that hybrid EEG signals, such as SSVEP and motor imagery, could improve the classification accuracy of brain–computer interfaces. The authors of [263,264][43][44] adopted the combination of P300 potential and MI or SSVEP to control a brain-actuated wheelchair. In this case, multi-dimensional control (direction and speed) is provided by multiple commands. In the paper of [265][45], the authors proposed a hybrid BCI system that combines MI and SSVEP to control the speed and direction of a wheelchair synchronously. In this system, the direction of the wheelchair was given by left- and right-hand imagery. The idle state without mental activities was decoded to keep the wheelchair moving along the straight direction. Synchronously, SSVEP signals induced by gazing at specific flashing buttons were used to accelerate or decelerate the wheelchair. To make it easier for the reader to identify the references in this section, Table 2 1 summarizes the papers about BCI applications presented in each subsection.
Table 21.
BCI applications to automation.
Reference Paper Application to Unmanned Vehicles and Robotics Application to Mobile Robotics and Interaction with Robotic Arms Application to Robotic Arms, Robotic Tele-Presence and Electrical Prosthesis Application to Wheelchair Control and Autonomous Vehicles
[115][46] X      
[216][47] X      
[217][48] X      
[218][49] X      
[219][50] X      
[220][51] X      
[221][52] X      
[222][53] X      
[223][54] X      
[224][55] X      
[225][56] X      
[226][57] X      
[227][11] X      
[78][16] X      
[34][58] X      
[228][1] X      
[229][2] X      
[230][3] X      
[231][4] X      
[232][5] X      
[132][14]   X    
[240][15]   X    
[78][16]   X    
[22][17]   X    
[77][18]   X    
[241][19]   X    
[123][20]     X  
[242][21]     X  
[240][15]     X  
[243][22]     X  
[244][23]     X  
[245][24]     X  
[246][25]     X  
[247][26]     X  
[248][27]     X  
[249][28]     X  
[1][29]     X  
[250][30]     X  
[251][31]       X
[252][32]       X
[253][33]       X
[255][35]       X
[256][36]       X
[257][37]       X
[258][38]       X
[259][39]       X
[261][41]       X
[262][42]       X
[263][43]       X
[264][44]       X
[265][45]       X

References

  1. LaFleur, K.; Cassady, K.; Doud, A.; Shades, K.; Rogin, E.; He, B. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface. J. Neural Eng. 2013, 10, 046003.
  2. Royer, A.; He, B. Goal selection versus process control in a brain-computer interface based on sensorimotor rhythms. J. Neural Eng. 2009, 6, 016005.
  3. Royer, A.S.; Doud, A.J.; Rose, M.L.; He, B. EEG control of a virtual helicopter in 3-dimensional space using intelligent control strategies. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 581–589.
  4. Doud, A.J.; Lucas, J.P.; Pisansky, M.T.; He, B. Continuous Three-Dimensional Control of a Virtual Helicopter Using a Motor Imagery Based Brain-Computer Interface. PLoS ONE 2011, 6, e26322.
  5. Chae, Y.; Jeong, J.; Jo, S. Toward Brain-Actuated Humanoid Robots: Asynchronous Direct Control Using an EEG-Based BCI. IEEE Trans. Robot. 2012, 28, 1131–1144.
  6. Alshabatat, A.; Vial, P.; Premaratne, P.; Tran, L. EEG-based brain-computer interface for automating home appliances. J. Comput. 2014, 9, 2159–2166.
  7. Lin, C.T.; Lin, B.S.; Lin, F.C.; Chang, C.J. Brain Computer Interface-Based Smart Living Environmental Auto-Adjustment Control System in UPnP Home Networking. IEEE Syst. J. 2014, 8, 363–370.
  8. Ou, C.Z.; Lin, B.S.; Chang, C.J.; Lin, C.T. Brain Computer Interface-based Smart Environmental Control System. In Proceedings of the Eighth International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), Piraeus-Athens, Greece, 18–20 July 2012; pp. 281–284.
  9. Edlinger, G.; Holzner, C.; Guger, C.; Groenegress, C.; Slater, M. Brain-computer interfaces for goal orientated control of a virtual smart home environment. In Proceedings of the 4th International IEEE/EMBS Conference on Neural Engineering, Antalya, Turkey, 29 April–2 May 2009; pp. 463–465.
  10. Kanemura, A.; Morales, Y.; Kawanabe, M.; Morioka, H.; Kallakuri, N.; Ikeda, T.; Miyashita, T.; Hagita, N.; Ishii, S. A waypoint-based framework in brain-controlled smart home environments: Brain interfaces, domotics, and robotics integration. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–7 November 2013; pp. 865–870.
  11. Marshall, D.; Coyle, D.; Wilson, S.; Callaghan, M. Games, gameplay, and BCI: The state of the art. IEEE Trans. Comput. Intell. AI Games 2013, 5, 82–99.
  12. Bayliss, J.; Ballard, D. A virtual reality testbed for brain-computer interface research. IEEE Trans. Rehabil. Eng. 2000, 8, 188–190.
  13. Bayliss, J. Use of the evoked potential P3 component for control in a virtual apartment. IEEE Trans. Neural Syst. Rehabil. Eng. 2003, 11, 113–116.
  14. Dornhege, G.; Millan, J.d.R.; Hinterberger, T.; McFarland, D.; Muller, K.R. (Eds.) Toward Brain-Computer Interfacing; A Bradford Book; The MIT Press: Cambridge, MA, USA, 2007.
  15. Ianez, E.; Azorin, J.M.; Ubeda, A.; Ferrandez, J.M.; Fernandez, E. Mental tasks-based brain–robot interface. Robot. Auton. Syst. 2010, 58, 1238–1245.
  16. Luzheng, B.; Xin-An, F.; Yili, L. EEG-Based Brain-Controlled Mobile Robots: A Survey. IEEE Trans. Hum. Mach. Syst. 2013, 43, 161–176.
  17. Bamdad, M.; Zarshenas, H.; Auais, M.A. Application of BCI systems in neurorehabilitation: A scoping review. Disabil. Reabil. Assist. Technol. 2014, 10, 355–364.
  18. Wolpaw, J.R.; Birbaumer, N.; Heetderks, W.J.; McFarland, D.J.; Peckham, H.P.; Gerwin, S.; Emanuel, D.; Quatrano, L.A.; Robinson, C.J.; Vaughan, T.M. Brain-Computer Interface Technology: A Review of the First International Meeting. IEEE Trans. Rehabil. Eng. 2000, 8, 222–226.
  19. Schalk, G. Brain-computer symbiosis. J. Neural Eng. 2008, 5, P1–P15.
  20. Wolpaw, J.R.; McFarland, D.J. Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Natl. Acad. Sci. 2004, 101, 17849–17854.
  21. McFarland, D.J.; Sarnacki, W.A.; Wolpaw, J.R. Electroencephalographic (EEG) control of three-dimensional movement. J. Neural Eng. 2010, 7, 036007.
  22. Ubeda, A.; Ianez, E.; Azorin, J.M. Shared control architecture based on RFID to control a robot arm using a spontaneous brain–machine interface. Robot. Auton. Syst. 2013, 61, 768–774.
  23. Li, T.; Hong, J.; Zhang, J.; Guo, F. Brain-machine interface control of a manipulator using small-world neural network and shared control strategy. J. Neurosci. Methods 2014, 224, 26–38.
  24. Sirvent Blasco, J.L.; Ianez, E.; Ubeda, A.; Azorin, J.M. Visual evoked potential-based brain-machine interface applications to assist disabled people. Expert Syst. Appl. 2012, 39, 7908–7918.
  25. Hortal, E.; Planelles, D.; Costa, A.; Ianez, E.; Ubeda, A.; Azorin, J.; Fernandez, E. SVM-based Brain-Machine Interface for controlling a robot arm through four mental tasks. Neurocomputing 2015, 151, 116–121.
  26. Bell, C.J.; Shenoy, P.; Chalodhorn, R.; Rao, R.P.N. Control of a humanoid robot by a noninvasive brain-computer interface in humans. J. Neural Eng. 2008, 5, 214–220.
  27. Xu, B.; Peng, S.; Song, A.; Yang, R.; Pan, L. Robot-aided upper-limb rehabilitation based on motor imagery EEG. Int. J. Adv. Robot. Syst. 2011, 8, 88–97.
  28. Leeb, R.; Perdikis, S.; Tonini, L.; Biasiucci, A.; Tavella, M.; Creatura, M.; Molina, A.; Al-Khodairy, A.; Carlson, T.; Millán, J. Transferring brain–computer interfaces beyond the laboratory: Successful application control for motor-disabled users. Artif. Intell. Med. 2013, 59, 121–132.
  29. Pfurtscheller, G.; Neuper, C.; Birbaumer, N. Human Brain-Computer Interface; CRC Press: Boca Raton, FL, USA, 2005; pp. 1–35.
  30. Gernot, R.; Muller-Putz, G.R.; Pfurtscheller, G. Control of an Electrical Prosthesis With an SSVEP-Based BCI. IEEE Trans. Biomed. Eng. 2008, 55, 361–364.
  31. Perrin, X.; Chavarriaga, R.; Colas, F.; Siegwart, R.; Millán, J.d.R. Brain-coupled Interaction for Semi-autonomous Navigation of an Assistive Robot. Robot. Auton. Syst. 2010, 58, 1246–1255.
  32. Tanaka, K.; Matsunaga, K.; Wang, H. Electroencephalogram-based control of an electric wheelchair. IEEE Trans. Robot. 2005, 21, 762–766.
  33. Rebsamen, B.; Guan, C.; Zhang, H.; Wang, C.; Teo, C.; Ang, M. A brain controlled wheelchair to navigate in familiar environments. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 590–598.
  34. Rebsamen, B.; Teo, C.; Zeng, Q.; Ang, M.; Burdet, E.; Guan, C.; Zhang, H.; Laugier, C. Controlling a Wheelchair Indoors Using Thought. IEEE Intell. Syst. 2007, 22, 18–24.
  35. Iturrate, I.; Antelis, J.; Kubler, A.; Minguez, J. A noninvasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation. IEEE Trans. Robot. 2009, 25, 614–627.
  36. Muller, S.; Celeste, W.; Bastos-Filho, T.; Sarcinelli-Filho, M. Brain-computer interface based on visual evoked potentials to command autonomous robotic wheelchair. J. Med. Biol. Eng. 2010, 30, 407–415.
  37. Diez, P.; Muller, S.; Mut, V.; Laciar, E.; Avila, E.; Bastos-Filho, T.; Sarcinelli-Filho, M. Commanding a robotic wheelchair with a high-frequency steady-state visual evoked potential based brain–computer interface. Med. Eng. Phys. 2013, 35, 1155–1164.
  38. Leeb, R.; Friedman, D.; Müller-Putz, G.R.; Scherer, R.; Slater, M.; Pfurtscheller, G. Self-paced (asynchronous) BCI control of a wheelchair in virtual environments: A case study with a tetraplegic. Comput. Intell. Neurosci. 2007, 7, 7–12.
  39. Tsui, C.; Gan, J.; Hu, H. A self-paced motor imagery based brain-computer interface for robotic wheelchair control. Clin. EEG Neurosci. 2011, 42, 225–229.
  40. Galan, F.; Nuttin, M.; Lew, P.; Vanacker, G.; Philips, J. A brain-actuated wheelchair: Asynchronous and non-invasive brain-computer interfaces for continuous control of robots. Clin. Neurophysiol. 2008, 119, 2159–2169.
  41. Allison, B.; Dunne, S.; Leeb, R.; Del, R.; Millán, J.; Nijholt, A. Towards Practical Brain–Computer Interfaces Bridging the Gap from Research to Real-World Applications; Springer: Berlin/Heidelberg, Germany, 2013.
  42. Brunner, C.; Allison, B.; Krusienski, D.; Kaiser, V.; Muller-Putz, G.; Pfurtscheller, G. Improved signal processing approaches in an offline simulation of a hybrid brain-computer interface. J. Neurosci. Methods 2010, 188, 165–173.
  43. Li, Y.; Pan, J.; Wang, F.; Yu, Z. A hybrid BCI system combining P300 and SSVEP and its application to wheelchair control. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 60, 3156–3166.
  44. Long, J.; Li, Y.; Wang, H.; Yu, T.; Pan, J.; Li, F. A Hybrid Brain Computer Interface to Control the Direction and Speed of a Simulated or Real Wheelchair. IEEE Trans. Neural Syst. Rehabil. Eng. 2012, 20, 720–729.
  45. Cao, L.; Li, J.; Ji, H.; Jiang, C. A hybrid brain computer interface system based on the neurophysiological protocol and brain-actuated switch for wheelchair control. J. Neurosci. Methods 2014, 229, 33–43.
  46. Ajrawi, S.; Rao, R.; Sarkar, M. Cybersecurity in brain-computer interfaces: RFID-based design-theoretical framework. Inform. Med. Unlocked 2021, 22, 100489.
  47. Graimann, B.; Pfurtscheller, G.; Allison, B. Brain-Computer Interfaces—Revolutionizing Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2010.
  48. Ladouce, S.; Donaldson, D.; Dudchenko, P.; Ietswaart, M. Understanding Minds in Real-World Environments: Toward a Mobile Cognition Approach. Front. Hum. Neurosci. 2017, 10.
  49. Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain-computer interface paradigms. J. Neural Eng. 2019, 16.
  50. Lazarou, I.; Nikolopoulos, S.; Petrantonakis, P.; Kompatsiaris, I.; Tsolaki, M. EEG-Based Brain–Computer Interfaces for Communication and Rehabilitation of People with Motor Impairment: A Novel Approach of the 21st Century. Front. Hum. Neurosci. 2018, 12, 14.
  51. Shih, J.; Krusienski, D.; Wolpaw, J. Brain-computer interfaces in medicine. Mayo Clinic Proc. 2012, 87, 268–279.
  52. Huang, Q.; Zhang, Z.; Yu, T.; He, S.; Li, Y. An EEG-EOG-Based Hybrid Brain-Computer Interface: Application on Controlling an Integrated Wheelchair Robotic Arm System. Front. Neurosci. 2019, 13.
  53. Yuanqing, L.; Chuanchu, W.; Haihong, Z.; Cuntai, G. An EEG-based BCI system for 2D cursor control. In Proceedings of the IEEE International Joint Conference on Neural Networks, IJCNN, and IEEE World Congress on Computational Intelligence, Hong Kong, China, 1–8 June 2008; pp. 2214–2219.
  54. Donchin, E.; Spencer, K.M.; Wijesinghe, R. The mental prosthesis: Assessing the speed of a P300-based brain-computer interface. IEEE Trans. Rehabil. Eng. 2000, 8, 174–179.
  55. Hong, B.; Guo, F.; Liu, T.; Gao, X.; Gao, S. N200-speller using motion-onset visual response. Clin. Neurophysioly 2009, 120, 1658–1666.
  56. Karim, A.A.; Hinterberger, T.; Richter, J.; Mellinger, J.; Neumann, N.; Flor, H.; Kübler, A.; Birbaumer, N. Neural internet: Web surfing with brain potentials for the completely paralyzed. Neurorehabilit. Neural Repair 2006, 20, 508–515.
  57. Bensch, M.; Karim, A.A.; Mellinger, J.; Hinterberger, T.; Tangermann, M.; Bogdan, M.; Rosenstiel, W.; Nessi Birbaumer, N. Nessi: An EEG-Controlled Web Browser for Severely Paralyzed Patients. Comput. Intell. Neurosci. 2007, 7, 508–515.
  58. Kim, M.; Kim, M.K.; Hwang, M.; Kim, H.Y.; Cho, J.; Kim, S.P. Online Home Appliance Control Using EEG-Based Brain–Computer Interfaces. Electronics 2019, 8, 1101.
More
ScholarVision Creations