Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2271 2023-02-24 10:12:29 |
2 Reference format revised. Meta information modification 2271 2023-02-27 03:26:51 | |
3 remove blank line Meta information modification 2271 2023-02-27 03:27:22 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Nazari, V.; Zheng, Y. Controlling Upper Limb Prostheses Using Sonomyography. Encyclopedia. Available online: https://encyclopedia.pub/entry/41626 (accessed on 09 October 2024).
Nazari V, Zheng Y. Controlling Upper Limb Prostheses Using Sonomyography. Encyclopedia. Available at: https://encyclopedia.pub/entry/41626. Accessed October 09, 2024.
Nazari, Vaheh, Yong-Ping Zheng. "Controlling Upper Limb Prostheses Using Sonomyography" Encyclopedia, https://encyclopedia.pub/entry/41626 (accessed October 09, 2024).
Nazari, V., & Zheng, Y. (2023, February 24). Controlling Upper Limb Prostheses Using Sonomyography. In Encyclopedia. https://encyclopedia.pub/entry/41626
Nazari, Vaheh and Yong-Ping Zheng. "Controlling Upper Limb Prostheses Using Sonomyography." Encyclopedia. Web. 24 February, 2023.
Controlling Upper Limb Prostheses Using Sonomyography
Edit

A ground-breaking study by Zheng et al. investigated whether ultrasound imaging of the forearm might be used to control a powered prosthesis, and the term “sonomyography” (SMG) was coined by the group. Ultrasound signals have recently garnered the interest of researchers in the area of HMIs because they can collect information from both superficial and deep muscles and so provide more comprehensive information than other techniques. Due to the great spatiotemporal resolution and specificity of ultrasound measurements of muscle deformation, researchers have been able to infer fine volitional motor activities, such as finger motions and the dexterous control of robotic hands.

controlling system human–machine interface machine learning non-invasive sensor prosthesis sonomyography

1. Introduction

Human–machine interfaces (HMIs) and wearable technologies have sparked a great deal of interest in recent decades because they can be used for a wide range of applications, including immersive games [1], rehabilitation engineering [2][3][4][5], the automotive industry [6][7], tele-operation in space [8], and virtual reality [9]. Furthermore, an HMI is frequently employed in the development of various control systems in prostheses and exoskeletons. In contrast to the many advancements in mechanical design, there are still significant challenges in regard to HMIs at higher levels of the control hierarchy to overcome. There is a specific type of interface that may be utilized to predict patients’ volitional movement from their residual muscle contractions or neuroactivities [10][11]. However, detecting a user’s motion intention fast enough to coordinate with devices is an important issue that requires further study [12]. A range of sensing modalities have been used to regulate human–machine interfaces. Sensing technologies for HMIs have been developed in order to provide accurate and trustworthy information to assist in the understanding of movement intentions.
In order to control prostheses, the most often used approach is the use of biological signals, which may be recorded by a variety of sensors and electrodes by interfacing with either the peripheral nervous system (PNS) or the central nervous system (CNS) [13][14]. This technique is classified as either non-invasive, including surface electromyography (sEMG), electroencephalography (EEG), forcemyography (FMG), mechanomyography (MMG), magnetoencephalography (MEG), and force sensitive resistance (FSR), or invasive, including and magnetomicrometry (MM) presently developed in MIT [15], implanted electromyography (iEMG), myoelectric implantable recording arrays (MIRAs), electroneurography (ENG), electrocorticography (ECoG), brain–chip interfaces (BCHIs), and magnetomicrometry (MM) [16]. Among all of these techniques, sEMG is the most commonly used method for prosthesis control, which has been studied very extensively [17][18][19][20].
Recently, there has been a concentrated attempt to non-invasively monitor user intention and intuitively operate various degrees of freedom of cutting-edge prostheses. This endeavor has been ongoing during the last decade. Non-invasive techniques include placing electrodes on the skin of the scalp or skeletal muscles, and applying conductive gel to the electrodes and skin surface in order to improve the contact area and conductivity between the electrodes and skin surface [21]. However, in order to collect low-amplitude electrical impulses from skeletal muscles, bipolar electrodes are put on the skeletal muscles in order to record muscular activities. However, there is a difficulty with the non-invasive technique in that the data obtained by sensors may be substantially influenced by a variety of circumstances, including electrode placement and movement, perspiration, and even noise caused by the electronic devices. Moreover, these methods have poor spatial resolution due to the interferences between the signals generated by neighboring or overlapping muscles. Surface EMG is also unable to accurately record the activity of deep muscles, and as a result, it is difficult to utilize this approach to control protheses with multiple degrees of freedom [22]. Additionally, training users to control robots by using biological signals is difficult and requires time, which is another drawback of these interface methods [23], as the signals are often not linearly related to the muscle outputs, such as force or angle [22].
Biomaterials have been used for implants for a long time [24]. Implanted myoelectric sensors, peripheral nerve implants, targeted muscle reinnervation, brain–computer interfaces [25], and implanted stimulators [26] are examples of new technologies and methods that have the potential to provide significant improvements and new opportunities in neurological research. Invasive techniques include the placing of neural implants deep into the brain, on the nerves or the skeletal muscles [16]; and the recording of signals from the cerebral cortex, part of the brain, or muscle activity. These implants are able to connect with the brain, nerves, and muscles to collect electrical signals during nerve or muscle activation. In addition, they give electrical impulses to neurons, as well as transmit electrical signals between neurons and computers, or between computers and neurons through a chip [24][27]. While invasive approaches may increase the stability of biological signals, as well as give more accurate information about the activities of the brain or muscles [28], these novel interface methods raise a lot of concerns regarding the safety and efficacy of the operations, which involve surgery or implanted devices [23]. Furthermore, these signals also have the presence of noises, the same as non-invasive techniques.
Researchers have also made significant efforts in recent years to employ new technologies and propose novel techniques for controlling prosthetic hands, such as augmented-reality (AR) glasses [29], inductive tongue control systems (ITCSs) [30], voice commands, and inertial measurement units (IMUs) [31][32]. Some concepts have proved that even the simplest techniques may have compelling results.
These techniques are often utilized for prostheses that only have a single degree of freedom. Hence, the analysis or classification of biological signals necessitates the development of intelligent characteristic algorithms that are capable of accurately classifying the different signals gathered with the least number of errors [33]. Utilizing a variety of machine learning methods, including deep learning, significant improvements in the processing and classification of biological signals have been made in recent years. For example, the use of machine learning has yielded good results and achieved high performance accuracy across a wide variety of topics, including the rehabilitation and re-education of physically handicapped human limbs [34]. In enhancing robot control, various algorithms, such as K nearest neighbors (KNN), support vector machines (SVMs), principal component analysis (PCA), linear discriminant analysis (LDA), artificial neural networks (ANNs), convolutional neural networks (CNNs), and Bayes networks, can be used to classify signals with an accuracy of approximately 90% [35].
Recently, it has been proven that replacing biological signals with ultrasound (US) imaging that may provide real-time dynamic images of interior tissue movements linked with physical and physiological activity enables better discernment between discrete motions or categorization of full finger flexion [36]. Muscle architectural changes can be detected by putting an ultrasound probe on the residual limb and by classifying different hand gestures based on muscle movement and activities for controlling a prosthesis [37].

2. Sonomyography (SMG)

To retain performance, a prosthesis that responds to the user’s physiological signals must be fast to respond. EEG, sEMG, and other intuitive interfaces are capable of detecting neuromuscular signals prior to the beginning of motion; therefore, they are predicted to appear before the motion itself [38][39][40]. However, ultrasound imaging can detect skeletal muscle kinematic and kinetic characteristics [41], which indicate the continued creation of cross bridges during motor unit recruitment and prior to the generation of muscular force [39][42], and these changes occur during sarcomere shortening, when muscle force exceeds segment inertial forces, and before the beginning of joint motion [39]. It is important to note that the changes in kinetic and kinematic ultrasonography properties of muscles occur prior to joint motion. As a result, prosthetic hands will be able to respond more quickly in the present and future.

2.1. Ultrasound Modes Used in SMG

Real-time dynamic images of muscle activities can be provided by US imaging systems. There are five different types of ultrasound modes, and each of them generates different information, but only some of them are applicable for use in controlling artificial robotic hands. The most popular ultrasound modes utilized in prosthesis control are A-mode, B-mode, and M-mode.
(1) A-mode SMG: One of the most basic types of US is A-mode, which offers data in one dimension in the form of a graph in which the y-axis indicates information about echo amplitude and the x-axis represents time, similar to the way that EMG signals indicate muscle activity.
(2) B-mode SMG: B-mode, or 2D mode, provides a cross-sectional image of tissues or organs and is one of the most popular US modes used in a wide range of medical applications. In B-mode US, organs and tissues show up as points of different brightness in 2D grayscale images made from the echoes. B-mode ultrasound can provide a real-time image of muscles under contraction.
(3) M-mode SMG: An M-mode scan, also known as a motion mode scan, uses a series of A-mode scan signals, normally by selecting one line in B-mode imaging, to depict tissue motion over time. Using the M-mode, it is possible to estimate the velocity of individual organ structures. In comparison to the B-mode and A-mode, the motion mode US scans at a greater frequency and provides more comprehensive information about the tissue.

2.2. Muscle Location and Probe Fixation

It is vital to note that the position and location of the probe are critical in order to have greater control over robotic hands. The main muscles which perform different types of finger flexion are the FDS, FDP, and FPL muscles. However, to perform different wrist movements, the pronator teres, flexor carpi radialis, flexor carpi ulnaris, palmaris longus, and pronater quadratus are involved (Figure 1).
Figure 1. Illustration of fiber tractography and textbook anatomical structure of main forearm flexor muscles [43]. (A) Superficial muscle groups of a forearm. (B) Deep muscle group of a forearm.
Hence, the placing of sensors to collect these muscle activities with better and more reliable control over the robot is important. After collecting data from healthy volunteers, Akhlaghi et al. [44] discovered that muscular distortion was significant in 30–50% of the forearm length from the elbow and that this region is the best place to record muscle movements for controlling robots. However, after testing various locations and fixing positions on a range of healthy individuals, McIntosh et al. [45] discovered that the wrist region is the most effective place for classifying discrete motions. Furthermore, they observed that the diagonal position is the most effective position for collecting data for identifying discrete gestures, whereas the diagonal and transverse positions are the most effective for predicting finger angles (Figure 2).
Figure 2. (ae) A comparison of the ultrasound probe’s various hand mounting positions, along with the related picture [45].

2.3. Feature Extraction Algorithm

To classify the finger movements and different hand gestures, it is important to use different types of algorithms to extract features from signals or images captured by US transducers because machine learning algorithms cannot process all the information. It is worth mentioning that using a machine learning algorithm without extracting features can classify different hand gestures, but the accuracy would be significantly less.
Shi et al. [46] captured the forearm muscle activities and controlled a hand prosthesis with B-mode ultrasound, and AI was used to classify the finger movements. Before using collected data to train their AI, a deformation field was constructed to extract features from the data after registering the ultrasound image pair with the demons registration algorithm for each group. Valerio Ortenzi [21] used the SMG technique as a valid HMI method to control a robotic hand. In order to classify ten different hand gestures and grasp forces, visual characteristics such as regions-of-interest gradients and histogram of oriented gradient (HOG) features were extracted from the collected images, and these features were used to train three machine learning algorithms.
The activity pattern was generated using an image-processing method developed by Akhlaghi et al. [47]. MATLAB (MathWorks, Natick, MA, USA) was used to extract the activity patterns for each kind of hand movement from the B-mode ultrasound picture frames. Pixel-wise differences were determined and then averaged across a time span to identify the spatial distribution of intensity variations that corresponded to the muscle activity in each sequential frame of each series (raw activity pattern). A hand motion was mapped to a single activity pattern by using this method. On the basis of the global thresholding level and decimal block size, the raw activity pattern was then transformed into a binary image. This database was then used to train the nearest neighbor classification algorithm.

2.4. Artificial Intelligence in Classification

To have dexterous and precise control over prostheses, different deep learning and machine learning algorithms have been developed to classify different hand gestures and intended movements, using SMG with high accuracy.
To control a prosthetic device in real time, Shi et al. [48] looked at the sum of absolute differences (SAD), the two-dimensional logarithmic search (TDL), the cross-correlation (CC) method, and algorithms such as SAD and TDL in conjunction with streaming single-instruction multiple-data extensions (SSEs). They utilized a block-matching method to measure the muscle deformation during contraction. To compare TDL with and without SSE, the findings revealed good execution efficiency, with a mean correlation coefficient of about 0.99, a mean standard root-mean-square error of less than 0.75, and a mean relative root-mean-square error of less than 8.0%. Tests have shown that a prosthetic hand can be controlled by only one muscle position, which allows for proprioception of muscle tension. They mentioned that SMG is good at controlling prosthetic hands, allowing them to open and close proportionally and quickly.
In order to capture muscle activity in a finger’s flexion and extension and evaluate the potential of using an ultrasound device in HMI, Shi et al. [46] employed B-mode ultrasound imaging. The deformation field was used to extract features, which were then inputted into the SVM classifier for the identification of finger movements. The experimental results revealed that the overall mean recognition accuracy was around 94%, indicating that this method has high accuracy and reliability. They assert that the suggested approach might be utilized in place of surface electromyography for determining which fingers move in distinct ways.

References

  1. Van Dijk, L.; van der Sluis, C.K.; van Dijk, H.W.; Bongers, R.M. Task-oriented gaming for transfer to prosthesis use. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 24, 1384–1394.
  2. Liu, H.; Dong, W.; Li, Y.; Li, F.; Geng, J.; Zhu, M.; Chen, T.; Zhang, H.; Sun, L.; Lee, C. An epidermal sEMG tattoo-like patch as a new human–machine interface for patients with loss of voice. Microsyst. Nanoeng. 2020, 6, 16.
  3. Jiang, N.; Dosen, S.; Muller, K.-R.; Farina, D. Myoelectric control of artificial limbs—Is there a need to change focus? . IEEE Signal Process. Mag. 2012, 29, 150–152.
  4. Nazari, V.; Pouladian, M.; Zheng, Y.-P.; Alam, M. A Compact and Lightweight Rehabilitative Exoskeleton to Restore Grasping Functions for People with Hand Paralysis. Sensors 2021, 21, 6900.
  5. Nazari, V.; Pouladian, M.; Zheng, Y.-P.; Alam, M. Compact Design of A Lightweight Rehabilitative Exoskeleton for Restoring Grasping Function in Patients with Hand Paralysis. Res. Sq. 2021. Preprits.
  6. Zhang, Y.; Yu, C.; Shi, Y. Designing autonomous driving HMI system: Interaction need insight and design tool study. In Proceedings of the International Conference on Human-Computer Interaction, Las Vegas, NV, USA, 15–20 July 2018; pp. 426–433.
  7. Young, S.N.; Peschel, J.M. Review of human–machine interfaces for small unmanned systems with robotic manipulators. IEEE Trans. Hum.-Mach. Syst. 2020, 50, 131–143.
  8. Wilde, M.; Chan, M.; Kish, B. Predictive human-machine interface for teleoperation of air and space vehicles over time delay. In Proceedings of the 2020 IEEE Aerospace Conference, Big Sky, MT, USA, 7–14 March 2020; pp. 1–14.
  9. Morra, L.; Lamberti, F.; Pratticó, F.G.; La Rosa, S.; Montuschi, P. Building trust in autonomous vehicles: Role of virtual reality driving simulators in HMI design. IEEE Trans. Veh. Technol. 2019, 68, 9438–9450.
  10. Bortole, M.; Venkatakrishnan, A.; Zhu, F.; Moreno, J.C.; Francisco, G.E.; Pons, J.L.; Contreras-Vidal, J.L. The H2 robotic exoskeleton for gait rehabilitation after stroke: Early findings from a clinical study. J. Neuroeng. Rehabil. 2015, 12, 54.
  11. Pehlivan, A.U.; Losey, D.P.; O’Malley, M.K. Minimal assist-as-needed controller for upper limb robotic rehabilitation. IEEE Trans. Robot. 2015, 32, 113–124.
  12. Zhu, C.; Luo, L.; Mai, J.; Wang, Q. Recognizing Continuous Multiple Degrees of Freedom Foot Movements With Inertial Sensors. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 431–440.
  13. Russell, C.; Roche, A.D.; Chakrabarty, S. Peripheral nerve bionic interface: A review of electrodes. Int. J. Intell. Robot. Appl. 2019, 3, 11–18.
  14. Yildiz, K.A.; Shin, A.Y.; Kaufman, K.R. Interfaces with the peripheral nervous system for the control of a neuroprosthetic limb: A review. J. Neuroeng. Rehabil. 2020, 17, 43.
  15. Taylor, C.R.; Srinivasan, S.; Yeon, S.H.; O’Donnell, M.; Roberts, T.; Herr, H.M. Magnetomicrometry. Sci. Robot. 2021, 6, eabg0656.
  16. Ng, K.H.; Nazari, V.; Alam, M. Can Prosthetic Hands Mimic a Healthy Human Hand? Prosthesis 2021, 3, 11–23.
  17. Geethanjali, P. Myoelectric control of prosthetic hands: State-of-the-art review. Med. Devices Evid. Res. 2016, 9, 247–255.
  18. Mahmood, N.T.; Al-Muifraje, M.H.; Saeed, T.R.; Kaittan, A.H. Upper prosthetic design based on EMG: A systematic review. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Duhok, Iraq, 9–10 September 2020; p. 012025.
  19. Mohebbian, M.R.; Nosouhi, M.; Fazilati, F.; Esfahani, Z.N.; Amiri, G.; Malekifar, N.; Yusefi, F.; Rastegari, M.; Marateb, H.R. A Comprehensive Review of Myoelectric Prosthesis Control. arXiv 2021, arXiv:2112.13192.
  20. Cimolato, A.; Driessen, J.J.; Mattos, L.S.; De Momi, E.; Laffranchi, M.; De Michieli, L. EMG-driven control in lower limb prostheses: A topic-based systematic review. J. NeuroEngineering Rehabil. 2022, 19, 43.
  21. Ortenzi, V.; Tarantino, S.; Castellini, C.; Cipriani, C. Ultrasound imaging for hand prosthesis control: A comparative study of features and classification methods. In Proceedings of the 2015 IEEE International Conference on Rehabilitation Robotics (ICORR), Singapore, 11–14 August 2015; pp. 1–6.
  22. Guo, J.-Y.; Zheng, Y.-P.; Kenney, L.P.; Bowen, A.; Howard, D.; Canderle, J.J. A comparative evaluation of sonomyography, electromyography, force, and wrist angle in a discrete tracking task. Ultrasound Med. Biol. 2011, 37, 884–891.
  23. Ribeiro, J.; Mota, F.; Cavalcante, T.; Nogueira, I.; Gondim, V.; Albuquerque, V.; Alexandria, A. Analysis of man-machine interfaces in upper-limb prosthesis: A review. Robotics 2019, 8, 16.
  24. Haque, M.; Promon, S.K. Neural Implants: A Review Of Current Trends And Future Perspectives. Eurpoe PMC 2022. Preprint.
  25. Kuiken, T.A.; Dumanian, G.; Lipschutz, R.D.; Miller, L.; Stubblefield, K. The use of targeted muscle reinnervation for improved myoelectric prosthesis control in a bilateral shoulder disarticulation amputee. Prosthet. Orthot. Int. 2004, 28, 245–253.
  26. Miller, L.A.; Lipschutz, R.D.; Stubblefield, K.A.; Lock, B.A.; Huang, H.; Williams, T.W., III; Weir, R.F.; Kuiken, T.A. Control of a six degree of freedom prosthetic arm after targeted muscle reinnervation surgery. Arch. Phys. Med. Rehabil. 2008, 89, 2057–2065.
  27. Wadikar, D.; Kumari, N.; Bhat, R.; Shirodkar, V. Book recommendation platform using deep learning. Int. Res. J. Eng. Technol. IRJET 2020, 7, 6764–6770.
  28. Memberg, W.D.; Stage, T.G.; Kirsch, R.F. A fully implanted intramuscular bipolar myoelectric signal recording electrode. Neuromodulation Technol. Neural Interface 2014, 17, 794–799.
  29. Hazubski, S.; Hoppe, H.; Otte, A. Non-contact visual control of personalized hand prostheses/exoskeletons by tracking using augmented reality glasses. 3D Print. Med. 2020, 6, 6.
  30. Johansen, D.; Cipriani, C.; Popović, D.B.; Struijk, L.N. Control of a robotic hand using a tongue control system—A prosthesis application. IEEE Trans. Biomed. Eng. 2016, 63, 1368–1376.
  31. Otte, A. Invasive versus Non-Invasive Neuroprosthetics of the Upper Limb: Which Way to Go? Prosthesis 2020, 2, 237–239.
  32. Fonseca, L.; Tigra, W.; Navarro, B.; Guiraud, D.; Fattal, C.; Bó, A.; Fachin-Martins, E.; Leynaert, V.; Gélis, A.; Azevedo-Coste, C. Assisted grasping in individuals with tetraplegia: Improving control through residual muscle contraction and movement. Sensors 2019, 19, 4532.
  33. Fang, C.; He, B.; Wang, Y.; Cao, J.; Gao, S. EMG-centered multisensory based technologies for pattern recognition in rehabilitation: State of the art and challenges. Biosensors 2020, 10, 85.
  34. Briouza, S.; Gritli, H.; Khraief, N.; Belghith, S.; Singh, D. A Brief Overview on Machine Learning in Rehabilitation of the Human Arm via an Exoskeleton Robot. In Proceedings of the 2021 International Conference on Data Analytics for Business and Industry (ICDABI), Sakheer, Bahrain, 25–26 October 2021; pp. 129–134.
  35. Zhang, S.; Li, Y.; Zhang, S.; Shahabi, F.; Xia, S.; Deng, Y.; Alshurafa, N. Deep learning in human activity recognition with wearable sensors: A review on advances. Sensors 2022, 22, 1476.
  36. Zadok, D.; Salzman, O.; Wolf, A.; Bronstein, A.M. Towards Predicting Fine Finger Motions from Ultrasound Images via Kinematic Representation. arXiv 2022, arXiv:2202.05204.
  37. Yan, J.; Yang, X.; Sun, X.; Chen, Z.; Liu, H. A lightweight ultrasound probe for wearable human–machine interfaces. IEEE Sens. J. 2019, 19, 5895–5903.
  38. Begovic, H.; Zhou, G.-Q.; Li, T.; Wang, Y.; Zheng, Y.-P. Detection of the electromechanical delay and its components during voluntary isometric contraction of the quadriceps femoris muscle. Front. Physiol. 2014, 5, 494.
  39. Dieterich, A.V.; Botter, A.; Vieira, T.M.; Peolsson, A.; Petzke, F.; Davey, P.; Falla, D. Spatial variation and inconsistency between estimates of onset of muscle activation from EMG and ultrasound. Sci. Rep. 2017, 7, 42011.
  40. Wentink, E.; Schut, V.; Prinsen, E.; Rietman, J.S.; Veltink, P.H. Detection of the onset of gait initiation using kinematic sensors and EMG in transfemoral amputees. Gait Posture 2014, 39, 391–396.
  41. Lopata, R.G.; van Dijk, J.P.; Pillen, S.; Nillesen, M.M.; Maas, H.; Thijssen, J.M.; Stegeman, D.F.; de Korte, C.L. Dynamic imaging of skeletal muscle contraction in three orthogonal directions. J. Appl. Physiol. 2010, 109, 906–915.
  42. Jahanandish, M.H.; Fey, N.P.; Hoyt, K. Lower limb motion estimation using ultrasound imaging: A framework for assistive device control. IEEE J. Biomed. Health Inform. 2019, 23, 2505–2514.
  43. Froeling, M.; Nederveen, A.J.; Heijtel, D.F.; Lataster, A.; Bos, C.; Nicolay, K.; Maas, M.; Drost, M.R.; Strijkers, G.J. Diffusion-tensor MRI reveals the complex muscle architecture of the human forearm. J. Magn. Reson. Imaging 2012, 36, 237–248.
  44. Akhlaghi, N.; Dhawan, A.; Khan, A.A.; Mukherjee, B.; Diao, G.; Truong, C.; Sikdar, S. Sparsity analysis of a sonomyographic muscle–computer interface. IEEE Trans. Biomed. Eng. 2019, 67, 688–696.
  45. McIntosh, J.; Marzo, A.; Fraser, M.; Phillips, C. Echoflex: Hand gesture recognition using ultrasound imaging. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1923–1934.
  46. Shi, J.; Guo, J.-Y.; Hu, S.-X.; Zheng, Y.-P. Recognition of finger flexion motion from ultrasound image: A feasibility study. Ultrasound Med. Biol. 2012, 38, 1695–1704.
  47. Akhlaghi, N.; Baker, C.A.; Lahlou, M.; Zafar, H.; Murthy, K.G.; Rangwala, H.S.; Kosecka, J.; Joiner, W.M.; Pancrazio, J.J.; Sikdar, S. Real-time classification of hand motions using ultrasound imaging of forearm muscles. IEEE Trans. Biomed. Eng. 2015, 63, 1687–1698.
  48. Shi, J.; Chang, Q.; Zheng, Y.-P. Feasibility of controlling prosthetic hand using sonomyography signal in real time: Preliminary study. J. Rehabil. Res. Dev. 2010, 47, 87–98.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : ,
View Times: 399
Revisions: 3 times (View History)
Update Date: 27 Feb 2023
1000/1000
ScholarVision Creations