Tactile and Force Sensors for Human–Machine Interaction: History
Please note this is an old version of this entry, which may differ significantly from the current revision.

Human–Machine Interface (HMI) plays a key role in the interaction between people and machines, which allows people to easily and intuitively control the machine and immersively experience the virtual world of the meta-universe by virtual reality/augmented reality (VR/AR) technology. Currently, wearable skin-integrated tactile and force sensors are widely used in immersive human–machine interactions due to their ultra-thin, ultra-soft, conformal characteristics. 

  • tactile sensor
  • force sensor
  • HMI
  • VR/AR

1. Introduction

Tactile and force sensing is important for humans to understand and interact with the external world. Human skin, especially the skin of the hand, can sensitively sense pressure, strain, and bending stimuli. In order to imitate the tactile and force sensing capability of human skin and to record tactile information for practical use, flexible tactile and force sensors were developed in the form of electronic skin [1], electronic fabric [2], smart contact lenses [3], etc. Compared to conventional bulky and rigid devices, flexible tactile and force sensors can be attached to curved and soft surfaces; thus, they are suitable to be used for wearable electronics with high comfort and fitness [4]. Moreover, with the development of material and structural design and micro-nano processing technology, flexible tactile and force sensors have higher sensitivity and lower response time than conventional devices, and some even surpass the performance of the human skin [5]. Flexible tactile and force sensors have been applied to a variety of applications, including health monitoring [6], object recognition [7], intelligent robots [8], human–machine interaction (HMI) [9], etc., where HMI is receiving increasing attention since it serves as a bridge to connect human and robots, devices, or virtual avatars.
During the human–machine interaction process between the user and the machine, the user first enters a signal through the tactile and force sensors; then, the input signal is converted into a directive and transmitted to the machine system, and finally, the machine system carries out a task corresponding to the directive [4]. The tactile and force sensors are the hardware fundamental of an HMI system since they determine the sensitivity, accuracy, and response time of the system to receive input from the user. Commonly used types of tactile and force sensors include resistive sensors, capacitive sensors, piezoelectric sensors, and triboelectric sensors, where resistive sensors have high sensitivity and simple readout, but the power consumption is relatively high; capacitive sensors have low power consumption but are sensitive to electromagnetic interferences; piezoelectric and triboelectric sensors have self-powered sensing properties; and triboelectric sensors can detect not only dynamic but also static tactile signals. Recently, strategies to improve the performance of tactile and force sensors have been proposed, including the enhancement of the linear detection range, sensitivity, wearing fitness, and the capability of multi-dimensional tactile sensing, which have the potential to be applied to HMI applications [5][10][11][12].
With the aid of nanofabrication, tactile mechanisms, and the advanced recognition method, novel HMIs in the form of a keyboard, gear, or touchscreen [13][14][15] and HMI systems for wireless communications [16] have been developed for advanced performance and intelligent interaction. Apart from those conventional HMI forms, novel forms of HMI as electronic skin or smart clothing have been demonstrated with the increasing requirement for wearable HMI applications, especially for robot control. Robot control HMIs consist of multiple tactile and force sensing units to achieve the multi-channel monitoring of human body motion, and the group of signals received by the sensing units is processed and recognized to generate directives to control robots [17]. Especially with the development of the meta-universe, as a window for people to experience the virtual world, the HMI must be combined with the corresponding feedback system to let people immerse in the real experience of the virtual world, such as sports, games, and other fields. Traditional VR/AR technology mostly depends on the sensory perception of glasses and ears. However, as the largest sensory organ of the human body, the skin can feel more feedback such as temperature, vibration, shape, etc. Therefore, more and more researchers develop skin-integrated electronic sensors as the feedback system of immersive VR/AR experiences [9][18][19]. In addition, other wearable electronic device systems such as smart gloves and rings have also been developed for VR/AR technology applications [20]. Recently, novel strategies have been rapidly developed for advanced tactile and force sensors applicable to HMIs, and HMIs with novel structures, system designs, and scenarios are demonstrated for robot control and VR/AR technology. Therefore, it is necessary to provide an overview and give the future outlook of tactile and force sensors for HMI.

2. Tactile and Force Sensors for HMI

2.1. Resistive Tactile and Force Sensors

Resistive tactile and force sensors are widely used in human–machine interfaces for their high sensitivity and simple readout circuits, and they can be applied to sense different forms of forces, including pressure and strain. Resistive tactile and force sensing is based on the change of the contact resistance R of the active layer that is given by:
R = ρ L A
where ρ is the resistivity of the active material, L is the length, and A is the contact area that is continuously changed by the force loaded on the sensor. As a result, each resistance value corresponds to a force value.
A resistive pressure sensor has an active layer that is conducive for measurement and elastic for response to pressure, which can be made by coating the conductive nanomaterial, such as nanowires (NWs) [21], reduced graphene oxide (rGO) [22], carbon nanotubes (CNTs), graphene, or MXene [23], on the elastic polymer or by combining the conductive material and the polymer to form the composite film [24]. When pressure is applied, the shape, contact area, and resistivity of the active layer are altered, and the resistance is changed accordingly [25]. Resistive pressure sensors are used to sense and record the contact between the human body (e.g., finger or foot) and the contact surface for HMI applications [26][27].
Resistive strain sensors as flexible electronic skins consist of nanomaterials on elastic films or directly on human skin, composite polymers [28], or textiles [29]. Graphene [30], CNTs [31], and MXene [32] are widely used nanomaterials of resistive strain sensors. The resistance changes of strain sensors are mainly caused by the piezoresistive effect, geometrical change, connect area change, crack propagation, and tunneling effect [33]. Resistive strain sensors are commonly used for monitoring skin stretching caused by arthrosis or muscle motion by attaching to human skin [17][34], or they are used for intraocular HMIs by attaching to smart contact lenses [35]

2.2. Capacitive Tactile and Force Sensors

Capacitive sensors are commonly used for tactile sensing because of their high sensitivity and low power consumption, and they can be designed to reduce interferences by temperature fluctuation [36][37]. However, capacitive sensors are sensitive to outer electromagnetic interferences. Most flexible capacitive tactile and force sensors are parallel-plate capacitors, which consist of an elastic medium sandwiched between two electrodes [38], and the capacitance C is given by:
C = ε A d
where A is the contact area, d is the distance between the two electrodes, and ε is the permittivity of the medium. The distance d is reduced with the increase of the applied pressure; thus, the capacitance is continuously increased with the increment of the pressure [39]. Another type of capacitive pressure sensor is called an interdigital pressure capacitor, which has also been reported for tactile sensing [40]

2.3. Piezoelectric Tactile and Force Sensors

Piezoelectric tactile and force sensors utilize the piezoelectric effect that an electric field is generated by the dipole separation in the piezoelectric material caused by the pressure applied on the surface [41]. The piezoelectric effect is described by the constitutive piezoelectric equations given by [42]:
S D = s E d t d ε T T E
where S , T , E , and D are the strain, stress, electric field, and electric displacement matrices, respectively, s E is the compliance tensor at the constant electric field, ε T is the dielectric constant tensor at constant stress, and d is a piezoelectric constant tensor ( d t is the transpose of d ). Based on the piezoelectric effect, the mechanical signal of tactile and force is converted to an electric signal that is output without an electric energy source; thus, piezoelectric tactile and force sensors are self-powered.

2.4. Triboelectric Tactile and Force Sensors

Triboelectric sensors for tactile sensing are based on triboelectric nanogenerators (TENG) that were first reported in 2012 [43] with the principle to convert irregular mechanical energy into available electrical energy. The fundamental working mechanism of TENG is based on the coupling effect of the contact electrification and electrostatic induction happening during the contact and separation processes between two materials with different electronegativities [16]. Firstly, a movable triboelectric layer such as triboelectric material or human skin is left from an electrode; then, the movable layer contacts the electrode and causes a surface charge transfer. Then, the movable layer is left again, and the induced charges at the electrode are decreased to balance the voltage with the ground. The process is repeated, and dynamic signals are generated [44]. The types of material used for triboelectric tactile sensors include Polytetrafluoroethylene (PTFE), PDMS, Polyvinyl chloride (PVC), etc., for electron acceptor materials, and skin, PU, Indium tin oxide (ITO), cotton, etc., for electron donor materials, and materials should be properly chosen to guarantee the correct direction of electron transfer [44][45]. Furthermore, novel materials used for triboelectric tactile sensors were demonstrated for advanced performance, including PAN@ZIF-8 nanofibers that can improve the amount of charges generated during electrification [16].

2.5. Other Types of Tactile and Force Sensors

Apart from typical sensors, other types of tactile and force sensors were reported for HMI. An electrical impedance tactile sensor is based on the measurement of electrical impedance change when force is applied [46][47]. An optical fiber tactile sensor is based on the transmittance alternation of different light wavelengths by the applied force, and an optical microfiber for HMI has been developed accordingly [48]. A magnetic tactile sensor consists of an active layer where the density and distribution of magnetic particles are changed by the applied force, and a Hall sensor captures the magnetic field change [49]. The other type of magnetic tactile sensor consists of a magnetic layer that can be deformed by pressure and a giant magneto-resistive material layer that senses the deformation of the magnetic layer [50]. Moreover, different tactile sensing mechanisms can be combined for multifunctional tactile sensing, e.g., a heterogeneous tactile sensor able to sense stretching, bending, and compression individually was proposed by combining optical, microfluidic, and piezoresistive sensing mechanisms [51]. In addition to touch tactile interaction, proximity interaction is regarded as a generalized form of tactile, and sensing mechanisms, including capacitive [52], nearby charge induction [53], optical fiber [48], and magnetic field sensing [50], are utilized to realize non-contact tactile HMIs.

3. Performance Improvement of Tactile and Force Sensor for Advanced HMI

3.1. Linear Detection Range

The linear detection range is a key specification of tactile and force sensor performance. A tactile sensor that has a large linear detection range can preserve high-pressure resolution over the detection range and can facilitate calibration and data processing, which have the potential for advanced HMI applications [54][55]. Although linearity can be achieved by changing the tactile sensor structure, there is a trade-off between the sensor linearity and sensitivity, since the structure change may increase the linear range but decrease the sensitivity [56].

3.2. Detection Sensitivity

Sensitivity is an important performance parameter of tactile and force sensors for HMI, and highly sensitive tactile sensors can detect slight tactile signals and have high measurement accuracy [25]. The sensitivity of a pressure sensor is defined by the relative change of the output signal such as resistance, capacitance, and voltage per pressure. For strain sensors, the sensitivity is evaluated by gauge factor (GF, δ ( Δ R / R 0 ) / δ ε ), which is the ratio of the relative change to the mechanical strain [33]. The surface microstructure is a commonly used way to increase sensitivity, and active layers with microstructures have higher deformation and larger contact area change under applied forces. Widely used surface microstructures are cylinders [57], domes [58], pyramids [59], etc. Other than regular microstructures, irregular structures such as rough surfaces processed by abrasive paper [27] or salt and sugar [60] were used to mold the material surface, especially for resistive tactile sensors since the change of the applied pressure does not only alter the equivalent resistance but also changes the equivalent number of resistors between the electrodes, which further enhances the sensitivity [61].
The increase in sensitivity is not always beneficial because the increment of sensitivity may cause a decrease in the detection range for capacitive tactile sensors. Therefore, there is a trade-off between the sensitivity and detection range for capacitive tactile sensors. A strategy used to increase sensitivity without decreasing detection range called iontronic tactile sensing was developed, which utilizes the change of the electron double layer (EDL) at the interface between the ionic material. When the pressure is loaded, the contact area is changed, which alters the amount of charge induced at the interface and driftly changes the EDL capacitance. 

3.3. Multi-Dimensional Sensing

In addition to single force detection such as pressure and strain sensing, tactile sensors are developed to sense multiple dimensional forces, including pressure, shear force, strain, and torsion. Tactile sensors with a single channel to respond to multi-dimensional forces were developed [62][63][64]. Although the readout circuit is simple, the force and torque vector cannot be obtained from the single output signal individually. Electronic skins with two strain sensing units perpendicular to each other to sense two-dimensional strains were developed [65][66][67][68], and multi-dimensional force sensors were developed accordingly [69]. However, this type of design cannot distinguish planer forces with the same absolute value and opposite direction.
Various types of multi-dimensional tactile sensors able to distinguish forces with different directions and values were proposed [70][71][72]. A widely used method is to add a bump structure on the surface of four tactile sensing units to measure the torsion strains to determine planer shear forces [73], and sensors based on resistive [74], capacitive [75], piezoelectric [76], transistor [77], microfluid [78], and conductive liquid [79] mechanisms have been demonstrated. Moreover, capacitive three-dimensional force sensors containing multiple capacitors in which the lower electrodes share the same upper electrode were developed [80], where the pressure and shear forces change the electrode distance and equivalent area of the four capacitors, respectively, and can be distinguished by the signal process.

3.4. Wearing Fitness

With the development of wearable electronics, wearable human–machine interfaces with tactile and force sensors attached to clothing or human skin have been demonstrated [9]. Wearable human–machine interfaces should fit with the attached surface to recover the real tactile sense of the user and not disturb the user’s normal life. One of the specifications of wearing fitness is flexibility. Tactile and force sensors on human skin should be flexible and match the property of the skin to ensure comfortable wearing or even not be precepted by the users and adapt to the dynamic motion of the human body [81][82][83][84]. For HMIs that have single-channel tactile sensing units or multiple units distributed at many places, flexible nanomaterials are utilized for the tactile sensing of each unit [26][34][85][86][87]. For array-based HMIs, rigid electrodes and electrodes printed on Polyimide (PI) or Polyethylene Terephthalate (PET) films are non-stretchable and are hard to fit with human skin. Therefore, stretchable electrodes, created by printing metal electrodes on patterned plastic films, were developed and used for wearable tactile interfaces [81].

4. HMIs for Dexterous Robotic Manipulation

4.1. Multi-Channel Control

Robot technology is a crucial field in the age of intelligence. Robots can reach hazardous places that humans need not attend and work personally. With various scenarios and HMI requirements, a diversity of brilliant solutions for multi-channel HMI for robot manipulation was proposed. A basic way to control a robot is to manipulate the robot’s motion in space. Mishra et al. proposed a convenient control HMI to manipulate the planer motion of a wheelchair by eye movement via multi-channel electrooculogram (EOG) signals [88]. To expand the HMI for robot control from two dimensions (2D) to three dimensions (3D), Xu et al. demonstrated an EOG and tactile collaborative HMIs to enable 3D control [39].
The human hand possesses multiple arthroses, having a large degree of freedom (DOF). In order to make robots achieve a similar motion ability to the human hand, multi-channel sensing systems with units distributed at multiple places on the human skin or wearable devices have been proposed [89]. Tactile and force sensors are attached to hands or knuckles to monitor the finger bending and control the robot hand to respond to a different hand gesture or to grasp objects [51][85][86][90]. Tao et al. developed a triboelectric tactile sensor based on micro-pyramid-patterned double-network ionic organo-hydrogels [86]. The sensors were attached to knuckles, and the high sensitivity (45.97 mV/ Pa) and fast response (about 20 ms) enabled the sensors to capture the bending signal of the hand and realize robot hand control. In order to enable HMI to sense multiple stimuli from the user, Kim et al. proposed a heterogenous tactile sensor that could sense stretching, bending, and compression individually, and it was demonstrated that the robot successfully responded to eight different types of tactile directives with higher than 95% accuracies [51]. The human–machine interactive system is an important component of intelligent robot technology, and there is a trend to combine tactile signals with multiple physicochemical signals for advanced HMI systems that establish loops of robot control and robot information feedback [17].

4.2. Machine Learning-Enhanced Control

An intelligent human–robot interactive system is a synthesis constructed by hardware sensors and software algorithms, and a suitable algorithm to analyze and recognize the signals obtained by the sensor can further enhance the power of the interactive system [91]. Many machine learning (ML) algorithms have been applied to process tactile information for object recognition [7], material sensing [92], touch modality classification [47], and HMI [17] applications for their powerful capability for pattern recognition, including convolutional neural networks (CNN) [7], support vector machines (SVM) [39], k-nearest neighbors (KNN) [17], etc. CNNs are good at extracting information from output signals and detecting multiple low-level features with high accuracy [91][93]. SVMs are one of the most efficient machine learning algorithms that are commonly used for classification, and SVMs can produce a unique solution, which makes SVMs more trustable over different samples compared to neural network algorithms [94]. KNNs have the advantages of their simplicity and superior accuracy in hand gesture recognition [17]. Many ML algorithms can be applied to tactile signal recognition, and the most suitable ML algorithm for a specific scenario should be given by testing and comparison.
For HMI applications, ML algorithms are often introduced to accomplish the recognition task of many kinds of human motion patterns from tactile signals obtained by tactile and force sensors that are hard to directly understand by humans. For example, Xu et al. proposed an EOG and tactile perception collaborative interface and applied SVM to successfully recognize nine types of eye motion states with an accuracy of 92.6% [39]. Moreover, Hou et al. demonstrated a mouthguard integrated with an optoelectronic sensing system that can receive bite control signals [95]. An artificial neural network (ANN) algorithm was developed to extract the features of multi-channel bite signals and recognize the bite control directive from the user, and the accuracy was validated by wheelchair and virtual keyboard control tests (more than 94.2%). In order to reduce the number of readout circuits and simplify the electrode structure, Xu et al. demonstrated a handwriting panel with only one output channel [96].

5. HMIs for Virtual/Augmented Reality Applications

Virtual reality (VR) and augmented reality (AR) technologies create human experiences related to the physical world by replicating visual and auditory stimuli of sensation. The most extensive VR and AR systems use head-mounted displays, accelerometers, and speakers as the basis for 3D computer-generated environments, which can exist independently or as an overlay of actual scenes. Therefore, eyes and ears are the key ways to obtain information in a virtual reality experience. The skin is the largest organ of the human body, but in VR technology, the skin is relatively undeveloped relative to the eyes and ears. Nowadays, tactile interaction is also gradually paid attention to. However, the current tactile devices used in VR and AR usually rely on the motor to exert skin vibration. Users need to wear heavy wires and batteries to achieve tactile interaction, which greatly limits the application. In order to immersively experience virtual reality scenes, more and more researchers developed skin-integrated interfaces as feedback and interactive applications of virtual reality technology, which has great application value in games, sports, medicine, and other fields [18][97].
In addition to the electronic skin tactile system, some other tactile interfaces such as socks, gloves, rings, stylus, and so on are also used for AR/VR applications [20][98][99][100][101]. Lee et al. developed triboelectric smart socks for IoT-based gait analysis and established a digital human body system by mapping the physical signals collected by socks into the virtual space, which is beneficial for motion monitoring, medical care, recognition, and future smart home applications [98]. Wen et al. designed a triboelectric smart glove for sign language recognition and VR space bidirectional communication. The language recognition and communication system composed of a smart glove, AI block, and back-end VR interface can independently recognize words and sentences with a high accuracy of 91.3% and 95% in a non-segmented framework, indicating the potential for advanced and practical language recognition. Furthermore, the VR platform can provide opportunities for speech/hearing-impaired people to directly use sign language to interact with non-sign language speakers [99].

This entry is adapted from the peer-reviewed paper 10.3390/s23041868

References

  1. Chortos, A.; Liu, J.; Bao, Z. Pursuing prosthetic electronic skin. Nat. Mater. 2016, 15, 937–950.
  2. Luo, Y.; Li, Y.; Sharma, P.; Shou, W.; Wu, K.; Foshey, M.; Li, B.; Palacios, T.; Torralba, A.; Matusik, W. Learning human–environment interactions using conformal tactile textiles. Nat. Electron. 2021, 4, 193–201.
  3. Kim, J.; Park, J.; Park, Y.G.; Cha, E.; Ku, M.; An, H.S.; Lee, K.P.; Huh, M.I.; Kim, J.; Kim, T.S.; et al. A soft and transparent contact lens for the wireless quantitative monitoring of intraocular pressure. Nat. Biomed. Eng. 2021, 5, 772–782.
  4. Yin, R.; Wang, D.; Zhao, S.; Lou, Z.; Shen, G. Wearable Sensors-Enabled Human–Machine Interaction Systems: From Design to Application. Adv. Funct. Mater. 2020, 31, 2008936.
  5. Duan, S.; Yang, H.; Hong, J.; Li, Y.; Lin, Y.; Zhu, D.; Lei, W.; Wu, J. A skin-beyond tactile sensor as interfaces between the prosthetics and biological systems. Nano Energy 2022, 102, 107665.
  6. Liu, Z.; Wang, G.; Ye, C.; Sun, H.; Pei, W.; Wei, C.; Dai, W.; Dou, Z.; Sun, Q.; Lin, C.T.; et al. An Ultrasensitive Contact Lens Sensor Based On Self-Assembly Graphene For Continuous Intraocular Pressure Monitoring. Adv. Funct. Mater. 2021, 31, 2010991.
  7. Sundaram, S.; Kellnhofer, P.; Li, Y.; Zhu, J.Y.; Torralba, A.; Matusik, W. Learning the signatures of the human grasp using a scalable tactile glove. Nature 2019, 569, 698–702.
  8. Boutry, C.M.; Negre, M.; Jorda, M.; Vardoulis, O.; Chortos, A.; Khatib, O.; Bao, Z. A hierarchically patterned, bioinspired e-skin able to detect the direction of applied pressure for robotics. Sci. Robot. 2018, 3, eaau6914.
  9. Liu, Y.; Yiu, C.; Song, Z.; Huang, Y.; Yao, K.; Wong, T.; Zhou, J.; Zhao, L.; Huang, X.; Nejad, S.K.; et al. Electronic skin as wireless human-machine interfaces for robotic VR. Sci. Adv. 2022, 8, eabl6700.
  10. Ma, C.; Xu, D.; Huang, Y.C.; Wang, P.; Huang, J.; Zhou, J.; Liu, W.; Li, S.T.; Huang, Y.; Duan, X. Robust Flexible Pressure Sensors Made from Conductive Micropyramids for Manipulation Tasks. ACS Nano 2020, 14, 12866–12876.
  11. Lee, J.H.; Heo, J.S.; Kim, Y.J.; Eom, J.; Jung, H.J.; Kim, J.W.; Kim, I.; Park, H.H.; Mo, H.S.; Kim, Y.H.; et al. A Behavior-Learned Cross-Reactive Sensor Matrix for Intelligent Skin Perception. Adv. Mater. 2020, 32, e2000969.
  12. Zhang, X.; Li, Z.; Du, W.; Zhao, Y.; Wang, W.; Pang, L.; Chen, L.; Yu, A.; Zhai, J. Self-powered triboelectric-mechanoluminescent electronic skin for detecting and differentiating multiple mechanical stimuli. Nano Energy 2022, 96, 107115.
  13. Wu, C.; Ding, W.; Liu, R.; Wang, J.; Wang, A.C.; Wang, J.; Li, S.; Zi, Y.; Wang, Z.L. Keystroke dynamics enabled authentication and identification using triboelectric nanogenerator array. Mater. Today 2018, 21, 216–222.
  14. Hou, C.; Geng, J.; Yang, Z.; Tang, T.; Sun, Y.; Wang, F.; Liu, H.; Chen, T.; Sun, L. A Delta-Parallel-Inspired Human Machine Interface by Using Self-Powered Triboelectric Nanogenerator Toward 3D and VR/AR Manipulations. Adv. Mater. Technol. 2020, 6, 2000912.
  15. Xiang, S.; Tang, J.; Yang, L.; Guo, Y.; Zhao, Z.; Zhang, W. Deep learning-enabled real-time personal handwriting electronic skin with dynamic thermoregulating ability. npj Flex. Electron. 2022, 6, 59.
  16. Pandey, P.; Thapa, K.; Ojha, G.P.; Seo, M.-K.; Shin, K.H.; Kim, S.-W.; Sohn, J.I. Metal-organic frameworks-based triboelectric nanogenerator powered visible light communication system for wireless human-machine interactions. Chem. Eng. J. 2023, 452, 139209.
  17. Yu, Y.; Li, J.; Solomon, S.A.; Min, J.; Tu, J.; Guo, W.; Xu, C.; Song, Y.; Gao, W. All-printed soft human-machine interface for robotic physicochemical sensing. Sci. Robot. 2022, 7, eabn0495.
  18. Jung, Y.H.; Yoo, J.-Y.; Vázquez-Guardado, A.; Kim, J.-H.; Kim, J.-T.; Luan, H.; Park, M.; Lim, J.; Shin, H.-S.; Su, C.-J.; et al. A wireless haptic interface for programmable patterns of touch across large areas of the skin. Nat. Electron. 2022, 5, 374–385.
  19. Wang, K.; Yap, L.W.; Gong, S.; Wang, R.; Cheng, W. Nanowire–ased Soft Wearable Human–Machine Interfaces for Future Virtual and Augmented Reality Applications. Adv. Funct. Mater. 2021, 31, 2008347.
  20. Sun, Z.; Zhu, M.; Shan, X.; Lee, C. Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nat. Commun. 2022, 13, 5224.
  21. Du, Q.; Liu, L.; Tang, R.; Ai, J.; Wang, Z.; Fu, Q.; Li, C.; Chen, Y.; Feng, X. High-Performance Flexible Pressure Sensor Based on Controllable Hierarchical Microstructures by Laser Scribing for Wearable Electronics. Adv. Mater. Technol. 2021, 6, 2100122.
  22. Jia, J.; Huang, G.; Deng, J.; Pan, K. Skin-inspired flexible and high-sensitivity pressure sensors based on rGO films with continuous-gradient wrinkles. Nanoscale 2019, 11, 4258–4266.
  23. Wu, Z.; Wei, L.; Tang, S.; Xiong, Y.; Qin, X.; Luo, J.; Fang, J.; Wang, X. Recent Progress in Ti(3)C(2)T(x) MXene-Based Flexible Pressure Sensors. ACS Nano 2021, 15, 18880–18894.
  24. Zhao, Y.; Shen, T.; Zhang, M.; Yin, R.; Zheng, Y.; Liu, H.; Sun, H.; Liu, C.; Shen, C. Advancing the pressure sensing performance of conductive CNT/PDMS composite film by constructing a hierarchical-structured surface. Nano Mater. Sci. 2022.
  25. Pyo, S.; Lee, J.; Bae, K.; Sim, S.; Kim, J. Recent Progress in Flexible Tactile Sensors for Human-Interactive Systems: From Sensors to Advanced Applications. Adv. Mater. 2021, 33, e2005902.
  26. Li, Y.; Zhao, M.; Yan, Y.; He, L.; Wang, Y.; Xiong, Z.; Wang, S.; Bai, Y.; Sun, F.; Lu, Q.; et al. Multifunctional biomimetic tactile system via a stick-slip sensing strategy for human–machine interactions. npj Flex. Electron. 2022, 6, 46.
  27. Pang, Y.; Zhang, K.; Yang, Z.; Jiang, S.; Ju, Z.; Li, Y.; Wang, X.; Wang, D.; Jian, M.; Zhang, Y.; et al. Epidermis Microstructure Inspired Graphene Pressure Sensor with Random Distributed Spinosum for High Sensitivity and Large Linearity. ACS Nano 2018, 12, 2346–2354.
  28. Kanoun, O.; Bouhamed, A.; Ramalingame, R.; Bautista-Quijano, J.R.; Rajendran, D.; Al-Hamry, A. Review on Conductive Polymer/CNTs Nanocomposites Based Flexible and Stretchable Strain and Pressure Sensors. Sensors 2021, 21, 341.
  29. Wang, J.; Lu, C.; Zhang, K. Textile-Based Strain Sensor for Human Motion Detection. Energy Environ. Mater. 2020, 3, 80–100.
  30. Qiao, Y.; Wang, Y.; Tian, H.; Li, M.; Jian, J.; Wei, Y.; Tian, Y.; Wang, D.Y.; Pang, Y.; Geng, X.; et al. Multilayer Graphene Epidermal Electronic Skin. ACS Nano 2018, 12, 8839–8846.
  31. Wang, S.; Xiao, P.; Liang, Y.; Zhang, J.; Huang, Y.; Wu, S.; Kuo, S.-W.; Chen, T. Network cracks-based wearable strain sensors for subtle and large strain detection of human motions. J. Mater. Chem. 2018, 6, 5140–5147.
  32. Guo, Q.; Zhang, X.; Zhao, F.; Song, Q.; Su, G.; Tan, Y.; Tao, Q.; Zhou, T.; Yu, Y.; Zhou, Z.; et al. Protein-Inspired Self-Healable Ti(3)C(2) MXenes/Rubber-Based Supramolecular Elastomer for Intelligent Sensing. ACS Nano 2020, 14, 2788–2797.
  33. Amjadi, M.; Kyung, K.-U.; Park, I.; Sitti, M. Stretchable, Skin-Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review. Adv. Funct. Mater. 2016, 26, 1678–1698.
  34. Wu, Y.; Karakurt, I.; Beker, L.; Kubota, Y.; Xu, R.; Ho, K.Y.; Zhao, S.; Zhong, J.; Zhang, M.; Wang, X.; et al. Piezoresistive stretchable strain sensors with human machine interface demonstrations. Sens. Actuators A Phys. 2018, 279, 46–52.
  35. Zha, X.J.; Zhang, S.T.; Pu, J.H.; Zhao, X.; Ke, K.; Bao, R.Y.; Bai, L.; Liu, Z.Y.; Yang, M.B.; Yang, W. Nanofibrillar Poly(vinyl alcohol) Ionic Organohydrogels for Smart Contact Lens and Human-Interactive Sensing. ACS Appl. Mater. Interfaces 2020, 12, 23514–23522.
  36. Zhang, C.; Gallichan, R.; Budgett, D.M.; McCormick, D. A Capacitive Pressure Sensor Interface IC with Wireless Power and Data Transfer. Micromachines 2020, 11, 897.
  37. Yang, J.C.; Kim, J.-O.; Oh, J.; Kwon, S.Y.; Sim, J.Y.; Kim, D.W.; Choi, H.B.; Park, S. Microstructured Porous Pyramid-Based Ultrahigh Sensitive Pressure Sensor Insensitive to Strain and Temperature. ACS Appl. Mater. Interfaces 2019, 11, 19472–19480.
  38. Mishra, R.B.; El-Atab, N.; Hussain, A.M.; Hussain, M.M. Recent Progress on Flexible Capacitive Pressure Sensors: From Design and Materials to Applications. Adv. Mater. Technol. 2021, 6, 2001023.
  39. Xu, J.; Li, X.; Chang, H.; Zhao, B.; Tan, X.; Yang, Y.; Tian, H.; Zhang, S.; Ren, T.L. Electrooculography and Tactile Perception Collaborative Interface for 3D Human-Machine Interaction. ACS Nano 2022, 16, 6687–6699.
  40. Malik, M.S.; Zulfiqar, M.H.; Khan, M.A.; Mehmood, M.Q.; Massoud, Y. Facile Pressure-Sensitive Capacitive Touch Keypad for a Green Intelligent Human-Machine Interface. Sensors 2022, 22, 8113.
  41. Liu, Q.; Wang, X.X.; Song, W.Z.; Qiu, H.J.; Zhang, J.; Fan, Z.; Yu, M.; Long, Y.Z. Wireless Single-Electrode Self-Powered Piezoelectric Sensor for Monitoring. ACS Appl. Mater. Interfaces 2020, 12, 8288–8295.
  42. Jung, J.; Lee, W.; Kang, W.; Shin, E.; Ryu, J.; Choi, H. Review of piezoelectric micromachined ultrasonic transducers and their applications. J. Micromech. Microeng. 2017, 27, 113001.
  43. Fan, F.-R.; Tian, Z.-Q.; Lin Wang, Z. Flexible triboelectric generator. Nano Energy 2012, 1, 328–334.
  44. Akram, W.; Chen, Q.; Xia, G.; Fang, J. A review of single electrode triboelectric nanogenerators. Nano Energy 2023, 106, 108043.
  45. Zhang, R.; Olin, H. Material choices for triboelectric nanogenerators: A critical review. EcoMat 2020, 2, e12062.
  46. Wu, Y.; Jiang, D.; Liu, X.; Bayford, R.; Demosthenous, A. A Human-Machine Interface Using Electrical Impedance Tomography for Hand Prosthesis Control. IEEE Trans. Biomed. Circuits Syst. 2018, 12, 1322–1333.
  47. Park, K.; Yuk, H.; Yang, M.; Cho, J.; Lee, H.; Kim, J. A biomimetic elastomeric robot skin using electrical impedance and acoustic tomography for tactile sensing. Sci. Robot. 2022, 7, eabm7187.
  48. Liu, H.; Song, X.; Wang, X.; Wang, S.; Yao, N.; Li, X.; Fang, W.; Tong, L.; Zhang, L. Optical Microfibers for Sensing Proximity and Contact in Human-Machine Interfaces. ACS Appl. Mater. Interfaces 2022, 14, 14447–14454.
  49. Yan, Y.; Hu, Z.; Yang, Z.; Yuan, W.; Song, C.; Pan, J.; Shen, Y. Soft magnetic skin for super-resolution tactile sensing with force self-decoupling. Sci. Robot. 2021, 6, eabc8801.
  50. Ge, J.; Wang, X.; Drack, M.; Volkov, O.; Liang, M.; Canon Bermudez, G.S.; Illing, R.; Wang, C.; Zhou, S.; Fassbender, J.; et al. A bimodal soft electronic skin for tactile and touchless interaction in real time. Nat. Commun. 2019, 10, 4405.
  51. Kim, T.; Lee, S.; Hong, T.; Shin, G.; Kim, T.; Park, Y.L. Heterogeneous sensing in a multifunctional soft sensor for human-robot interfaces. Sci. Robot. 2020, 5, eabc6878.
  52. Li, T.; Sakthivelpathi, V.; Qian, Z.; Kahng, S.J.; Ahn, S.; Dichiara, A.B.; Manohar, K.; Chung, J.H. Ultrasensitive Capacitive Sensor Composed of Nanostructured Electrodes for Human–Machine Interface. Adv. Mater. Technol. 2022, 7, 2101704.
  53. Zhu, S.; Li, Y.; Yelemulati, H.; Deng, X.; Li, Y.; Wang, J.; Li, X.; Li, G.; Gkoupidenis, P.; Tai, Y. An artificial remote tactile device with 3D depth-of-field sensation. Sci. Adv. 2022, 8, eabo5314.
  54. Ji, B.; Zhou, Q.; Lei, M.; Ding, S.; Song, Q.; Gao, Y.; Li, S.; Xu, Y.; Zhou, Y.; Zhou, B. Gradient Architecture-Enabled Capacitive Tactile Sensor with High Sensitivity and Ultrabroad Linearity Range. Small 2021, 17, e2103312.
  55. Pyo, S.; Lee, J.; Kim, W.; Jo, E.; Kim, J. Multi-Layered, Hierarchical Fabric-Based Tactile Sensors with High Sensitivity and Linearity in Ultrawide Pressure Range. Adv. Funct. Mater. 2019, 29, 1902484.
  56. Ji, B.; Zhou, Q.; Hu, B.; Zhong, J.; Zhou, J.; Zhou, B. Bio-Inspired Hybrid Dielectric for Capacitive and Triboelectric Tactile Sensors with High Sensitivity and Ultrawide Linearity Range. Adv. Mater. 2021, 33, e2100859.
  57. Yang, J.; Luo, S.; Zhou, X.; Li, J.; Fu, J.; Yang, W.; Wei, D. Flexible, Tunable, and Ultrasensitive Capacitive Pressure Sensor with Microconformal Graphene Electrodes. ACS Appl. Mater. Interfaces 2019, 11, 14997–15006.
  58. Peng, S.; Blanloeuil, P.; Wu, S.; Wang, C.H. Rational Design of Ultrasensitive Pressure Sensors by Tailoring Microscopic Features. Adv. Mater. Interfaces 2018, 5, 1800403.
  59. Ruth, S.R.A.; Beker, L.; Tran, H.; Feig, V.R.; Matsuhisa, N.; Bao, Z. Rational Design of Capacitive Pressure Sensors Based on Pyramidal Microstructures for Specialized Monitoring of Biosignals. Adv. Funct. Mater. 2019, 30, 1903100.
  60. Wang, J.; Suzuki, R.; Shao, M.; Gillot, F.; Shiratori, S. Capacitive Pressure Sensor with Wide-Range, Bendable, and High Sensitivity Based on the Bionic Komochi Konbu Structure and Cu/Ni Nanofiber Network. ACS Appl. Mater. Interfaces 2019, 11, 11928–11935.
  61. Wang, Z.; Wang, S.; Zeng, J.; Ren, X.; Chee, A.J.; Yiu, B.Y.; Chung, W.C.; Yang, Y.; Yu, A.C.; Roberts, R.C.; et al. High Sensitivity, Wearable, Piezoresistive Pressure Sensors Based on Irregular Microhump Structures and Its Applications in Body Motion Sensing. Small 2016, 12, 3827–3836.
  62. Pang, C.; Lee, G.Y.; Kim, T.I.; Kim, S.M.; Kim, H.N.; Ahn, S.H.; Suh, K.Y. A flexible and highly sensitive strain-gauge sensor using reversible interlocking of nanofibres. Nat. Mater. 2012, 11, 795–801.
  63. Choi, D.; Jang, S.; Kim, J.S.; Kim, H.-J.; Kim, D.H.; Kwon, J.-Y. A Highly Sensitive Tactile Sensor Using a Pyramid-Plug Structure for Detecting Pressure, Shear Force, and Torsion. Adv. Mater. Technol. 2019, 4, 1800284.
  64. Ji, B.; Zhou, Q.; Chen, G.; Dai, Z.; Li, S.; Xu, Y.; Gao, Y.; Wen, W.; Zhou, B. In situ assembly of a wearable capacitive sensor with a spine-shaped dielectric for shear-pressure monitoring. J. Mater. Chem. 2020, 8, 15634–15645.
  65. Ma, L.; Yang, W.; Wang, Y.; Chen, H.; Xing, Y.; Wang, J. Multi-dimensional strain sensor based on carbon nanotube film with aligned conductive networks. Compos. Sci. Technol. 2018, 165, 190–197.
  66. Sui, C.; Yang, Y.; Headrick, R.J.; Pan, Z.; Wu, J.; Zhang, J.; Jia, S.; Li, X.; Gao, W.; Dewey, O.S.; et al. Directional sensing based on flexible aligned carbon nanotube film nanocomposites. Nanoscale 2018, 10, 14938–14946.
  67. Lee, J.H.; Kim, J.; Liu, D.; Guo, F.; Shen, X.; Zheng, Q.; Jeon, S.; Kim, J.K. Highly Aligned, Anisotropic Carbon Nanofiber Films for Multidirectional Strain Sensors with Exceptional Selectivity. Adv. Funct. Mater. 2019, 29, 1901623.
  68. Mousavi, S.; Howard, D.; Zhang, F.; Leng, J.; Wang, C.H. Direct 3D Printing of Highly Anisotropic, Flexible, Constriction-Resistive Sensors for Multidirectional Proprioception in Soft Robots. ACS Appl. Mater. Interfaces 2020, 12, 15631–15643.
  69. Ma, Y.; Ouyang, J.; Raza, T.; Li, P.; Jian, A.; Li, Z.; Liu, H.; Chen, M.; Zhang, X.; Qu, L.; et al. Flexible all-textile dual tactile-tension sensors for monitoring athletic motion during taekwondo. Nano Energy 2021, 85, 105941.
  70. Lee, J.I.; Pyo, S.; Kim, M.O.; Kim, J. Multidirectional flexible force sensors based on confined, self-adjusting carbon nanotube arrays. Nanotechnology 2018, 29, 055501.
  71. Won, S.M.; Wang, H.; Kim, B.H.; Lee, K.; Jang, H.; Kwon, K.; Han, M.; Crawford, K.E.; Li, H.; Lee, Y.; et al. Multimodal Sensing with a Three-Dimensional Piezoresistive Structure. ACS Nano 2019, 13, 10972–10979.
  72. Choi, E.; Hwang, S.; Yoon, Y.; Seo, H.; Lee, J.; Yeom, S.; Ryu, G.; Yang, H.; Kim, S.; Sul, O.; et al. Highly Sensitive Tactile Shear Sensor Using Spatially Digitized Contact Electrodes. Sensors 2019, 19, 1300.
  73. Sun, X.; Sun, J.; Li, T.; Zheng, S.; Wang, C.; Tan, W.; Zhang, J.; Liu, C.; Ma, T.; Qi, Z.; et al. Flexible Tactile Electronic Skin Sensor with 3D Force Detection Based on Porous CNTs/PDMS Nanocomposites. Nano-Micro Lett. 2019, 11, 57.
  74. Zhang, W.; Xi, Y.; Wang, E.; Qu, X.; Yang, Y.; Fan, Y.; Shi, B.; Li, Z. Self-Powered Force Sensors for Multidimensional Tactile Sensing. ACS Appl. Mater. Interfaces 2022, 14, 20122–20131.
  75. Cheng, M.Y.; Lin, C.L.; Lai, Y.T.; Yang, Y.J. A polymer-based capacitive sensing array for normal and shear force measurement. Sensors 2010, 10, 10211–10225.
  76. Yuan, F.; Wang, W.; Liu, S.; Zhou, J.; Wang, S.; Wang, Y.; Deng, H.; Xuan, S.; Gong, X. A self-powered three-dimensional integrated e-skin for multiple stimuli recognition. Chem. Eng. J. 2023, 451, 138522.
  77. Oh, H.; Yi, G.C.; Yip, M.; Dayeh, S.A. Scalable tactile sensor arrays on flexible substrates with high spatiotemporal resolution enabling slip and grip for closed-loop robotics. Sci. Adv. 2020, 6, eabd7795.
  78. Nie, B.; Li, R.; Brandt, J.D.; Pan, T. Microfluidic tactile sensors for three-dimensional contact force measurements. Lab Chip 2014, 14, 4344–4353.
  79. Noda, K.; Matsumoto, K.; Shimoyama, I. Stretchable tri-axis force sensor using conductive liquid. Sens. Actuators A Phys. 2014, 215, 123–129.
  80. Viry, L.; Levi, A.; Totaro, M.; Mondini, A.; Mattoli, V.; Mazzolai, B.; Beccai, L. Flexible three-axial force sensor for soft and highly sensitive artificial touch. Adv. Mater. 2014, 26, 2659–2664.
  81. Sim, K.; Rao, Z.; Zou, Z.; Ershad, F.; Lei, J.; Thukral, A.; Chen, J.; Huang, Q.A.; Xiao, J.; Yu, C. Metal oxide semiconductor nanomembrane-based soft unnoticeable multifunctional electronics for wearable human-machine interfaces. Sci. Adv. 2019, 5, eaav9653.
  82. Huang, S.; Liu, Y.; Zhao, Y.; Ren, Z.; Guo, C.F. Flexible Electronics: Stretchable Electrodes and Their Future. Adv. Funct. Mater. 2018, 29, 1805924.
  83. Low, Z.W.K.; Li, Z.; Owh, C.; Chee, P.L.; Ye, E.; Kai, D.; Yang, D.P.; Loh, X.J. Using Artificial Skin Devices as Skin Replacements: Insights into Superficial Treatment. Small 2019, 15, e1805453.
  84. Jeong, J.W.; Yeo, W.H.; Akhtar, A.; Norton, J.J.; Kwack, Y.J.; Li, S.; Jung, S.Y.; Su, Y.; Lee, W.; Xia, J.; et al. Materials and optimized designs for human-machine interfaces via epidermal electronics. Adv. Mater. 2013, 25, 6839–6846.
  85. Zhou, J.; Long, X.; Huang, J.; Jiang, C.; Zhuo, F.; Guo, C.; Li, H.; Fu, Y.; Duan, H. Multiscale and hierarchical wrinkle enhanced graphene/Ecoflex sensors integrated with human-machine interfaces and cloud-platform. npj Flex. Electron. 2022, 6, 55.
  86. Tao, K.; Chen, Z.; Yu, J.; Zeng, H.; Wu, J.; Wu, Z.; Jia, Q.; Li, P.; Fu, Y.; Chang, H.; et al. Ultra-Sensitive, Deformable, and Transparent Triboelectric Tactile Sensor Based on Micro-Pyramid Patterned Ionic Hydrogel for Interactive Human–Machine Interfaces. Adv. Sci. 2022, 9, 2104168.
  87. He, T.; Sun, Z.; Shi, Q.; Zhu, M.; Anaya, D.V.; Xu, M.; Chen, T.; Yuce, M.R.; Thean, A.V.-Y.; Lee, C. Self-powered glove-based intuitive interface for diversified control applications in real/cyber space. Nano Energy 2019, 58, 641–651.
  88. Mishra, S.; Norton, J.J.S.; Lee, Y.; Lee, D.S.; Agee, N.; Chen, Y.; Chun, Y.; Yeo, W.H. Soft, conformal bioelectronics for a wireless human-wheelchair interface. Biosens. Bioelectron. 2017, 91, 796–803.
  89. Li, Z.; Guo, W.; Huang, Y.; Zhu, K.; Yi, H.; Wu, H. On-skin graphene electrodes for large area electrophysiological monitoring and human-machine interfaces. Carbon 2020, 164, 164–170.
  90. Jin, T.; Sun, Z.; Li, L.; Zhang, Q.; Zhu, M.; Zhang, Z.; Yuan, G.; Chen, T.; Tian, Y.; Hou, X.; et al. Triboelectric nanogenerator sensors for soft robotics aiming at digital twin applications. Nat. Commun. 2020, 11, 5381.
  91. Wang, M.; Wang, T.; Luo, Y.; He, K.; Pan, L.; Li, Z.; Cui, Z.; Liu, Z.; Tu, J.; Chen, X. Fusing Stretchable Sensing Technology with Machine Learning for Human–Machine Interfaces. Adv. Funct. Mater. 2021, 31, 2008807.
  92. Pang, Y.; Xu, X.; Chen, S.; Fang, Y.; Shi, X.; Deng, Y.; Wang, Z.-L.; Cao, C. Skin-inspired textile-based tactile sensors enable multifunctional sensing of wearables and soft robots. Nano Energy 2022, 96, 107137.
  93. Mahmood, M.; Mzurikwao, D.; Kim, Y.-S.; Lee, Y.; Mishra, S.; Herbert, R.; Duarte, A.; Ang, C.S.; Yeo, W.-H. Fully portable and wireless universal brain–machine interfaces enabled by flexible scalp electronics and deep learning algorithm. Nat. Mach. Intell. 2019, 1, 412–422.
  94. Karamizadeh, S.; Abdullah, S.M.; Halimi, M.; Shayan, J.; Rajabi, M.j. Advantage and drawback of support vector machine functionality. In Proceedings of the 2014 International Conference on Computer, Communications, and Control Technology (I4CT), Langkawi, Malaysia, 2–4 September 2014; pp. 63–65.
  95. Hou, B.; Yi, L.; Li, C.; Zhao, H.; Zhang, R.; Zhou, B.; Liu, X. An interactive mouthguard based on mechanoluminescence-powered optical fibre sensors for bite-controlled device operation. Nat. Electron. 2022, 5, 682–693.
  96. Xu, W.; Liu, S.; Yang, J.; Meng, Y.; Liu, S.; Chen, G.; Jia, L.; Li, X. Self-powered flexible handwriting input panel with 1D output enabled by convolutional neural network. Nano Energy 2022, 101, 107557.
  97. Yu, X.; Xie, Z.; Yu, Y.; Lee, J.; Vazquez-Guardado, A.; Luan, H.; Ruban, J.; Ning, X.; Akhtar, A.; Li, D.; et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 2019, 575, 473–479.
  98. Zhang, Z.; He, T.; Zhu, M.; Sun, Z.; Shi, Q.; Zhu, J.; Dong, B.; Yuce, M.R.; Lee, C. Deep learning-enabled triboelectric smart socks for IoT-based gait analysis and VR applications. npj Flex. Electron. 2020, 4, 29.
  99. Wen, F.; Zhang, Z.; He, T.; Lee, C. AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove. Nat. Commun. 2021, 12, 5378.
  100. Zhu, M.; Sun, Z.; Zhang, Z.; Shi, Q.; He, T.; Liu, H.; Chen, T.; Lee, C. Haptic-feedback smart glove as a creative human-machine interface (HMI) for virtual/augmented reality applications. Sci. Adv. 2020, 6, eaaz8693.
  101. Kim, G.; Hwang, D.; Park, J. Effect of 2.5D haptic feedback on virtual object perception via a stylus. Sci. Rep. 2021, 11, 18954.
More
This entry is offline, you can click here to edit this entry!
ScholarVision Creations