Five Senses plus One of Robotics: Comparison
Please note this is a comparison between Version 2 by Lindsay Dong and Version 1 by rand naser albustanji.

Robots can be equipped with a range of senses to allow them to perceive and interact with the world in a more natural and intuitive way. These senses can include vision, hearing, touch, smell, and taste. Vision allows the robot to see and recognize objects and navigate its environment. Hearing enables the robot to recognize sounds and respond to vocal commands. Touch allows the robot to perceive information about the texture, shape, and temperature of objects through the sense of touch. Smell enables the robot to recognize and classify different odors. Taste enables the robot to identify the chemical composition of materials. The specific senses used in a robot will depend on the needs of the application, and many robots use a combination of different senses to perceive and interact with the environment.

  • robotic sensing
  • vision
  • hearing
  • tactile
  • electronic nose
  • electronic tongue
  • Sixth Sense

1. Introduction

Robotics is an interdisciplinary field of computer science and engineering that is rapidly advancing and transforming the world. The field of robotics aims to design machines that can help and assist humans in various tasks. This field integrates knowledge and expertise in mechanical engineering, electrical engineering, information engineering, mechatronics, electronics, bioengineering, computer engineering, control engineering, software engineering, mathematics, and more. Our senses give us the power to explore the world around us! With our five senses—sight, hearing, touch, smell, and taste—we can perceive the world and its changes. Sensors are the devices that help robots do the same. To make robots even more effective, engineers have been exploring ways to give them sensory abilities, such as odor-sensing, vision, tactile sensing, hearing, and taste. In addition to the traditional five senses, some researchers are exploring the idea of a “Sixth Sense” for robots. Have you ever wondered how robots can see, hear, smell, taste, and touch?
Robots can sense, plan, and act. They are equipped with sensors that go beyond human capabilities! From exploring the surface of Mars to lightning-fast global deliveries, robots can do things humans can only dream of. When designing and building robots, engineers often use fascinating animal and human models to help decide which sensors they need. For instance, bats can be used as a model for sound-detecting robots, ants can be used as a model to determine smell, and bees can be used as a model to determine how they use pheromones to call for help. Human touch helps us to sense various features of our environment, such as texture, temperature, and pressure. Similarly, tactile sensors in robots can detect these qualities and more. For instance, the robot vacuum cleaner (Roomba) uses sensors to detect objects through contact [7][1]. However, similar to sight and sound, a robot may not always know the precise content of what it picks up (a bag, a soft cake, or a hug from a friend); it just knows that there is an obstacle to be avoided or found. Tactile sensing is a crucial element of intelligent robotic manipulation as it allows robots to interact with physical objects in ways that other sensors cannot [8][2]. Robots are increasingly being used in various applications, including industrial, military, and healthcare. One of the most important features of robots is their ability to detect and respond to environmental changes. Odor-sensing technology is a key component of this capability. In a survey presented by [9][3]. The article evaluates various techniques that are available for detecting chemicals and how they can be used to control the motion of a robot. Additionally, it discusses the importance of controlling and measuring airflow close to the sensor to infer useful information from readings of chemical concentration. Robot vision is an emerging technology that uses cameras and sensors to allow robots to interpret and respond to their environment, with numerous applications in the medical, industrial, and entertainment fields. It requires artificial intelligence (AI) techniques to produce devices that can interact with the physical world, and the accuracy of these devices depends on the vision techniques used. A survey by [10][4] presents a summary of data processing and domain-based data processing, evaluating various robot vision techniques, tools, and methodologies. Robot sensors and ears detect EM waves. The sound waves heard by human ears can also be detected by some robot sensors, such as microphones. Other robot sensors can detect waves beyond our capabilities, such as ultrasound. Cloud-based speech recognition systems use AI to interpret a user’s voice and convert it into text or commands, enable robots to interact with humans in a more natural way, automate certain tasks, and are hosted on the cloud for increased reliability and cost-effectiveness [11][5]. The sense of taste is the most challenging sense to replicate in the structure of robots. A lot of research has been conducted on this subject, but a definitive solution has not yet been reached. The human tongue, despite its small size, is highly complex, with different parts responsible for perceiving different flavors—bitter, sour, and salty—which adds to the difficulty of electronically reproducing the tongue. However, robots can now have a sense of taste. They can be programmed to detect flavors and distinguish between different tastes. This is used in the food industry to ensure that food products meet the required quality standards [13][6]. It consists of a sensor array composed of several types of sensors, each sensitive to a different taste. By analyzing the output of these sensors, the electronic tongue can detect and differentiate between various tastes and flavors. Additionally, the electronic tongue can measure the concentration of a specific substance in a sample, as well as its bitterness and sweetness. The Sixth Sense is a revolutionary new technology that can help to bridge the gap between humans and machines. It uses advanced artificial intelligence to recognize and respond to the user’s environment and surroundings. This technology can be used to create a more personal and interactive experience with machines, making them more human-like and helping to improve the overall user experience. The potential applications of this technology are endless, and it is sure to revolutionize how humans interact with machines and technology [14][7]. The researchers developed a gesture-controlled robot with an Arduino microcontroller and a smartphone. It uses a combination of hand gestures and voice commands to allow for a more intuitive way of controlling robots. With this technology, robots can be given complex commands with a few simple gestures.

2. Vision

Robot vision—a game-changer for automation processes. By giving robots the ability to see, scholars have unlocked a whole new level of precision and accuracy in smart automated processes. Robotic vision works similar to human vision—it captures valuable data from 3D images and applies them to the robot’s programming algorithm. With this, robots can identify colors, find parts, detect people, check the quality, process information, read text, and much more [18][8]. Robot vision also helps simplify complex and expensive fixtures—giving robots the power to find objects in their working envelope and adapt to variations in part size, shape, and location. All of this ultimately reduces costs and improves system efficiency [19][9]. Robot vision and machine vision are two related but distinct fields. Robot vision is a subset of machine vision that focuses specifically on the use of computer vision techniques for robotic applications. Machine vision, on the other hand, is a broader field that encompasses a range of technologies and techniques for extracting information from visual inputs, including still images and videos [20][10]. The major components of a machine vision system include lighting, lens, image sensor, vision processing, and communications. Lighting illuminates the part to be inspected, allowing its features to stand out so they can be clearly seen by the camera. The lens captures the image and presents it to the sensor in the form of light. The sensor in a machine vision camera converts this light into a digital image, which is then sent to the processor for analysis [21][11].
  • Lighting: Proper lighting is vital to the success of robotic vision systems. Poor lighting can cause an image to be undetectable to a robot, resulting in inefficient operation and loss of information.
A robotic hearing system, also known as an auditory system, is a type of sensor that allows a robot to detect and interpret sound waves. The main components of a robotic hearing system include [32][14]:
  • Microphones or other sound sensors: These are the devices that detect sound waves and convert them into electrical signals. There are many different types of microphones and sound sensors that can be used, including those that use diaphragms, piezoelectric crystals, or lasers to detect vibrations.
  • Amplifiers: These are electronic devices that are used to amplify the electrical signals that are generated by microphones or sound sensors. They can help to improve the sensitivity and accuracy of hearing sensors.
  • Analog-to-digital converters (ADCs): These are devices that are used to convert the analog electrical signals from the microphones or sound sensors into digital data that can be processed by the robot’s computer system.
  • Computer system: This is the central processing unit of the robot, which is responsible for controlling the various functions and sensors of the robot. The computer system is used to process digital data from ADCs and interpret and understand spoken commands or other sounds in the environment.
  • Algorithms and software: These are the instructions and programs that are used by the computer system to analyze and interpret digital data from microphones or sound sensors. The algorithms and software may be designed to recognize specific words or sounds or to understand and respond to more complex spoken commands.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22][12].

54. Tactile Sense

Tactile sense, also known as the sense of touch, allows humans and animals to perceive and interpret information about the texture, shape, and temperature of objects through the sense of touch. In robotics, the tactile sense is often simulated using sensors that are placed on the surface of a robot’s skin or limbs [45][15]. These sensors can detect pressure, temperature, and other physical sensations, and send this information to the robot’s control system. The control system can then use this information to make decisions about how to interact with the environment and how to move the robot’s body. Some robots also use haptic feedback, which allows them to transmit a sense of touch to the user by vibrating or applying pressure to the skin [46][16].
  • Sensing: The robot uses sensors to detect physical sensations, such as pressure, temperature, and force. These sensors may be mounted on the surface of the robot’s skin or limbs, and they may be connected to the control system through wires or wireless signals.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22][12].
  • Data processing: The control system processes the sensor data using algorithms that interpret the data and provide the robot with a sense of touch. This may involve filtering the data to remove noise or errors and applying algorithms to extract information about the shape, size, and texture of objects in the environment.
  • Decision-making: The control system uses the processed sensor data to make decisions about how to interact with the environment and how to move the robot’s body. This may involve adjusting the robot’s grip on an object, avoiding collisions, or navigating around obstacles.
  • Actuation: The control system sends commands to the robot’s actuators, which are responsible for moving the robot’s body. The actuators may be motors, servos, or other types of mechanical devices, and they use the commands from the control system to move the robot’s limbs and other body parts [47][17].
  • Lenses: The lens of a vision system directs the light in order to capture an image.
  • Image sensor: Image sensors are responsible for converting the light that is captured by the lens into a digital image for the robot to later analyze.
  • Vision processing: Vision processing is how robotic vision systems obtain data from an image that is used by robots for analysis in order to determine the best course of action for operation.
  • Communications: Communications connect and coordinate all robotic vision components. This allows all vision components to effectively and quickly interact and communicate for a successful system.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22][12].

43. Hearing Sense

There are several ways that robots can “hear”. One common method is to use microphones or other sensors that are able to detect sound waves and convert them into electrical signals. These signals can then be processed by the robot’s computer system and used to understand spoken commands or other sounds in the environment. Another method that some robots use to “hear” is to use lasers or other types of sensors to detect vibrations in the environment. These sensors can be used to detect the vibrations caused by sound waves, which can then be used to understand spoken commands or other sounds in the environment. Overall, the ability of robots to “hear” depends on the specific sensors and technology used, as well as the programming and algorithms that are in place to interpret and understand the signals that are being received [31][13].

65. Electrical Nose

An electronic nose is a technology that involves biological olfactory performance. It is used to recognize complex, volatile molecules, which can emulate the architecture and function of the olfactory system [60][18]. The use of olfactory sensing is gaining significant importance in various domains such as the food industry, environmental monitoring, and medical treatments. Here is a summary of electronic and bio-electronic noses: electronic noses have considerable application prospects in the identification of scents, such as wines, vegetables, cigarettes, and so on. It is broadly employed in odor sensing, raw material inspection, quality marking, and process administration. It is one of the essential tools for quality assurance and quality control [61][19]. The electronic nose is mainly employed in fruit and vegetable testing to evaluate quality, detect maturity, and identify species. Additionally, in terms of medical diagnostics, the electronic nose is cutting-edge for the early detection of diseases. By collecting only a small volume of human breath, the electronic nose can capture the various scents within it by using a bioreceptor, which then forms a processed chemical map [62][20]. Different sensors based on various chemical detection principles and electronic noses have been used in clinical disease diagnosis applications. An electronic nose typically consists of the following components:
  • Sensors: These are the components that detect the chemical compounds present in the sample being analyzed [63][21].
  • Data acquisition system: This component is responsible for collecting and storing the data from the sensors [64][22].
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22][12].
  • Data analysis system: This component is responsible for analyzing the data from the sensors and determining the specific chemicals present in the sample [65][23].
  • Display or output device: This component is used to present the results of the analysis to the user [66][24].
  • Sample introduction system: This component is responsible for introducing the sample to be analyzed into the electronic nose [67][25].
  • Power supply: This component provides the electrical power required to operate the electronic nose
  • [41].

8.5. Sixth Sense Techniques

7.5. Sixth Sense Techniques

There are many different ways that a Sixth Sense could work in a robot, depending on the specific capabilities and goals of the system.
  • Using sensors to detect and interpret physical phenomena that are not directly visible to the robot, such as temperature, pressure, or humidity. For example, a robot might have a Sixth Sense for temperature that allows it to detect changes in the ambient temperature and adjust its behavior accordingly [96][42].
  • [
  • ]
  • [
  • ].
  • Housing or enclosure: This component encloses and protects the other components of the electronic nose [66][24].

76. Electronic Tongue

An electronic tongue is a device that can mimic the ability of a human tongue to taste and distinguish different flavors. It typically consists of a series of sensors that can detect and measure various chemical properties. Electronic tongues are used in a variety of applications, including quality control in the food and beverage industry, monitoring the purity of water and other liquids, and the development of new flavors for products [78][26]. There are a wide variety of materials that can be used for sensing in electronic tongues. The specific materials used depend on the type of flavor or chemical property being detected, as well as the specific design and requirements of the electronic tongue [79][27].
  • Conductive polymers: These are special polymers that are highly conductive and can be used to detect changes in conductivity, which can be indicative of certain flavors or chemical properties [80][28].
  • Ion-selective electrodes: These are electrodes that are selectively sensitive to particular types of ions, such as sodium or potassium. They can be used to detect changes in the concentration of these ions, which can be indicative of certain flavors or chemical properties [81][29].
  • Using machine learning algorithms to process and interpret complex visual or auditory information in real-time. For example, a robot might be equipped with cameras and machine learning algorithms that allow it to recognize and classify objects in its environment, or to understand and respond to spoken commands [97][43].
    Optical fibers: These are fibers made of special materials that can transmit light over long distances. They can be used to detect changes in the refractive index or other optical properties of a substance, which can be indicative of certain flavors or chemical properties [82][30].
  • Humidity sensors: These sensors can detect changes in humidity and could be used to enable a robot to sense and respond to changes in its environment 98][44].
    Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22][12].
  • Piezoelectric materials: These are materials that produce an electrical charge when subjected to mechanical stress or strain. They can be used to detect changes in the mechanical properties of a substance, which can be indicative of certain flavors or chemical properties [83][31].
  • Surface acoustic wave devices: These are devices that use sound waves to detect changes in the properties of a substance. They can be used to detect changes in the viscosity, density, or other properties of a substance, which can be indicative of certain flavors or chemical properties [84][32].

87. Sixth Sense

A Sixth Sense could refer to a range of different things, depending on the context in which it is used. Some people might use the term to describe a heightened intuition or a sense of awareness that goes beyond the five physical senses of sight, hearing, taste, touch, and smell. In the context of robotics, a Sixth Sense might refer to a system or ability that allows a robot to perceive and interact with its environment in a way that goes beyond its basic sensors and actuators. This could include the ability to sense and respond to temperature, pressure, or other physical phenomena, or the ability to process and understand complex visual or auditory information [87][33].

8.1. The Difference between the Sixth Sense and the Other Five Senses in Robots

7.1. The Difference between the Sixth Sense and the Other Five Senses in Robots

The idea of a Sixth Sense is often used colloquially to refer to a supposed extra sensory ability beyond the five traditional senses (sight, hearing, taste, touch, and smell). While there is no scientific evidence to support the existence of a distinct Sixth Sense, there are certainly other senses and abilities that go beyond the traditional five, such as proprioception (the ability to sense the position and movement of one’s own body) and interoception (the ability to perceive internal bodily sensations). It is true that many of these senses and abilities involve similar sensory components as the traditional senses (such as receptors in the skin for touch or the inner ear for balance) [88][34]. However, the ultimate difference lies in the way the brain processes and integrates these various sensory inputs, as well as the conscious experiences and interpretations that result. For example, while vision and hearing rely on distinct sensory organs and pathways in the brain, they work together to create a cohesive and multi-dimensional perception of the world [89][35]. Similarly, proprioception and interoception involve a complex interplay of sensory inputs and motor outputs, allowing us to navigate and interact with our environment in a seamless and coordinated way. Ultimately, the idea of a Sixth Sense may be more of a metaphorical concept than a literal one, representing the idea that our sensory experiences and abilities go beyond a simple sum of their parts, and are shaped by complex interactions between our brains, bodies, and the world around us.

8.2. Components of a Sixth Sense System

7.2. Components of a Sixth Sense System

A Sixth Sense robot would likely include the following components [90][36]:
  • Sensors: These would include visual sensors cameras, auditory sensors microphones, tactile sensors, gustatory sensors, sensors for detecting taste, and olfactory sensors.
  • Manipulators: These would include arms, hands, or other devices that allow the robot to interact with its environment, such as picking up objects or manipulating tools.
  • Processor: This would be the central "brain" of the robot, responsible for processing the data from the sensors, executing commands, and controlling the manipulators.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22][12].
  • Power supply: This would provide the electrical power required to operate the robot.
  • Housing or enclosure: This would enclose and protect the other components of the robot.
  • Machine learning system: This would allow the robot to learn and adapt to new situations and environments, using techniques such as artificial neural networks and other machine learning algorithms.

8.3. Functionality of a Sixth Sense System

7.3. Functionality of a Sixth Sense System

In terms of functionality, a Sixth Sense robot might be able to [91][37]:
  • Navigate its environment using visual and/or other sensors to avoid obstacles and locate objects or destinations.
  • Identify and classify objects and other beings using visual and/or other sensors, and possibly use machine learning algorithms to improve its ability to recognize and classify new objects and beings.
  • Interact with objects and other beings using its manipulators, and possibly using force sensors and other sensors to gauge the appropriate amount of force to apply.
  • Communicate with other beings using various modalities such as speech, gestures, and facial expressions.
  • Learn and adapt to new situations and environments using machine learning algorithms and other techniques to improve its performance over time.

8.4. Types of Sixth Sense Sensors

7.4. Types of Sixth Sense Sensors

There are many different types of sensors that could potentially be used to enable a Sixth Sense in a robot. Some examples of sensors that might be used for this purpose include:
  • Temperature sensors: These sensors can detect changes in temperature and could be used to enable a robot to sense and respond to changes in its environment [92][38].
  • Pressure sensors: These sensors can detect changes in pressure and could be used to enable a robot to sense and respond to changes in its environment, such as changes in the amount of force being applied to it
Overall, the specific capabilities and implementation of a robot’s Sixth Sense will depend on the goals of the system and the technological resources available.

8.6. The Most Popular Trends and Challenges for a Robot’s Sixth Sense

7.6. The Most Popular Trends and Challenges for a Robot’s Sixth Sense

The concept of the Sixth Sense in robotics typically refers to the ability of robots to perceive and interact with the world in ways that go beyond the five traditional senses of sight, hearing, touch, taste, and smell. Here are some of the most popular trends and challenges in the development of Sixth Sense capabilities for robots:
  • Multi-sensor fusion: One of the main challenges in developing Sixth Sense capabilities for robots is the need to integrate data from multiple sensors and sources. This involves developing sophisticated algorithms for data fusion and interpretation that can combine information from a wide range of sensors, such as cameras, microphones, pressure sensors, and temperature sensors [99][45].
  • [93][39].
  • Machine learning: Machine learning and artificial intelligence (AI) are key technologies that can help robots develop a Sixth Sense. By analyzing and interpreting data from multiple sensors, robots can learn to recognize patterns and make predictions about their environment. This can help robots navigate complex environments, detect and avoid obstacles, and interact more intelligently with their surroundings [100][46].
    [92][38].
  • Using neural networks or other artificial intelligence techniques to enable the robot to make decisions and take actions based on its environment and its goals. For example, a robot might be programmed to navigate through a crowded environment by using its Sixth Sense to avoid obstacles and find the optimal path [
  • Haptic feedback: Haptic feedback, which involves providing robots with the ability to feel and respond to physical stimuli, is a key part of developing a Sixth Sense for robots. This involves developing sensors and actuators that can provide feedback to robots about their environment, such as changes in pressure or temperature [101][47].
    Cameras: These sensors can capture images and video, and could be used to enable a robot to perceive and understand its environment in a more sophisticated way [94][40].
  • Augmented reality (AR): Augmented reality technology can be used to enhance a robot’s perception of the world by providing additional visual or auditory information. This can help robots recognize and interact with objects more effectively, even in complex and changing environments [102][48].
    Microphones: These sensors can detect and record sound waves, and could be used to enable a robot to perceive and understand its environment through hearing [94][40].
  • Human–robot interaction: Developing a Sixth Sense for robots also requires the ability to interact with humans in a natural and intuitive way. This involves developing sensors and algorithms that can recognize human gestures and expressions, as well as natural language processing capabilities that enable robots to understand and respond to human speech [103][49].
    LiDAR sensors: These sensors use lasers to measure distance and can be used to enable a robot to build up a detailed 3D map of its environment [95]
 

References

  1. Coggins, T.N. More work for Roomba? Domestic robots, housework and the production of privacy. Prometheus 2022, 38, 98–112.
  2. Blanes, C.; Ortiz, C.; Mellado, M.; Beltrán, P. Assessment of eggplant firmness with accelerometers on a pneumatic robot gripper. Comput. Electron. Agric. 2015, 113, 44–50.
  3. Russell, R.A. Survey of Robotic Applications for Odor-Sensing Technology. Int. J. Robot. Res. 2001, 20, 144–162.
  4. Deshmukh, A. Survey Paper on Stereo-Vision Based Object Finding Robot. Int. J. Res. Appl. Sci. Eng. Technol. 2017, 5, 2100–2103.
  5. Deuerlein, C.; Langer, M.; Seßner, J.; Heß, P.; Franke, J. Human-robot-interaction using cloud-based speech recognition systems. Procedia Cirp 2021, 97, 130–135.
  6. Tan, J.; Xu, J. Applications of electronic nose (e-nose) and electronic tongue (e-tongue) in food quality-related properties determination: A review. Artif. Intell. Agric. 2020, 4, 104–115.
  7. Chanda, P.; Mukherjee, P.K.; Modak, S.; Nath, A. Gesture controlled robot using Arduino and android. Int. J. 2016, 6, 227–234.
  8. Chakraborty, E. What Is Robotic Vision?|5+ Important Applications. Lambda Geeks. Available online: https://lambdageeks.com/robotic-vision-important-features/ (accessed on 4 February 2023).
  9. YouTube. Robotic Vision System AEE Robotics Part 9. 2021. Available online: https://www.youtube.com/watch?v=7csTyRjKAeE (accessed on 4 February 2023).
  10. Tao, S.; Cao, J. Research on Machine Vision System Design Based on Deep Learning Neural Network. Wirel. Commun. Mob. Comput. 2022, 2022, 4808652.
  11. LTCC, PCB an Reticle Inspection Solutions—Stratus Vision AOI. (n.d.). Stratus Vision AOI. Available online: https://stratusvision.com/ (accessed on 4 February 2023).
  12. Pan, L.; Yang, S.X. An Electronic Nose Network System for Online Monitoring of Livestock Farm Odors. IEEE/ASME Trans. Mechatron. 2009, 14, 371–376.
  13. Attanayake, A.M.N.C.; Hansamali, W.G.R.U.; Hirshan, R.; Haleem, M.A.L.A.; Hinas, M.N.A. Amigo (A Social Robot): Development of a robot hearing system. In Proceedings of the IET 28th Annual Technical Conference, Virtual, 28 August 2021.
  14. ElGibreen, H.; Al Ali, G.; AlMegren, R.; AlEid, R.; AlQahtani, S. Telepresence Robot System for People with Speech or Mobility Disabilities. Sensors 2022, 22, 8746.
  15. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile Sensing—From Humans to Humanoids. IEEE Trans. Robot. 2009, 26, 1–20.
  16. Thai, M.T.; Phan, P.T.; Hoang, T.T.; Wong, S.; Lovell, N.H.; Do, T.N. Advanced intelligent systems for surgical robotics. Adv. Intell. Syst. 2020, 2, 1900138.
  17. Liu, Y.; Aleksandrov, M.; Hu, Z.; Meng, Y.; Zhang, L.; Zlatanova, S.; Ai, H.; Tao, P. Accurate light field depth estimation under occlusion. Pattern Recognit. 2023, 138, 109415.
  18. Göpel, W. Chemical imaging: I. Concepts and visions for electronic and bioelectronic noses. Sens. Actuators D Chem. 1998, 52, 125–142.
  19. Fitzgerald, J.E.; Bui, E.T.H.; Simon, N.M.; Fenniri, H. Artificial Nose Technology: Status and Prospects in Diagnostics. Trends Biotechnol. 2017, 35, 33–42.
  20. Kim, S.; Chen, J.; Cheng, T.; Gindulyte, A.; He, J.; He, S.; Li, Q.; Shoemaker, B.A.; Thiessen, P.A.; Yu, B.; et al. PubChem in 2021: New data content and improved web interfaces. Nucleic Acids Res. 2021, 49, D1388–D1395.
  21. James, D.; Scott, S.M.; Ali, Z.; O’Hare, W.T. Chemical Sensors for Electronic Nose Systems. Microchim. Acta 2005, 149, 1–17.
  22. Chueh, H.-T.; Hatfield, J.V. A real-time data acquisition system for a hand-held electronic nose (H2EN). Sens. Actuators B Chem. 2002, 83, 262–269.
  23. Pan, L.; Yang, S.X. A new intelligent electronic nose system for measuring and analysing livestock and poultry farm odours. Environ. Monit. Assess. 2007, 135, 399–408.
  24. Ampuero, S.; Bosset, J.O. The electronic nose applied to dairy products: A review. Sens. Actuators B Chem. 2003, 94, 1–12.
  25. Simpkins, A. Robotic Tactile Sensing: Technologies and System (Dahiya, R.S. and Valle, M.; 2013) (On the Shelf). IEEE Robot. Autom. Mag. 2013, 20, 107.
  26. Rodríguez-Méndez, M.L.; Apetrei, C.; de Saja, J.A. Evaluation of the polyphenolic content of extra virgin olive oils using an array of voltammetric sensors. Electrochim. Acta 2010, 53, 5867–5872.
  27. Ribeiro, C.M.G.; Strunkis, C.D.M.; Campos, P.V.S.; Salles, M.O. Electronic nose and tongue materials for Sensing. In Reference Module in Biomedical Sciences; Elsevier: Amsterdam, The Netherlands, 2021.
  28. Sierra-Padilla, A.; García-Guzmán, J.J.; López-Iglesias, D.; Palacios-Santander, J.M.; Cubillana-Aguilera, L. E-Tongues/noses based on conducting polymers and composite materials: Expanding the possibilities in complex analytical sensing. Sensors 2021, 21, 4976.
  29. Yan, R.; Qiu, S.; Tong, L.; Qian, Y. Review of progresses on clinical applications of ion selective electrodes for electrolytic ion tests: From conventional ISEs to graphene-based ISEs. Chem. Speciat. Bioavailab. 2016, 28, 72–77.
  30. Floris, I.; Sales, S.; Calderón, P.A.; Adam, J.M. Measurement uncertainty of multicore optical fiber sensors used to sense curvature and bending direction. Measurement 2019, 132, 35–46.
  31. Kiran, E.; Kaur, K.; Aggarwal, P. Artificial senses and their fusion as a booming technique in food quality assessment—A review. Qual. Assur. Saf. Crop. Foods 2022, 14, 9–18.
  32. Zhou, B. Construction and simulation of online English reading model in wireless surface acoustic wave sensor environment optimized by particle swarm optimization. Discret. Dyn. Nat. Soc. 2022, 2022, 1633781.
  33. Cominelli, L.; Carbonaro, N.; Mazzei, D.; Garofalo, R.; Tognetti, A.; De Rossi, D. A Multimodal Perception Framework for Users Emotional State Assessment in Social Robotics. Future Internet 2017, 9, 42.
  34. Grall, C.; Finn, E.S. Leveraging the power of media to drive cognition: A media-informed approach to naturalistic neuroscience. Soc. Cogn. Affect. Neurosci. 2022, 17, 598–608.
  35. Bonci, A.; Cen Cheng, P.D.; Indri, M.; Nabissi, G.; Sibona, F. Human-robot perception in industrial environments: A survey. Sensors 2021, 21, 1571.
  36. Laut, C.L.; Leasure, C.S.; Pi, H.; Carlin, S.M.; Chu, M.L.; Hillebr, G.H.; Lin, H.K.; Yi, X.I.; Stauff, D.L.; Skaar, E.P. DnaJ and ClpX Are Required for HitRS and HssRS Two-Component System Signaling in Bacillus anthracis. Infect. Immun. 2022, 90, e00560-21.
  37. Bari, R.; Gupta, A.K.; Mathur, P. An Overview of the Emerging Technology: Sixth Sense Technology: A Review. In Proceedings of the Second International Conference on Information Management and Machine Intelligence: ICIMMI 2020, Jaipur, India, 24–25 July 2020; Springer: Singapore, 2021; pp. 245–254.
  38. Wikelski, M.; Ponsford, M. Collective behaviour is what gives animals their “Sixth Sense”. New Sci. 2022, 254, 43–45.
  39. Xu, G.; Wan, Q.; Deng, W.; Guo, T.; Cheng, J. Smart-Sleeve: A Wearable Textile Pressure Sensor Array for Human Activity Recognition. Sensors 2022, 22, 1702.
  40. Hui, T.K.L.; Sherratt, R.S. Towards disappearing user interfaces for ubiquitous computing: Human enhancement from Sixth Sense to super senses. J. Ambient. Intell. Humaniz. Comput. 2017, 8, 449–465.
  41. Li, N.; Ho, C.P.; Xue, J.; Lim, L.W.; Chen, G.; Fu, Y.H.; Lee, L.Y.T. A Progress Review on Solid-State LiDAR and Nanophotonics-Based LiDAR Sensors. Laser Photonics Rev. 2022, 16, 2100511.
  42. Randall, N.; Bennett, C.C.; Šabanović, S.; Nagata, S.; Eldridge, L.; Collins, S.; Piatt, J.A. More than just friends: In-home use and design recommendations for sensing socially assistive robots (SARs) by older adults with depression. Paladyn J. Behav. Robot. 2019, 10, 237–255.
  43. Shih, B.; Shah, D.; Li, J.; Thuruthel, T.G.; Park, Y.-L.; Iida, F.; Bao, Z.; Kramer-Bottiglio, R.; Tolley, M.T. Electronic skins and machine learning for intelligent soft robots. Sci. Robot. 2020, 5, eaaz9239.
  44. Anagnostis, A.; Benos, L.; Tsaopoulos, D.; Tagarakis, A.; Tsolakis, N.; Bochtis, D. Human Activity Recognition through Recurrent Neural Networks for Human–Robot Interaction in Agriculture. Appl. Sci. 2021, 11, 2188.
  45. Tsanousa, A.; Bektsis, E.; Kyriakopoulos, C.; González, A.G.; Leturiondo, U.; Gialampoukidis, I.; Karakostas, A.; Vrochidis, S.; Kompatsiaris, I. A review of multisensor data fusion solutions in smart manufacturing: Systems and trends. Sensors 2022, 22, 1734.
  46. Kumar S.N., N.; Zahid, M.; Khan, S.M. Sixth Sense Robot For The Collection of Basic Land Survey Data. Int. Res. J. Eng. Technol. 2021, 8, 4484–4489.
  47. Saracino, A.; Deguet, A.; Staderini, F.; Boushaki, M.N.; Cianchi, F.; Menciassi, A.; Sinibaldi, E. Haptic feedback in the da Vinci Research Kit (dVRK): A user study based on grasping, palpation, and incision tasks. Int. J. Med. Robot. Comput. Assist. Surg. 2019, 15, E1999.
  48. García, A.; Solanes, J.E.; Muñoz, A.; Gracia, L.; Tornero, J. Augmented Reality-Based Interface for Bimanual Robot Teleoperation. Appl. Sci. 2022, 12, 4379.
  49. Akalin, N.; Kristoffersson, A.; Loutfi, A. Evaluating the sense of safety and security in human–robot interaction with older people. In Social Robots: Technological, Societal and Ethical Aspects of Human-Robot Interaction; Springer: Berlin/Heidelberg, Germany, 2019; pp. 237–264.
More
Video Production Service