Olfactory Displays in Education and Training: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor:

Olfactory displays are defined as human–computer interfaces that generate and diffuse or transmit one or more odors to a user for a purpose. Computer-generated odors, in conjunction with other sensory information, have been proposed and used in education and training settings over the past four decades, supporting memorization of information, helping immerse learners into 3D educational environments, and complementing or supplementing human senses.

  • olfactory displays
  • smell
  • human–computer interfaces

1. Olfactory Applications That Support Training

Designers and developers of immersive 3D virtual environments including virtual simulations and serious games used for training, typically aim to faithfully recreate real-world scenarios. In fact, it has been suggested that “achieving multisensory digital experiences is the holy grail of human–technology interaction” in general [83]. However, traditional emphasis is placed on recreating the visual, and (perhaps to a lesser extent), auditory scenes, while ignoring the other senses (including touch, smell, and taste) despite their importance in the real world [86]. However, current digital experiences are primarily based on audiovisual stimulation and involve other sensory stimulation to a lesser extent [87]. Olfactory stimuli, in particular, are widely neglected, although the sense of smell influences many of our daily life choices, affects our behavior, and can catch and direct our attention [6]. Incorporating pleasant and congruent ambient odors into a virtual reality experience can lead to enhanced sensory stimulation, which in turn, can directly (and indirectly through ease of imagination) influence affective and behavioral reactions [82]. Finally, incorporating olfactory technologies into virtual environments has shown to be safe and effective for targeting several aspects of psychological and physical health such as anxiety, stress, and pain [88]. Simulating the sense of touch, smell, and taste is not trivial and presents many technological challenges and issues. Recently, there has been a large effort made on simulating the sense of touch (e.g., haptics) and this effort has been accelerated with the availability of consumer-level haptic devices. That being said, although greater work remains, as described below, olfaction has been applied in virtual reality (3D) learning environments.
Olfactory stimuli have also been applied to support training of people in different configurations and situations. Cater (1994) [32] developed a backpack-mounted fire-fighter training system at the Deep Immersion Virtual Environment Laboratory, which generated odors that were sent to a fire-fighter oxygen mask, to train persons to distinguish different types of smoke. However, the author reported that the system generated strong odors, causing extreme discomfort in some trainees.
A review conducted by Spencer (2006) [89] outlined a number of research projects that used artificially generated odors in medical simulations for education and training. Olfactory information is a key factor in medicine to complement a correct patient diagnosis of many diseases. The review [89] argues that it is technically feasible to use odors in virtual reality simulations for medical training, thanks to recent technological advances that have led to devices for odor production in a computer interface, and the development of efficient ways to disperse them remotely over the Internet and in local networks. They showed that adding simulated odors to a virtual reality medical simulator effectively complements medical diagnoses and enhance training skills in medicine students. Kent et al. (2016) [90] discussed the use of olfactory stimuli in healthcare simulators, carrying out a systematic review of literature identifying five relevant papers that described the use of smell in medical simulations. They found that olfaction is very rarely used in medical simulations and found mixed results from the five reviewed papers. The researchers determined that clinically relevant odors (such as an odor of iodine scrub) could be more effective than using general smells for supporting training, and smell can be very useful for improving simulation fidelity.
A compelling application of olfactory human–computer interfaces has been developed for military training. According to Vlahos (2006) [91], theme-park designers and the University of Southern California developed a virtual reality simulator to train U.S. soldiers, integrating a number of odors to enhance the ambiance of a virtual war scenario. The soldiers don an electronic device around their neck that generates scents. Each odor is activated wirelessly, and they are activated according to the events generated in the virtual reality war simulation. For instance, when soldiers fire a gun, they can perceive the smell of gunpowder coming from the electronic device they wear. Preliminary research pointed out that the use of odors in a simulated war environment enhanced soldiers’ mental immersion, a key element that supports training.
Tsai and Hsieh (2012) [21] explored the use of two types of computer-generated odors to support training of computer programmers in writing efficient software code. The purpose of the olfactory display was to help improve coding style and to identify code errors. Results from a test indicated that more than 80% of participants reported that the smell was useful for identifying errors and to improve their coding style. The researchers built an olfactory display using an Arduino™ microcontroller board connected to a pair of off-the-shelf household aromatizers.
Narciso et al. (2019) [92] developed a virtual reality system whose goal was to analyze how olfactory stimulus supported training of firefighters. The researchers set up a multimodal virtual environment simulating a virtual closed container, where firefighters were holding a virtual fire, with the added burnt wood odor stimulus. The odor was generated by a SensoryCo SmX-4D™ aroma system, by using compressed air, and preloading the odor into the system. A between-groups experimental result showed that although the multimodal virtual environment was successful in knowledge transference overall, the addition of smell stimulus in the group of firefighters that experienced the smell did not significantly influence any of the measured variables of presence, cybersickness, fatigue, stress, and knowledge transfer. The researchers explained that a possible cause of low odor effectiveness was because the firefighters were not fully equipped with protective gear, and the multimodal virtual reality environment (including its olfactory display) was not immersive enough in the experiment, thus reporting lower values of fatigue and stress in it.
 

2. Educational Olfactory Applications

Computer-generated odors can be an important support for learning, since they can be useful for provoking positive emotions in learners, thus reducing their stress levels at school [60]. In addition, odors can help “enhance memory performance through better problem solving, reduce response times, produce fewer errors, increase recall, recognition, and retention, and enhance productivity, alertness, and physical performance” [61]. Youngblut et al. (1996) [62] pointed out a number of benefits of olfaction in educational virtual reality, such as reducing students’ stress, and the ability of improving information retention and recall. The following paragraphs summarize examples of research projects that employ odors for supporting learning.
Olfactory applications have been used in art exhibitions for enhancing visitors’ educational experiences about multimodal art. Lai (2015) [63] developed an interactive art exhibition where patrons perceived five odors (scents of grass, baby powder, whiskey tobacco, dark chocolate, and leather), generated by mist diffusers. In one part of the exhibition, the odors were linked to artwork such as origami boxes that patrons rearranged themselves, allowing other visitors to perceive the odors in different order. In another part, the aromas (a sweet or pleasant odor) were generated according to the visitors’ walking direction, allowing other visitors to perceive them. Lai reported that the odors worked as a powerful communication medium and a complement to the other senses in the artistic perception. Another example of multimodal experiences, including olfactory information, was presented at the Tate Britain art gallery in London, UK [64], where a combination of visual, touch, auditory, and olfactory stimuli was used for supporting artistic appreciation of paintings. Visitors were allowed to hold and smell 3D printed scented objects that were related to some paintings that visitors could observe, while listening to related sounds. Vi et al. (2017) [64] collected and analyzed post-questionnaire data completed by 50 visitors (participants), showing that “all participants strongly acknowledged that stimulating all the senses added another layer, dimension, and perspective to the experience of the paintings and thus opened new ways of thinking and interpreting art”. Qualitative data showed that odors were useful in supporting understanding the paintings’ meaning and artistic renderings.
Tijou et al. (2006) [65] developed a fully immersive desktop virtual reality (VR) system to investigate the effects of olfaction on learning, recall, and retention of 3D structures of organic molecules. The VR system included two commercial olfactory display devices generating up to six odors that were related to the virtual molecular models presented in the VR system, for example, a molecular model of vanillin was paired with a vanilla odor. In the VR system, the 3D graphical molecular models were displayed on both a computer monitor and a head-mounted display (HMD). Users interacted with the molecular models using a special 3D mouse or a hand movement tracking system. The paper demonstrated the feasibility of a multimodal VR system used for learning in the sciences. However, the authors [65] did not report any testing conducted with students, and proposed future work that will study the role of olfaction, interaction techniques and depth cues while learning molecular structures.
Richard et al. (2006) [66] introduced the ‘‘Nice-smelling Interactive Multimedia Alphabet’’ project that involved developing a multimodal computer application that included olfactory, visual, and auditory information. The main objective of this multimodal application was to support learning of letters of the alphabet. However, Richard et al. do not provide further details on the project.
Miyaura et al. (2011) [67] developed an olfactory display that diffused odors of ylang-ylang and peppermint using inkjet technology. An electrocardiogram was used to measure users’ concentration levels when performing basic addition tasks. The researchers reported that the odors helped to decrease errors in the additions. The aim of the olfactory display was to help learners to re-engage in additional tasks with the use of odors once a concentration lapse was detected with the electrocardiogram.
Kwok et al. (2009) [68] introduced the SAMAL (Smart Ambience for Affective Learning) system, which included the development and testing of a multimodal ambient room with visual, auditory, and olfactory stimuli. One of the objectives of SAMAL was to integrate cognitive and affective issues with the purpose of enhancing learning, and studying the emotional and affective experience of learners while perceiving multisensory stimuli coming from the SAMAL system. The ambient room included 3D stereo projection, 3D interaction using a Wii-mote, high-fidelity audio, and an olfactory display system with spray dispensers, among other interesting features. The SAMAL system provided some ambient “scenarios” to evoke different affective and cognitive states of mind and feelings. For example, a scenario called “Blue Hat Smart Ambient” displayed a 3D projection of a quiet road shown along with a sound effect of rain and provided students with a smell of violets. According to Kwok et al. (2009) [68], all of the multimodal stimuli were designed to promote a “feeling of calm and wakening needed for better control and direction” of students while solving a problem. Another scenario, a green apple odor was dispersed to stimulate the “fresh, liberated and free-thinking feelings needed for triggering new or wild ideas.” Preliminary findings of post-tests applied to learners showed that the SAMAL system generally did influence students’ affective experiences, improving their learning effectiveness. However, Kwok et al. (2009) [68] did not describe any effects of odors on students’ cognitive processes, and they did not describe the odor generation system in greater detail.
Garcia-Ruiz et al. (2008) [69] described a usability study that tested the integration of an odor in an educational 3D virtual environment (a virtual town with some buildings, street lamps, and roads with street name signs) developed for second language learning. Twelve computer science student participants tested the virtual environment, where they followed oral instructions in English (their native language was Spanish) about going from point A to B in the virtual environment, using a regular mouse for “walking around” in the streets of the virtual town. While walking on the virtual streets, all participants were presented with a fresh leaves of mint (Mentha Spicata) odor. After the test, participants completed the System Usability Scale (SUS) questionnaire—a 10-item questionnaire with five response options for respondents, from Strongly agree to Strongly disagree—which is a reliable tool for measuring the usability of a system [70]. Preliminary results indicated that all participants perceived the usability of the multimodal virtual environment as very good. In addition, participants reported that the mint odor helped them lower their anxiety when listening to the oral instructions in English. Experimental results obtained by Herz et al. (2004) [71] have shown that a mint odor (among other pleasant smells) can stimulate or activate the mood of learners, and pleasant responses to odors are learned through emotional associations. The use of mint to affect mood has also been examined in marketing applications [72]. Moreover, Ho and Spence (2005) [73] demonstrated experimentally that olfactory stimulation of mint facilitates tactile performance of complex tasks, which may be useful for supporting further training in computer simulations that require dexterity. Oliver (2012) [74] incorporated odors in a language learning seminar. The seminar explored how odors can be incorporated into teaching literary concepts of English language at different levels, pointing out that odors can work as an important learning tool.
Czyzewski et al. (2010) [39] developed a computer-controlled device capable of generating tiny drops of scented oil, previously stored in a glass pipe, and rapidly released to the environment using compressed air. This is a technique that the authors called “cold air diffusion.” The developed device can generate up to four odors simultaneously and was tested using an educational software showing animated animal cartoons. This multimedia software was designed for measuring the degree of concentration in young students. The students had to concentrate on the movements of a bee character while purposely distracted with other moving characters. In addition, a particular odor was displayed with the developed device when the animations were shown to the students. However, they do not describe what type of odors were used. The researchers [39] reported that although the results of an initial test were inconclusive, the test served to correct technical problems from the device and to analyze the effectiveness of the olfactory device. The researchers also pointed out that their device could further be used to support multisensory stimulation for science-based education.
Covaci et al. (2018) [75] developed a multiplayer serious game intended to teach high-school students about the seventeenth century’s Age of Discovery (a period in history when several European kingdoms started to explore the world by sea for trading goods such as spices). In a study, participants played the serious game and required to open and smell small jars containing odors of real spices and beans, including coffee beans, cocoa beans, ginger, pepper, cinnamon, and clove. The spices were smelled at the same time that an image of each spice appeared visually in the game. Presumably, participants were required to close the jar when it was no longer needed. This research aimed at exploring ways to design multisensory experiences in serious games. The researchers’ goal was to examine any differences in students’ performance and enjoyment while playing the game on desktop computers and mobile devices, in the absence or presence of olfactory stimulation. They found, through a self-reporting engagement questionnaire, that multiple sense stimulation in a serious game engaged the users, potentially improving the learning process. However, results of pre-test and post-test knowledge questionnaires showed that the olfactory feedback did not yield an improvement in performance. This was possibly due to the participants’ cultural background which prevented them to effectively associate the images of the spices with their odors, their ability to discriminate among the odor stimuli, and the fact that students did not verbalize the odors used in the game. As Guinea and Ademoye (2011) [16] highlighted, it is difficult to effectively measure the user experience (UX) on the quality of olfactory data presented through olfactory displays.
In the real world, our senses are constantly responding to specific physical phenomena, providing the perceptual system with data that is processed/integrated by the nervous system, producing multisensory information that allow us to acquire knowledge [76]. The integration of environmental information across multiple sensory channels (such as the olfactory channel) is critical to guiding decisions and behaviors [77]. The different senses also interact with one another and alter each other’s processing and ultimately the resulting perception. Stability of perception in everyday life is preserved by integration of multimodal information and perception of synchrony in cross-modal combinations plays an important role in maintaining perceptual stability in a continually changing environment [78]. Various “real-world” studies suggest that sound can have a significant effect on the perception of the other senses. For example, sound (noise) can have an impact on the perception of food gustatory properties, food crunchiness, and food liking [79]. In other words, our construct of our environment and our ability to interact with that environment is very much determined by the interaction of our senses. An understanding of this multimodal interactivity in the real world can therefore inform our development of environments in the virtual space. Most scientific research suggests that the more modalities that are integrated into a virtual representation of our environment (including the smell modality), the greater the sense of presence or immersion in that space [1]. Additional modalities can reinforce existing information, or provide additional information that cannot be obtained by a single modality alone. For instance, although vision tends to dominate, auditory information can tell us what is behind a door, or behind or inside of our bodies. With respect to education, training, and memory, most studies acknowledge that multiple sensory inputs (including odors) result in improved processing and retention [80]. The term cross-modal interaction or integration is defined as the process of coordinating multimodal information (information that stimulates multiple senses), from different sensory channels into a final percept [81]. Signals that incorporate multiple sensory modalities enhance cognitive processes, including learning and decision-making [82]. As Flavian et al. (2021) [83] point out, “achieving multisensory digital experiences is the holy grail of human–technology interaction”, and “providing multisensory experiences in digital environments is one of the future priorities in technology development”.
However, some care should be given when considering multimodal stimuli. As Kapralos et al. (2017) [1] describe, redundancy, that is, the repetition between modalities of the same message, can increase communication and facilitate knowledge transfer. However, due to channel limits—that is, how much information we can absorb at any one time (also known as cognitive load)—redundancy can at times decrease effectiveness. Redundancy that does not provide new information runs the risk of decreasing the facilitation of communication. In other words, we have a limited cognitive ability to take in information across multiple senses at the same time, and therefore an overload of information—too much sensory stimulation—can impede task performance. Additional modalities with incongruent information add cognitive load when not carefully balanced with the main means of communication.
Klašnja-Milićević et al. (2018) [84] investigated olfaction-based applications in multimodal human–computer interfaces for learning, with the goal of determining how the senses of smell, taste, vision, and hearing interact and how they can improve memorization in a virtual reality educational application that focused on the solar system. Sixty university students participated in an experiment, where they were randomly divided into two groups. One group used an educational VR system while they consumed chocolate and/or coffee and were presented with vapors of three essential oils: citrus, rosemary, and mint. The other group used an educational VR system without the olfactory and taste stimuli. A within-groups testing protocol revealed that participants who consumed the chocolate, drank the coffee, and smelled the citrus oil vapor while using the VR learning application scored higher in a pre-/post-test.
Alkasasbeh and Ghinea (2019) [85] developed a multimodal website for learning about geography. Four odors were used to represent information about four countries (Brazil = coffee, India = curry, Japan = green tea, and South Africa = wild grass) and were delivered to web users through a dry-air scent diffuser with fans for directing the odors to the users. They hypothesized that memory and recall information about those countries would be improved with the odors. A test was conducted with 32 participants who were randomly distributed within eight conditions (text only, audio only, olfactory media only, images, audio and text, audio and olfactory media, audio, text and olfactory media, text, and olfactory media). The conditions depicted geographical information about the four countries. The researchers administered pre-tests and post-tests with questions about the four countries. Results showed that the questions from the post-test in which the olfactory media was synchronized with the audiovisual media was significantly different (improved with higher than average scores) compared to those who were not provided with any olfactory stimuli. However, the odor-only related questions yielded no significant difference. The use of odors for recalling information is consistent with previous research [35].

This entry is adapted from the peer-reviewed paper 10.3390/mti5100064

This entry is offline, you can click here to edit this entry!