Submitted Successfully!
Thank you for your contribution! You can also upload a video entry related to this topic through the link below: https://encyclopedia.pub/user/video_add?id=23615
Check Note
2000/2000
Ver. Summary Created by Modification Content Size Created at Operation
1 -- 3185 2022-05-31 17:56:55 |
2 format -36 word(s) 3149 2022-06-01 03:23:22 |
Bimodal Processing and Learning in Insect Models
Edit
Upload a video

The study of sensory systems in insects has a long-spanning history of almost an entire century. Olfaction, vision, and gustation are thoroughly researched in several robust insect models and new discoveries are made every day on the more elusive thermo- and mechano-sensory systems. Few specialized senses such as hygro- and magneto-reception are also identified in some insects. In light of recent advancements in the scientific investigation of insect behavior, it is not only important to study sensory modalities individually, but also as a combination of multimodal inputs. This is of particular significance, as a combinatorial approach to study sensory behaviors mimics the real-time environment of an insect with a wide spectrum of information available to it. As a fascinating field that is recently gaining new insight, multimodal integration in insects serves as a fundamental basis to understand complex insect behaviors including, but not limited to navigation, foraging, learning, and memory. 

sensory systems olfaction Bimodal Processing Honeybees
Information
Subjects: Zoology
View Times: 118
Revisions: 2 times (View History)
Update Date: 01 Jun 2022
Table of Contents

    1. Introduction

    Insects perform precisely controlled tasks in extremely small time scales to navigate ecological niches and therefore serve as an excellent system to study complex behaviors and their origins. Reliable cues from an external environment are critical for decision-making and most insect behaviors occur as a consequence of simultaneous input-consolidation through multimodal sensory channels. For example, a predatory robber fly processes olfactory, visual, and directional cues simultaneously before executing a calculated aerial attack on a potential prey [1]. Similarly, a pollinating bumblebee in flight receives an overload of sensory information from a colorful flower emitting attractive volatiles [2]. In both these examples, diverse cues from the surroundings translate to information and elicit meaningful responses from the insects. Such multimodal integration has also been investigated in the context of courtship, mating, fleeing, and feeding behaviors of insects. Owing to the ease of approach and experimental design in laboratory conditions, unimodal sensory processing has been studied in different insect models and their isolated functioning has been extensively investigated [3][4][5][6][7]. However, multimodal integration, behaviors that utilize multimodal cues and their neuronal workings are still open fields for exploration. New discoveries made in this field emphasize the role of multimodal integration in improving the decision-making ability of insects, by altering the speed and accuracy of directed responses to stimuli. Therefore, it warrants the importance of investigating sensory signals in combination than individually. To that end, several studies have adapted virtual reality arenas to provide insects with the control of their own flight environments. This technique allows for the design of ecologically relevant experiments to study visual behaviors in navigation [8] and the mechanism of color learning in honeybees [9].

    2. Honeybees (Apidae)

    The honeybee has been one of the most sustained research models for several decades, owing to its eusocial construct of living and the huge repertoire of complex behaviors that comes with it. With a total of only 950,000 neurons [10], the remarkable cognitive capabilities of these insects and their neural correlates have been the subject of intrigue in the field of learning and memory [11]. Individual hive members consolidate multi-channel information within and outside the hive to perform sophisticated tasks specific to their rank. Starting from the early works of Nobel laureate Karl von Frisch in 1965, different ethological approaches have uncovered the role of bimodal integration in pollinators, especially in a plant-pollinator context [12][13][14][15][16][17][18]. The transition between the 20th and the 21st century saw the development of classical conditioning experiments in restrained honeybees, where the pairing of a unimodal olfactory or a visual stimulus with a reward or a punishment leads to robust associative learning of the stimulus. Since then, several studies have addressed olfactory and visual learning separately, establishing that honeybees are excellent learners of both these sensory modalities. The most popular paradigm was the proboscis extension response (PER) which was followed later by the tethered flight arena. In controlled laboratory conditions, the usage of both visual and olfactory stimuli in a single PER paradigm opened up possibilities to present a restrained honeybee with a combinatorial CS+ (i.e., the conditioned stimulus paired with a positive or negative reinforcer). The study was one of the first to demonstrate a positive interaction between the two modalities, where a previous training with a visual stimulus enhanced olfactory learning [19]. These findings contradicted the overshadowing effect of bimodal training reported in earlier works on foraging bees. The earliest reports of synergistic effects of color on odor learning indicated stronger memory formation when compared to isolated unimodal training [20][21][22][23]. In later years, compound learning as a paradigm also addressed the effect of simultaneous stimuli presentation during training. In such a paradigm, positive patterning defines that the US is paired with the compound stimuli and negative patterning means that the US is paired with the individual components. Such experiments show that ultraviolet light can be learned better than other wavelengths and can specifically interfere with the reinforcement of a reward with an olfactory-visual combination [24]. Notably, the honeybee was the first insect model demonstrated to solve both, positive and negative patterning tasks, that involved more than one modality [25][26]. Therefore, further physiological investigation of this behavior can help narrow down the specific neuronal correlates directly underlying sensory integration. It is also noteworthy that trained honeybees can exhibit cross-modal associative behavior wherein they can recall just the specific color that was present alongside an odor scent during reward reinforcement [27][28]. This effect, which was previously described in humans, providing evidence for an information transfer between different sensory modalities during active flight, thus simplifying the process of foraging and increasing fitness in the insects [18][29]. In the past decade, aversive conditioning paradigms using an electric shock as a punishment were employed to examine both olfactory and visual conditioning in similar assays, enabling more direct behavioral comparisons between them [11][12][19][23][24][30]. Such methods have also been widely replicated to identify the brain centers that underlie the learning behaviors of both modalities [31]. Being a robust research model that offers a large variety of multimodal behaviors to choose from, honeybees provide a large ground to investigate cognitive tasks that involve complex sensory processing, both within and outside the laboratory environment.

    3. Ants (Formicidae)

    The ant is another hymenopteran model that has received a lot of attention from ethologists and neurobiologists alike for exhibiting sophisticated social behaviors that are largely integrative in nature. Members of the ant family Formicidae are distributed across 12,000 different species and show large diversity in anatomy, physiology, and behavior. They occur in different terrestrial habitats in huge numbers and therefore offer great sampling access to perform population studies.
    The foraging members of an ant colony are regularly faced with the challenge of finding food, which could be miles away from their nest, and then finding their way back home. Several studies have identified multiple navigational techniques employed by ants of different species and other hymenopterans by extension, to perform this task [32][33][34][35]. Path integration is the most important tool in the box as it helps the ant to update its current position relative to the nest [36][37]. By counting the number of steps taken in a direction and using the celestial compass for orientation, foraging ants are able to form trajectories back to the nest. In fixed terrestrial habitats with specific panoramic views, foraging wood ants of the species Formica rufa learn and encode diverse visual cues of the scene and recall them during navigation on a match-the-view basis, therefore providing evidence for very strong long-term memories [38][39]. In contrast to such rich visual habitats, desert ants belonging to the genus Cataglyphis perform more challenging navigational tasks in a featureless habitat with very few food rewards scattered very far from their nests, while also constantly under the risk of predation by robber flies and jumping spiders [40][41]. In such a scenario, desert ants are capable of utilizing consistent landmarks when making the trip back to the nest [42]. However, it is of recent consensus that a multimodal input is necessary to navigate to and from a potential food source [43]. Initially, olfactory cues were thought to influence only inter-colony communication and nest mate identification. Although visual cues render navigation possible, only olfaction can provide a chemical tracking of a potential food source. With only visual cues, the ant can reach within a few meters of the source, but when there is no odor plume to direct it further, it keeps altering its course until it encounters one. This final indicator of the food reward marks the foraging trip as a success and makes the ant remember this route for future trips [44][45]. Such studies have also revealed that the navigational strategies do not directly lead to the food, but to the location where the odor plume was first detected [46]. It has also been shown that certain environment-specific blends in the desert habitat evoked significant electro-antennogram (EAG) activity in the Cataglyphis ants and can therefore be detected by the animals [47][48]. These could potentially be used alongside landmarks for fine navigation. Furthermore, the same study also showed that these ants could use olfactory cues to remember the nest entrance when the visual representation of the same is inconspicuous, implying the role of chemosensation in homing behavior. In order to investigate if and how these two modalities are integrated, a later study published by the same group combined visual and olfactory cues to represent a landmark in the path of a foraging ant [49]. The combined cues enabled the ants to focus their search immediately after the first training trial, therefore saving a lot of time and energy while returning to the nest. When trained once to a combination of both sensory modalities, the ants still showed a strong recognition of the individual olfactory and visual cues in the test phase. Interestingly, upon extended the training with the combined cue, the responses to the single cues were broader and more ambiguous, implying that a stronger reinforcement of the combined cue was used as a more reliable indicator of the landmark. This behavior substantiates the efficiency of processing multisensory cues in the learning of landmarks during difficult foraging tasks.
    The remarkable integrative behaviors of foraging ants in the field have been frequently used to understand the significance of complex cue computation. However, the mechanistic framework supporting these behaviors are yet to be fully understood, owing to the poorly elucidated neural circuitry in the ant brain. Although a fair bit of similarity exists between the anatomy of the ant brain and the bee brain, functional and physiological experiments on live ants continue to be a technical challenge, especially when the aim is to establish ecologically relevant paradigms. However, the last ten years have also seen the development of genetic tools including the use of CRISPR-Cas9 method to generate transgenic ants [50][51][52]. These techniques are aimed to manipulate and genetically trace the olfactory centers while observing the consequent effects on the social behaviors, thereby opening up a new portal for in-depth exploration into the brain of the ant.

    4. Flies (Drosophilidae)

    The reputation of the vinegar fly D. melanogaster is beyond formidable. The ease of genetic manipulation, amenability to different experimental approaches and rapid, affordable upscaling makes it one of the most indispensable basic research model system worldwide. A century’s worth of work has been done to understand various physiological and behavioral pathways of the insect brain, enabling scientists to use it as a template to draw countless parallels to the more complex vertebrate’s systems. Therefore, it is of no surprise that when multisensory behaviors began garnering attention, neurobiologists turned to the humble fly for answers [53].
    Decades of fly research have focused on discrete behaviors that arise as a consequence of sensory input, with special attention given to vision and chemosensation, while the basis of multimodal mechanisms is still a nascent field of study. The most obvious incidences of multimodal interactions are reported in fly-feeding behaviors. While food-derived odorants provide the maximum input for the tracking of a potential food-source by the fly, several other important features such as taste, texture, color, temperature, and wind direction are also received and processed during the decision-making process. A synchronous addition of a mechanosensory and an olfactory cue to the taste stimulus enhances the proboscis extension response (PER) in flies and initiates feeding [54]. Wind directions indicating the odor plume trajectory and visual input via the optic flow are vital for navigation in the wild [55]. Contributions from different sensory modalities are essential for behaviors such as egg laying, where picking a suitable substrate would ensure the safety of the eggs and sustenance for the developing larvae. Alongside parasitoid- specific odor cues, D. melanogaster females also utilize visual cues to detect the presence of the wasp, which activates a signaling pathway to suppress egg laying [56]. Sensory integration also plays a major role in the communication of mating signals during the courtship ritual of animals [57]. A classic example is found in the male courtship behavior of the fly, where a combination of olfaction, gustation and vision is required for the male fly, not just to initiate the courtship with a virgin female, but also to sustain its sex drive and carry the courtship to completion and succeeding in copulation [58]. It has also been shown that the presence of a food odor can increase the salience of the male-released pheromone, cis-vaccenyl acetate (cVA), thereby preventing males from making futile attempts to court a mated female on feeding sites [59][60].
    Combinatorial processing of sensory information is also seen in larval locomotion studies, where synergistic activation of mechanosensory and nociceptive neurons increased the likelihood of rolling, an escape behavior exhibited by the larvae [61]. Larvae can also compute integrative behaviors just before making the decision to turn [62]. Based on the CO2 levels, light intensity and the presence of attractive odors, D. melanogaster larvae use head-sweeps to scan the spatial gradients in the environment and linearly combine both olfactory and visual signals before executing a turn [63][64][65].
    With the advent of the tethered flight era, many research groups have targeted the role of sensory integration during flight maneuvers [66]. Any behavior exhibited by a flying insect requires motor control with high spatio-temporal precision while also processing multimodal navigational cues that change in real time. Wing-beat analyzers coupled to a flight simulator provide a conducive set-up to study motor response behaviors to varying stimuli. Although different sensory organs relay different information about the environment, the flight control system integrates input from the halteres (i.e., modified hind-wings essential for flight) and the optic lobes, in a manner such that the motor response to the combined input is always greater than that elicited by just one modality [67]. Such summation models have also been observed in the integration of visual and mechanosensory stimuli during a turn-behavior executed by a tethered fly [68]. Closed-loop tethered flight experiments also show that visual feedback can increase olfactory acuity by regulating odor-motor responses of the fly [69]. Conversely, the same system has also been used to show modulation of visual salience by odor activity in a context dependent manner [70]. An attractive odor or the optogenetic representation of one can be used to reverse the aversive nature of a small object in the fly’s visual field [71][72]. In the wild, such inter-modality regulation systems can narrow down search behaviors and greatly increase foraging efficiencies. The ease of stimulus presentation in the tethered flight system coupled with recent advancements in optogenetic neuronal control provides a strong foundation to explore the circuit dynamics of multi-sensorimotor responses in the fly model.
    The first instance of using more than one kind of stimulus in a learning paradigm also happened in the tethered flight simulator. When two different types of visual stimuli—colors and patterns are used as a CS to be associated with a heat stimulus—flies show robust compound learning, with equally strong associations being produced for both stimuli individually [73]. The study specifies that flies can acquire, store, and retrieve the two stimuli separately and also as a compound. A similar paradigm was then used to study cross-modal integration in flies, where instead of two stimuli of the same modality, the CS involved a combination of an odor and a visual pattern [74]. The study performs two important experiments that postulated possible information transfer between the two sensory modalities: In the first experiment, the flies were trained to unimodally associate a heat reinforcement to an odor of low concentration and a visual pattern stimulus (both elicited very low learning responses on their own). In such a scenario, a bimodal conditioning that consisted of the two stimuli paired with the heat reinforcement produced stronger learning performances, both when retrieved as a compound memory and as individual components. This observation further proves the principle of inverse effectiveness, where a weak memory reinforcement can be amplified using a cross-modal percept [75]. In the second experiment, a combination of both stimuli was provided simultaneously for “sensory preconditioning” after which each sensory modality was paired with the heat reinforcement individually. In the testing phase after such a preconditioned training paradigm, even retrieval with the non-reinforced stimulus produced a robust learning response, signifying a very strong cross-modal transfer of memory. Aligning with the experiments done in honeybees, these observations were pivotal in the understanding of bimodal information transfer that occurs during operant conditioning, especially in a situation of sensory deficit, where a different source of input reinforcing the same consequence can greatly aid in quick decision-making.
    Every sensory element that constitutes a context holds weight in how the experience is remembered and can be used to retrieve the memory at a later point in time. A recent study [76] utilized specific components of the “Tully T-maze”, a two-choice learning paradigm [77], to illustrate this concept in D. melanogaster. This included the replication of specific aspects (except the US) of the aversive training paradigm onto the testing phase, such as the color of the light, the temperature of the chamber and the input of visual and mechanosensory cues from the copper coil that conducts the electric shock. The study aimed to understand the substrates of aversive conditioning and long-term memory (LTM) generated thereafter. The findings show that flies can perform context-based multimodal integration in response to an aversive learning experience. They use this information as a basis for forming long-lasting memories retrievable even after 14 days. During the testing phase, when replicating the context in which the reinforcement (here: an electric shock) was delivered, a significant long-term memory was formed immediately after the first training trial. Contradictory to older studies, targeted blocking of the protein consolidation did not impair this behavior in flies, clearly indicating that such a context-dependent LTM (cLTM) does not require protein synthesis. However, when the copper grid or the visual context of reinforcement was removed from the testing phase or when the perception of visual input was genetically inhibited, the cLTM was significantly abolished implying the importance of vision in the retrieval process. The importance of an encoding context in enabling efficient memory retrieval is often described in psychology and observed in complex vertebrate models, including humans. However, extensive physiological and molecular work are required to pinpoint the neural substrates that can relay information transfer between distinct sensory modalities.

    References

    1. Nicholas, S.; Supple, J.; Leibbrandt, R.; Gonzalez-Bellido, P.T.; Nordström, K. Integration of Small-and Wide-Field Visual Features in Target-Selective Descending Neurons of Both Predatory and Nonpredatory Dipterans. J. Neurosci. 2018, 38, 10725–10733.
    2. Kulahci, I.G.; Dornhaus, A.; Papaj, D.R. Multimodal Signals Enhance Decision Making in Foraging Bumble-Bees. Proc. R. Soc. B Biol. Sci. 2008, 275, 797–802.
    3. Giurfa, M. Visual Learning in Social Insects: From Simple Associations to Higher-Order Problem Solving BT-Sensory Perception: Mind and Matter; Barth, F.G., Giampieri-Deutsch, P., Klein, H.-D., Eds.; Springer: Vienna, Austria, 2012; pp. 109–133.
    4. Fleischer, J.; Pregitzer, P.; Breer, H.; Krieger, J. Access to the Odor World: Olfactory Receptors and Their Role for Signal Transduction in Insects. Cell. Mol. Life Sci. 2018, 75, 485–508.
    5. Masse, N.Y.; Turner, G.C.; Jefferis, G.S.X.E. Olfactory Information Processing in Drosophila. Curr. Biol. 2009, 19, R700–R713.
    6. Shanbhag, S.; Müller, B.; Steinbrecht, R. Atlas of Olfactory Organs of Drosophila melanogaster 1. Types, External Organization, Innervation. Int. J. Insect Morphol. Embryol. 1999, 28, 377–397.
    7. Avargués-Weber, A.; Deisig, N.; Giurfa, M. Visual Cognition in Social Insects. Annu. Rev. Entomol. 2011, 56, 423–443.
    8. Stowers, J.R.; Hofbauer, M.; Bastien, R.; Griessner, J.; Higgins, P.; Farooqui, S.; Fischer, R.M.; Nowikovsky, K.; Haubensak, W.; Couzin, I.D.; et al. Virtual Reality for Freely Moving Animals. Nat. Methods 2017, 14, 995–1002.
    9. Lafon, G.; Howard, S.R.; Paffhausen, B.H.; Avarguès-Weber, A.; Giurfa, M. Motion Cues from the Background Influence Associative Color Learning of Honey Bees in a Virtual-Reality Scenario. Sci. Rep. 2021, 11, 21127.
    10. Witthöft, W. Absolute Anzahl Und Verteilung Der Zellen Im Him Der Honigbiene. Z. Morphol. Tiere 1967, 61, 160–184.
    11. Menzel, R. The Honeybee as a Model for Understanding the Basis of Cognition. Nat. Rev. Neurosci. 2012, 13, 758–768.
    12. Sun, X.; Yue, S.; Mangan, M. How the Insect Central Complex Could Coordinate Multimodal Navigation. Elife 2021, 10, e73077.
    13. Kantsa, A.; Raguso, R.A.; Dyer, A.G.; Sgardelis, S.P.; Olesen, J.M.; Petanidou, T. Community-Wide Integration of Floral Colour and Scent in a Mediterranean Scrubland. Nat. Ecol. Evol. 2017, 1, 1502–1510.
    14. Kunze, J.; Gumbert, A. The Combined Effect of Color and Odor on Flower Choice Behavior of Bumble Bees in Flower Mimicry Systems. Behav. Ecol. 2001, 12, 447–456.
    15. Balamurali, G.S.; Rose, S.; Somanathan, H.; Kodandaramaiah, U. Complex Multi-Modal Sensory Integration and Context Specificity in Colour Preferences of a Pierid Butterfly. J. Exp. Biol. 2020, 223, jeb223271.
    16. Leonard, A.S.; Dornhaus, A.; Papaj, D.R. Flowers Help Bees Cope with Uncertainty: Signal Detection and the Function of Floral Complexity. J. Exp. Biol. 2011, 214, 113–121.
    17. Westerman, E.L.; Monteiro, A. Odour Influences Whether Females Learn to Prefer or to Avoid Wing Patterns of Male Butterflies. Anim. Behav. 2013, 86, 1139–1145.
    18. Leonard, A.S.; Masek, P. Multisensory Integration of Colors and Scents: Insights from Bees and Flowers. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 2014, 200, 463–474.
    19. Gerber, B.; Smith, B.H. Visual Modulation of Olfactory Learning in Honeybees. J. Exp. Biol. 1998, 201, 2213–2217.
    20. Couvillon, P.A.; Mateo, E.T.; Bitterman, M.E. Reward and Learning in Honeybees: Analysis of an Overshadowing Effect. Anim. Learn. Behav. 1996, 24, 19–27.
    21. Funayama, E.S.; Couvillon, P.A.; Bitterman, M.E. Compound Conditioning in Honeybees: Blocking Tests of the Independence Assumption. Anim. Learn. Behav. 1995, 23, 429–437.
    22. Mota, T.; Giurfa, M.; Sandoz, J.C. Color Modulates Olfactory Learning in Honeybees by an Occasion-Setting Mechanism. Learn. Mem. 2011, 18, 144–155.
    23. Zhang, L.Z.; Zhang, S.W.; Wang, Z.L.; Yan, W.Y.; Zeng, Z.J. Cross-Modal Interaction between Visual and Olfactory Learning in Apis cerana. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 2014, 200, 899–909.
    24. Becker, M.C.; Rössler, W.; Strube-Bloss, M.F. UV Light Perception Is Modulated by the Odour Element of an Olfactory–Visual Compound in Restrained Honeybees. J. Exp. Biol. 2019, 222, jeb201483.
    25. Devaud, J.M.; Papouin, T.; Carcaud, J.; Sandoz, J.C.; Grönewald, B.; Giurfa, M. Neural Substrate for Higher-Order Learning in an Insect: Mushroom Bodies Are Necessary for Configural Discriminations. Proc. Natl. Acad. Sci. USA 2015, 112, E5854–E5862.
    26. Mansur, B.E.; Rodrigues, J.R.V.; Mota, T. Bimodal Patterning Discrimination in Harnessed Honey Bees. Front. Psychol. 2018, 9, 1529.
    27. Zhang, S.W.; Lehrer, M.; Srinivasan, M.V. Honeybee Memory: Navigation by Associative Grouping and Recall of Visual Stimuli. Neurobiol. Learn. Mem. 1999, 72, 180–201.
    28. Srinivasan, M.V.; Zhang, S.W.; Zhu, H. Honeybees Link Sights to Smells. Nature 1998, 396, 637–638.
    29. Reinhard, J.; Srinivasan, M.V.; Guez, D.; Zhang, S.W. Floral Scents Induce Recall of Navigational and Visual Memories in Honeybees. J. Exp. Biol. 2004, 207, 4371–4381.
    30. Giurfa, M. Learning and Cognition in Insects. Wiley Interdiscip. Rev. Cogn. Sci. 2015, 6, 383–395.
    31. Plath, J.A.; Entler, B.V.; Kirkerud, N.H.; Schlegel, U.; Galizia, C.G.; Barron, A.B. Different Roles for Honey Bee Mushroom Bodies and Central Complex in Visual Learning of Colored Lights in an Aversive Conditioning Assay. Front. Behav. Neurosci. 2017, 11, 98.
    32. Freas, C.A.; Schultheiss, P. How to Navigate in Different Environments and Situations: Lessons from Ants. Front. Psychol. 2018, 9, 841.
    33. Schwarz, S.; Wystrach, A.; Cheng, K. Ants’ Navigation in an Unfamiliar Environment Is Influenced by Their Experience of a Familiar Route. Sci. Rep. 2017, 7, 14161.
    34. Wystrach, A.; Mangan, M.; Webb, B. Optimal Cue Integration in Ants. Proc. R. Soc. B Biol. Sci. 2015, 282, 20151484.
    35. Zeil, J.; Narendra, A.; Stürzl, W. Looking and Homing: How Displaced Ants Decide Where to Go. Philos. Trans. R. Soc. B Biol. Sci. 2014, 369, 20130034.
    36. Collett, T.S.; Collett, M. Path Integration in Insects. Curr. Opin. Neurobiol. 2000, 10, 757–762.
    37. Andel, D.; Wehner, R. Path Integration in Desert Ants, Cataglyphis: How to Make a Homing Ant Run Away from Home. Proc. R. Soc. B Biol. Sci. 2004, 271, 1485–1489.
    38. Lent, D.D.; Graham, P.; Collett, T.S. Visual Scene Perception in Navigating Wood Ants. Curr. Biol. 2013, 23, 684–690.
    39. Buehlmann, C.; Woodgate, J.L.; Collett, T.S. On the Encoding of Panoramic Visual Scenes in Navigating Wood Ants. Curr. Biol. 2016, 26, 2022–2027.
    40. Schmid-Hempel, P.; Schmid-Hempel, R. Life Duration and Turnover of Foragers in the Ant Cataglyphis bicolor (Hymenoptera, Formicidae). Insectes Soc. 1984, 31, 345–360.
    41. Narendra, A.; Si, A.; Sulikowski, D.; Cheng, K. Learning, Retention and Coding of Nest-Associated Visual Cues by the Australian Desert Ant, Melophorus bagoti. Behav. Ecol. Sociobiol. 2007, 61, 1543–1553.
    42. Bisch-Knaden, S.; Wehner, R. Landmark Memories Are More Robust When Acquired at the Nest Site than En Route: Experiments in Desert Ants. Naturwissenschaften 2003, 90, 127–130.
    43. Shams, L.; Seitz, A.R. Benefits of Multisensory Learning. Trends Cogn. Sci. 2008, 12, 411–417.
    44. Buehlmann, C.; Graham, P.; Hansson, B.S.; Knaden, M. Desert Ants Locate Food by Combining High Sensitivity to Food Odors with Extensive Crosswind Runs. Curr. Biol. 2014, 24, 960–964.
    45. Wolf, H.; Wehner, R.; Institut, Z.; Zürich, U.; Zürich, C. Pinpointing Food Sources: Olfactory And Anemotactic Orientation In Desert Ants, Cataglyphis fortis. J. Exp. Biol. 2000, 868, 857–868.
    46. Collett, M.; Cardé, R.T. Navigation: Many Senses Make Efficient Foraging Paths. Curr. Biol. 2014, 24, 362–364.
    47. Steck, K.; Hansson, B.S.; Knaden, M. Smells like Home: Desert Ants, Cataglyphis fortis, Use Olfactory Landmarks to Pinpoint the Nest. Front. Zool. 2009, 6, 5.
    48. Steck, K.; Knaden, M.; Hansson, B.S. Do Desert Ants Smell the Scenery in Stereo? Anim. Behav. 2010, 79, 939–945.
    49. Steck, K.; Hansson, B.S.; Knaden, M. Desert Ants Benefit from Combining Visual and Olfactory Landmarks. J. Exp. Biol. 2011, 214, 1307–1312.
    50. Yan, H.; Opachaloemphan, C.; Mancini, G.; Yang, H.; Gallitto, M.; Mlejnek, J.; Leibholz, A.; Haight, K.; Ghaninia, M.; Huo, L.; et al. An Engineered Orco Mutation Produces Aberrant Social Behavior and Defective Neural Development in Ants. Cell 2017, 170, 736–747.e9.
    51. McKenzie, S.K.; Fetter-Pruneda, I.; Ruta, V.; Kronauer, D.J.C. Transcriptomics and Neuroanatomy of the Clonal Raider Ant Implicate an Expanded Clade of Odorant Receptors in Chemical Communication. Proc. Natl. Acad. Sci. USA 2016, 113, 14091–14096.
    52. Trible, W.; Olivos-Cisneros, L.; McKenzie, S.K.; Saragosti, J.; Chang, N.C.; Matthews, B.J.; Oxley, P.R.; Kronauer, D.J.C. Orco Mutagenesis Causes Loss of Antennal Lobe Glomeruli and Impaired Social Behavior in Ants. Cell 2017, 170, 727–735.e10.
    53. Tolwinski, N.S. Introduction: Drosophila-A Model System for Developmental Biology. J. Dev. Biol. 2017, 5, 9.
    54. Oh, S.M.; Jeong, K.; Seo, J.T.; Moon, S.J. Multisensory Interactions Regulate Feeding Behavior in Drosophila. Proc. Natl. Acad. Sci. USA 2021, 118, e2004523118.
    55. Cardé, R.T.; Willis, M.A. Navigational Strategies Used by Insects to Find Distant, Wind-Borne Sources of Odor. J. Chem. Ecol. 2008, 34, 854–866.
    56. Sadanandappa, M.K.; Sathyanarayana, S.H.; Kondo, S.; Bosco, G. Neuropeptide F Signaling Regulates Parasitoid-Specific Germline Development and Egg-Laying in Drosophila. PLoS Genet. 2021, 17, e1009456.
    57. Halfwerk, W.; Varkevisser, J.; Simon, R.; Mendoza, E.; Scharff, C.; Riebel, K. Toward Testing for Multimodal Perception of Mating Signals. Front. Ecol. Evol. 2019, 7, 2013–2019.
    58. Krstic, D.; Boll, W.; Noll, M. Sensory Integration Regulating Male Courtship Behavior in Drosophila. PLoS ONE 2009, 4, e4457.
    59. Das, S.; Trona, F.; Khallaf, M.A.; Schuh, E.; Knaden, M.; Hansson, B.S.; Sachse, S. Electrical Synapses Mediate Synergism between Pheromone and Food Odors in Drosophila melanogaster. Proc. Natl. Acad. Sci. USA 2017, 114, E9962–E9971.
    60. Griffith, L.C.; Ejima, A. Multimodal Sensory Integration of Courtship Stimulating Cues in Drosophila melanogaster: Contextual Effects on Chemosensory Cues. Ann. N. Y. Acad. Sci. 2009, 1170, 394–398.
    61. Ohyama, T.; Schneider-Mizell, C.M.; Fetter, R.D.; Aleman, J.V.; Franconville, R.; Rivera-Alba, M.; Mensh, B.D.; Branson, K.M.; Simpson, J.H.; Truman, J.W.; et al. A Multilevel Multimodal Circuit Enhances Action Selection in Drosophila. Nature 2015, 520, 633–639.
    62. Gepner, R.; Skanata, M.M.; Bernat, N.M.; Kaplow, M.; Gershow, M. Computations Underlying Drosophila Photo- Taxis, Odor-Taxis, and Multi-Sensory Integration. Elife 2015, 4, e6229.
    63. Gershow, M.; Berck, M.; Mathew, D.; Luo, L.; Kane, E.A.; Carlson, J.R.; Samuel, A.D.T. Controlling Airborne Cues to Study Small Animal Navigation. Nat. Methods 2012, 9, 290–296.
    64. Gomez-Marin, A.; Louis, M. Multilevel Control of Run Orientation in Drosophila Larval Chemotaxis. Front. Behav. Neurosci. 2014, 8, 38.
    65. Gomez-Marin, A.; Stephens, G.J.; Louis, M. Active Sampling and Decision Making in Drosophila Chemotaxis. Nat. Commun. 2011, 2, 441.
    66. Lehmann, F.O.; Dickinson, M.H. The Changes in Power Requirements and Muscle Efficiency during Elevated Force Production in the Fruit Fly Drosophila melanogaster. J. Exp. Biol. 1997, 200, 1133–1143.
    67. Sherman, A.; Dickinson, M.H. Summation of Visual and Mechanosensory Feedback in Drosophila Flight Control. J. Exp. Biol. 2004, 207, 133–142.
    68. Currier, T.A.; Nagel, K.I. Multisensory Control of Orientation in Tethered Flying Drosophila. Curr. Biol. 2018, 28, 3533–3546.e6.
    69. Frye, M.A.; Dickinson, M.H. Motor Output Reflects the Linear Superposition of Visual and Olfactory Inputs in Drosophila. J. Exp. Biol. 2004, 207, 123–131.
    70. Chow, D.M.; Frye, M.A. Context-Dependent Olfactory Enhancement of Optomotor Flight Control in Drosophila. J. Exp. Biol. 2008, 211, 2478–2485.
    71. Cheng, K.Y.; Colbath, R.A.; Frye, M.A. Olfactory and Neuromodulatory Signals Reverse Visual Object Avoidance to Approach in Drosophila. Curr. Biol. 2019, 29, 2058–2065.e2.
    72. Cheng, K.Y.; Frye, M.A. Odour Boosts Visual Object Approach in Flies. Biol. Lett. 2021, 17, 20200770.
    73. Brembs, B.; Heisenberg, M. Conditioning with Compound Stimuli in Drosophila melanogaster in the Flight Simulator. J. Exp. Biol. 2001, 204, 2849–2859.
    74. Guo, J.; Guo, A. Neuroscience: Crossmodal Interactions between Olfactory and Visual Learning in Drosophila. Science 2005, 309, 307–310.
    75. Stein, B.E.; Meredith, M.A. The Merging of the Senses. In The Merging of the Senses; The MIT Press: Cambridge, MA, USA, 1993; p. 211.
    76. Zhao, B.; Sun, J.; Zhang, X.; Mo, H.; Niu, Y.; Li, Q.; Wang, L.; Zhong, Y. Long-Term Memory Is Formed Immediately without the Need for Protein Synthesis-Dependent Consolidation in Drosophila. Nat. Commun. 2019, 10, 4550.
    77. Tully, T.; Quinn, W.G. Classical Conditioning and Retention in Normal and Mutant Drosophila melanogaster. J. Comp. Physiol. A 1985, 157, 263–277.
    More
    Information
    Subjects: Zoology
    Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : ,
    View Times: 118
    Revisions: 2 times (View History)
    Update Date: 01 Jun 2022
    Table of Contents
      1000/1000

      Confirm

      Are you sure you want to delete?

      Video Upload Options

      Do you have a full video?
      Cite
      If you have any further questions, please contact Encyclopedia Editorial Office.
      Sachse, S.; Thiagarajan, D. Bimodal Processing and Learning in Insect Models. Encyclopedia. Available online: https://encyclopedia.pub/entry/23615 (accessed on 06 February 2023).
      Sachse S, Thiagarajan D. Bimodal Processing and Learning in Insect Models. Encyclopedia. Available at: https://encyclopedia.pub/entry/23615. Accessed February 06, 2023.
      Sachse, Silke, Devasena Thiagarajan. "Bimodal Processing and Learning in Insect Models," Encyclopedia, https://encyclopedia.pub/entry/23615 (accessed February 06, 2023).
      Sachse, S., & Thiagarajan, D. (2022, May 31). Bimodal Processing and Learning in Insect Models. In Encyclopedia. https://encyclopedia.pub/entry/23615
      Sachse, Silke and Devasena Thiagarajan. ''Bimodal Processing and Learning in Insect Models.'' Encyclopedia. Web. 31 May, 2022.
      Top
      Feedback