A growing number of studies have highlighted the various sensory interactions involved in the musical experience, as relationships between music and dimensions of taste, olfaction, sound, and visual qualities, such as associations between pitch and the size of images or objects, spatial location and frequency, and instrumental timbres and visual shapes. These studies share the premise that the way we relate to the musical phenomenon, whether in the processes of production, perception, or understanding, emerges from an integrated and intrinsically multisensory perceptual event. Nevertheless, because music is present daily in everyday life and because this experience is inherently subjective, such interactions tend to occur so naturally and seem so obvious that they have been relegated to common sense. On the other hand, evidence indicates that sensory interactions constitute a fundamental ancestral mechanism for cognitive and neuronal development governed by non-arbitrary tendencies, multiple variables, and patterns of predictability. The novel contribution of this review is to advance a dynamic theoretical model of multisensory musical experience that takes crossmodal correspondences as its central organising axis, articulated through three structuring principles (universality, congruence effect, hierarchical tendency) and their interaction with musical organisation, cognitive structure, and the sensory systems mobilised by music. A future research agenda is also proposed to broaden and deepen investigations in the field of music psychology and human development.
As an experience involving different sensory mechanisms, music has been recognised for its intrinsically multisensory nature
[1]. In fact, from the manipulation of sound through musical instruments to the reading of signs and notation to physical involvement, whether creating, playing, or listening, the musical phenomenon activates and interacts with different perceptual senses, such as auditory, tactile, visual, and kinesthetic, as well as emotional and cognitive mechanisms.
Although seemingly obvious, a growing body of research shows that sensory interactions involving the musical phenomenon are governed by non-arbitrary tendencies, as in the case of the relationships between pitch and the size of images and objects
[2,3,4[2][3][4][5][6][7],
5,6,7], spatial location and frequency
[8], brightness and visual forms
[9], in addition to instrumental timbres and visual forms
[10]. Other studies have reported connections between music and colours
[11[11][12][13],
12,13], icons and figures
[14[14][15][16],
15,16], and paintings
[17,18,19][17][18][19]. In addition, relationships between music and taste
[20,21,22,23,24,25][20][21][22][23][24][25] and between music and the sense of smell
[26] have been reported. These studies share the premise that the way we relate to the musical phenomenon, whether in its production, perception, or understanding, emerges from an integrated perceptual event. There is evidence that combining information across sensory systems constitutes a fundamental ancestral mechanism for cognitive and neural development
[27], manifesting itself in different ways among individuals. Consequently, interactions between sensory domains are believed to be central features of the human mind
[28]. In seeking to understand the foundations that underpin this integration, this review draws on philosophical assumptions and empirical evidence to propose a theoretical model that helps to understand the multisensory nature of music, while also offering an agenda for future research in this area.