2. The Social Brain and Social Cognitive Neuroscience
Despite the clear anthropological and evolutionary connection between social cognition, social behavior, and the social brain, described above, this framework has not yet been fully integrated into our current understanding of social cognitive neuroscience. In truth, the intricacies of the neurological computations that underpin group living in primates are substantial, including activities like coalition formation, tactical deception, organizing grooming cliques, social play, and social learning
[1]. In humans, a complex network of brain regions underlies important social activities, including the recognition and cognitive processing of social signals, recognizing faces, evaluating mental states (i.e., mentalizing or theory of mind), perceiving emotions, sharing attention, determining friends from foes, evaluating others’ perceptions and beliefs, social learning, relationship formation, and social bonding
[1][4][59].
In a preliminary, noteworthy model of the social brain in the 1990s, neuroscientist Leslie Brothers [18][10] highlighted the contributions of the amygdala, orbitofrontal cortex (OFC), superior temporal sulcus (STS), and fusiform gyrus (FFG) to social information processing. More recently, functional magnetic resonance imaging (fMRI) has provided additional recognition of an interconnected network of regions joining the parietal and temporal brain lobes to the prefrontal brain lobes [4][11].
Broadly speaking, the OFC is implicated in social reinforcement and social reward processing [4][612]. More specifically, the STS region, particularly the right-hemisphere posterior STS (pSTS) area, processes biological motion signals, like the hand, eye, and salient motions of the body, to predict and interpret the intentions and behaviors of other agents [4][612]. In addition to this area, the right inferior temporal gyrus, fusiform gyrus, right parietal lobule, and middle temporal gyrus in each hemisphere are differentially activated by processing the direction of gaze [4][612].
Moreover, the default mode network (DMN)—comprised of the dorsal medial prefrontal cortex (mPFC), posterior cingulate cortex, precuneus, angular gyrus, and, occasionally, right temporoparietal junction (rTPJ)—which is known for activation when an individual is unfocused on the external world and the brain is at conscious rest, further appears to be active when an individual is thinking about the self, the past and future, and most intriguingly, evaluating the mental states of other people (i.e., mentalizing or theory of mind) [7]. Further, much work now reveals that the social brain hypothesis explains not only variation in the volume of the brain between various primates, but also individual differences in the volume of the brain in humans, in regard to several different features of human social networking and social cognition. In particular, the volume of gray matter in the OFC, ACC, ventromedial prefrontal cortex (vmPFC), amygdala, and STS are associated with individual differences in higher-order intentionality capacity (i.e., advanced mentalizing or theory of mind) and social network size [8][913][1014].
Lastly, recent studies of ‘mirror neurons’—neurons in the brain that activate when an organism acts, as well as when the same organism observes this same action done by another—have been postulated to be integral for mentalizing or theory of mind, language, empathy, comprehending the intentions and acts of agents, and imitative learning [10][11][12]. In other words, studies of the default mode neural network, especially, in studies of adult monkeys, suggest that observing an action and producing the same action oneself are neurally equivalent, and, at least in monkeys, this capacity appears to occupy a role in social comprehension and imitation [10][11][12]. Though mirror neurons of the brain have been observed directly in non-human primates—most notably, in macaques—in humans, brain activity merely consistent with mirror neurons has been found in the primary somatosensory cortex, inferior and superior parietal lobes, inferior frontal cortex, premotor cortex, and supplementary motor region [1315].
3. The Social Brain and Cognitive Neuroscience of Language
In a similar fashion, despite the clear anthropological and evolutionary connection between the social brain and social communication, as described above, this framework has not yet been fully integrated into our current understanding of the cognitive neuroscience of human language
[1416][1517]. Perhaps most critically, a complex neurological system of communication—for regulating interactions and social bonding with important members of the group—appears to be crucial for many non-human primates, including human social relationships
[1618]. In humans, a complex network of brain regions underlies the processing of language, including speech comprehension and production, and substantive integration with the social brain, including social-semantic working memory, and encompassing regulation from the neural network level to the neurotransmitter level, including social neurotransmitters such as oxytocin, endorphins, and dopamine
[17][18][19][20][21][22][23].
In an influential and noteworthy model of the cognitive neuroscience of language, Pierre Paul Broca determined in 1861 that language processing areas are located primarily in the left cerebral hemisphere of the brain
[2224]. In later years, much research, including neuroanatomical analyses by Geschwind and Galaburda, further suggested left hemisphere dominance in brain areas dedicated to language
[2325][2426], including myelinated axons and larger pyramidal neurons in the left hemisphere, allowing for more rapid and efficient processing of linguistic information
[2527][2628].
Nonetheless, more recent work has further shown that additional areas, including the putamen, caudate nucleus, and internal capsule appear to play additional roles in language processing [49][29], while very young children also show significant activity in the inferior frontal and superior temporal regions of the right cerebral hemisphere—homologs of traditional left cerebral hemisphere language areas—with an activation profile in the right cerebral hemisphere that appears to diminish with age [50][30]. Intriguingly, homologous brain regions of Broca’s area and Wernicke’s area have also been discovered in the brains of social, group-adapted, nonhuman primates, strongly suggesting a shared evolutionary or phylogenetic history [51,52][31][32]. Though their function in nonhuman primates is poorly understood, an evolutionary perspective would suggest that they are probably central to nonhuman primate vocalization processing, in ways similar to human language processing [53,54,55,56,57,58,59][33][34][35][36][37][38][39].
4. The Social Brain and First Language Acquisition
4.1. Social Signals That Facilitate Early Language Acquisition
Social interaction skills, including play, reading, reference, or joint attention between an infant or child and parent or guardian to an outward thing, and the face-to-face interactions involved in speaking in natural language environments, crucially aid the early acquisition of language (see
Figure 51;
[1820][2740][2841][2942][3043][3144][3245]). In particular, infant-directed speech (IDS) and child-directed speech (CDS), or the face-to-face communication cues between an infant or child and parent or guardian, aid language acquisition by delivering relevant social signals (e.g., gestures, facial and emotional expressions, and directed eye-gaze), provoking infant attention, and emphasizing important pragmatic signals. Crucially, social interaction appears to impact the development of both speech perception and comprehension
[3329], as well as speech production learning
[3430][3531][3632].
Figure 51. Social interaction skills, including play, reading, joint attention, and the face-to-face interactions involved in infant-directed speech (IDS) in natural language environments crucially aid the early acquisition of language. IDS aids language acquisition by providing relevant social signals (e.g., gestures, facial and emotional expressions, and directed eye gaze) that provoke infant attention and emphasize important pragmatic signals. Adapted image from the public domain.
Several important developments accompany the capacity to understand reference and joint attention of an infant and parent to an outward thing or object [68][46]. By 9 months of age, youngsters start to participate in individual–object–individual triadic activities in which interest is devoted to objects using gaze, which provokes attention from other individuals, also known as joint attention [65,66][47][48]. This shared perception of communicative intentions is likely to be critical for the infant’s learning of language [65[47][48][49],66,85], as well as understanding others as intentional agents [65,86][47][50].
In addition, the quality and quantity of speech stimuli (e.g., vocabulary diversity, amount of word units, and mean length of utterance (MLU)) are further associated with infant vocabulary growth [87,88,89][51][52][53]. Unfortunately, while most language acquisition research has been conducted on families of high socioeconomic status (SES), infants raised in poorer communities with multiple challenges can affect caregiver interactions, leading to greater variability in language abilities [90,91][54][55]; although, see [92][56] for a recent alternative perspective.
4.2. Infant-/Child-Directed Speech and Face-to-Face Communication
Infant-directed speech (IDS) and child-directed speech (CDS) are intrinsically multimodal, and many nonverbal social signals are present during this sort of communication (e.g., gestures, facial and emotional expressions, and directed eye gaze; see
Figure 51). Previous research has shown that directed eye gaze is a key form of nonverbal communication, as it facilitates language acquisition in several regards, including language processing, development of vocabulary, and perceptual mapping of form-to-object
[3733][3857][3935]. For instance, gaze following and directed eye contact provoke arousal and attention by emphasizing important social stimuli and facilitating the infant’s or child’s social engagement
[4036][4137].
Directed gaze as a tool for language learning is typically distinguished by an early developmental trajectory where infants display a proclivity for open eyes on upright faces, involving the specialization of areas of the cortex associated with gaze processing [97,108,109,110,111][58][59][60][61][62]. Newborns develop the capacity for gaze following beginning from 3–4 months, becoming a consistent communication signal from 6–8 months of age [108,110][63][61]. However, it is not until 9–12 months that directed gaze begins to become an important tool used for indicating reference, facilitating language acquisition by providing directed eye gaze signaling [64,112,113,114,115][64][65][66][67][68].
Quality and quantity of speech during interactions are also significant factors in language learning, especially the growth of vocabulary [4238][4369][4446][4570]. For instance, studies have shown the quantity of child-directed speech at 18 months predicts vocabulary growth at 2 years [4671]. Parental engagement, namely, vocal reactions to infant vocalizations with either words or vowels, quickly influences infant vocal productions, as newborns start to assimilate phonological sound patterns spoken by the parent, facilitating the acquisition of new vocalizations [4772].
Child–parent social interactions are further affected by a number of environmental factors, like socioeconomic status (SES). More specifically, SES affects both the quantity and quality of parental speech stimuli [4238][4873]; for instance, children of low-SES families tend to display more sluggish real-time effectiveness of linguistic processing and subsequent growth of vocabulary [4974].
5. The Social Brain, Cognitive Neuroscience of Language, and First Language Acquisition
In light of the significance of child–parent interactions for acquiring language, neuroimaging research has recently begun to investigate how just this sort of communication may impact the developing brain. As previously discussed, studies have found that, during early maturation, the
quantity of linguistic information, as calculated by infant exposure to the number of adult words, is strongly predictive of myelin in white matter association tracts related to adult language abilities—especially the left arcuate fasciculus (AF) and superior longitudinal fasciculus (SLF) in younger children at 30 months of age—as well as youngster’s developing linguistic abilities
[5075]. On the other hand, the
quality of linguistic information—word richness, dialogue experience, and mean length of utterance (MLU)—appears to be more crucial for older youth 4–6 years of age, who show greater white matter connectivity involving left AF and SLF
[5176][5277] and greater cortical volume in the left inferior frontal gyrus (IFG) and supramarginal gyri
[5378], as well as older children 5–9 years of age, who show increased cortical areas in the left perisylvian areas
[5447]. Additional social cognitive neuroscience studies have revealed that the neural circuits underpinning the discrimination of a mother’s voice—also known as ‘motherese’, as an important component of social bonding—include voice-perception and auditory areas of the temporal lobe, reward circuit areas in the orbitofrontal cortex (OFC), nucleus accumbens (NAc), and ventromedial prefrontal cortex (vmPFC), affective processing areas, especially the amygdala, and areas related to visual face processing, especially the fusiform cortex, predict the communication and linguistic function capacities in older youth at 7–12 years of age
[1719].
That said, functional magnetic resonance imaging (fMRI) work on social interactions in youngsters has, until recently, primarily centered on brain activation in infants or children in reaction to a one-way social signal. However, a newly utilized technique, known as ‘hyperscanning’, allows for concurrent data collection of brain activation from multiple individuals at once, concurrently taking part in social interaction [144][79]. More specifically, real-time social interactions between a child and parent can be correlated with the temporal alignment of their brainwaves during such interactions. In particular, as a consequence of non-verbal and verbal signaling during social interaction, neural synchronization can occur [145,146][80][81]. Further, in at least one recent neuroimaging study, of a live two-way social interaction involving differences in speech prosody, eye gaze, and joint attention between adults and infants 9–15 months of age, distinctive paired activation occurred in infant and adult brains as a function of their social importance.
6. The Social Brain and Second Language Acquisition
A relatively more recent aggregation of work has further explored the influence of bilingualism on mentalizing or perspective-taking, as well as empathy, on language processing in young children. For instance, at least one recent study found bilingual-speaking youngsters to be more accurate than monolingual speakers in a task that required analyzing an observer’s perspective from different positions
[5548]. Moreover, a recent meta-analysis appears to indicate these general findings are robust
[5649]. Though it is not fully understood how bilingualism provides this advantage, it has been suggested that bilingualism perhaps allows for additional occasions to develop executive function, metalinguistic comprehension, and improved sensitivity to the nuances of typical sociolinguistic interactions
[5548][5649].
7. The Social Brain, Developmental Dysfunctions, and Psychopathologies
Over the last few decades, increasing numbers of psychologists and neuroscientists have come to understand that many psychopathologies and developmental disorders can be largely attributed to dysfunctions of the evolved social brain
[5750]. In the majority of cases, such dysfunctions typically involve substantive deficiencies in social cognition, social communication, and linguistic abilities. In particular, autism spectrum disorder (ASD) is a heterogeneous disruption of social cognition, generally entailing various social deficits, such as dysfunctions in social communication (e.g., atypical facial expressions and vocal tone), social interactions (e.g., joint attention, eye gaze, and gesture), imitation and social norms, mentalizing, empathy, analogies (e.g., sarcasm and jokes), unfamiliar situations, imagination (e.g., make believe or play), and planning for or predicting future events
[5882]. Intriguingly, due to the profound neurogenetic and neurodevelopmental causes, as well as serious dysfunctions in social cognition that define ASD, ASD presents the occasion for neuroscientists, anthropologists, and psychologists to investigate the biological genesis of social cognition and social behaviors inherent to human nature.
8. The Social Brain and Autism Spectrum Disorder
Autism spectrum disorder (ASD) is a neurodevelopmental dysfunction characterized by chronic deficits in social interaction, non-verbal and verbal communication, and social cognition, including deficits in mentalizing or the ability to understand the mental states of another individual
[5952][6053]. Intriguingly, the complex interrelated genetic, social, and neurodevelopmental pathways and deficits found in ASD, present perhaps one of the clearest and most compelling connections between the social brain, language function, social cognition, and social bonding
[612]. As the name suggests, autism is situated on a spectrum, with some individuals whose verbal capacities exist along the typical spectrum of abilities, while others never learn to speak
[6154]. Interestingly, in those with adequate language and cognitive capacities, such as those with Asperger’s syndrome and high-functioning autism (HFA), specifically social communicative capacities ostensibly remain impaired. In other words, communication is typically unidirectional and used instrumentally and non-socially instead of for socially related functions
[6255]. Neurological studies on cortical development in language-related areas of the frontal and temporal lobes of the brain have been further correlated with linguistic impairments in ASD, including asymmetrical turnaround of the frontal lobes
[6383][64][6558], superior and anterior shifting of the left cerebral hemisphere, superior temporal sulcus, and inferior frontal sulcus
[6684], bilateral decreases of gray matter volume in the superior temporal sulcus
[6785], and apparently overall reduced left hemispheric dominance. Intriguingly, though challenging to disentwine the respective contributions of social cognition deficits in autism to linguistic deficits in autism, several recent studies in both autistic and neurotypical adults and children appear to suggest that mentalizing, which is impaired in autistic individuals, may be integral for the cognitive and linguistic ability to build subordinate and recursive embedded clauses (e.g., ‘‘Mary thinks that Sandra believes the broom is in the closet’’) (see
Figure 62;
[6886][6963][7060]), suggesting another direct link between social cognition and language ability.
Figure 62. Several studies of both autistic and neurotypical adults and children appear to suggest that higher-order mentalizing (i.e., inferring the mental states of more than one individual) may be important for the syntactic ability to build subordinate and recursive embedded clauses (e.g., “Mary thinks that Sandra believes the broom is in the closet”), suggesting a direct link between social cognition and language ability. Adapted image from the public domain.
9. Early Biomarkers of Language-Related Abilities and Relevant Clinical Applications
Describing the early development of neurotypical and neuroatypical language neurobiology is critical for the early identification and potential treatment of clinical language disorders. Crucially, delays in language and speech in infants and children can negatively affect important social and academic skills such as attention, reading, writing, social interactions, and, of course, later educational outcomes
[7161]. For instance, delays in language acquisition from 2–5 years are implicated in substandard reading comprehension in the classroom
[7262][7365]. If such language delays persist after 5 years, related challenges often persist in the consequent maturation of attention, directed eye gaze, and socialization
[7161][7466]. The majority of language delays are often noticed during parental observations or clinical check-ups when an important developmental landmark does not appear to be present, like syntactic challenges or speech onset delays. As a consequence of this rather crude ‘sit-and-wait’ approach, most youngsters are unfortunately not characterized as having had a disorder or delay of language until 2–3 years of age, which is often noted by the absence of combinatorial speech, or the capacity to formulate words into complete thoughts and sentences
[7161][7587].
An alternative approach emphasizes the emergence of early indications, or biomarkers, of ultimate language capacities, early enough in development, to establish that any clinical interventions into speech and language delays and disorders might provide the greatest benefits. Perhaps surprisingly, there are currently no standardized or universally agreed-upon criteria in screening for language and speech deficiencies.
As might be expected, the diagnosis of language delays and disorders is usually grounded in comparable maturational landmarks observed in neurotypical language learning [175][88]. Children with language delays typically adhere to a normal maturational trajectory, albeit at more sluggish rates than would be expected [176][89], whereas children with language disorders tend to display regressions in language development (e.g., word loss from 14–21 months of age in ASD), serious and persistent delays in language learning (e.g., challenges with syntax in youngsters with specific language impairment (SLI), or impairments in at least two domains of development (e.g., such as motor function and language impairments in global developmental delay (GDD) [175,177,178][88][90][91]. As a general rule-of-thumb, language delays typically require clinical intervention when the development rate drops beneath 3/4 of the rate expected, for example, when a standard developmental landmark typically observed at 2 years of age fails to be met in a youngster at 30 months of age [179][92].
Nonetheless, speech and linguistic interventions should arguably begin even earlier in development. In fact, speech processing already begins in utero, in spite of the fact that the more observable first 24 months are distinguished by more obvious mappings of form-to-meaning at 5–7 months of age and proficiency at distinguishing native sounds from 6–12 months of age [7693].