Tracking Eye Movements as Window on Language Processing: Comparison
Please note this is a comparison between Version 2 by Vicky Zhou and Version 1 by Marta Tagliani.

This entry overviews the pioneering experimental studies exploiting eye movement data to investigate language processing in real time. After examining how vision and language were found to be closely related, herein focus the discussion on the evolution of eye-tracking methodologies to investigate children’s language development. To conclude, herein provide some insights about the use of eye-tracking technology for research purposes, focusing on data collection and data analysis

  • visual world
  • eye-tracking
  • language processing
  • language acquisition
Until the 1970s, experimental studies on linguistic competence and processing have exclusively relied on offline measures of comprehension. In classical psycholinguistic paradigms such as lexical decision [1] or sentence–picture verification tasks [2[2][3],3], participants are asked to evaluate the truthfulness of the linguistic input provided, either against pictures or their word knowledge. In these paradigms, sentence comprehension is assessed by measuring participants’ response latencies and accuracy in expressing metalinguistic evaluations after being presented with the linguistic stimulus. However, while response choices and reaction times are behavioral measures that provide information on linguistic comprehension, such tasks do not tap into real-time processing of spoken language, and, as a consequence, reveal less about the speaker’s efficiency and knowledge.
Another paradigm is the visual world paradigm, an experimental methodology which employs the recording of participants’ eye movements during listening tasks. Unlike long-established psycholinguistic paradigms, eye movement data provide exhaustive information on the time course of language comprehension as well as relevant insights on how visual and linguistic sources of information interact in real-time. In a typical visual world set-up, participants are instructed to listen to sentences carefully and look wherever they want on the screen or interact with objects or screen-based pictures (e.g., by moving them). The simplicity of such set-up makes the task execution extremely effortless, as it relies on the human tendency to look at relevant parts of the visual scenario as critical words are mentioned. In fact, participants are not asked to do anything different from what they do in their everyday life, when they automatically integrate information from visual or written and spoken sources of information (e.g., while listening to the news on TV). The unchallenging nature of visual world studies makes this experimental paradigm extremely suitable to investigate language comprehension in populations with language disorders as aphasia [4,5][4][5] or developmental dyslexia [6[6][7][8][9][10],7,8,9,10], as well as in infants and young children [11,12,13][11][12][13].
This entry offers a detailed overview of how the visual world paradigm can be used efficiently to assess linguistic comprehension in children. In Section 2.1, the entry will review the pioneering eye-tracking studies, which have led to the affirmation of the visual world paradigm in psycholinguistic research. In Section 2.2, the entry will illustrate the main experimental procedures typically used in the visual world paradigm with both adult and child participants. The remainder of the entry will be devoted to the discussion of the different eye-tracking methodologies exploiting the relation between language and vision to study online language processing by infants and children, namely the Preferential-Looking Paradigm (Section 3.1) and the Looking-While-Listening Task (Section 3.2). Specific limitations and advantages of these different tasks will also be discussed. In conclusion, the entry will give some details about the eye-tracking technology, focusing on data collection and data analysis, and discussing some methodological limitations (Section 3.3).


  1. Meyer, D.E.; Schvaneveldt, R.W. Facilitation in recognizing pairs of words: Evidence of a dependence between retrieval operations. J. Exp. Psychol. 1971, 90, 227–234.
  2. Wason, P.C. Response to Affirmative and Negative Binary Statements. Br. J. Psychol. 1961, 52, 133–142.
  3. Carpenter, P.A.; Just, M.A. Sentence comprehension: A psycholinguistic processing model of verification. Psychol. Rev. 1975, 82, 45–73.
  4. Dickey, M.W.; Choy, J.J.; Thompson, C.K. Real-time comprehension of wh- movement in aphasia: Evidence from eyetracking while listening. Brain Lang. 2007, 100, 1–22.
  5. Yee, E.; Blumstein, S.E.; Sedivy, J.C. Lexical-Semantic Activation in Broca’s and Wernicke’s Aphasia: Evidence from Eye Movements. J. Cogn. Neurosci. 2008, 20, 592–612.
  6. Rayner, K. Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 1998, 124, 372–422.
  7. De Luca, M.; Di Pace, E.; Judica, A.; Spinelli, D.; Zoccolotti, P. Eye movement patterns in linguistic and non-linguistic tasks in developmental surface dyslexia. Neuropsychologia 1999, 37, 1407–1420.
  8. Desroches, A.S.; Joanisse, M.F.; Robertson, E.K. Specific phonological impairments in dyslexia revealed by eyetracking. Cognition 2006, 100, B32–B42.
  9. Huettig, F.; Brouwer, S. Delayed Anticipatory Spoken Language Processing in Adults with Dyslexia—Evidence from Eye-tracking. Dyslexia 2015, 21, 97–122.
  10. Benfatto, M.N.; Seimyr, G.Ö.; Ygge, J.; Pansell, T.; Rydberg, A.; Jacobson, C. Screening for Dyslexia Using Eye Tracking during Reading. PLoS ONE 2016, 11, e0165508.
  11. Joseph HS, S.L.; Nation, K.; Liversedge, S.P. Using Eye Movements to Investigate Word Frequency Effects in Children’s Sentence Reading. Sch. Psychol. Rev. 2013, 42, 207–222.
  12. Mani, N.; Huettig, F. Word reading skill predicts anticipation of upcoming spoken language input: A study of children developing proficiency in reading. J. Exp. Child Psychol. 2014, 126, 264–279.
  13. Tribushinina, E.; Mak, W.M. Three-year-olds can predict a noun based on an attributive adjective: Evidence from eye-tracking. J. Child Lang. 2016, 43, 425–441.
Video Production Service