Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 896 2023-08-01 17:31:39 |
2 references update + 4 word(s) 900 2023-08-02 03:28:08 | |
3 layout + 222 word(s) 1122 2023-09-06 08:40:36 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Dondi, P.; Porta, M. Eye Tracking Technology. Encyclopedia. Available online: https://encyclopedia.pub/entry/47506 (accessed on 07 July 2024).
Dondi P, Porta M. Eye Tracking Technology. Encyclopedia. Available at: https://encyclopedia.pub/entry/47506. Accessed July 07, 2024.
Dondi, Piercarlo, Marco Porta. "Eye Tracking Technology" Encyclopedia, https://encyclopedia.pub/entry/47506 (accessed July 07, 2024).
Dondi, P., & Porta, M. (2023, August 01). Eye Tracking Technology. In Encyclopedia. https://encyclopedia.pub/entry/47506
Dondi, Piercarlo and Marco Porta. "Eye Tracking Technology." Encyclopedia. Web. 01 August, 2023.
Eye Tracking Technology
Edit

Eye tracking is a technique for detecting and measuring eye movements and characteristics. An eye tracker can sense a person’s gaze locations and features at a certain frequency.

eye tracking gaze input museums and exhibitions

1. Introduction

Eye tracking is a technique for detecting and measuring eye movements and characteristics [1]. An eye tracker can sense a person’s gaze locations at a certain frequency. Finding gaze position allows identification of fixations and saccades. Fixations, which typically last between 100 and 600 ms [2][3], are time periods during which the eyes are almost still, with the gaze being focused on a specific element of the scene. On the other hand, saccades, which normally last less than 100 ms [3], are very fast eye movements occurring between consecutive pairs of fixations, with the purpose of relocating the gaze on a different element in the visual scene.

2. Eye Tracking Technology

Electro-oculography, scleral contact lens/search coil, photo-oculography, video-oculography and pupil center-corneal reflection are some of the eye-tracking technologies that have been developed over time [1].
Electro-oculography (EOG), one of the oldest methods to record eye movements, measures the skin’s electrical potential differences through small electrodes placed around the eyes [4]. This solution allows recording of eye movements even when the eyes are closed, but is generally more invasive and less accurate and precise than other approaches.
Scleral contact lens/search coil is another old method consisting of small coils of wire inserted in special contact lenses. The user’s head is then placed inside a magnetic field to generate an electrical potential that allows estimation of eye position [5]. While this technique has a very high spatial and temporal resolution, it is also extremely invasive and uncomfortable, used practically only for physiological studies.
Photo- and video-oculography (POG and VOG) are generally video-based methods in which small cameras, incorporated in head-mounted devices, measure eye features such as pupil size, iris–sclera boundaries and possible corneal reflections. The assessment of these characteristics can occur both automatically and manually. However, these systems tend to be inaccurate and are mainly used for medical purposes [1].
Pupil center-corneal reflection (PCCR) is the most used eye tracking technique nowadays. Its basic principle consists of using infrared (or near-infrared) light sources to illuminate the eyes and detect reflections on their surface (Figure 1); this allows determination of the gaze direction [1]. Infrared light is employed because it is invisible and also produces a better contrast between pupil and iris. The prices of these eye trackers range from a few hundreds to tens of thousands of euros, depending on their accuracy and gaze sampling frequency. All the works analyzed in the present review employ this technology.
Figure 1. Example of eye detection with the Gazepoint GP3 HD eye tracker: above, the eyes detected within the face; below, pupil/corneal reflections.
There are two main kinds of eye trackers, namely, remote and wearable. Remote eye trackers (Figure 2, left) are normally non-intrusive devices (often little “bars”) that are positioned at the bottom of standard displays. They are currently the most prevalent kind of eye trackers. Wearable eye trackers (Figure 2, right), on the other hand, are frequently used to study viewing behavior in real-world settings. Recent wearable eye trackers look more and more like glasses, making them much more comfortable than in the past.
Figure 2. Examples of remote and wearable eye trackers. On the left, highlighted in red, a Tobii 4c (by Tobii) remote device; on the right, a PupilCore (by Pupil Labs) wearable tool.
Psychology [6], neuroscience [7], marketing [8], education [9], usability [10][11] and biometrics [12][13] are all fields in which eye tracking technology has been applied, for instance, to determine the user’s gaze path while looking at something (e.g., an image or a web page) or to obtain information about the screen regions that are most frequently inspected. When using an eye tracker as an input tool (i.e., for interactive purposes, in an explicit way), gaze data must be evaluated in real-time, so that the computer can respond to specific gaze behaviors [14]. Gaze input is also extremely beneficial as an assistive technology for people who are unable to use their hands. Several assistive solutions have been devised to date, including those for writing [15][16][17], surfing the Web [18][19] and playing music [20][21].
Two common ways to provide gaze input are through dwell time and gaze gestures. Dwell time, which is the most used approach, consists of fixating a target element (e.g., a button) for a certain time (the dwell time), after which an action connected to that element is triggered. The duration of the dwell time can vary depending on the application, but it should be chosen so as to avoid the so called “Midas touch problem” [22], i.e., involuntary selections occurring when simply looking at the elements of an interface.
Gaze gestures consist of gaze paths performed by the user to trigger specific actions. This approach can be fast and is immune to the Midas touch problem, but it is also generally less intuitive than the dwell time (since the user needs to memorize a set of gaze gestures, it may have a steep learning curve). For this reason, gaze gestures are recommended only for applications meant to be used multiple times, such as writing systems (e.g., [23][24]).
Hybrid approaches that mix dwell time and gaze gestures (e.g., for interacting with video games [25]) have also been proposed, while other gaze input methods (such as those based on blinks [26] or smooth pursuit [17]) are currently less common.
After some pioneering studies that date back to the late 1980s [27], the first attempts to use eye tracking technology in museums were focused on studying visitors’ behavior [28][29][30], their cognitive processes while observing artwork [31][32], and their emotional reactions [33]. More recently, various museums have used gaze-based solutions to both collect data and engage visitors—for instance, by showing them, after the visit, how they observed the items on display and/or highlighting possible similarities of their eye paths with those of other visitors. In this regard, notable examples include the Cleveland Museum of Art (https://mw18.mwconf.org/glami/gaze-tracker/ (accessed on 20 April 2023)) (United States), the ARoSArt Museum (https://userexperienceawards.com/2017-submissions/aros-art-museum-aros-public/ (accessed on 20 April 2023)) (Denmark) and the M-Museum Leuven (https://www.mleuven.be/en/research-support/research/eye-tracking-research-how-do-we-look-art (accessed on 20 April 2023)) (Belgium). These can mostly be considered “passive” uses of eye tracking technology, since there is not an actual, explicit interaction—gaze input is used for some kind of “a posteriori” analysis. There are also cases, such as the Science Museum of Trento (https://www.srlabs.it/en/project/muse/ (accessed on 20 April 2023)) (Italy), in which the interaction based on eye tracking per se (therefore not necessarily connected with the exhibited works) is simply included in scientific museums for informational purposes, to make the existence of this technology known to the general public.

References

  1. Duchowski, A.T. Eye Tracking Methodology: Theory and Practice, 3rd ed.; Springer International Publishing AG: Cham, Switzerland, 2017.
  2. Velichkovsky, B.M.; Dornhoefer, S.M.; Pannasch, S.; Unema, P.J. Visual Fixations and Level of Attentional Processing. In Proceedings of the ETRA 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, FL, USA, 6–8 November 2000; ACM: New York, NY, USA, 2000; pp. 79–85.
  3. Robinson, D.A. The mechanics of human saccadic eye movement. J. Physiol. 1964, 174, 245–264.
  4. Shackel, B. Pilot study in electro-oculography. Br. J. Ophthalmol. 1960, 44, 89.
  5. Robinson, D.A. A Method of Measuring Eye Movemnent Using a Scieral Search Coil in a Magnetic Field. IEEE Trans. Bio-Med. Electron. 1963, 10, 137–145.
  6. Mele, M.L.; Federici, S. Gaze and eye-tracking solutions for psychological research. Cogn. Process. 2012, 13, 261–265.
  7. Popa, L.; Selejan, O.; Scott, A.; Mureşanu, D.F.; Balea, M.; Rafila, A. Reading beyond the glance: Eye tracking in neurosciences. Neurol. Sci. 2015, 36, 683–688.
  8. Wedel, M.; Pieters, R. Eye tracking for visual marketing. Found. Trends Mark. 2008, 1, 231–320.
  9. Cantoni, V.; Perez, C.J.; Porta, M.; Ricotti, S. Exploiting Eye Tracking in Advanced E-Learning Systems. In Proceedings of the CompSysTech ’12: 13th International Conference on Computer Systems and Technologies, Ruse, Bulgaria, 22–23 June 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 376–383.
  10. Nielsen, J.; Pernice, K. Eyetracking Web Usability; New Riders Press: Thousand Oaks, CA, USA, 2009.
  11. Mosconi, M.; Porta, M.; Ravarelli, A. On-Line Newspapers and Multimedia Content: An Eye Tracking Study. In Proceedings of the SIGDOC ’08: 26th Annual ACM International Conference on Design of Communication, Lisbon, Portugal, 22–24 September 2008; Association for Computing Machinery: New York, NY, USA, 2008; pp. 55–64.
  12. Kasprowski, P.; Ober, J. Eye Movements in Biometrics. In Biometric Authentication; Maltoni, D., Jain, A.K., Eds.; Springer: Berlin/Heidelberg, Germany, 2004; pp. 248–258.
  13. Porta, M.; Dondi, P.; Zangrandi, N.; Lombardi, L. Gaze-Based Biometrics From Free Observation of Moving Elements. IEEE Trans. Biom. Behav. Identity Sci. 2022, 4, 85–96.
  14. Duchowski, A.T. Gaze-based interaction: A 30 year retrospective. Comput. Graph. 2018, 73, 59–69.
  15. Majaranta, P.; Räihä, K.J. Text entry by gaze: Utilizing eye-tracking. In Text Entry Systems: Mobility, Accessibility, Universality; Morgan Kaufmann: San Francisco, CA, USA, 2007; pp. 175–187.
  16. Porta, M. A study on text entry methods based on eye gestures. J. Assist. Technol. 2015, 9, 48–67.
  17. Porta, M.; Dondi, P.; Pianetta, A.; Cantoni, V. SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit. IEEE Trans. Hum.-Mach. Syst. 2022, 52, 312–323.
  18. Kumar, C.; Menges, R.; Müller, D.; Staab, S. Chromium based framework to include gaze interaction in web browser. In Proceedings of the 26th International Conference on World Wide Web Companion, Geneva, Switzerland, 3–7 April 2017; pp. 219–223.
  19. Casarini, M.; Porta, M.; Dondi, P. A Gaze-Based Web Browser with Multiple Methods for Link Selection. In Proceedings of the ETRA ’20 Adjunct: ACM Symposium on Eye Tracking Research and Applications, Stuttgart, Germany, 2–5 June 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–8.
  20. Davanzo, N.; Dondi, P.; Mosconi, M.; Porta, M. Playing Music with the Eyes through an Isomorphic Interface. In Proceedings of the COGAIN ’18: Workshop on Communication by Gaze Interaction, Warsaw, Poland, 15 June 2018; ACM: New York, NY, USA, 2018; pp. 5:1–5:5.
  21. Valencia, S.; Lamb, D.; Williams, S.; Kulkarni, H.S.; Paradiso, A.; Ringel Morris, M. Dueto: Accessible, Gaze-Operated Musical Expression. In Proceedings of the ASSETS ’19: 21st International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 28–30 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 513–515.
  22. Jacob, R.J. Eye movement-based human-computer interaction techniques: Toward non-command interfaces. Adv. Hum.-Comput. Interact. 1993, 4, 151–190.
  23. Wobbrock, J.O.; Rubinstein, J.; Sawyer, M.W.; Duchowski, A.T. Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry. In Proceedings of the ETRA ’08: 2008 Symposium on Eye Tracking Research & Applications, Savannah, Georgia, 26–28 March 2008; ACM: New York, NY, USA, 2008; pp. 11–18.
  24. Porta, M.; Turina, M. Eye-S: A Full-Screen Input Modality for Pure Eye-Based Communication. In Proceedings of the ETRA ’08: 2008 Symposium on Eye Tracking Research & Applications, Savannah, Georgia, 26–28 March 2008; ACM: New York, NY, USA, 2008; pp. 27–34.
  25. Istance, H.; Bates, R.; Hyrskykari, A.; Vickers, S. Snap clutch, a moded approach to solving the Midas touch problem. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, Savannah, Georgia, 26–28 March 2008; pp. 221–228.
  26. Królak, A.; Strumiłło, P. Eye-blink detection system for human–computer interaction. Univ. Access Inf. Soc. 2012, 11, 409–419.
  27. Buquet, C.; Charlier, J.; Paris, V. Museum application of an eye tracker. Med. Biol. Eng. Comput. 1988, 26, 277–281.
  28. Wooding, D.S.; Mugglestone, M.D.; Purdy, K.J.; Gale, A.G. Eye movements of large populations: I. Implementation and performance of an autonomous public eye tracker. Behav. Res. Methods Instrum. Comput. 2002, 34, 509–517.
  29. Wooding, D.S. Eye movements of large populations: II. Deriving regions of interest, coverage, and similarity using fixation maps. Behav. Res. Methods Instrum. Comput. 2002, 34, 518–528.
  30. Milekic, S. Gaze-tracking and museums: Current research and implications. In Museums and the Web 2010: Proceedings; Archives & Museum Informatics: Toronto, ON, Canada, 2010; pp. 61–70.
  31. Eghbal-Azar, K.; Widlok, T. Potentials and Limitations of Mobile Eye Tracking in Visitor Studies. Soc. Sci. Comput. Rev. 2013, 31, 103–118.
  32. Villani, D.; Morganti, F.; Cipresso, P.; Ruggi, S.; Riva, G.; Gilli, G. Visual exploration patterns of human figures in action: An eye tracker study with art paintings. Front. Psychol. 2015, 6, 1636.
  33. Calandra, D.M.; Di Mauro, D.; D’Auria, D.; Cutugno, F. E.Y.E. C.U.: An Emotional eYe trackEr for Cultural heritage sUpport. In Empowering Organizations: Enabling Platforms and Artefacts; Springer International Publishing: Cham, Switzerland, 2016; pp. 161–172.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : ,
View Times: 308
Revisions: 3 times (View History)
Update Date: 06 Sep 2023
1000/1000
Video Production Service