Your browser does not fully support modern features. Please upgrade for a smoother experience.
Fidelity, Virtual Human Assistants, and Engagement in Immersive Virtual Learning Environments: The Role of Temporal Functional Fidelity: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor: , Bill Kapralos , Alvaro Quevedo
Advances in consumer virtual reality (VR) and artificial intelligence (AI) have accelerated the use of immersive virtual learning environments (iVLEs) for skills training. Learner engagement is a critical determinant of training effectiveness, which can be shaped by VR system features (e.g., visual, auditory, and tactile immersion) coupled with interaction mechanics and instructional design integrated with the instructional behaviors of virtual human assistants (VHAs). Although visual and behavioral fidelity in VHAs have been extensively studied, functional fidelity (i.e., the extent to which the iVLE and/or VHAs support cognitive, perceptual, and motor processes required to perform a task regardless of visual realism), and particularly the temporal alignment of instructional guidance with learners’ cognitive and motor demands, remains underexamined. This article highlights research on VHAs in iVLEs with a special emphasis on temporal functional fidelity as an emerging requirement for synchronizing instructional support with user workload and task phases. By consolidating existing findings and highlighting gaps in current empirical work, this article outlines key implications for the design and evaluation of VHAs and identifies directions for future research aimed at optimizing instructional timing in iVLEs. The goal is to inform principled VHA design and clarify how fidelity dimensions should be integrated to support effective, pedagogically grounded immersive learning experiences.
  • virtual human
  • virtual human assistant
  • temporal fidelity
  • engagement
  • immersive virtual learning environment
  • serious game
  • virtual simulation
Advances in computational processing power and the miniaturization of consumer electronics have substantially reduced the cost and increased the accessibility of immersive technologies, contributing to the rapid expansion of virtual reality (VR) as a widely available platform for simulation and training [1]. VR enables users to interact within computer-generated environments through head-mounted displays (HMDs), motion-tracked controllers, and hand tracking that has been shown to reshape human–computer interaction and support immersive learning experiences [2]. When combined with artificial intelligence (AI), these capabilities have accelerated the development of immersive virtual learning environments (iVLEs), including virtual simulations and serious games that provide adaptive instruction, enhance conceptual understanding, and improve knowledge retention [3].
A central determinant of iVLE effectiveness is learner engagement, a multidimensional construct associated with persistence, achievement, and enjoyment [4][5][6]. Engagement is particularly consequential in VR, where immersive experiences can increase motivation and support authentic learning and performance [7]. Within iVLEs, virtual humans (VHs) used as instructional or representational agents, also referred to as virtual human assistants (VHAs), play a critical role in shaping engagement by mediating how learners attend to, interact with, and interpret the virtual learning experience. Research indicates that VHAs with appropriate representational qualities can strengthen embodiment and user acceptance compared to abstract representations, thereby supporting sustained involvement and task engagement [8].
VHAs play an increasingly prominent role in iVLEs by providing instruction, feedback, and adaptive support [9][10]. Their effectiveness is shaped by diverse fidelity dimensions, including visual, behavioral, auditory, cognitive, and functional, all of which influence learner perception and interaction [9][11]. Although fidelity is often discussed broadly as the realism of a simulation [12][13], research demonstrates considerable inconsistency in its definition and application [14][15]. For example, structural fidelity refers to perceptual realism, whereas functional fidelity concerns how well task interactions align with learning objectives. In contrast, realism reflects subjective user perception and varies based on expectations and context [16][17][18].
Efforts to increase fidelity have revealed challenges. Visual or behavioral hyper-realism may elicit uncanny valley responses [19], and mismatches in audio fidelity or vocal characteristics can disrupt presence and engagement [20]. Empirical findings remain mixed. More specifically, some studies report that higher fidelity enhances social presence and perceived attractiveness, whereas others show that medium fidelity animations can increase eeriness or bias user judgments [21]. As VR research increasingly prioritizes photorealistic rendering [22][23] and AI-driven interactivity [24], the need to clarify how fidelity affects learning has become more pressing.
A key unresolved question concerns the optimal level and type of fidelity needed to support engagement and performance in iVLEs. Although fidelity is frequently assumed to improve learning, current evidence remains inconclusive [25][26]. These findings suggest that learning effectiveness depends not only on how realistic a virtual environment or instructor (VHA) appears, but on how it is functionally applied [14]. In this context, temporal functional fidelity (i.e., the alignment of instructional timing with learners’ cognitive and motor demands) has received comparatively limited attention despite its strong theoretical relevance for managing cognitive load and supporting procedural skill acquisition.
This article highlights key implications for VHA design and evaluation and identifies future research directions for optimizing instructional timing in iVLEs. The aim is to inform principled VHA design and clarify how fidelity dimensions should be integrated to support effective, pedagogically grounded immersive virtual learning.

This entry is adapted from the peer-reviewed paper 10.3390/encyclopedia6040077

References

  1. Riener, R.; Harders, M. Virtual Reality in Medicine; Springer: London, UK, 2012; ISBN 9781447140115.
  2. Crogman, H.T.; Cano, V.D.; Pacheco, E.; Sonawane, R.B.; Boroon, R. Virtual Reality, Augmented Reality, and Mixed Reality in Experiential Learning: Transforming Educational Paradigms. Educ. Sci. 2025, 15, 303.
  3. Fernández-Cerero, J.; Fernández-Batanero, J.M.; Montenegro-Rueda, M. Possibilities of Extended Reality in Education. Interact. Learn. Environ. 2025, 33, 208–222.
  4. Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School Engagement: Potential of the Concept, State of the Evidence. Rev. Educ. Res. 2004, 74, 59–109.
  5. Collie, R.J.; Martin, A.J. Motivation and Engagement in Learning; Oxford University Press: Oxford, UK, 2019.
  6. Shute, V.J.; Ventura, M.; Bauer, M.; Zapata-Rivera, D. Melding the Power of Serious Games and Embedded Assessment to Monitor and Foster Learning. In Serious Games; Ritterfeld, U., Cody, M., Vorderer, P., Eds.; Routledge: New York, NY, USA, 2009.
  7. Liévano Taborda, C. Enhancing Learner Engagement and Attention in XR Environments: Metrics and Strategies. In Proceedings of the 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Saint Malo, France, 8–12 March 2025.
  8. Lee, Y.J.; Ji, Y.G. Effects of Visual Realism on Avatar Perception in Immersive and Non-Immersive Virtual Environments. Int. J. Hum. Comput. Interact. 2025, 41, 4362–4375.
  9. van Rijmenam, M. Step into the Metaverse: How the Immersive Internet Will Unlock a Trillion-Dollar Social Economy; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2022; ISBN 9781119887577.
  10. Gao, H.; Xie, Y.; Kasneci, E. PerVRML: ChatGPT-Driven Personalized VR Environments for Machine Learning Education. Int. J. Hum. Comput. Interact. 2026, 42, 40–54.
  11. Cui, L.; Liu, J. Virtual Human: A Comprehensive Survey on Academic and Applications. IEEE Access 2023, 11, 123830–123845.
  12. Hays, R.T.; Singer, M.J. (Eds.) Simulation Fidelity in Training System Design; Recent Research in Psychology; Springer: New York, NY, USA, 1989; ISBN 978-0-387-96846-9.
  13. Berro, E.A.; Dane, F.C.; Knoesel, J. Exploring the Relationships among Realism, Engagement, and Competency in Simulation. Teach. Learn. Nurs. 2023, 18, e241–e245.
  14. Hamstra, S.J.; Brydges, R.; Hatala, R.; Zendejas, B.; Cook, D.A. Reconsidering Fidelity in Simulation-Based Training. Acad. Med. 2014, 89, 387–392.
  15. Hubal, R. Rethinking Some Virtual Human Applications. Annu. Rev. Cybertherapy Telemed. 2024, 22, 28–33.
  16. Witmer, B.G.; Singer, M.J. Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence Teleoper. Virtual Environ. 1998, 7, 225–240.
  17. INACSL Standards Committee; Persico, L.; Wilson-Keates, B.; DiGregorio, H.; Decker, S.; Xavier, N. Preamble: Grounded in Excellence: The Cornerstone Healthcare Simulation Standards of Best Practice®; Elsevier: Amsterdam, The Netherlands, 2025; Volume 105.
  18. Tun, J.K.; Alinier, G.; Tang, J.; Kneebone, R.L. Redefining Simulation Fidelity for Healthcare Education. Simul. Gaming 2015, 46, 159–174.
  19. Hepperle, D.; Purps, C.F.; Deuchler, J.; Wölfel, M. Aspects of Visual Avatar Appearance: Self-Representation, Display Type, and Uncanny Valley. Vis. Comput. 2022, 38, 1227–1244.
  20. Shin, M.; Kim, S.J.; Biocca, F. The Uncanny Valley: No Need for Any Further Judgments When an Avatar Looks Eerie. Comput. Hum. Behav. 2019, 94, 100–109.
  21. Amadou, N.; Haque, K.I.; Yumak, Z. Effect of Appearance and Animation Realism on the Perception of Emotionally Expressive Virtual Humans. In Proceedings of the 23rd ACM International Conference on Intelligent Virtual Agents, IVA 2023; Association for Computing Machinery, Inc.: New York, NY, USA, 2023; pp. 1–8.
  22. Zibrek, K.; Kokkinara, E.; McDonnell, R. The Effect of Realistic Appearance of Virtual Characters in Immersive Environments—Does the Character’s Personality Play a Role? IEEE Trans. Vis. Comput. Graph. 2018, 24, 1681–1690.
  23. Xu, L. Virtual Reality Gaming Systems: Hardware Integration and Human-Computer Interaction Analysis. World J. Math. Phys. 2024, 2, 78–83.
  24. Lycett, M.; Reppel, A. Humans in (Digital) Space: Representing Humans in Virtual Environments. In Proceedings of the 2022 ACM International Conference on Advanced Visual Interfaces; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–5.
  25. Cao, Q.; Yu, H.; Charisse, P.; Qiao, S.; Stevens, B. Is High-Fidelity Important for Human-like Virtual Avatars in Human Computer Interactions? Int. J. Netw. Dyn. Intell. 2023, 2, 15–23.
  26. Ye, X.; Backlund, P.; Ding, J.; Ning, H. Fidelity in Simulation-Based Serious Games. IEEE Trans. Learn. Technol. 2020, 13, 340–353.
More
This entry is offline, you can click here to edit this entry!
Academic Video Service