Tracking Technology in Augmented Reality: Comparison
Please note this is a comparison between Version 2 by Sirius Huang and Version 1 by Toqeer Ali.

Augmented reality (AR) is one of the leading expanding immersive experiences of the 21st century. AR has brought a revolution in different realms including health and medicine, teaching and learning, tourism, designing, manufacturing, and other similar industries whose acceptance accelerated the growth of AR in an unprecedented manner. The tracking technologies are the building blocks of AR and establish a point of reference for movement and for creating an environment where the virtual and real objects are presented together. To achieve a real experience with augmented objects, several tracking technologies are presented.

  • augmented reality
  • virtual reality
  • tracking technology

1. Augmented Reality Overview

Augmented Reality provides the composite view to the users by superimposing the computer-generated virtual content i.e. audio, graphic, text, or video on the real world object. The major components of the AR process are the tracking of the position for placing virtual objects in the real environment and display of the virtual content to the user. Tracking process in AR is to follow a defined pattern in the real world using the computer or mobile for the correct placement of the virtual object in the real world. While display technologies are used to display the virtual content in front of the viewer's eyes.

For many years, people have been using lenses, light sources, and mirrors to create illusions and virtual images in the real world [22,23,24][1][2][3]. Ivan Sutherland was the first person to truly generate the AR experience. Sketchpad, developed at MIT in 1963 by Ivan Sutherland, is the world’s first interactive graphic application [25][4]. In Figure 1, an overview of the development of AR technology from the beginning to 2022 is given. Bottani et al. [26][5] reviewed the AR literature published during the time period of 2006–2017. Moreover, Sereno et al. [27][6] use a systematic survey approach to detail the existing literature available on the intersection of computer-supported collaborative work and AR.
Figure 1.
Augmented reality advancement over time for the last 60 years.

1.1. Head-Mounted Display

Ens et al. [28][7] review the existing work on design exploration for mixed-scale gestures where the Hololens AR display is used to interweave larger gestures with micro-gestures.

1.2. AR Towards Applications

ARToolKit tracking library [13][8] aimed to provide the computer vision tracking of a square marker in real-time which fixed two major problems, i.e., enabling interaction with real-world objects and secondly, the user’s viewpoint tracking system. Researchers conducted studies to develop handheld AR systems. Hettig et al. [29][9] present a system called “Augmented Visualization Box” to asses surgical augmented reality visualizations in a virtual environment. Goh et al. [30][10] present details of the critical analysis of 3D interaction techniques in mobile AR. Kollatsch et al. [31][11] introduce a system that creates and introduces the production data and maintenance documentation into the AR maintenance apps for machine tools which aims to reduce the overall cost of necessary expertise and the planning process of AR technology. Bhattacharyya et al. [32][12] introduce a two-player mobile AR game known as Brick, where users can engage in synchronous collaboration while inhabiting the real-time and shared augmented environment. Kim et al. [33][13] suggest that this decade is marked by a tremendous technological boom particularly in rendering and evaluation research while display and calibration research has declined. Liu et al. [34][14] expand the information feedback channel from industrial robots to a human workforce for human–robot collaboration development.

1.3. Augmented Reality for the Web

Cortes et al. [35][15] introduce the new techniques of collaboratively authoring surfaces on the web using mobile AR. Qiao et al. [36][16] review the current implementations of mobile AR, enabling technologies of AR, state-of-art technology, approaches for potential web AR provisioning, and challenges that AR faces in a web-based system.

1.4. AR Application Development

The AR industry was tremendously increasing in 2015, extending from smartphones to websites with head-worn display systems such as Google Glass. In this regard, Agati et al. [18][17] propose design guidelines for the development of an AR manual assembly system which includes ergonomics, usability, corporate-related, and cognition.
AR for Tourism and Education: Shukri et al. [37][18] aim to introduce the design guidelines of mobile AR for tourism by proposing 11 principles for developing efficient AR design for tourism which reduces cognitive overload, provides learning ability, and helps explore the content while traveling in Malaysia. In addition to it, Fallahkhair et al. [38][19] introduce new guidelines to make AR technologies with enhanced user satisfaction, efficiency, and effectiveness in cultural and contextual learning using mobiles, thereby enhancing the tourism experience. Akccayir et al. [39][20] show that AR has the advantage of placing the virtual image on a real object in real time while pedagogical and technical issues should be addressed to make the technology more reliable. Salvia et al. [40][21] suggest that AR has a positive impact on learning but requires some advancements.
Sarkar et al. [41][22] present an AR app known as ScholAR. It introduces enhancing the learning skills of the students to inculcate conceptualizing and logical thinking among sevemth-grade students. Soleiman et al. [42][23] suggest that the use of AR improves abstract writing as compared to VR.

1.5. AR Security and Privacy

Hadar et al. [43][24] scrutinize security at all steps of AR application development and identify the need for new strategies for information security, privacy, and security, with a main goal to design and introduce capturing and mapping concerns. Moreover, in the industrial arena, Mukhametshin et al. [44][25] focus on developing sensor tag detection, tracking, and recognition for designing an AR client-side app for Siemen Company to monitor the equipment for remote facilities.

2. Tracking Technology of AR

Tracking technologies introduce the sensation of motion in the virtual and augmented reality world and perform a variety of tasks. Once a tracking system is rightly chosen and correctly installed, it allows a person to move within a virtual and augmented environment. It further allows us to interact with people and objects within augmented environments. The selection of tracking technology depends on the sort of environment, the sort of data, and the availability of required budgets. For AR technology to meet Azuma’s definition of an augmented reality system, it must adhere to three main components:
  • it combines virtual and the real content;
  • it is interactive in real time;
  • it is registered in three dimensions.
The third condition of being “registered in three dimensions” alludes to the capability of an AR system to project the virtual content on physical surroundings in such a way that it seems to be part of the real world. The position and orientation (pose) of the viewer concerning some anchor in the real world must be identified and determined for registering the virtual content in the real environment. This anchor of the real world may be the dead-reckoning from inertial tracking, a defined location in space determined using GPS, or a physical object such as a paper image marker or magnetic tracker source. In short, the real-world anchor depends upon the applications and the technologies used. With respect to the type of technology used, there are two ways of registering the AR system in 3D:
  • Determination of the position and orientation of the viewer relative to the real-world anchor: registration phase;
  • Upgrading of viewer’s pose with respect to previously known pose: tracking phase.
In this text, the word “tracking” would define both phases as common terminology. There are two main types of tracking techniques which are explained as follows (depicted in Figure 2).
Figure 2.
Categorization of augmented reality tracking techniques.

2.1. Markerless Tracking Techniques

Markerless tracking techniques further have two types, one is sensor based and another is vision based.

2.1.1. Sensor-Based Tracking

Magnetic Tracking Technology: This technology includes a tracking source and two sensors, one sensor for the head and another one for the hand. The tracking source creates an electromagnetic field in which the sensors are placed. The computer then calculates the orientation and position of the sensors based on the signal attenuation of the field. This gives the effect of allowing a full 360 range of motion. i.e., allowing us to look all the way around the 3D environment. It also allows us to move around all three degrees of freedom. The hand tracker has some control buttons that allow the user to navigate along the environment. It allows us to pick things up and understand the size and shape of the objects [45][26]. Figure 53 shows the tracking techniques to give a better understanding to the reader.
Figure 53.
Augmented reality tracking techniques presentation.
Frikha et al. [46][27] introduce a new mutual occlusion problem handler. The problem of occlusion occurs when the real objects are in front of the virtual objects in the scene. The authors use a 3D positioning approach and surgical instrument tracking in an AR environment. The paradigm is introduced that is based on monocular image-based processing. The result of the experiment suggested that this approach is capable of handling mutual occlusion automatically in real-time.
One of the main issues with magnetic tracking is the limited positioning range [47][28]. Orientation and position can be determined by setting up the receiver to the viewer [48][29]. Receivers are small and light in weight and the magnetic trackers are indifferent to optical disturbances and occlusion; therefore, these have high update rates. However, the resolution magnetic field declines with the fourth power of the distance, and the strength of magnetic fields decline with the cube of the distance [49][30]. Therefore, the magnetic trackers have constrained working volume. Moreover, magnetic trackers are sensitive to environments around magnetic fields and the type of magnetic material used and are also susceptible to measurement jitter [50][31].
Magnetic tracking technology is widely used in the range of AR systems, with applications ranging from maintenance [51][32] to medicine [52][33] and manufacturing [53][34].
Inertial Tracking: Magnetometers, accelerometers, and gyroscopes are examples of inertial measurement units (IMU) used in inertial tracking to evaluate the velocity and orientation of the tracked object. An inertial tracking system is used to find the three rotational degrees of freedom relative to gravity. Moreover, the time period of the trackers’ update and the inertial velocity can be determined by the change in the position of the tracker.
Advantages of Inertial Tracking: It does not require a line of sight and has no range limitations. It is not prone to optical, acoustic, magnetic, and RE interference sources. Furthermore, it provides motion measurement with high bandwidth. Moreover, it has negligible latency and can be processed as fast as one desires.
Disadvantages of Inertial Tracking: They are prone to drift of orientation and position over time, but their major impact is on the position measurement. The rationale behind this is that the position must be derived from the velocity measurements. The usage of a filter could help in resolving this issue. However, the issue could while focusing on this, the filter can decrease the responsiveness and the update rate of the tracker [54][35]. For the ultimate correction of this issue of the drift, the inertial sensor should be combined with any other kind of sensor. For instance, it could be combined with ultrasonic range measurement devices and optical trackers.

2.1.2. Vision-Based Tracking

Vision-based tracking is defined as tracking approaches that ascertain the camera pose by the use of data captured from optical sensors and as registration. The optical sensors can be divided into the following three categories:
  • visible light tracking;
  • 3D structure tracking;
  • infrared tracking.
In recent times, vision-based tracking AR is becoming highly popular due to the improved computational power of consumer devices and the ubiquity of mobile devices, such as tablets and smartphones, thereby making them the best platform for AR technologies. Chakrabarty et al. [55][36] contribute to the development of autonomous tracking by integrating the CMT into IBVS, their impact on the rigid deformable targets in indoor settings, and finally the integration of the system into the Gazebo simulator. Vision-based tracking is demonstrated by the use of an effective object tracking algorithm [56][37] known as the clustering of static-adaptive correspondences for deformable object tracking (CMT). Gupta et al. [57][38] detail the comparative analysis between the different types of vision-based tracking systems.
Moreover, Krishna et al. [58][39] explore the use of electroencephalogram (EEG) signals in user authentication. User authentication is similar to facial recognition in mobile phones. Moreover, this is also evaluated by combining it with eye-tracking data. This research contributes to the development of a novel evaluation paradigm and a biometric authentication system for the integration of these systems. Furthermore, Dzsotjan et al. [59][40] delineate the usefulness of the eye-tracking data evaluated during the lectures in order to determine the learning gain of the user. Microsoft HoloLens2’s designed Walk the Graph app was used to generate the data. Binary classification was performed on the basis of the kinematic graphs which users reported of their own movement.
Ranging from smartphones to laptops and even to wearable devices with suitable cameras located in them, visible light tracking is the most commonly used optical sensor. These cameras are particularly important because they can both make a video of the real environment and can also register the virtual content to it, and thereby can be used in video see-through AR systems.
Chen et al. [60][41] resolve the shortcomings of the deep learning lightning model (DAM) by combining the method of transferring a regular video to a 3D photo-realistic avatar and a high-quality 3D face tracking algorithm. The evaluation of the proposed system suggests its effectiveness in real-world scenarios when we have variability in expression, pose, and illumination. Furthermore, Rambach et al. [61][42] explore the details pipeline of 6DoF object tracking using scanned 3D images of the objects. The scope of research covers the initialization of frame-to-frame tracking, object registration, and implementation of these aspects to make the experience more efficient. Moreover, it resolves the challenges that we faced with occlusion, illumination changes, and fast motion.

2.1.3. Three-Dimensional Structure Tracking

Three-dimensional structure information has become very affordable because of the development of commercial sensors capable of accomplishing this task. It was begun after the development of Microsoft Kinect [62][43]. Syahidi et al. [63][44] introduce a 3D AR-based learning system for pre-school children. For determining the three-dimensional points in the scene, different types of sensors could be used. The most commonly used are the structured lights [64][45] or the time of flight [65][46]. These technologies work on the principle of depth analysis. In this, the real environment depth information is extracted by the mapping and the tracking [66][47]. The Kinect system [67][48], developed by Microsoft, is one of the widely used and well-developed approaches in Augmented Reality.
Rambach et al. [68][49] present the idea of augmented things: utilizing off-screen rendering of 3D objects, the realization of application architecture, universal 3D object tracking based on the high-quality scans of the objects, and a high degree of parallelization. Viyanon et al. [69][50] focus on the development of an AR app known as “AR Furniture" for providing the experience of visualizing the design and decoration to the customers. The customers fit the pieces of furniture in their rooms and were able to make a decision regarding their experience. Turkan et al. [70][51] introduce the new models for teaching structural analysis which has considerably improved the learning experience. The model integrates 3D visualization technology with mobile AR. Students can enjoy the different loading conditions by having the choice of switching loads, and feedback can be provided in the real-time by AR interface.

2.1.4. Infrared Tracking

The objects that emitted or reflected the light are some of the earliest vision-based tracking techniques used in AR technologies. Their high brightness compared to their surrounding environment made this tracking very easy [71,72][52][53]. The self-light emitting targets were also indifferent to the drastic illumination effects i.e., harsh shadows or poor ambient lighting. In addition, these targets could either be transfixed to the object being tracked and camera at the exterior of the object and was known as “outside-looking-in” [73][54]. Or it could be “inside-looking-out”, external in the environment with camera attached to the target [74][55]. The inside-looking-out configuration, compared to the sensor of the inside-looking-out system, has greater resolution and higher accuracy of angular orientation. The inside-looking-out configuration is used in the development of several systems [20,75,76[56][57][58][59],77], typically with infrared LEDs mounted on the ceiling and a head-mounted display with a camera facing externally.

2.1.5. Model-Based Tracking

The three-dimensional tracking of real-world objects has been the subject of researchers’ interest. It is not as popular as natural feature tracking or planner fiducials, however, a large amount of research has been done on it. In the past, tracking the three-dimensional model of the object was usually created by the hand. In this system, the lines, cylinders, spheres, circles, and other primitives were combined to identify the structure of objects [78][60]. Wuest et al. [79][61] focus on the development of the scalable and performance pipeline for creating a tracking solution. The structural information of the scene was extracted by using the edge filters. Additionally, for the determination of the pose, edge information and the primitives were matched [80][62].
In addition, Gao et al. [81][63] explore the tracking method to identify the different vertices of a convex polygon. This is done successfully as most of the markers are square. The coordinates of four vertices are used to determine the transformation matrix of the camera. Results of the experiment suggested that the algorithm was so robust to withstand fast motion and large ranges that make the tracking more accurate, stable, and real time.
The combination of edge-based tracking and natural feature tracking has the following advantages:
  • It provides additional robustness [82][64].
  • Enables spatial tracking and thereby is able to be operated in open environments [83][65].
  • For variable and complex environments, greater robustness was required. Therefore, they introduced the concept of keyframes [84][66] in addition to the primitive model [85][67].
Figen et al. [86][68] demonstrate of a series of studies that were done at the university level in which participants were asked to make the mass volume of buildings. The first study demanded the solo work of a designer in which they had to work using two tools: MTUIs of the AR apps and analog tools. The second study developed the collaboration of the designers while using analog tools. The study has two goals: change in the behavior of the designer while using AR apps and affordances of different interfaces.
Developing and updating the real environment’s map simultaneously had been the subject of interest in model-based tracking. This has a number of developments. First, simultaneous localization and map building (SLAM) was primarily done for robot navigation in unknown environments [87][69]. In augmented reality, [88[70][71],89], this technique was used for tracking the unknown environment in a drift-free manner. Second, parallel mapping and tracking [88][70] was developed especially for AR technology. In this, the mapping of environmental components and the camera tracks were identified as a separate function. It improved tracking accuracy and also overall performance. However, like SLAM, it did not have the capability to close large loops in the constrained environment and area (Figure 64).
Figure 64.
Hybrid tracking: inertial and SLAM combined and used in the latest mobile-based AR tracking.
Oskiper et al. [90][72] propose a simultaneous localization and mapping (SLAM) framework for sensor fusion, indexing, and feature matching in AR apps. It has a parallel mapping engine and error-state extended Kalman filter (EKF) for these purposes. Zhang et al.’s [91][73] Jaguar is a mobile tracking AR application with low latency and flexible object tracking. Thise paper discusses the design, execution, and evaluation of Jaguar. Jaguar enables a markerless tracking feature which is enabled through its client development on top of ARCoreest from Google. ARCore is also helpful for context awareness while estimating and recognizing the physical size and object capabilities, respectively.

2.1.6. Global Positioning System—GPS Tracking

This technology refers to the positioning of outdoor tracking with reference to the earth. The present accuracy of the GPS system is up to 3 m. However, improvements are available with the advancements in satellite technology and a few other developments. Real-time kinematic (RTS) is one example of them. It works by using the carrier of a GPS signal. The major benefit of it is that it has the ability to improve the accuracy level up to the centimeter level. Feiner’s touring machine [92][74] was the first AR system that utilized GPS in its tracking system. It used the inclinometer/magnetometer and differential GPS positional tracking. The military, gaming [93,94][75][76], and the viewership of historical data [95][77] have applied GPS tracking for the AR experiences. As it only has the supporting positional tracking low accuracy, it could only be beneficial in the hybrid tracking systems or in the applications where the pose registration is not important. AR et al. [96][78] use the GPS-INS receiver to develop models for object motion having more precision. Ashutosh et al. [97][79] explore the hardware challenges of AR technology and also explore the two main components of hardware technology: battery performance and global positioning system (GPS). Table 1 provides a succinct categorization of the prominent tracking technologies in augmented reality. Example studies are referred to while highlighting the advantages and challenges of each type of tracking technology. Moreover, possible areas of application are suggested.
Table 1.
Summary of tracking techniques and their related attributes.
118][100] introduce an AR canvas for information visualization which is quite different from the traditional AR canvas. Therefore, dimensions and essential aspects for developing the visualization design for AR-canvas while enlisting the several limitations within the process. Zeng et al. [119][101] discuss the design and the implementation of FunPianoAR for creating a better AR piano learning experience. However, a number of discrepancies occurred with this system, and the initiation of a hybrid system is a more viable option. Rewkowski et al. [120][102] introduce a prototype system of AR to visualize the laparoscopic training task. This system is capable of tracking small objects and requires surgery training by using widely compatible and inexpensive borescopes.

2.1.8. Hybrid Tracking

Hybrid tracking systems were used to improve the following aspects of the tracking systems:
  • Improving the accuracy of the tracking system.
  • Coping with the weaknesses of the respective tracking methods.

2.1.7. Miscellaneous Tracking

Yang et al. [98], in order to recognize the different forms of hatch covers having similar shapes, propose tracking and cover recognition methods. The results of the experiment suggest its real-time property and practicability, and tracking accuracy was enough to be implemented in the AR inspection environment. Kang et al. [99] propose a pupil tracker which consists of several features that make AR more robust: key point alignment, eye-nose detection, and infrared (NIR) led. NIR led turns on and off based on the illumination light. The limitation of this detector is that it cannot be applied in low-light conditions.
Moreover, Bach et al. [
  • Adding more degrees of freedom.
  • Improving the accuracy of the tracking system.
  • Coping with the weaknesses of the respective tracking methods.
  • Adding more degrees of freedom.
Gorovyi et al. [108][88] detail the basic principles that make up an AR by proposing a hybrid visual tracking algorithm. The direct tracking techniques are incorporated with the optical flow technique to achieve precise and stable results. The results suggested that they both can be incorporated to make a hybrid system, and ensured its success in devices having limited hardware capabilities. Previously, magnetic tracking [109][89] or inertial trackers [110][90] were used in the tracking applications while using the vision-based tracking system. Isham et al. [111][91] use a game controller and hybrid tracking to identify and resolve the ultrasound image position in a 3D AR environment. This hybrid system was beneficial because of the following reasons:
  • Low drift of vision-based tracking.
  • Low jitter of vision-based tracking.
  • They had a robust sensor with high update rates. These characteristics decreased the invalid pose computation and ensured the responsiveness of the graphical updates [121].
  • Low jitter of vision-based tracking.
  • They had a robust sensor with high update rates. These characteristics decreased the invalid pose computation and ensured the responsiveness of the graphical updates [103
  • They had more developed inertial and magnetic trackers which were capable of extending the range of tracking and did not require the line of sight. The above-mentioned benefits suggest that the utilization of the hybrid system is more beneficial than just using the inertial trackers.
  • Low drift of vision-based tracking.
  • ].
  • They had more developed inertial and magnetic trackers which were capable of extending the range of tracking and did not require the line of sight. The above-mentioned benefits suggest that the utilization of the hybrid system is more beneficial than just using the inertial trackers.
In addition, Mao et al. [122][104] propose a new tracking system with a number of unique features. First, it accurately translates the relative distance into the absolute distance by locating the reference points at the new positions. Secondly, it embraces the separate receiver and sender. Thirdly, resolves the discrepancy in the sampling frequency between the sender and receiver. Finally, the frequency shift due to movement is highly considered in this system. Moreover, the combination of the IMU sensor and Doppler shift with the distributed frequency modulated continuous waveform (FMCW) helps in the continuous tracking of mobile due to multiple time interval developments. The evaluation of the system suggested that it can be applied to the existing hardware and has an accuracy to the millimeter level.
The GPS tracking system alone only provides the positional information and has low accuracy. So, GPS tracking systems are usually combined with vision-based tracking or inertial sensors. The intervention would help gain the full pose estimation of 6DoF [123][105]. Moreover, backup tracking systems have been developed as an alternative when the GPS fails [98,124][98][106]. The optical tracking systems [100][80] or the ultrasonic rangefinders [101][81] can be coupled with the inertial trackers for enhancing efficiency. As the differential measurement approach causes the problem of drift, these hybrid systems help resolve them. Furthermore, the use of gravity as a reference to the inertial sensor made them static and bound. The introduction of the hybrid system would make them operate in a simulator, vehicle, or in any other moving platform [125][107]. The introduction of accelerators, cameras, gyroscopes [126][108], global positioning systems [127][109], and wireless networking [128][110] in mobile phones such as tablets and smartphones also gives an opportunity for hybrid tracking. Furthermore, these devices have the capability of determining outdoor as well as indoor accurate poses [129][111].

2.2. Marker-Based Tracking

Fiducial Tracking: Artificial landmarks for aiding the tracking and registration that are added to the environment are known as fiducial. The complexity of fiducial tracking varies significantly depending upon the technology and the application used. Pieces of paper or small colored LEDs were used typically in the early systems, which had the ability to be detected using color matching and could be added to the environment [130][112]. If the position of fiducials is well-known and they are detected enough in the scene then the pose of the camera can be determined. The positioning of one fiducial on the basis of a well-known previous position and the introduction of additional fiducials gives an additional benefit that workplaces could dynamically extend [131][113]. A QR code-based fudicial/marker is also proposed by some researchers for marker-/tag-based tracking [115][95]. With the progression of work on the concept and complexity of the fiducials, additional features such as multi-rings were introduced for the detection of fiducials at much larger distances [116][96]. A minimum of four points of a known position is needed for determining for calculating the pose of the viewer [117][97]. In order to make sure that the four points are visible, the use of these simpler fiducials demanded more care and effort for placing them in the environment. Examples of such fiducials are ARToolkit and its successors, whose registration techniques are mostly planar fiducial. In the upcoming section, AR display technologies are discussed to fulfill all the conditions of Azuma’s definition.

2.3. Summary

The text above provides comprehensive details on tracking technologies that are broadly classified into markerless and marker-based approaches. Both types have many subtypes whose details, applications, pros, and cons are provided in a detailed fashion. The different categories of tracking technologies are presented in Figure 2, while the summary of tracking technologies is provided in Figure 75. Among the different tracking technologies, hybrid tracking technologies are the most adaptive. 
Figure 75.
Steps for combining real and virtual content.

References

  1. Kerber, R. Advanced tactic targeted grocer. The Boston Globe. 2008. Available online: https://seclists.org/isn/2008/Mar/126 (accessed on 20 October 2022).
  2. Mansfield-Devine, S. Interview: BYOD and the enterprise network. Comput. Fraud Secur. 2012, 2012, 14–17.
  3. Nofer, M.; Gomber, P.; Hinz, O.; Schiereck, D. Blockchain. Bus. Inf. Syst. Eng. 2017, 59, 183–187.
  4. Behzadan, A.H.; Aziz, Z.; Anumba, C.J.; Kamat, V.R. Ubiquitous location tracking for context-specific information delivery on construction sites. Autom. Constr. 2008, 17, 737–748.
  5. Bottani, E.; Vignali, G. Augmented reality technology in the manufacturing industry: A review of the last decade. IISE Trans. 2019, 51, 284–310.
  6. Sereno, M.; Wang, X.; Besançon, L.; McGuffin, M.J.; Isenberg, T. Collaborative work in augmented reality: A survey. IEEE Trans. Vis. Comput. Graph. 2020, 28, 2530–2549.
  7. Ens, B.; Quigley, A.; Yeo, H.S.; Irani, P.; Piumsomboon, T.; Billinghurst, M. Counterpoint: Exploring mixed-scale gesture interaction for AR applications. In Proceedings of the Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–6.
  8. Khan, D.; Ullah, S.; Yan, D.M.; Rabbi, I.; Richard, P.; Hoang, T.; Billinghurst, M.; Zhang, X. Robust tracking through the design of high quality fiducial markers: An optimization tool for ARToolKit. IEEE Access 2018, 6, 22421–22433.
  9. Hettig, J.; Engelhardt, S.; Hansen, C.; Mistelbauer, G. AR in VR: Assessing surgical augmented reality visualizations in a steerable virtual reality environment. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 1717–1725.
  10. Goh, E.S.; Sunar, M.S.; Ismail, A.W. 3D object manipulation techniques in handheld mobile augmented reality interface: A review. IEEE Access 2019, 7, 40581–40601.
  11. Kollatsch, C.; Klimant, P. Efficient integration process of production data into Augmented Reality based maintenance of machine tools. Prod. Eng. 2021, 15, 311–319.
  12. Bhattacharyya, P.; Nath, R.; Jo, Y.; Jadhav, K.; Hammer, J. Brick: Toward a model for designing synchronous colocated augmented reality games. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–9.
  13. Kim, K.; Billinghurst, M.; Bruder, G.; Duh, H.B.L.; Welch, G.F. Revisiting trends in augmented reality research: A review of the 2nd decade of ISMAR (2008–2017). IEEE Trans. Vis. Comput. Graph. 2018, 24, 2947–2962.
  14. Liu, H.; Wang, L. An AR-based worker support system for human-robot collaboration. Procedia Manuf. 2017, 11, 22–30.
  15. Cortés-Dávalos, A.; Mendoza, S. Collaborative Web Authoring of 3D Surfaces Using Augmented Reality on Mobile Devices. In Proceedings of the 2016 IEEE/WIC/ACM International Conference on Web Intelligence (WI), Omaha, NE, USA, 13–16 October 2016; pp. 640–643.
  16. Qiao, X.; Ren, P.; Dustdar, S.; Liu, L.; Ma, H.; Chen, J. Web AR: A promising future for mobile augmented reality—State of the art, challenges, and insights. Proc. IEEE 2019, 107, 651–666.
  17. Agati, S.S.; Bauer, R.D.; Hounsell, M.d.S.; Paterno, A.S. Augmented reality for manual assembly in industry 4.0: Gathering guidelines. In Proceedings of the 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil, 7–10 November 2020; pp. 179–188.
  18. Shukri, S.A.A.; Arshad, H.; Abidin, R.Z. The design guidelines of mobile augmented reality for tourism in Malaysia. In AIP Conference Proceedings; AIP Publishing LLC: Melville, NY, USA, 2017; Volume 1891, p. 020026.
  19. Fallahkhair, S.; Brito, C.A. Design Guidelines for Development of Augmented Reality Application with Mobile and Wearable Technologies for Contextual Learning. Braz. J. Technol. Commun. Cogn. Sci. 2019, 7, 1–16.
  20. Akçayır, M.; Akçayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11.
  21. da Silva, M.M.; Teixeira, J.M.X.; Cavalcante, P.S.; Teichrieb, V. Perspectives on how to evaluate augmented reality technology tools for education: A systematic review. J. Braz. Comput. Soc. 2019, 25, 1–18.
  22. Sarkar, P.; Pillai, J.S.; Gupta, A. ScholAR: A collaborative learning experience for rural schools using Augmented Reality application. In Proceedings of the 2018 IEEE Tenth International Conference on Technology for Education (T4E), Chennai, India, 10–13 December 2018; pp. 8–15.
  23. Soleimani, H.; Jalilifar, A.; Rouhi, A.; Rahmanian, M. Augmented Reality and Virtual Reality Scaffoldings in Improving the Abstract Genre Structure in a Collaborative Learning Environment: A CALL Study. J. Engl. Lang. Teach. Learn. 2019, 11, 327–356.
  24. Hadar, E. Toward Development Tools for Augmented Reality Applications—A Practitioner Perspective. In Workshop on Enterprise and Organizational Modeling and Simulation; Springer: Berlin/Heidelberg, Germany, 2018; pp. 91–104.
  25. Mukhametshin, S.; Makhmutova, A.; Anikin, I. Sensor tag detection, tracking and recognition for AR application. In Proceedings of the 2019 International Conference on Industrial Engineering, Applications and Manufacturing (ICIEAM), Tokyo, Japan, 12–15 April 2019; pp. 1–5.
  26. Santoni, F.; De Angelis, A.; Moschitta, A.; Carbone, P. MagIK: A Hand-Tracking Magnetic Positioning System Based on a Kinematic Model of the Hand. IEEE Trans. Instrum. Meas. 2021, 70, 1–13.
  27. Frikha, R.; Ejbali, R.; Zaied, M. Handling occlusion in augmented reality surgical training based instrument tracking. In Proceedings of the 2016 IEEE/ACS 13th International Conference of Computer Systems and Applications (AICCSA), Agadir, Morocco, 29 November–2 December 2016; pp. 1–5.
  28. Wang, M.; Shi, Q.; Song, S.; Meng, M.Q.H. A novel magnetic tracking approach for intrabody objects. IEEE Sensors J. 2020, 20, 4976–4984.
  29. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003.
  30. Davison, A.J.; Reid, I.D.; Molton, N.D.; Stasse, O. MonoSLAM: Real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 1052–1067.
  31. De Smet, J. The Smart Contact Lens: From an Artificial Iris to a Contact Lens Display. Ph.D. Thesis, Ghent University, Ghent, Belgium, 2014.
  32. Dissanayake, M.G.; Newman, P.; Clark, S.; Durrant-Whyte, H.F.; Csorba, M. A solution to the simultaneous localization and map building (SLAM) problem. IEEE Trans. Robot. Autom. 2001, 17, 229–241.
  33. Dodgson, N.A. Autostereoscopic 3D displays. Computer 2005, 38, 31–36.
  34. De Smet, J.; Avci, A.; Joshi, P.; Schaubroeck, D.; Cuypers, D.; De Smet, H. Progress toward a liquid crystal contact lens display. J. Soc. Inf. Disp. 2013, 21, 399–406.
  35. Heidemann, G.; Bax, I.; Bekel, H. Multimodal interaction in an augmented reality scenario. In Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, USA, 13–15 October 2004; pp. 53–60.
  36. Chakrabarty, A.; Morris, R.; Bouyssounouse, X.; Hunt, R. Autonomous indoor object tracking with the Parrot AR. Drone. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; pp. 25–30.
  37. Buttner, S.; Sand, O.; Rocker, C. Exploring design opportunities for intelligent worker assistance: A new approach using projetion-based AR and a novel hand-tracking algorithm. In Proceedings of the European Conference on Ambient Intelligence, Malaga, Spain, 26–28 April 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 33–45.
  38. Gupta, S.; Chaudhary, R.; Gupta, S.; Kaur, A.; Mantri, A. A survey on tracking techniques in augmented reality based application. In Proceedings of the 2019 Fifth International Conference on Image Information Processing (ICIIP), Shimla, India, 15–17 November 2019; pp. 215–220.
  39. Krishna, V.; Ding, Y.; Xu, A.; Höllerer, T. Multimodal biometric authentication for VR/AR using EEG and eye tracking. In Proceedings of the Adjunct of the 2019 International Conference on Multimodal Interaction, Suzhou, China, 14–18 October 2019; pp. 1–5.
  40. Dzsotjan, D.; Ludwig-Petsch, K.; Mukhametov, S.; Ishimaru, S.; Kuechemann, S.; Kuhn, J. The Predictive Power of Eye-Tracking Data in an Interactive AR Learning Environment. In Proceedings of the Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, Virtual, 21–26 September 2021; pp. 467–471.
  41. Chen, L.; Cao, C.; De la Torre, F.; Saragih, J.; Xu, C.; Sheikh, Y. High-fidelity Face Tracking for AR/VR via Deep Lighting Adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13059–13069.
  42. Rambach, J.; Pagani, A.; Schneider, M.; Artemenko, O.; Stricker, D. 6DoF object tracking based on 3D scans for augmented reality remote live support. Computers 2018, 7, 6.
  43. Ha, T.; Billinghurst, M.; Woo, W. An interactive 3D movement path manipulation method in an augmented reality environment. Interact. Comput. 2012, 24, 10–24.
  44. Syahidi, A.A.; Tolle, H.; Supianto, A.A.; Arai, K. AR-Child: Analysis, Evaluation, and Effect of Using Augmented Reality as a Learning Media for Preschool Children. In Proceedings of the 2019 5th International Conference on Computing Engineering and Design (ICCED), Purwokerto, Indonesia, 5–6 August 2019; pp. 1–6.
  45. Wang, X.; Dunston, P.S. User perspectives on mixed reality tabletop visualization for face-to-face collaborative design review. Autom. Constr. 2008, 17, 399–412.
  46. Wang, X.; Dunston, P.S. Comparative effectiveness of mixed reality-based virtual environments in collaborative design. IEEE Trans. Syst. Man Cybern. Part C 2011, 41, 284–296.
  47. Hauptmann, A.G. Speech and gestures for graphic image manipulation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 30 April–4 June 1989; pp. 241–245.
  48. Heath, C.; Luff, P. Disembodied conduct: Communication through video in a multi-media office environment. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 27 April–2 May 1991; pp. 99–103.
  49. Rambach, J.; Pagani, A.; Stricker, D. Augmented things: Enhancing AR applications leveraging the internet of things and universal 3D object tracking. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Nantes, France, 9–13 October 2017; pp. 103–108.
  50. Viyanon, W.; Songsuittipong, T.; Piyapaisarn, P.; Sudchid, S. AR furniture: Integrating augmented reality technology to enhance interior design using marker and markerless tracking. In Proceedings of the 2nd International Conference on Intelligent Information Processing, Bangkok Thailand, 17–18 July 2017; pp. 1–7.
  51. Turkan, Y.; Radkowski, R.; Karabulut-Ilgu, A.; Behzadan, A.H.; Chen, A. Mobile augmented reality for teaching structural analysis. Adv. Eng. Inform. 2017, 34, 90–100.
  52. Dorfmüller, K. Robust tracking for augmented reality using retroreflective markers. Comput. Graph. 1999, 23, 795–800.
  53. Danielsson, O.; Holm, M.; Syberfeldt, A. Augmented reality smart glasses for operators in production: Survey of relevant categories for supporting operators. Procedia CIRP 2020, 93, 1298–1303.
  54. Dörner, R.; Geiger, C.; Haller, M.; Paelke, V. Authoring mixed reality—A component and framework-based approach. In Entertainment Computing; Springer: Berlin/Heidelberg, Germany, 2003; pp. 405–413.
  55. Drascic, D.; Milgram, P. Positioning accuracy of a virtual stereographic pointer in a real stereoscopic video world. In Proceedings of the Stereoscopic Displays and Applications II, San Jose, CA, USA, 25–27 February 1991; Volume 1457, pp. 302–313.
  56. Drascic, D.; Grodski, J.J.; Milgram, P.; Ruffo, K.; Wong, P.; Zhai, S. ARGOS: A display system for augmenting reality. In Proceedings of the INTERACT’93 and CHI’93 Conference on Human Factors in Computing Systems, Amsterdam, The Netherlands, 24–29 April 1993; p. 521.
  57. Drascic, D.; Milgram, P. Perceptual issues in augmented reality. In Proceedings of the Stereoscopic Displays and Virtual Reality Systems III. International Society for Optics and Photonics, San Jose, CA, USA, 30 January–2 February 1996; Volume 2653, pp. 123–134.
  58. Dünser, A. Supporting low ability readers with interactive augmented reality. Annu. Rev. Cybertherapy Telemed. 2008, 6, 39–46.
  59. Dünser, A.; Hornecker, E. Lessons from an AR book study. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction, Baton Rouge, LA, USA, 15–17 February 2007; pp. 179–182.
  60. Gibson, L.; Hanson, V.L. Digital motherhood: How does technology help new mothers? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 313–322.
  61. Wuest, H.; Engekle, T.; Wientapper, F.; Schmitt, F.; Keil, J. From CAD to 3D Tracking—Enhancing & Scaling Model-Based Tracking for Industrial Appliances. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida, Mexico, 19–23 September 2016; pp. 346–347.
  62. LaViola, J.J. A discussion of cybersickness in virtual environments. ACM SIGCHI Bull. 2000, 32, 47–56.
  63. Gao, Y.F.; Wang, H.Y.; Bian, X.N. Marker tracking for video-based augmented reality. In Proceedings of the 2016 International Conference on Machine Learning and Cybernetics (ICMLC), Jeju Island, Republic of Korea, 10–13 July 2016; Volume 2, pp. 928–932.
  64. Szalavári, Z.; Eckstein, E.; Gervautz, M. Collaborative gaming in augmented reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Taipei, Taiwan, 2–5 November 1998; pp. 195–204.
  65. Dolata, M.; Agotai, D.; Schubiger, S.; Schwabe, G. Pen-and-paper rituals in service interaction: Combining high-touch and high-tech in financial advisory encounters. Proc. ACM Hum.-Comput. Interact. 2019, 3, 1–24.
  66. Butz, A.; Hollerer, T.; Feiner, S.; MacIntyre, B.; Beshers, C. Enveloping users and computers in a collaborative 3D augmented reality. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), San Francisco, CA, USA, 20–21 October 1999; pp. 35–44.
  67. Müller, J.; Rädle, R.; Reiterer, H. Remote collaboration with mixed reality displays: How shared virtual landmarks facilitate spatial referencing. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 6481–6486.
  68. Gül, L.F. Studying gesture-based interaction on a mobile augmented reality application for co-design activity. J. Multimodal User Interfaces 2018, 12, 109–124.
  69. Benko, H.; Ishak, E.W.; Feiner, S. Collaborative mixed reality visualization of an archaeological excavation. In Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 5 November 2004; pp. 132–140.
  70. Franz, J.; Alnusayri, M.; Malloch, J.; Reilly, D. A comparative evaluation of techniques for sharing AR experiences in museums. Proc. ACM Hum.-Comput. Interact. 2019, 3, 1–20.
  71. An, Z.; Xu, X.; Yang, J.; Liu, Y.; Yan, Y. Research of the three-dimensional tracking and registration method based on multiobjective constraints in an AR system. Appl. Opt. 2018, 57, 9625–9634.
  72. Oskiper, T.; Samarasekera, S.; Kumar, R. CamSLAM: Vision Aided Inertial Tracking and Mapping Framework for Large Scale AR Applications. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Nantes, France, 9–13 October 2017; pp. 216–217.
  73. Zhang, W.; Han, B.; Hui, P. Jaguar: Low latency mobile augmented reality with flexible tracking. In Proceedings of the 26th ACM International Conference on Multimedia, Seoul, Republic of Korea, 22–26 October 2018; pp. 355–363.
  74. Tokusho, Y.; Feiner, S. Prototyping an outdoor mobile augmented reality street view application. In ISMAR Workshop on Outdoor Mixed and Augmented Reality; Citeseer: Princeton, NJ, USA, 2009; Volume 2.
  75. Henrysson, A.; Ollila, M. UMAR: Ubiquitous mobile augmented reality. In Proceedings of the 3rd International Conference on Mobile and Ubiquitous Multimedia, College Park, MD, USA, 27–29 October 2004; pp. 41–45.
  76. Henrysson, A.; Ollila, M.; Billinghurst, M. Mobile phone based AR scene assembly. In Proceedings of the 4th International Conference on Mobile and Ubiquitous Multimedia, Christchurch, New Zealand, 8–10 December 2005; pp. 95–102.
  77. Hilliges, O.; Kim, D.; Izadi, S.; Weiss, M.; Wilson, A. HoloDesk: Direct 3d interactions with a situated see-through display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 2421–2430.
  78. Ar, Y.; Ünal, M.; Sert, S.Y.; Bostanci, E.; Kanwal, N.; Güzel, M.S. Evolutionary Fuzzy Adaptive Motion Models for User Tracking in Augmented Reality Applications. In Proceedings of the 2018 2nd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), Ankara, Turkey, 19–21 October 2018; pp. 1–6.
  79. Ashutosh, K. Hardware Performance Analysis of Mobile-Based Augmented Reality Systems. In Proceedings of the 2020 International Conference on Computational Performance Evaluation (ComPE), Shillong, India, 2–4 July 2020; pp. 671–675.
  80. Hu, X.; Hua, H. Design of an optical see-through multi-focal-plane stereoscopic 3d display using freeform prisms. In Frontiers in Optics; Optical Society of America: Washington, DC, USA, 2012; p. FTh1F-2.
  81. Hourcade, J.P. Interaction Design and Children; Now Publishers Inc.: Delft, The Netherland, 2008.
  82. Kang, D.; Ma, L. Real-Time Eye Tracking for Bare and Sunglasses-Wearing Faces for Augmented Reality 3D Head-Up Displays. IEEE Access 2021, 9, 125508–125522.
  83. Jeong, J.; Lee, C.K.; Lee, B.; Lee, S.; Moon, S.; Sung, G.; Lee, H.S.; Lee, B. Holographically printed freeform mirror array for augmented reality near-eye display. IEEE Photonics Technol. Lett. 2020, 32, 991–994.
  84. Park, S.g. Augmented and mixed reality optical see-through combiners based on plastic optics. Inf. Disp. 2021, 37, 6–11.
  85. Lee, Y.H.; Zhan, T.; Wu, S.T. Prospects and challenges in augmented reality displays. Virtual Real. Intell. Hardw. 2019, 1, 10–20.
  86. Jang, C.; Mercier, O.; Bang, K.; Li, G.; Zhao, Y.; Lanman, D. Design and fabrication of freeform holographic optical elements. ACM Trans. Graph. (TOG) 2020, 39, 1–15.
  87. Yu, C.; Peng, Y.; Zhao, Q.; Li, H.; Liu, X. Highly efficient waveguide display with space-variant volume holographic gratings. Appl. Opt. 2017, 56, 9390–9397.
  88. Gorovyi, I.M.; Sharapov, D.S. Advanced image tracking approach for augmented reality applications. In Proceedings of the 2017 Signal Processing Symposium (SPSympo), Auckland, New Zealand, 27–30 November 2017; pp. 1–5.
  89. Hix, D.; Gabbard, J.L.; Swan, J.E.; Livingston, M.A.; Hollerer, T.H.; Julier, S.J.; Baillot, Y.; Brown, D. A cost-effective usability evaluation progression for novel interactive systems. In Proceedings of the 37th Annual Hawaii International Conference on System Sciences, Big Island, HI, USA, 5–8 January 2004.
  90. Hodges, S.; Williams, L.; Berry, E.; Izadi, S.; Srinivasan, J.; Butler, A.; Smyth, G.; Kapur, N.; Wood, K. SenseCam: A retrospective memory aid. In Proceedings of the International Conference on Ubiquitous Computing, Seoul, Republic of Korea, 21–24 September 2008; Springer: Berlin/Heidelberg, Germany, 2006; pp. 177–193.
  91. Isham, M.I.M.; Mohamed, F.; Siang, C.V.; Yusoff, Y.A.; Abd Aziz, A.A.; Dewi, D.E.O. A framework of ultrasounds image slice positioning and orientation in 3D augmented reality environment using hybrid tracking method. In Proceedings of the 2018 IEEE Conference on Big Data and Analytics (ICBDA), Seattle, WA, USA, 10–13 December 2018; pp. 105–110.
  92. Park, J.H.; Kim, S.B. Optical see-through holographic near-eye-display with eyebox steering and depth of field control. Opt. Express 2018, 26, 27076–27088.
  93. Chakravarthula, P.; Peng, Y.; Kollin, J.; Fuchs, H.; Heide, F. Wirtinger holography for near-eye displays. ACM Trans. Graph. (TOG) 2019, 38, 1–13.
  94. Peng, Y.; Choi, S.; Padmanaban, N.; Wetzstein, G. Neural holography with camera-in-the-loop training. ACM Trans. Graph. (TOG) 2020, 39, 1–14.
  95. Ruan, W.; Yao, L.; Sheng, Q.Z.; Falkner, N.J.; Li, X. Tagtrack: Device-free localization and tracking using passive rfid tags. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, London, UK, 2–5 December 2014; pp. 80–89.
  96. Ellis, S.R.; Menges, B.M. Studies of the Localization of Virtual Objects in the Near Visual Field. In Fundamentals of Wearable Computers and Augmented Reality, 1st ed.; CRC Press: Boca Raton, FL, USA, 2001.
  97. Evennou, F.; Marx, F. Advanced integration of WiFi and inertial navigation systems for indoor mobile positioning. EURASIP J. Adv. Signal Process. 2006, 2006, 1–11.
  98. Yang, X.; Fan, X.; Wang, J.; Yin, X.; Qiu, S. Edge-based cover recognition and tracking method for an AR-aided aircraft inspection system. Int. J. Adv. Manuf. Technol. 2020, 111, 3505–3518.
  99. Kang, D.; Heo, J.; Kang, B.; Nam, D. Pupil detection and tracking for AR 3D under various circumstances. Electron. Imaging 2019, 2019, 55-1–55-5.
  100. Bach, B.; Sicat, R.; Pfister, H.; Quigley, A. Drawing into the AR-CANVAS: Designing embedded visualizations for augmented reality. In Workshop on Immersive Analytics; IEEE Vis: Piscataway, NJ, USA, 2017.
  101. Zeng, H.; He, X.; Pan, H. FunPianoAR: A novel AR application for piano learning considering paired play based on multi-marker tracking. J. Phys. Conf. Ser. 2019, 1229, 012072.
  102. Rewkowski, N.; State, A.; Fuchs, H. Small Marker Tracking with Low-Cost, Unsynchronized, Movable Consumer Cameras For Augmented Reality Surgical Training. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020; pp. 90–95.
  103. Hoffman, H.G. Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments. In Proceedings of the IEEE 1998 Virtual Reality Annual International Symposium (Cat. No. 98CB36180), Atlanta, GA, USA, 14–18 March 1998; pp. 59–63.
  104. Mao, W.; He, J.; Qiu, L. Cat: High-precision acoustic motion tracking. In Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking, New York, NY, USA, 3–7 October 2016; pp. 69–81.
  105. Höllerer, T.; Wither, J.; DiVerdi, S. “Anywhere augmentation”: Towards mobile augmented reality in unprepared environments. In Location Based Services and TeleCartography; Springer: Berlin/Heidelberg, Germany, 2007; pp. 393–416.
  106. Hong, J. Considering privacy issues in the context of Google glass. Commun. ACM 2013, 56, 10–11.
  107. Hua, H.; Brown, L.D.; Gao, C.; Ahuja, N. A new collaborative infrastructure: SCAPE. In Proceedings of the IEEE Virtual Reality, 2003. Proceedings, Los Angeles, CA, USA, 22–26 March 2003; IEEE: Piscataway, NJ, USA, 2003; pp. 171–179.
  108. Huang, Y.; Weng, D.; Liu, Y.; Wang, Y. Key issues of wide-area tracking system for multi-user augmented reality adventure game. In Proceedings of the 2009 Fifth International Conference on Image and Graphics, Xi’an, China, 20–23 September 2009; pp. 646–651.
  109. Huber, M.; Pustka, D.; Keitler, P.; Echtler, F.; Klinker, G. A system architecture for ubiquitous tracking environments. In Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007; pp. 211–214.
  110. Hugues, O.; Fuchs, P.; Nannipieri, O. New augmented reality taxonomy: Technologies and features of augmented environment. In Handbook of Augmented Reality; Springer: Berlin/Heidelberg, Germany, 2011; pp. 47–63.
  111. Inami, M.; Kawakami, N.; Sekiguchi, D.; Yanagida, Y.; Maeda, T.; Tachi, S. Visuo-haptic display using head-mounted projector. In Proceedings of the Proceedings IEEE Virtual Reality 2000 (Cat. No. 00CB37048), New Brunswick, NJ, USA, 18–22 March 2000; IEEE: Piscataway, NJ, USA, 2000; pp. 233–240.
  112. Ekman, P.; Friesen, W.V. The Repertoire of Nonverbal Behavior: Categories, Origins, Usage, and Coding; De Gruyter Mouton: Berlin, Germany, 2010.
  113. Ellis, S.R.; Menges, B.M. Judgments of the distance to nearby virtual objects: Interaction of viewing conditions and accommodative demand. Presence Teleoperators Virtual Environ. 1997, 6, 452–460.
More
ScholarVision Creations