Geomatic Sensors for Heritage Documentation: History
Please note this is an old version of this entry, which may differ significantly from the current revision.

Geomatic technologies have been widely populated for cultural heritage applications, while the scientific field is quite broad: from underwater to close-range to low-altitude and satellite observations. Geomatic sensors have been used in applications such as close-range approaches with red-green-blue (RGB) cameras and Terrestrial Laser Scanners (TLS), as well as underwater studies. Low-altitude sensors on Unmanned Aerial Vehicles (UAVs) have also been widely used with RGB and multispectral cameras, as well as lidar and thermal sensors.

  • Sensors
  • Heritage
  • cameras

1. Introduction

In the past, a variety of sensors has been used for documentation and monitoring purposes of heritage sites [1][2][3][4][5][6]. As the technology advances and sensor capabilities have increased, there have been more studies on the subject. Geomatic sensors have been used in applications such as close-range approaches with red-green-blue (RGB) cameras and Terrestrial Laser Scanners (TLS), as well as underwater studies [7][8][9][10][11]. Low-altitude sensors on Unmanned Aerial Vehicles (UAVs) have also been widely used with RGB and multispectral cameras, as well as lidar and thermal sensors [12][13][14][15]. Additionally, researchers have been interested in using aerial and satellite sensors for observing heritage sites and monuments on a macro scale [16][17].

2. Geomatic Sensors for Heritage Documentation

2.1. Close-Range Sensors

2.1.1. RGB Sensors

RGB sensors are passive sensors, commonly used for closed-range photogrammetric applications. These sensors operate in the visible part of the spectrum, between 380–750 nm. While CMOS sensors are sensitive to approximately 350–1050 nm, an infrared filter (750–1000 nm) is applied to reproduce natural colors visible to humans. By replacing the infrared filter with a red one, some cameras can be easily customized to near infrared, hence CMOS records infrared wavelengths into the red channel of the RGB image file [18][19].
The versatility and variety of available RGB cameras in the market are unmatched by any other sensor in the cultural heritage field [20]. Cameras can be classified based on their sensors and lenses [21]. CMOS sensors are usually classified according to their physical size and resolution, which affects the physical pixel size and amount of recorded light. Other sensor characteristics of interest include the color pattern and an antialiasing digital filter.
Lenses are characterized by their focal length and material. Focal length affects the size of the covered area at a given object-to-camera distance and depends on the application. Two lens materials are available: plastic (acrylic) and glass (crystal), with the latter being preferable. Other important characteristics are the number of elements in the lens, chromatic aberrations, distortion, and whether it is a prime or zoom lens [21].
Camera manufacturers prioritize different characteristics, based on the intended application and the final cost of the camera. For example, camera rigidity is desirable for 3D reconstruction applications, but it increases camera weight, making it challenging to mount the camera on a drone. Cameras with interchangeable lenses are more versatile, but this feature also increases camera size and weight. LCD screens and control dials are desirable for professional users, but useless if the device is mounted on a drone. Other characteristics of interest for specific applications include recording in raw format, a wired or wireless connection for remote control, triggering and data recording, a hot shoe for flash, and synchronization for precise triggering.
The primary purpose of such sensors is general-purpose recording and documentation, but they are increasingly used for 3D reconstruction through Structure from Motion (SfM) and Multi-View Stereo (MVS) techniques [22][23]. Calibration is necessary for both types of measurements to achieve high standards [24]. A color checker in the frame of each photo is usually sufficient to achieve color accuracy, while accurate 3D reconstruction requires a rigid camera [25][26]. The camera must be either calibrated before the photo acquisition or self-calibrated during post-processing, while several ground control points must be measured using a higher-order accuracy method, i.e., a total station, to ensure high geometric accuracy and georeferencing.
While there are many different sensors in the market, it is essential to highlight the 360° cameras, which have gained attention from the community as an easy means to record data quickly without missing any information [27][28][29]. There are two main applications for such cameras: virtual tours and 3D reconstruction. The former is served even by affordable commercial cameras, but the latter requires high-end dedicated cameras. Such cameras consist of an array of sensors, which are triggered simultaneously. The most common and affordable approach is two small image sensors mounted back-to-back, coupled with 180° spherical lenses. Most expensive implementations consist of 3–25 sensors and lenses connected with a rigid body, triggered simultaneously. The advantage of more sensors is that each covers a much smaller field of view, limiting the lens distortions and increasing the overall resolution. The same comments for the single-lens cameras apply to each set of sensors and lenses. It should be mentioned that a camera with multi-lenses/sensors should be used for 3D reconstruction if good results are expected [27].

2.1.2. Terrestrial Laser Scanners

The LiDAR technology [30] being used in Terrestrial Laser Scanners (TLS) are active sensors, emitting laser in the 900–1064 nm wavelength, which is reflected by the surrounding objects and returned to the scanner. The scanner measures the time of flight (TOF) or phase shift and calculates the distance from the reflected surface. Modern TLS cover a 360° × 270° window area or even more, and they can acquire points at a rate between 30 K and 2 M points per second [31][32][33].
Beside the laser measurement technology (TOF or phase shift), which directly affects acquisition rate and range, other essential characteristics of TLS include angular resolution, distance accuracy, signal-to-noise ratio, multiple responses, and a coaxial RGB camera. Final point accuracy from the laser head is a combination of distance and angular accuracy, and varies roughly in the 5–15 mm @ 100 m range.
The collected point clouds from TLS are co-registered or geo-registered during the post-processing. The former may be done using sphere targets, and the latter using targets measured with other methods, usually a combination of total stations and Global Navigation Satellite Systems (GNSS). For the final alignment, Iterative Closest Point (ICP) algorithms are employed [34][35][36][37]. Most TLS use complementary sensors, such as GNSS, barometers, and digital compasses, to estimate the initial position and accelerate alignment during post-processing. Some modern TLS use LiDAR or visual Simultaneous Location And Mapping (SLAM) techniques to co-register neighboring scans instead of the aforementioned sensors.
Simultaneous Localization And Mapping (based on visual, IMU, or combined) is also being used to eliminate the need for the scanner to be stationary [38][39][40][41][42]. The implementation of such scanners maybe handheld, backpack, car, or drone mounted. The user holds a rotating laser profiler while walking around and inside the monument. Recorded data are stored and merged into a single-point cloud during post-processing. Such methods are faster in data acquisition, i.e., a monument can be covered in a fraction of the time if stationary TLS were used. However, they are of inferior accuracy, varying from 30 mm to 50 mm @ 100 m range. Professional calibration and service of TLS is necessary, as they are complex and sensitive equipment [43][44].

2.2. Low-Altitude Sensors

Similar sensors are also used in low-altitude applications. The drone RGB sensor is like the RGB sensor discussed previously, but it onboards a drone, allowing for more advantageous positions and angles for photography. The rise of location-aware drones equipped with single-frequency GNSS at the beginning of the 2010s allowed for autonomous flights aimed at large-scale mapping [45][46]. In the following years, multicopper drones were extensively used with oblique photographs for detailed 3D reconstruction of cultural heritage monuments and sites [47][48][49][50][51].
Cameras onboard drones have similar characteristics to standard ones but must be optimized for weight and space. Additionally, given that the object-to-camera distance may be easily altered by proper flying height, the need for interchangeable lenses is limited. The image scale can be controlled by the flying height rather than the lens focal length. Wide lens cameras are adopted in most cases, since they also provide a favorable base-to-height ratio, for better height precision.
Drone vendors prefer small and light cameras, hence cameras free of LCD screens, dials and buttons, viewfinders, etc. In fact, they adopt small custom-made cameras (Original Equipment Manufacturers), focusing on the best lens–sensor selection and optimizing them for size and weight.
Although the camera and drone should be considered as two separate pieces of equipment, each with its own characteristics, vendors dominating the recreational market have introduced combo solutions and have unified characteristics for their products, limiting users’ choices. Some drones allow for payload choices, including various RGB cameras, thermal, multispectral, hyperspectral, and LiDAR sensors [52][53][54][55][56][57], but these are aimed at specialized applications/customers.

2.3. Underwater Sensors

Underwater RGB is a passive sensor but when using flash/lights it becomes active. The natural sunlight is heavily reduced with depth, and taking photos without an artificial light source becomes impracticable. Apart from the passive/active nature of the underwater RGB sensor and limitations imposed by the environment, two more shortcomings need to be noted concerning the recorded information.
The water strongly absorbs the infrared, red, and green wavelengths (from shallow to deep), and the color is diminished to blue. Therefore, color accuracy cannot be ensured, even with color checkers, because the light attenuation depends on environmental parameters and lights-to-object-to-camera distance, which varies from pixel to pixel. Therefore, intense illumination and color differences appear in underwater photos. This problem is an active field of research on haze-removal and color-restoration techniques [58]. So far, there is no algorithm that can work universally.
Having the camera in a watertight enclosure means the light travels through many media (water, glass, air, glass, sensor). Hence the photogrammetric principle of straight light transmission is invalid. Given that the camera is rigidly fixed to the lens body and there are no severe misalignments, the geometric image deformations are radial and tangential to the principal point or near it. Therefore, they can be compensated with the existing lens-distortion models, and the whole process is resolved through camera self-calibration. Dome ports are more suitable than flat ones; hence, the latter introduces several other deformations, like a strong color aberration. After the emergence of SfM–MVS techniques, several applications for underwater heritage geometric documentation have been released [59][60][61][62][63][64].

2.4. Aerial and Satellite Sensors

Aerial and satellite sensors have been widely used for cultural heritage [65][66][67]. Aerial photogrammetry was one of the oldest techniques for reconnaissance over extensive archaeological landscapes and heritage objects [68]. Archive aerial images are now considered of great value as these can provide valuable information related to a landscape that has been changing due to modern construction [69][70]. Similarly, the role of sensors onboard satellite platforms has increased in the last decade. This increase is mainly due to the increased capabilities and improvement of the space sector that can provide enhanced spatial and spectral imagery. Satellites today can provide multispectral and hyperspectral data covering from approximately 380 nm to 2500 nm, while thermal sensors are becoming available today at a very high resolution (5 m) [71][72][73][74][75].

This entry is adapted from the peer-reviewed paper 10.3390/heritage6100357

References

  1. Markiewicz, J.; Tobiasz, A.; Kot, P.; Muradov, M.; Shaw, A.; Al-Shamma’a, A. Review of surveying devices for structural health monitoring of cultural heritage buildings. In Proceedings of the 2019 12th International Conference on Developments in eSystems Engineering (DeSE), Kazan, Russia, 7–10 October 2019; pp. 597–601.
  2. Adamopoulos, E.; Rinaudo, F. Close-range sensing and data fusion for built heritage inspection and monitoring—A review. Remote Sens. 2021, 13, 3936.
  3. Kot, P.; Markiewicz, J.; Muradov, M.; Lapinski, S.; Shaw, A.; Zawieska, D.; Tobiasz, A.; Al-Shamma’a, A. Combination of the photogrammetric and microwave remote sensing for Cultural Heritage documentation and preservation–preliminary results. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43, 1409–1413.
  4. Lercari, N.; Jaffke, D.; Campiani, A.; Guillem, A.; McAvoy, S.; Delgado, G.J.; Bevk Neeb, A. Building Cultural Heritage Resilience through Remote Sensing: An Integrated Approach Using Multi-Temporal Site Monitoring, Datafication, and Web-GL Visualization. Remote Sens. 2021, 13, 4130.
  5. Vileikis, O.; Khabibullaeyev, F. Application of Digital Heritage Documentation for Condition Assessments and Monitoring Change in Uzbekistan. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Beijing, China, 10 September 2021; Volume 8, No. M-1-2021. pp. 179–186.
  6. Bräuer-Burchardt, C.; Munkelt, C.; Bleier, M.; Heinze, M.; Gebhart, I.; Kühmstedt, P.; Notni, G. Underwater 3D Scanning System for Cultural Heritage Documentation. Remote Sens. 2023, 15, 1864.
  7. Yilmaz, H.M.; Yakar, M.; Gulec, S.A.; Dulgerler, O.N. Importance of digital close-range photogrammetry in documentation of cultural heritage. J. Cult. Herit. 2007, 8, 428–433.
  8. Rüther, H.; Smit, J.; Kamamba, D. A comparison of close-range photogrammetry to terrestrial laser scanning for heritage documentation. South. Afr. J. Geomat. 2012, 1, 149–162.
  9. Lerma, J.L.; Navarro, S.; Cabrelles, M.; Villaverde, V. Terrestrial laser scanning and close range photogrammetry for 3D archaeological documentation: The Upper Palaeolithic Cave of Parpalló as a case study. J. Archaeol. Sci. 2010, 37, 499–507.
  10. Lee, T.O. An Examination of Close-Range Photogrammetry and Traditional Cave Survey Methods for Terrestrial and Underwater Caves for 3-Dimensional Mapping. Doctoral Dissertation, University of Southern California, Los Angeles, CA, USA, 2018.
  11. Menna, F.; Agrafiotis, P.; Georgopoulos, A. State of the art and applications in archaeological underwater 3D recording and mapping. J. Cult. Herit. 2018, 33, 231–248.
  12. Murtiyoso, A.; Grussenmeyer, P. Documentation of heritage buildings using close-range UAV images: Dense matching issues, comparison and case studies. Photogramm. Rec. 2017, 32, 206–229.
  13. Bakirman, T.; Bayram, B.; Akpinar, B.; Karabulut, M.F.; Bayrak, O.C.; Yigitoglu, A.; Seker, D.Z. Implementation of ultra-light UAV systems for cultural heritage documentation. J. Cult. Herit. 2020, 44, 174–184.
  14. Li, Z.; Yan, Y.; Jing, Y.; Zhao, S.G. The design and testing of a LiDAR Platform for a UAV for heritage mapping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 17–24.
  15. Brumana, R.A.; Oreni, D.A.; Van Hecke, L.; Barazzetti, L.U.; Previtali, M.A.; Roncoroni, F.A.; Valente, R.I. Combined geometric and thermal analysis from UAV platforms for archaeological heritage documentation. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 2, 49–54.
  16. Monna, F.; Rolland, T.; Denaire, A.; Navarro, N.; Granjon, L.; Barbé, R.; Chateau-Smith, C. Deep learning to detect built cultural heritage from satellite imagery.-Spatial distribution and size of vernacular houses in Sumba, Indonesia. J. Cult. Herit. 2021, 52, 171–183.
  17. Agapiou, A.; Hadjimitsis, D.G.; Alexakis, D.; Sarris, A. Observatory validation of Neolithic tells (“Magoules”) in the Thessalian plain, central Greece, using hyperspectral spectroradiometric data. J. Archaeol. Sci. 2012, 39, 1499–1512.
  18. Berra, F.E.; Gaulton, R.; Barr, S. Commercial Off-the-Shelf Digital Cameras on Unmanned Aerial Vehicles for Multitemporal Monitoring of Vegetation Reflectance and NDVI. IEEE Trans. Geosci. Remote Sens. 2017, 55, 4878–4886.
  19. Geert, V.; Philippe, S.; Dirk, P.; Frank, V. Spectral Characterization of a Digital Still Camera’s NIR Modification to Enhance Archaeological Observation. Geoscience and Remote Sensing, IEEE Trans. Geosci. Remote Sens. 2009, 47, 3456–3468.
  20. Maas, H.G. Close-Range Photogrammetry Sensors. In Advances in Photogrammetry, Remote Sensing and Spatial Information Science: 2008 ISPRS Congress Book; CRC Press: Boca Raton, FL, USA, 2008; pp. 63–72.
  21. Luhmann, T.; Fraser, C.; Maas, H.G. Sensor modelling and camera calibration for close-range photogrammetry. ISPRS J. Photogramm. Remote Sens. 2016, 115, 37–46.
  22. Kholil, M.; Ismanto, I.; Fu’Ad, M.N. 3D Reconstruction Using Structure from Motion (SFM) Algorithm and Multi View Stereo (MVS) Based on Computer Vision. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2021; Volume 1073, p. 012066.
  23. Torresani, A.; Remondino, F. Videogrammetry vs. photogrammetry for heritage 3D reconstruction. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W15, 1157–1162.
  24. Balletti, C.; Guerra, F.; Tsioukas, V.; Vernier, P. Calibration of action cameras for photogrammetric purposes. Sensors 2014, 14, 17471–17490.
  25. Remondino, F.; Fraser, C. Digital camera calibration methods: Considerations and comparisons. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2006, 6, 266–272.
  26. Cronk, S.; Fraser, C.; Hanley, H. Automated metric calibration of colour digital cameras. Photogramm. Rec. 2006, 21, 355–372.
  27. Herban, S.; Costantino, D.; Alfio, V.S.; Pepe, M. Use of low-cost spherical cameras for the digitisation of cultural heritage structures into 3d point clouds. J. Imaging 2022, 8, 13.
  28. Murtiyoso, A.; Grussenmeyer, P.; Suwardhi, D. Technical considerations in Low-Cost heritage documentation. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 225–232.
  29. Fangi, G.; Pierdicca, R.; Sturari, M.; Malinverni, E.S. Improving spherical photogrammetry using 360 omni-cameras: Use cases and new applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 331–337.
  30. Zhien, W.; Massimo, M. Challenges and Opportunities in Lidar Remote Sensing. Front. Remote Sens. 2021, 2, 641723.
  31. Abmayr, T.; Härtl, F.; Reinköster, M.; Fröhlich, C. Terrestrial laser scanning: Applications in cultural heritage conservation and civil engineering. In Proceedings of the ISPRS Working Group V4 2005, Mestre-Venice, Italy, 22–24 August 2005.
  32. Nuttens, T.; De Maeyer, P.; De Wulf, A.; Goossens, R.; Stal, C. Terrestrial Laser Scanning and Digital Photogrammetry for Cultural Heritage: An Accuracy Assessment; FIG Working Week: Marrakech, Morocco, 2011; p. 10.
  33. Grussenmeyer, P.; Landes, T.; Doneus, M.; Lerma, J. Basics of Range-Based Modelling Techniques in Cultural Heritage 3D Recording. In 3D Recording, Documentation and Management of Cultural Heritage; Whittles Publishing: Dunbeath, UK, 2016.
  34. Kushwaha, S.K.; Dayal, K.R.; Sachchidanand Raghavendra, S.; Pande, H.; Tiwari, P.S.; Agrawal, S.; Srivastava, S.K. 3D Digital Documentation of a Cultural Heritage Site Using Terrestrial Laser Scanner—A Case Study. In Applications of Geomatics in Civil Engineering: Select Proceedings of ICGCE 2018; Springer: Singapore, 2020; pp. 49–58.
  35. Grussenmeyer, P.; Landes, T.; Voegtle, T.; Ringle, K. Comparison methods of terrestrial laser scanning, photogrammetry and tacheometry data for recording of cultural heritage buildings. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 213–218.
  36. Bernat, M.; Janowski, A.; Rzepa, S.; Sobieraj, A.; Szulwic, J. Studies on the use of terrestrial laser scanning in the maintenance of buildings belonging to the cultural heritage. In Proceedings of the 14th Geoconference on Informatics, Geoinformatics and Remote Sensing, SGEM. ORG, Albena, Bulgaria, 19–25 June 2014; Volume 3, pp. 307–318.
  37. Klapa, P.; Mitka, B.; Zygmunt, M. Application of Integrated Photogrammetric and Terrestrial Laser Scanning Data to Cultural Heritage Surveying. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2017; Volume 95, p. 032007.
  38. Keitaanniemi, A.; Rönnholm, P.; Kukko, A.; Vaaja, M.T. Drift analysis and sectional post-processing of indoor simultaneous localization and mapping (SLAM)-based laser scanning data. Autom. Constr. 2023, 147, 104700.
  39. Barba, S.; Ferreyra, C.; Cotella, V.A.; di Filippo, A.; Amalfitano, S. A SLAM integrated approach for digital heritage documentation. In Proceedings of the International Conference on Human-Computer Interaction, Málaga, Spain, 22–24 September 2021; pp. 27–39.
  40. Ortiz-Coder, P.; Sánchez-Ríos, A. An integrated solution for 3D heritage modeling based on videogrammetry and V-SLAM technology. Remote Sens. 2020, 12, 1529.
  41. Rodríguez-Gonzálvez, P.; Jiménez Fernández-Palacios, B.; Muñoz-Nieto, Á.L.; Arias-Sanchez, P.; Gonzalez-Aguilera, D. Mobile LiDAR System: New Possibilities for the Documentation and Dissemination of Large Cultural Heritage Sites. Remote Sens. 2017, 9, 189.
  42. Lauterbach, H.A.; Borrmann, D.; Heß, R.; Eck, D.; Schilling, K.; Nüchter, A. Evaluation of a Backpack-Mounted 3D Mobile Scanning System. Remote Sens. 2015, 7, 13753–13781.
  43. Lichti, D.; Stewart, M.P.; Tsakiri, M.; Snow, A.J. Calibration and testing of a terrestrial laser scanner. Int. Arch. Photogramm. Remote Sens. 2000, 33, 485–492.
  44. Rietdorf, A.; Gielsdorf, F.; Gruendig, L. A concept for the calibration of terrestrial laser scanners. In Proceedings of the INGEO 2004 and FIG Regional Central and Eastern European Conference of Engineering Surveying, Bratislava, Slovakia, 11–13 November 2004; Volume 11, p. 13.
  45. Gowroju, S.; Santhosh Ramchander, N. Applications of Drones—A Review. In Drone Technology; Mohanty, S.N., Ravindra, J.V.R., Surya Narayana, G., Pattnaik, C.R., Mohamed Sirajudeen, Y., Eds.; Wiley-Scrivener: Austin, TX, USA, 2023; pp. 183–206.
  46. Meyer, D.; Fraijo, E.; Lo, E.; Rissolo, D.; Kuester, F. Optimizing UAV Systems for Rapid Survey and Reconstruction of Large Scale Cultural Heritage Sites. In 2015 Digital Heritage; IEEE: Piscataway, NJ, USA, 2015; Volume 1, pp. 151–154.
  47. Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E.K. Evaluating unmanned aerial platforms for cultural heritage large scale mapping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 355–362.
  48. Gong, Y.; Zhang, F.; Jia, X.; Huang, X.; Li, D.; Mao, Z. Deep Neural Networks for Quantitative Damage Evaluation of Building Losses Using Aerial Oblique Images: Case Study on the Great Wall (China). Remote Sens. 2021, 13, 1321.
  49. Oczipka, M.; Bemmann, J.; Piezonka, H.; Munkabayar, J.; Ahrens, B.; Achtelik, M.; Lehmann, F. Small Drones for Geo-Archaeology in the Steppes: Locating and Documenting the Archaeological Heritage of the Orkhon Valley in Mongolia. In Remote Sensing for Environmental Monitoring, GIS Applications, and Geology IX; SPIE: Bellingham, WA, USA, 2009; Volume 7478, pp. 53–63.
  50. Bagnolo, V.; Paba, N. UAV-based photogrammetry for archaeological heritage site survey and 3D modeling of the sardus pater temple (Italy). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 45–51.
  51. Stek, T.D. Drones over Mediterranean landscapes. The potential of small UAV’s (drones) for site detection and heritage management in archaeological survey projects: A case study from Le Pianelle in the Tappino Valley, Molise (Italy). J. Cult. Herit. 2016, 22, 1066–1071.
  52. Matyukira, C.; Mhangara, P. Advancement in the Application of Geospatial Technology in Archaeology and Cultural Heritage in South Africa: A Scientometric Review. Remote Sens. 2023, 15, 4781.
  53. Uribe, P.; Angás, J.; Romeo, F.; Pérez-Cabello, F.; Santamaría, D. Mapping Ancient Battlefields in a multi-scalar approach combining Drone Imagery and Geophysical Surveys: The Roman siege of the oppidum of Cabezo de Alcalá (Azaila, Spain). J. Cult. Herit. 2021, 48, 11–23.
  54. Koutsoudis, A.; Ioannakis, G.; Pistofidis, P.; Arnaoutoglou, F.; Kazakis, N.; Pavlidis, G.; Chamzas, C.; Tsirliganis, N. Multispectral aerial imagery-based 3D digitisation, segmentation and annotation of large scale urban areas of significant cultural value. J. Cult. Herit. 2021, 49, 1–9.
  55. Materazzi, F.; Pacifici, M. Archaeological crop marks detection through drone multispectral remote sensing and vegetation indices: A new approach tested on the Italian pre-Roman city of Veii. J. Archaeol. Sci. Rep. 2022, 41, 103235.
  56. Khelifi, A.; Ciccone, G.; Altaweel, M.; Basmaji, T.; Ghazal, M. Autonomous service drones for multimodal detection and monitoring of archaeological sites. Appl. Sci. 2021, 11, 10424.
  57. Patrucco, G.; Cortese, G.; Giulio Tonolo, F.; Spanò, A. Thermal and optical data fusion supporting built heritage analyses. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43, 619–626.
  58. Vlachos, M.; Skarlatos, D. An Extensive Literature Review on Underwater Image Colour Correction. Sensors 2021, 21, 5690.
  59. Diamanti, E.; Løvås, H.S.; Larsen, M.K.; Ødegård, Ø. A multi-camera system for the integrated documentation of Underwater Cultural Heritage of high structural complexity; The case study of M/S Helma wreck. IFAC-Pap. OnLine 2021, 54, 422–429.
  60. Selmo, D.; Sturt, F.; Miles, J.; Basford, P.; Malzbender, T.; Martinez, K.; Thompson, C.; Earl, G.; Bevan, G. Underwater reflectance transformation imaging: A technology for in situ underwater cultural heritage object-level recording. J. Electron. Imaging 2017, 26, 011029.
  61. Skarlatos, D.; Agrafiotis, P. Image-Based Underwater 3DReconstruction for Cultural Heritage: From Image Collection to, 3.D. Critical Steps and Considerations. In Visual Computing for Cultural Heritage Springer Series on Cultural Computing; Liarokapis, F., Voulodimos, A., Doulamis, N., Doulamis, A., Eds.; Springer: Cham, Switzerland, 2020.
  62. Skarlatos, D.; Demestiha, S.; Kiparissi, S. An ‘open’ method for 3D modelling and mapping in underwater archaeological sites. Int. J. Herit. Digit. Era 2012, 1, 1–24.
  63. Drap, P.; Merad, D.; Hijazi, B.; Gaoua, L.; Nawaf, M.M.; Saccone, M.; Chemisky, B.; Seinturier, J.; Sourisseau, J.-C.; Gambin, T.; et al. Underwater Photogrammetry and Object Modeling: A Case Study of Xlendi Wreck in Malta. Sensors 2015, 15, 30351–30384.
  64. Hu, K.; Wang, T.; Shen, C.; Weng, C.; Zhou, F.; Xia, M.; Weng, L. Overview of Underwater 3D Reconstruction Technology Based on Optical Images. J. Mar. Sci. Eng. 2023, 11, 949.
  65. Lindsay, I.; Mkrtchyan, A. Free and Low-Cost Aerial Remote Sensing in Archaeology: An Overview of Data Sources and Recent Applications in the South Caucasus. Adv. Archaeol. Pract. 2023, 11, 1–20.
  66. Uribe, P.; Pérez-Cabello, F.; Bea, M.; De La Riva, J.; Martín-Bueno, M.; Sáenz, C.; Serreta, A.; Magallón, M.A.; Angás, J. Aerial mapping and multi-sensors approaches from remote sensing applied to the roman archaeological heritage. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-5/W4, 461–467.
  67. Agapiou, A.; Alexakis, D.D.; Hadjimitsis, D.G. Spectral sensitivity of ALOS, ASTER, IKONOS, LANDSAT and SPOT satellite imagery intended for the detection of archaeological crop marks. Int. J. Digit. Earth 2014, 7, 351–372.
  68. Winton, H.; Horne, P. National archives for national survey programmes: NMP and the English heritage aerial photograph collection. Landsc. Through Lens. Aer. Photogr. Hist. Enviroment. Aer. Archaeol. Res. Group 2010, 2, 7–18.
  69. Cowley, D.C.; Stichelbaut, B.B. Historic aerial photographic archives for European archaeology. Eur. J. Archaeol. 2012, 15, 217–236.
  70. Cowley, D.; Ferguson, L. Historic Aerial Photographs for Archaeology and Heritage Management. In Space Time and Place, Proceedings of the III International Conference on Remote Sensing in Archaeology, Tiruchirappalli, India, 17–21 August 2009; BAR International Series 2118; British Archaeological Reports Ltd.: Oxford, UK, 2010; pp. 17–21.
  71. Agapiou, A. Remote sensing heritage in a petabyte-scale: Satellite data and heritage Earth Engine© applications. Int. J. Digit. Earth 2017, 10, 85–102.
  72. Pappu, S.; Akhilesh, K.; Ravindranath, S.; Raj, U. Applications of satellite remote sensing for research and heritage management in Indian prehistory. J. Archaeol. Sci. 2010, 37, 2316–2331.
  73. Lasaponara, R.; Masini, N. Satellite Remote Sensing: A New Tool for Archaeology. In Proceedings of the I International EARSeL Workshop “Advances in Remote Sensing for Archaeology and Cultural Heritage Management”, Rome, Italy, 30 September 2008; Springer: Dordrecht, The Netherlands, 2012; p. 366.
  74. Agapiou, A.; Hadjimitsis, D.G.; Alexakis, D.D.; Papadavid, G. Examining Phenol. Cycle Barley (Hordeum Vulgare) Using Satell. Situ Spectroradiometer Meas. Detect. Buried Archaeol. Remain. GISci. Remote Sens. 2012, 49, 854–872.
  75. Agapiou, A.; Lysandrou, V.; Sarris, A.; Papadopoulos, N.; Hadjimitsis, D.G. Fusion of satellite multispectral images based on ground-penetrating radar (GPR) data for the investigation of buried concealed archaeological remains. Geosciences 2017, 7, 40.
More
This entry is offline, you can click here to edit this entry!
Video Production Service