Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2390 2022-03-31 16:12:35 |
2 format + 58 word(s) 2448 2022-04-01 03:59:35 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Teague, J.; Megson-Smith, D.; Allen, M.; , . Diver-Based Optical Survey Techniques for Coral Monitoring. Encyclopedia. Available online: https://encyclopedia.pub/entry/21237 (accessed on 13 June 2024).
Teague J, Megson-Smith D, Allen M,  . Diver-Based Optical Survey Techniques for Coral Monitoring. Encyclopedia. Available at: https://encyclopedia.pub/entry/21237. Accessed June 13, 2024.
Teague, Jonathan, David Megson-Smith, Mike Allen,  . "Diver-Based Optical Survey Techniques for Coral Monitoring" Encyclopedia, https://encyclopedia.pub/entry/21237 (accessed June 13, 2024).
Teague, J., Megson-Smith, D., Allen, M., & , . (2022, March 31). Diver-Based Optical Survey Techniques for Coral Monitoring. In Encyclopedia. https://encyclopedia.pub/entry/21237
Teague, Jonathan, et al. "Diver-Based Optical Survey Techniques for Coral Monitoring." Encyclopedia. Web. 31 March, 2022.
Diver-Based Optical Survey Techniques for Coral Monitoring
Edit

Monitoring the health of coral reefs is essential to understanding the damaging impacts of anthropogenic climate change as such non-invasive methods to survey coral reefs are the most desirable. Optics-based surveys, ranging from simple photography to multispectral satellite imaging are well established. The techniques are broadly separated by the primary method in which data are collected: by divers and/or robots directly within the environment or by remote sensing where data are captured above the water’s surface by planes, drones, or satellites.

Coral Monitoring Diver-Based RGB Underwater

1. Introduction

Simple photography techniques are well established in current coral surveyance as they are able to achieve non-invasive measurements, provide a permeant record of the data for future analyses and a wide array of analysis techniques that can be used on the collected data to ascertain different information to derive a metric or metrics for 'health'. More complex optical techniques such as multispectral/hyperspectral imaging are emerging within the field as the associated cost of the systems is rapidly reducing, as such they are becoming more accessible to researchers who can make use of the higher resolution data afforded by these systems.

2. Red, Green, Blue (RGB) Imaging

Digital cameras used in current photography surveys detect light in three broad colour channels: red, green, blue (RGB), centred at approximately 660 nm, 520 nm, and 450 nm, respectively. This enables similar images to be taken to those perceived by the human eye, which also sees using only these RGB channels [1]. Cameras can be employed in many coral reef surveying techniques such as photo-quadrat sampling in which quadrats are imaged via high resolution digital cameras. This secures a visual record suitable for subsequent laboratory analysis as opposed to manual in situ coverage estimates. Imaging thus allows for a reduction in diver ‘bottom time’ compared with non-image based surveyance. The advantage of laboratory analysis is that it allows for images to be run through machine learning software such as CoralNet [2], which attempt to fully or partially automate classification and benthic cover estimates. However, classification can only be made after sufficient training data are provided to the algorithm [3][4]. These types of machine learning systems are still in their infancy and, to date, can only effectively estimate the cover of common coral genera.
Images can also be interpreted to provide rudimentary colour analysis. This yields outputs similar to diver assessments undertaken qualitatively by eye using colour charts or wheels. The comparison of corals against known colour hues corresponding to different concentrations of symbionts [5] enables divers to quickly identify the extent of any coral bleaching, though other aspects of coral health may be overlooked. Performing such surveys using traditional digital camera (RGB) images is an improvement, since the intensity of individual colour channels can be recorded and interpreted. For example, the intensity of red wavelengths indicates the extent of chlorophyll absorption [6]. However, the value of the method is limited as only three broad colour bands are recorded by the camera sensor. The method can aid in preliminary assessments of bleaching, but by the point bleaching is RGB-detectable, typically around 70% or more of symbionts have already been expelled [7]. A general point to remember is that these types of survey typically only cover an individual coral colony and may not be representative of the whole reef system. They also only provide a snapshot of an environment and cannot be extrapolated to an understanding of the ongoing population dynamics of the whole reef system.
An emerging and more advanced utilisation of RGB imaging is via the creation of three-dimensional (3D) reconstructions of coral systems using photogrammetry. With recent advances in photogrammetric processing, this is now relatively quick and easy to conduct [8]. Photogrammetry uses a set of overlapping images collected either by video or still photography of a target area. Ideally adjacent images should have 60–80% overlap [9]. Physical parameters can be obtained from 3D reconstruction models of coral reefs such as surface topography, estimations of rugosity and surface area as well as coral cover and distribution [10]. Crucially, a wide range of additional information can be extracted from the same original dataset, making it a useful analysis technique when image sets are recorded. However, when using standard digital cameras, the method is still limited. Resultant images are only relevant to changes in coral colour or fluorescent emissions as detected by the three RGB channels.

3. Underwater Spectroscopic Techniques

Spectroscopic techniques can image in numerous, narrower wavelength bands across the whole visible light spectrum and mark an improvement over simple RGB imaging. The use of spectral data enables a more definitive discrimination between live coral, macroalgae, and other photoactive organisms by using the specific spectral “signature” or “fingerprints” associated with a certain organism or type of organism [11][12]. It also identifies whether corals are displaying a decline in ‘normal health’ by measuring the relative intensity of the spectral signatures arising from specific pigments associated with health such as chlorophyll. This can be achieved by using reference targets to correct for incident light variations, thereby normalising spectra so they can be compared between datasets to track changes in pigment intensity and thus bleaching.
Underwater spectrometry can be achieved by using laboratory spectrometers enclosed in waterproof housings with fibre optic probes to record radiance reflectance measurements. The fibre optic probes are held at an orthogonal angle to the solar incidence angle, approximately 0.5–1.0 cm from the target [13]. An accompanying reference measurement is required to normalise for variations in ambient illumination. This is achieved by taking a reflectance measurement from a well characterised, white Lambertian reflectance target such as polytetrafluorethylene (PTFE) or a Spectralon (Labsphere, USA). The reference spectrum enables a correction function to be applied to the data. Specialised spectrometers can also be employed for certain niche applications. For example, pulse amplitude modulation (PAM) fluorometers specifically look at fluorescence to determine the photosynthetic yield. Chlorophyll density can be used to determine relative electron transport rates of photosynthetic organisms to provide a measurement of photosynthetic efficiency [14]. This is a measure of how well chlorophyll converts light into energy and detects compromised tissues that are less efficient. The diving PAM I and II (Walz, Germany) are examples of underwater fluorometers and are the most commonly used devices in studies using this technique [15][16][17]. PAM devices do have limitations. Notably, a requirement for the sampling optical fibre probe to be held in near contact (<5 mm) with the sampled object for a long time (>30 s) to obtain accurate readings.
Spectrometers and fluorometers are able to generate more accurate spectral data but also suffer from many of the same pitfalls as RGB imaging. Data acquisition is typically slow when used to cover a whole reef system. This is mainly due to the small sampling area of the probes and the requirement to make point measurements. This limitation makes the technique particularly unsuitable for large-area surveys. Additionally, multiple points are often sampled on individual corals to obtain average spectra. However, the small number of measurements precludes confidence that these average spectra are truly representative of the whole organism or a whole reef system.
Conversely, spectroscopy techniques using imagers (multispectral and hyperspectral imaging) can generate spectra for every pixel in an image within one data acquisition. This makes the process of data collection quicker and more efficient, thereby facilitating the collection of datasets that are more comprehensive and representative. In turn, imagers can categorise and quantify colour. Spectral imagers generally comprise a dispersive element (either a prism or diffraction grating) or filter, which splits or filters incoming light into wavelengths, and an imaging detector such as a charged coupled device (CCD) or complementary metal–oxide–semiconductor device (CMOS).
Multispectral imagers record data across multiple spectral bands, typically between three and 15 bands [18]. Conversely, hyperspectral imaging records in hundreds of spectral bands, which means data may be collected and processed across the whole visible and/or near infrared spectrum with improved spectral resolution.
Previously, some multispectral systems have been deployed to assess specific marine monitoring cases. These included determining coral fluorescence using narrow bandpass filters [19] and filter wheel style imagers for classification via spectral discrimination [20]. Other imagers have been produced for applications such as the exploration of marine minerals and ores [21], but are not currently being used in coral monitoring surveys [22].
Underwater hyperspectral imaging (UHI) is a relatively new, emerging technology with limited published instances to date. Current diver operated hyperspectral systems such as the “HyperDiver” system [23] can generate hyperspectral and traditional RGB images simultaneously capturing synchronised high-resolution digital images, hyperspectral, and topographic data [23]. The system utilises a push-broom hyperspectral imager (Pika 2, Resonon Inc., Bozeman, MT, USA) with a spectral range of 400–900 nm sampled at ~1.5 nm resolution with 480 fixed bands and 640 spatial pixels [23].
Push broom or line scanning imaging methods acquire full spectral data one spatial line at a time. The line is imaged onto the entrance slit of a spectrometer, which disperses the light into its spectral components before reaching the sensor array. The composite image is constructed by either moving the slit across the image plane or by moving the entire system across the scene [24]. This is advantageous as spectral data can be gathered whilst the imager is moving, which provides both full spectral and spatial data. Other hyperspectral systems such as ‘Full data cube snapshot’ imagers work from fixed viewpoints, similar to traditional RGB imagers. In this case, a push broom effect is achieved by optically scanning a linear field of view across the hyperspectral detector within the device. The need for a stable platform and the delicate nature of the optics involved make them generally unsuitable for use in UHI.
UHI presents additional potential applications using an ‘objects of interest’ (OOI) identification technique, as described by Johnsen [25], which includes mapping and monitoring of seafloor habitats for minerals or soft versus hard bottom; seafloor pipeline inspections to determine type of material, cracks, rust, and leakage; shipwrecks (type and state of wood, nails, rust, and artefacts); deep-water coral reefs and sponge fields for species identification, area coverage and physiological state, and kelp forests (species identification, area coverage, physiological state, and growth rates of benthic organisms).
Current UHI technologies are generally bulky systems that are difficult to deploy and manoeuvre. For example, the “Hyperdiver” system [23] including all its additional sensors and payloads weighs 32 kg in air. Other sensors, specifically the tunable LED-based underwater multispectral imaging system (TuLUMIS) and ocean vision (UHI OV), are designed to be mounted on a UUV. The UUV provides the interface system to operate the camera as well as a translation platform. These are not easily deployed by a diver.
The use of UUVs does, however, eliminate the limitations imposed by diver reliance. For example, dive surveys require substantial amounts of time as there is a finite period a diver can spend underwater; on dives, this is usually dependent on air-tank capacity and depth. Subsequent dives can be achieved through the use of multiple air-tanks but ultimately, a diver will fatigue. The corresponding issue on UUV based surveys is battery life, although multiple batteries can be used to extend the survey time. Crucially, UUVs do not suffer fatigue and can be deployed longer than their human counterparts. A UUV can also cover a larger distance in a shorter time. For example, a 120 m squared area may take two scuba divers up to 2.5 h [26], equating to a surveying rate of 0.13 m2/s. Comparatively, a low-cost remotely operated vehicle (ROV) such as BlueROV2 can achieve survey rates of 1 m2/s. Key limitations to both divers and UUVs are repeatability and accuracy when surveying reefs because global positioning system(s) (GPS) do not work underwater. Acoustic transponder networks designed for UUVs create a way of translating GPS coordinates underwater and thus improve the repeatability and accuracy by recording accurate georeferenced data [8].
The use of UHI on UUVs is currently limited with only a few studies having been reported. One such study [27] used a prototype UHI system for mapping the seafloor for the automated identification of seabed, habitat, and OOI in coral reefs. Other studies [12], specifically using hyperspectral imaging with corals, have mainly focused on coverage and benthic discrimination with machine learning to classify corals and have not focused on assessing health or disease. The current generation of commercially available hyperspectral imagers are often cost prohibitive to both acquire and insure for marine studies. Consequently, there exists a need for technology development and application to study marine environments such as the surveyance of coral health.

4. The Potential of ‘New’ Underwater Hyperspectral Imagers

A new type of hyperspectral imaging technology that utilises linear variable filter(s) (LVF) has recently emerged. A LVF is an optical filter whose bandpass windows varies continuously across its surface [28]. LVFs allow for lower cost imagers to be produced [28]. For example, LVFs have been integrated with consumer grade digital cameras to convert them into hyperspectral imagers [28][29][30]. The Bi-Frost [28] DSLR reduces the financial burden of spectral imagers by up to 75%. Using DSLR cameras offers key advantages as they are already well implemented in underwater photography (in both scientific and hobbyist applications), making them affordable and accessible in both supply and use. Non-specialised personnel can access and use the technology with relatively little training, thus reducing dependency on highly skilled divers and marine scientists for surveying.
The Bi-Frost DSLR can be implemented in two operational modes: hyperspectral reflectance imaging (HyRi) and fluorescence imaging (HyFi). These modes use the same Bi-Frost DSLR with the only difference being the lighting conditions a used for imaging; HyRi images under sunlight/white light; and HyFi under ultraviolet (UV) produced by light emitting diodes (LEDs).
The methodology for gathering HyRi/HyFi data is similar to that of underwater photogrammetry. A delivery platform such as a diver or UUV translates the imager across a target scene in a single line for measurements on a colony scale, or a ‘lawn mower’ pattern for a reef scale [31], ensuring sufficient overlap between line intersects [8].
Spectral data can be extracted from the raw images generated from Bi-Frost cameras using photogrammetric approaches. Specifically, software was developed to interface with commercially available photogrammetric solutions (Photoscan 1.3.4, Agisoft, Russia). This software can produce hypercubes of intensity data having three spatial and one wavelength coordinate. In this way, its use enables the construction of 3D models with full spectral information for each 3D surface point. The sizes of the datasets are ultimately limited by the computing power required to run the reconstruction software. This approach generates camera positions in 3D space relative to the scene for each image taken. This, in turn, allows for the accurate derivation of the correction factors needed to compensate for optical attenuation between the source and the detector. A single survey with a HyRi system therefore gathers spectral data (enabling for coral identification, zonation, physiological assessments) and 3D data (reef structure, rugosity), giving it twice the value.

References

  1. Bokolonga, E.; Hauhana, M.; Rollings, N.; Aitchison, D.; Assaf, M.; Das, S.R.; Biswas, S.N.; Groza, V.; Petriu, E.M. A compact multispectral image capture unit for deployment on drones. In Proceedings of the Conference Record-IEEE Instrumentation and Measurement Technology Conference, Taipei, Taiwan, 23–26 May 2016.
  2. Falkowski, P.; Knoll, A.H. Evolution of Primary Producers in the Sea; Academic Press; Elsevier: New York, NY, USA, 2007.
  3. Beijbom, O.; Edmunds, P.J.; Roelfsema, C.; Smith, J.; Kline, D.I.; Neal, B.P.; Dunlap, M.J.; Moriarty, V.; Fan, T.Y.; Tan, C.J.; et al. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation. PLoS ONE 2015, 10, e0130312.
  4. Williams, I.D.; Couch, C.; Beijbom, O.; Oliver, T.; Vargas-Angel, B.; Schumacher, B.; Brainard, R. Leveraging Automated Image Analysis Tools to Transform Our Capacity to Assess Status and Trends of Coral Reefs. Front. Mar. Sci. 2019, 6, 222.
  5. Siebeck, U.E.; Marshall, N.J.; Klüter, A.; Hoegh-Guldberg, O. Monitoring coral bleaching using a colour reference card. Coral Reefs 2006, 25, 453–460.
  6. Winters, G.; Holzman, R.; Blekhman, A.; Beer, S.; Loya, Y. Photographic assessment of coral chlorophyll contents: Implications for ecophysiological studies and coral monitoring. J. Exp. Mar. Bio. Ecol. 2009, 380, 25–35.
  7. Fitt, W.K.; McFarland, F.K.; Warner, M.E.; Chilcoat, G.C. Seasonal patterns of tissue biomass and densities of symbiotic dinoflagellates in reef corals and relation to coral bleaching. Limnol. Oceanogr. 2000, 45, 677–685.
  8. Teague, J.; Scott, T.B. Underwater Photogrammetry and 3D Reconstruction of Submerged Objects in Shallow Environments by ROV and Underwater GPS. J. Mar. Sci. Res. Technol. 2017, 1, 1.
  9. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97.
  10. Bayley, D.T.I.; Mogg, A.O.M. A protocol for the large-scale analysis of reefs using Structure from Motion photogrammetry. Methods Ecol. Evol. 2020, 11, 1410–1420.
  11. Hedley, J.D.; Mumby, P.J. Biological and remote sensing perspectives of pigmentation in coral reef organisms. Adv. Mar. Biol. 2002, 43, 277–317.
  12. Hochberg, E.J.; Atkinson, M.J. Spectral discrimination of coral reef benthic communities. Coral Reefs 2000, 19, 164–171.
  13. Leiper, I.A.; Siebeck, U.E.; Marshall, N.J.; Phinn, S.R. Coral health monitoring: Linking coral colour and remote sensing techniques. Can. J. Remote Sens. 2009, 35, 276–286.
  14. Jones, R.J.; Kildea, T.; Hoegh-Guldberg, O. PAM Chlorophyll Fluorometry: A New in situ Technique for Stress Assessment in Scleractinian Corals, used to Examine the Effects of Cyanide from Cyanide Fishing. Mar. Pollut. Bull. 1999, 38, 864–874.
  15. Ralph, P.J.; Gademann, R.; Larkum, A. Zooxanthellae expelled from bleached corals at 33 °C are photosynthetically competent. Mar. Ecol. Prog. Ser. 2001, 220, 163–168.
  16. Chauka, L.J.; Steinert, G.; Mtolera, M.S.P. Influence of local environmental conditions and bleaching histories on the diversity and distribution of Symbiodinium in reef-building corals in Tanzania. Afr. J. Mar. Sci. 2016, 63, 57–64.
  17. Kurihara, H.; Takahashi, A.; Reyes-Bermudez, A.; Hidaka, M. Intraspecific variation in the response of the scleractinian coral Acropora digitifera to ocean acidification. Mar. Biol. 2018, 165, 38.
  18. Chang, C.I. Hyperspectral Imaging: Techniques for Spectral Detection and Classification; Springer Science & Business Media: Berlin, Germany, 2003.
  19. Zawada, D.G. The Application of a Novel Multispectral Imaging System to the in vivo Study of Flourescent Compounds in Selected Marine Organisms. Ph.D. Thesis, University of California, San Diego, CA, USA, 2002.
  20. Gleason, A.C.R.; Reid, R.P.; Voss, K.J. Automated classification of underwater multispectral imagery for coral reef monitoring. In Proceedings of the Oceans Conference Record (IEEE), Charleston, WV, USA, 29 September–4 October 2007.
  21. Sture, O.; Ludvigsen, M.; Soreide, F.; Aas, L.M.S. Autonomous underwater vehicles as a platform for underwater hyperspectral imaging. In Proceedings of the OCEANS 2017, Aberdeen, UK, 19–22 June 2017.
  22. Liu, B.; Liu, Z.; Men, S.; Li, Y.; Ding, Z.; He, J.; Zhao, Z. Underwater Hyperspectral Imaging Technology and Its Applications for Detecting and Mapping the Seafloor: A Review. Sensors 2020, 20, 4962.
  23. Chennu, A.; Färber, P.; De’Ath, G.; De Beer, D.; Fabricius, K.E. A diver-operated hyperspectral imaging and topographic surveying system for automated mapping of benthic habitats. Sci. Rep. 2017, 7, 7122.
  24. Maglione, P. Very High Resolution Optical Satellites: An Overview of the Most Commonly used. Am. J. Appl. Sci. 2016, 13, 91–99.
  25. Johnsen, G.; Ludvigsen, M.; Sørensen, A.; Sandvik Aas, L.M. The use of underwater hyperspectral imaging deployed on remotely operated vehicles—methods and applications. IFAC-Papers 2016, 49, 476–481.
  26. Willis, B.L.; Page, C.A.; Dinsdale, E.A. Coral Disease on the Great Barrier Reef. Coral Health Dis. 2002, 69–104.
  27. Johnsen, G.; Volent, Z.; Dierssen, H.; Pettersen, R.; Ardelan, M.; Søreide, F.; Fearns, P.; Ludvigsen, M.; Moline, M. Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties. In Subsea Optics and Imaging; Elsevier: Amsterdam, The Netherlands, 2013.
  28. Pust, O. Innovative Filter Solutions for Hyperspectral Imaging. Opt. Photonik 2016, 11, 24–27.
  29. Renhorn, I.G.E.; Bergström, D.; Hedborg, J.; Letalick, D.; Möller, S. High spatial resolution hyperspectral camera based on a linear variable filter. Opt. Eng. 2016, 55, 114105.
  30. Song, S.; Gibson, D.; Ahmadzadeh, S.; Chu, H.O.; Warden, B.; Overend, R.; Macfarlane, F.; Murray, P.; Marshall, S.; Aitkenhead, M.; et al. Low-cost hyper-spectral imaging system using a linear variable bandpass filter for agritech applications. Appl. Opt. 2020, 59, A167–A175.
  31. Burns, J.; Delparte, D.M.; Gates, R.D.; Takabayashi, M. Integrating structure-from-motion photogrammetry with geospatial software as a novel technique for quantifying 3D ecological characteristics of coral reefs. PeerJ 2015, 3, e1077.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , ,
View Times: 425
Revisions: 2 times (View History)
Update Date: 01 Apr 2022
1000/1000
Video Production Service