Over the last few centuries, mapping the ocean seabed has been a major challenge for marine geoscientists. Knowledge of seabed bathymetry and morphology has significantly impacted our understanding of our planet dynamics. The history and scientific trends of seabed mapping can be assessed by data mining prior studies. Here, we have mined the scientific literature using the keyword “seabed mapping” to investigate and provide the evolution of mapping methods and emphasize the main trends and challenges over the last 90 years. An increase in related scientific production was observed in the beginning of the 1970s, together with an increased interest in new mapping technologies. The last two decades have revealed major shift in ocean mapping. The future of seabed mapping brings high expectations, considering that this is one of the main research and development themes for the United Nations Decade of the Oceans. We may expect a new higher resolution ocean seafloor map that might be as influential as The Floor of the Oceans map.
Although the first documentation of studies concerning the sea date back to Aristotle’s time—384–322 AD, the beginning of marine sciences is generally considered to be the 17th century with Boyle’s “The Saltness of the Sea” (1674). One of the first bathymetric maps, if not the first, long before geophysical methods, was based on plumb measurements (fathoms units) and was published in an 1855 textbook by Matthew Fontaine Maury , who is considered to the father of modern oceanography. The HMS Challenger expedition (1873–1876) can be considered a striking fact of the seabed mapping history; much of the information preceding the 20th century presented herein were recovered from the “Report of The Voyage of HMS Challenger” . More than 500 plumb measurements were acquired in the Challenger Expedition, revealing the depth of the Mariana Trench and the Dolphin, Connecting, and Challenge Ridges, known today as the Mid-Atlantic Ridge. At the beginning of the 20th century, an important theory regarding the ocean basins was put forth: the Wegener’s hypothesis of continental drift and seafloor spreading. In 1977, another milestone of seafloor mapping elaborated by Bruce Heezen and Marie Tharp  helped to consolidate the Wegener hypothesis and also illustrated submarine morphology in a reliable and similar way to what we know today .
Prior to the 19th century, the methods for depth measurements were archaic. The “lead and line” system is a classical example, in which a plumb attached to a cable sank through the water column until it reached the seafloor. Such measurements were greatly influenced by vessel drift and marine currents. There had been a few attempts to improve the method, e.g., by Hooke during the 18th century and Brooke in the 19th century. In parallel, the lack of knowledge about the seafloor allowed hypotheses to emerge that were sometimes eccentric, such as a perpetual ice cover over the seabed (French naturalist Péron, 1805—mentioned in Thompson and Murray, 1885, Report on the Scientific Results of the Voyage of H.M.S. “Challenger”  p. 37). The operation to lay down the first telegraph cable across the Atlantic was an event of great importance, around 1850. This spawned and drove a need to produce better knowledge about the ocean depths and encouraged technological advances.
By 1920, the development of acoustic technology changed the course of seabed mapping, and then “lead and line” measurements were gradually replaced by echo sounders. The German Meteor Expedition (1925–1927) surveyed the South Atlantic Ocean using echo sounding equipment and other oceanographic tools. During this expedition, among other findings, the continuity of the Mid-Atlantic Ridge was observed, now known as being by far the largest geological formation on Earth. Seismic methods also came into use during this period, with the first records in water-covered areas in 1926 (historical events can be found in ), and the first important marine survey with this technique conducted in 1938 . Between the decades of 1950 and 1960, substantial progress and development was made on electronic stabilization, interferometric modulation and positioning improvement . In the 1970s, new perspectives for seabed acoustic mapping were driven by a revolutionary multibeam bathymetry system . In 1977, during the Jean Charcot Expedition, the first non-military versions of multibeam systems were employed—Seabeam and Hydrochart, for deep and shallow waters, respectively. These systems have greatly enriched the future of the seabed mapping and the quality of bathymetric surveys , already enabling the inference of seafloor characteristics from backscatter data . After these enhancements to acoustic systems, the science related to seabed classification leveraged major developments and innovations, e.g., geomorphological and habitat classification using geophysical data such as bathymetry and side scan sonar. In recent years, the use of correlations between acoustic properties and biological and/or geological parameters has been successful, but this type of transformation of acoustic properties into physical properties has been used in submarine acoustics for substantially longer .
Over time, seabed mapping gained further importance due to technological developments, providing substantive contributions to a better understanding of tectonics and ocean spreading. In addition to the unquestionable scientific importance and essential relationship with other oceanographic fields, the role of seabed mapping has been paramount for national and international limit jurisdiction issues, first established in 1982 (Law of the Sea). This subject is closely related to fields such as economic zoning, natural resources management and the exploration of marine mineral and oil and gas resources. The main points and implications of Article 76 of the United Nations Convention on the Law of the Sea were reviewed , identifying, among other issues, the importance of geophysical methods. Even today, one often hears or reads—with a degree of frustration or regret—that we know more about the moon’s surface and even some other planets than about our own oceans, and that around 82% of the ocean floor is as yet unmapped . A notorious example is the southeastern Indian Ocean mapping that was conducted to search for a Malaysian Airlines flight that disappeared. As a byproduct, that dataset revealed interesting geomorphological findings that improved the understanding of that ocean floor region . As well as the “The Nippon Foundation—GEBCO Seabed 2030 Project” efforts to map the oceans’ floor by 2030, the importance of seabed mapping for several fields is also corroborated by the United Nations with the declaration of the Decade of Ocean Science for Sustainable Development (2021–2030), including the map of the oceans as one of the research and development priorities.
Here, the goal is not to be an audacious and challenging historical review of marine sciences , nor an ocean mapping review , as it would demand a large historiographical effort and a study of the cartography aspects. The purpose here is to provide a consistent picture of seabed mapping supported by scientific studies, visualize the evolution of mapping methods (from the 1930s until now), and emphasize the main trends and challenges. In this context, the analyses performed were based on a set of scientific manuscripts (dated between 1930 and 2019) retrieved from the Brazilian virtual library “Portal de Periódicos da CAPES”. The web search was conducted using the keyword “seabed mapping”. The assessments comprised primarily a vocabulary analysis, presenting statistical results according to the main and frequent words from sets of manuscripts from each decade. The logic was to implement an important outcome from “machine learning” and “data science”, in which an algorithm previously scans a dataset and shows results and connections somewhat difficult to be noted by a single individual, instead of using the usual order of a hypothesis that arises and then is subsequently tested for validity through data .
Nevertheless, it is important to recognize that there are very relevant book references that are not considered in this analysis as they are not necessarily included in citation libraries. These are mainly old scientific books such as the seminal Submarine Geology (1948) by Francis Shepard and The Floors of the Ocean-North Atlantic (1959) by Bruce Heezen, Marie Tharp, and Maurice Ewing. Also, there are important publications associated with geological/hydrographic surveys that were not included in the analysis. The importance of these agencies is unquestionable and difficult to measure. Among other functions, in most nations these agencies played a lead role in data collection, interpretation and publication of contributions to geosciences (e.g., natural resource sustainability and fundamental geology mapping) . An important goal of these geological/hydrographic surveys is to produce information required by governments to map the limits of economic zoning and provide a basic seafloor knowledge for national or multilateral marine spatial planning.
The applied methodology is summarized in the following main steps: (i) search for scientific literature; (ii) elaboration and processing of the input data (text files); (iii) performance of statistical analyses.
Scientific manuscript mining using “seabed mapping” as the keyword successfully collected a total of 454 papers dated between 1930s and 2019. Supplementary Material (S1) provides the complete reference list.
The results and discussions presented have successfully provided an overview of research concerning seabed mapping. In addition, the importance and the worldwide embracement becomes clear for several fields.
Overall, the use of lexical analysis based on an extensive reference data mining allowed us to determine the evolution of the state of the art in seabed mapping through the scientific literature over almost a century. Seabed mapping started with words indicating exploratory investigations (“ocean”, “interpretation”, “investigation”) and changed to the significant influence of technology in scientific outcomes (echo, side scan, and multibeam) and started to incorporate a more integrated approach, by using terms such as habitat mapping and concepts of seabed classification and backscatter, involving statistical analysis trying to predict biological distributions. The growth of scientific production related to seabed mapping is observed over the decades. It may follow the trend of investments in research, science, and technology but is also related to national and international demands regarding defining the countries’ exclusive economic zones, the interest in marine mineral resource and oil and gas exploitation, marine sites for renewable energy, the need for spatial planning, the scientific challenge of understanding climate variability and tectonics processes, etc.
The range of applications is clear for seabed mapping. The perspective for the 2020 decade is that the scientific production in seabed mapping will follow the new trends, such as technological advances in autonomous or unmanned surface vessels for hydrographic surveys; the application of even more highly advanced statistical modelling and artificial intelligence to predict and automate seabed mapping and the prediction of biodiversity distribution; the use of even higher resolution maps, both in shallow and deep waters to improve spatial marine planning; and, possibly, the most important outcome during this decade would be a high resolution global map of the ocean floor, with an open source data set that will flood the scientific community with data to produce a better understanding of our oceans.
The well-known seafloor map produced by Marie Tharp and Bruce Heezen in the late 1960s influenced generations of marine geoscientists, and the expectations for this decade is to produce a map that will similarly generate a new vision of our ocean seabed morphology, influencing future generations. Based on the set of publications, it was possible to note some methodological trends through the word’s frequency and similarity graphs. The currently observed trend, with repeatable and quantitative methodologies is coherent with that proposed in this review, in which statistical calculations were employed to assess a variable that was essentially qualitative—the words.